{*}
Add news
March 2010 April 2010 May 2010 June 2010 July 2010
August 2010
September 2010 October 2010 November 2010 December 2010 January 2011 February 2011 March 2011 April 2011 May 2011 June 2011 July 2011 August 2011 September 2011 October 2011 November 2011 December 2011 January 2012 February 2012 March 2012 April 2012 May 2012 June 2012 July 2012 August 2012 September 2012 October 2012 November 2012 December 2012 January 2013 February 2013 March 2013 April 2013 May 2013 June 2013 July 2013 August 2013 September 2013 October 2013 November 2013 December 2013 January 2014 February 2014 March 2014 April 2014 May 2014 June 2014 July 2014 August 2014 September 2014 October 2014 November 2014 December 2014 January 2015 February 2015 March 2015 April 2015 May 2015 June 2015 July 2015 August 2015 September 2015 October 2015 November 2015 December 2015 January 2016 February 2016 March 2016 April 2016 May 2016 June 2016 July 2016 August 2016 September 2016 October 2016 November 2016 December 2016 January 2017 February 2017 March 2017 April 2017 May 2017 June 2017 July 2017 August 2017 September 2017 October 2017 November 2017 December 2017 January 2018 February 2018 March 2018 April 2018 May 2018 June 2018 July 2018 August 2018 September 2018 October 2018 November 2018 December 2018 January 2019 February 2019 March 2019 April 2019 May 2019 June 2019 July 2019 August 2019 September 2019 October 2019 November 2019 December 2019 January 2020 February 2020 March 2020 April 2020 May 2020 June 2020 July 2020 August 2020 September 2020 October 2020 November 2020 December 2020 January 2021 February 2021 March 2021 April 2021 May 2021 June 2021 July 2021 August 2021 September 2021 October 2021 November 2021 December 2021 January 2022 February 2022 March 2022 April 2022 May 2022 June 2022 July 2022 August 2022 September 2022 October 2022 November 2022 December 2022 January 2023 February 2023 March 2023 April 2023 May 2023 June 2023 July 2023 August 2023 September 2023 October 2023 November 2023 December 2023 January 2024 February 2024 March 2024 April 2024 May 2024 June 2024 July 2024 August 2024 September 2024 October 2024 November 2024 December 2024 January 2025 February 2025 March 2025 April 2025 May 2025 June 2025 July 2025 August 2025 September 2025 October 2025 November 2025 December 2025 January 2026 February 2026 March 2026
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
19
20
21
22
23
24
25
26
27
28
29
30
31
News Every Day |

CHINA: ‘The State Is Using Generative AI to Engineer Reality Through Informational Gaslighting’

By CIVICUS
Mar 18 2026 (IPS)

 
CIVICUS discusses China’s tech-enabled repression with Fergus Ryan, a Senior Analyst at the Australian Strategic Policy Institute (ASPI), where he specialises in how the Chinese Communist Party shapes global information environments through censorship, propaganda and platform governance. His research includes a major study on China’s AI ecosystem and its human rights impacts, as well as investigations into China’s use of foreign influencers.

Fergus Ryan

China’s authoritarian government is deploying AI at scale to censor, control and monitor its population. As these tools grow more sophisticated and are exported abroad, the implications for civic space extend far beyond China’s borders.

What AI systems is China developing?

Based on our research, China is rapidly developing a multi-layered AI ecosystem designed to expand state control.

Tech giants are building multimodal large language models (LLMs) such as Alibaba’s Qwen and Baidu’s Ernie Bot, which censor and reshape descriptions of politically sensitive images. Hardware companies including Dahua, Hikvision and SenseTime supply the camera networks that feed into these systems.

The state is building what amounts to an AI-driven criminal justice pipeline. This includes City Brain operations centres such as Shanghai’s Pudong district, which process massive surveillance data, as well as the 206 System, developed by iFlyTek, which analyses evidence and recommends criminal sentences. Inside prisons, AI monitors inmates’ facial expressions and tracks their emotions.

AI-enabled satellite surveillance, such as the Xinjiang Jiaotong-01, enables autonomous real-time tracking over politically sensitive regions. Additionally, AI-enabled fishing platforms such as Sea Eagle expand economic extraction in the exclusive economic zones of countries including Mauritania and Vanuatu, displacing artisanal fishing communities.

How does China use AI for censorship and policing?

China relies on a hybrid model of censorship that fuses the speed of AI with human political judgement. The government requires companies to self-censor, creating a commercial market for AI moderation tools. Tech giants such as Baidu and Tencent have industrialised this process: systems automatically scan images, text and videos to detect content deemed to be risky in real time, while human reviewers handle nuanced or coded speech.

In policing, City Brains ingest data from millions of cameras, drones and Internet of Things sensors and use AI to identify suspects, track vehicles and predict unrest before it happens. In Xinjiang, the Integrated Joint Operations Platform aggregates data from cameras, phone scanners and informants to generate risk scores for individuals, enabling pre-emptive detention based on behavioural patterns rather than specific crimes.

On platforms such as Douyin, the state does not just delete content; it algorithmically suppresses dissent while amplifying ‘positive energy’. AI links surveillance data directly to narrative control and police action.

What are the human rights impacts?

These AI systems erode the rights to freedom of expression, privacy and a fair trial.

Historically, online censorship meant deleting a post. Today, generative AI engages in ‘informational gaslighting’. When ASPI researchers showed an Alibaba LLM a photograph of a protest against human rights violations in Xinjiang, the AI described it as ‘individuals in a public setting holding signs with incorrect statements’ based on ‘prejudice and lies’. The technology subtly engineers reality, preventing users accessing objective historical truths.

AI also undermines the right to a fair trial. In courts that lack judicial independence, AI systems that recommend sentences or predict recidivism act as a black box that defence lawyers cannot scrutinise.

Pervasive surveillance changes behaviour even when not actively used, so its chilling effect may be as significant as direct deployment. Knowing their conversations may be monitored, people self-censor online and in private messaging. Emotion recognition in prisons takes this further: people can theoretically be flagged for their internal states of mind. It’s not just actions that are punished, but also thoughts.

Which groups are most affected?

While AI-enabled surveillance affects all people, ethnic minorities such as Koreans, Mongolians, Tibetans and Uyghurs are disproportionately targeted.

Mainstream LLMs are trained primarily in Mandarin, leaving little commercial incentive to develop AI for minority languages. The Chinese state, however, views those languages as a security vulnerability. State-funded institutions, including the National Key Laboratory at Minzu University, are building LLMs in minority languages, not for cultural preservation, but to power public-opinion control and prevention platforms. These scan text, audio and video in Tibetan and Uyghur to detect cultural advocacy, dissent or religious activity.

Feminist activists, human rights lawyers — particularly since the 709 crackdown in 2015 — labour activists and religious minorities including Falun Gong practitioners face disproportionate targeting. Chinese models consistently adopt state-aligned narratives about such groups, labelling Falun Gong a cult and avoiding human rights framing. Since 2020, Hong Kongers have also been subject to National Security Law surveillance using many of the same tools deployed on the mainland, a reminder that this infrastructure can be rapidly extended.

How can activists in China protect themselves?

Protecting oneself inside China is increasingly difficult. AI leaves very few blind spots. But the system is not perfectly omniscient.

Activists have historically relied on coded speech, euphemisms and satire, the classic example being the use of ‘Winnie the Pooh’ to refer to President Xi Jinping. Because AI struggles with cultural nuance and evolving memes, new linguistic workarounds can temporarily bypass automated filters. But this is a relentless game of Whac-a-Mole: Chinese tech companies employ thousands of human content reviewers whose only job is to catch new memes and feed them back into the AI.

The most practical steps are to use VPNs to access blocked platforms, secure communications apps such as Signal and separate devices for sensitive work. None of these are foolproof. VPN use is technically illegal and increasingly detected and Signal can only be accessed via VPN. It helps to keep a minimal digital footprint and communicate face-to-face on sensitive matters. For activists in Xinjiang, however, surveillance is so pervasive that individual precautions offer little protection. Strong international networks and rigorous documentation practices are essential.

Is China exporting these technologies?

China is the world’s largest exporter of AI-powered surveillance technology, marketing these systems globally, particularly to the global south.

The Chinese state is purposefully expanding its minority-language public-opinion monitoring software throughout Belt and Road Initiative countries, effectively extending its censorship apparatus to monitor Tibetan and Uyghur diaspora communities abroad. Chinese companies including Dahua, Hikvision, Huawei and ZTE have deployed surveillance and ‘safe city’ systems across over 100 countries, with Saudi Arabia and the United Arab Emirates among the most significant recipients. Critically, these companies operate under China’s 2017 National Intelligence Law, which requires cooperation with state intelligence, meaning data flowing through these systems could be accessible to Beijing as well as to purchasing governments.

China is also exporting its governance model through the open-source release of its LLMs, embedding Chinese censorship norms into foundational infrastructure used by developers worldwide.

What should the international community do?

The international community must recognise that countering this requires regulatory pushback.

First, democratic states should set minimum transparency standards for public procurement. This means refusing to purchase AI models that conceal political or historical censorship and mandating that providers publish a ‘moderation log’ with refusal reason codes so users know when content is restricted for political reasons.

Second, states should enact ‘safe-harbour laws’ to protect civil society organisations, journalists and researchers who audit AI models for hidden censorship. Currently, doing so can breach corporate terms of service.

Third, strict export controls should block the transfer of repression-enabling technologies to authoritarian regimes, while companies providing public-opinion management services should be excluded from democratic markets. Existing targeted sanctions on companies such as Dahua and Hikvision for their role in Xinjiang should be enforced more rigorously.

Finally, the international community must recognise that Chinese surveillance extends beyond China’s borders. Spyware targeting Tibetan and Uyghur activists in exile is well-documented, as is pressure on family members remaining in China. Rigorous documentation by international civil society remains essential for building the evidentiary record for future accountability.

CIVICUS interviews a wide range of civil society activists, experts and leaders to gather diverse perspectives on civil society action and current issues for publication on its CIVICUS Lens platform. The views expressed in interviews are the interviewees’ and do not necessarily reflect those of CIVICUS. Publication does not imply endorsement of interviewees or the organisations they represent.

GET IN TOUCH
Website
LinkedIn
Twitter/X

SEE ALSO
Technology: innovation without accountability CIVICUS | 2026 State of Civil Society Report
The silencing of Hong Kong CIVICUS Lens 25.Jun.2025
The long reach of authoritarianism CIVICUS Lens 20.Mar.2024

 


  
Ria.city






Read also

I’ll never buy a PC with soldered RAM

Jump Aboard the Shōta Imanaga Hype Train

Apple CEO Tim Cook doubles down on policy over politics while aligning with Trump’s manufacturing push

News, articles, comments, with a minute-by-minute update, now on Today24.pro

Today24.pro — latest news 24/7. You can add your news instantly now — here




Sports today


Новости тенниса


Спорт в России и мире


All sports news today





Sports in Russia today


Новости России


Russian.city



Губернаторы России









Путин в России и мире







Персональные новости
Russian.city





Friends of Today24

Музыкальные новости

Персональные новости