Scroll Top

How China’s Social Credit System Works

by Camilla Fatticcioni

China’s new social credit system guidelines tighten measures against dishonest behaviour. Companies classified as seriously dishonest will face significant restrictions, including bans on accessing government funds, tax incentives, and stock offerings. Sector-specific blacklists are also expanding, signalling stricter oversight in the real estate, internet services, human resources, and energy sectors.

In 2016, the dystopias portrayed in Black Mirror, the famous British sci-fi series, seemed lightyears away from our everyday lives. Today, nearly a decade after the release of its third season, episodes like Nosedive (S3, E1) no longer surprise us—they feel like unsettling predictions of the present. In the world of Nosedive, for example, every individual receives a “reputation score” based on their interactions and online behaviour: a rating that becomes social currency, determining access to services, job opportunities, and even personal relationships.

Around the same time, China was already discussing how to make its 2014 social credit system proposal a tangible reality. At the time, international headlines sceptically compared Beijing’s decision to the TV series: “China, like an episode of Black Mirror.” Today, China’s social credit system is neither fully unified nor mature, but it represents an unprecedented experiment in digital social engineering.

The system proved particularly effective in 2020 during the Covid pandemic: the government relied on this infrastructure to track citizens’ movements and curb the spread of the virus, encouraging people to comply with public health measures. By cross-referencing data from transport systems, financial transactions, and online behaviour, it was possible to quickly isolate outbreaks and penalise those who broke the rules, helping, albeit partially, to contain the virus within the country.

China’s social credit system is a set of regulations and digital infrastructure designed to assess the “trustworthiness” of individuals, companies, and public bodies. It gathers data from a wide range of sources—such as payment histories, legal records, and online behaviour—to assign scores or place citizens on blacklists. The stated goal is to promote virtuous behaviour, deter law-breaking, and improve social “trust”. However, critics warn that the system results in pervasive surveillance, restrictions on individual freedoms, and the risk of errors or arbitrary discrimination.

In everyday life, the system can influence one’s ability to travel, obtain loans, book hotels, use bike-sharing services, enrol children in private schools, and even access faster internet connections—depending on one’s score or inclusion on a blacklist. Central authorities, including the Supreme People’s Court and the People’s Bank of China, have set national guidelines for managing exclusion lists targeting those who fail to comply with civil rulings, tax obligations, or administrative regulations.

Meanwhile, private operators such as Ant Financial—with its Zhima Credit service—have experimented with numeric scoring models based on spending behaviour, payment history, and social network connections, offering perks such as streamlined bureaucracy and fast-track loans for users with high ratings. To feed these assessments, the system draws on multiple data sources: mobile payment apps (Alipay, WeChat Pay) that track every transaction and repayment punctuality, public court records that list defaulters, real-time transport data to monitor movements and penalise cancellations or misuse, surveillance cameras and facial recognition in urban spaces, reports of civic violations—from failing to recycle to breaking traffic rules—and even quizzes and civic participation on government apps.

China’s social credit system is neither fully unified nor mature, but it represents an unprecedented experiment in digital social engineering.

On the one hand, facial recognition has made daily tasks easier—like buying train tickets without needing to pull out your wallet. On the other hand, it has raised growing concerns about privacy and control. The prospect that every action—even the most trivial—can be tracked, evaluated, and potentially penalised introduces a reward-punishment logic that risks stifling individual autonomy and dissent. It fosters a society in which human relationships are filtered through scores and algorithms, and where spontaneity gives way to constant performance.

In the West, although there is no unified social credit system like the one imagined by Black Mirror creator Charlie Brooker, many digital platforms employ similar reputation algorithms: think of e-commerce reviews, expert user badges on forums, or “likes” on social media. We choose restaurants with the best ratings, book hotels with the most positive reviews, and trust influencers with the highest number of followers and views. It’s a system we’ve long been familiar with—though it hasn’t (yet) taken the dystopian turn described in Black Mirror.

Camilla Fatticcioni

China scholar and photographer. After graduating in Chinese language from Ca’ Foscari University in Venice, Camilla lived in China from 2016 to 2020. In 2017, she began a master’s degree in Art History at the China Academy of Art in Hangzhou, taking an interest in archaeology and graduating in 2021 with a thesis on the Buddhist iconography of the Mogao caves in Dunhuang. Combining her passion for art and photography with the study of contemporary Chinese society, Camilla collaborates with several magazines and edits the Chinoiserie column for China Files.

READ MORE