China is using AI to stalk its citizens and predict future crimes. What could possibly go wrong?

By Malavika Pradeep

Published Jun 27, 2022 at 10:37 AM

Reading time: 3 minutes

Back in December 2021, COVID-19 deaths in South Korea had hit a record high when former Prime Minister Kim Boo-kyum admitted that the country could be forced to take “extraordinary measures” to tackle the surge. The plans included the use of AI and facial recognition by leveraging thousands of closed-circuit video cameras to track citizens infected with the virus.

At the time, the public raised several concerns about the technology’s attack on privacy and consent. Is the exchange of personal data for convenience, order and safety a fair trade-off for citizens? Or are governments using the pandemic as an excuse to normalise surveillance?

Now, reports are surfacing that the police in China are buying technology that harnesses vast surveillance data to predict crime and protests before they happen. What’s worse is that the systems in question are targeting potential troublemakers in the eyes of an algorithm and the Chinese authorities—including not only citizens with a criminal past but also vulnerable groups like ethnic minorities, migrant workers, people with a history of mental illness and those diagnosed with HIV.

According to a New York Times (NYT) report, more than 1.4 billion people living in China are being recorded by police cameras that are installed everywhere from street corners and subway ceilings to hotel lobbies and apartment buildings. Heck, even their phones are being tracked, their purchases monitored and their online chats censored. “Now, even their future is under surveillance,” the publication noted.

The latest generation of technology is capable of warning the police if a drug user makes too many calls to the same number or a victim of a fraud travels to Beijing to petition the government for payment. “They can signal officers each time a person with a history of mental illness gets near a school,” NYT added.

Procurement details and other documents reviewed by the publication also highlighted how the technology extends the boundaries of social and political control and incorporates them ever deeper into people’s lives. “At their most basic, they justify suffocating surveillance and violate privacy, while in the extreme they risk automating systemic discrimination and political repression,” the report mentioned.

In 2020, authorities in southern China allegedly denied a woman’s request to shift to Hong Kong to be with her husband after software warned them that the marriage was suspicious. An investigation later revealed that the two were “not often in the same place at the same time and had not spent the Spring Festival holiday together.” The police then concluded that the marriage had been faked to obtain a migration permit.

So, given the fact that Chinese authorities don’t require warrants to collect personal information, how can we know the future has been accurately predicted if the police intervene before it even happens? According to experts, even if the software fails to deduce human behaviour, it can be considered ‘successful’ since the surveillance itself helps curb unrest and crime to a certain extent.

“This is an invisible cage of technology imposed on society,” said Maya Wang, a senior China researcher with Human Rights Watch. “The disproportionate brunt of it being felt by groups of people that are already severely discriminated against in Chinese society.”

In 2017, entrepreneur Yin Qi, who founded an artificial intelligence start-up called Megvii, first introduced a computer system capable of predicting crimes. At the time, he told Chinese state media that if cameras detected a person spending hours at a stretch on a train station, the system could flag a possible pickpocket.

Fast forward to 2022, the police in Tianjin have reportedly bought software made by Hikvision, a Megvii competitor that aims to predict protests. At its core, the system collects data of Chinese petitioners—a general term used to describe people who try to file complaints about local officials with higher authorities in the country. The model then analyses each of these citizens’ likelihood to petition based on their social and family relationships, past trips and personal situations to help authorities create individual profiles—with fields for officers to describe the temperament of the protester, including “paranoid,” “meticulous” and “short tempered.”

“It would be scary if there were actually people watching behind the camera, but behind it is a system,” Qi told state media back in 2017. “It’s like the search engine we use every day to surf the internet—it’s very neutral. It’s supposed to be a benevolent thing.” He also went on to add that with such surveillance, “the bad guys have nowhere to hide.”

Keep On Reading

By Fatou Ferraro Mboup

Exploitation exposed: British farms accused of modern slavery amid Home Office cover-up

By Charlie Sawyer

Azealia Banks is cancelled once again after calling Troye Sivan an expired Twink

By Charlie Sawyer

Millie Bobby Brown labelled as rude and disrespectful after recent interview confession

By Abby Amoakuh

Pro-suicide website finally blocked by broadband providers after being linked to 50 deaths in the UK

By Fatou Ferraro Mboup

Netflix’s depiction of Griselda Blanco was wrong. Why the cocaine godmother was not a feminist icon

By Charlie Sawyer

Megan Fox accused of xenophobia after comparing bad photo of herself to Ukrainian blowup doll

By Fatou Ferraro Mboup

AI-generated porn is growing in popularity. But will it simply become another man’s world?

By Charlie Sawyer

Robert F. Kennedy Jr defends Epstein connection as Ghislaine Maxwell’s appeal trial begins

By Charlie Sawyer

Nikki Haley snaps at Fox News reporters who asked her why she hasn’t dropped out of the election

By Fatou Ferraro Mboup

NHS starts testing weight loss pill with gastric balloon inside for the first time

By Fatou Ferraro Mboup

Men are weirdly confident they could land a plane in an emergency. We asked them to explain

By Charlie Sawyer

How to become a sugar baby: Everything you need to know about pursuing a safe sugar lifestyle

By Abby Amoakuh

Bride walks out on her own wedding after the groom smashed a cake in her face, and she’s not the first one!

By Abby Amoakuh

Online adoption ads prey on pregnant women in actions reminiscent of the Baby Scoop era

By Abby Amoakuh

Nicola Peltz Beckham faces backlash following new controversial campaign with Balenciaga

By Charlie Sawyer

Poison seller who promoted death kits on suicide forums tracked down by BBC

By Fatou Ferraro Mboup

Meet Edward and Natalie Ortega, the parents of Wednesday actress Jenna Ortega

By Jack Ramage

What is a gymcel? And why is the term problematic?

By Abby Amoakuh

The murder of a 22-year-old nursing student in Athens Georgia could decide the US presidential elections

By Charlie Sawyer

Legit or not? Debunking the latest viral £50 Temu free money giveaway