Back in December 2021, COVID-19 deaths in South Korea had hit a record high when former Prime Minister Kim Boo-kyum admitted that the country could be forced to take “extraordinary measures” to tackle the surge. The plans included the use of AI and facial recognition by leveraging thousands of closed-circuit video cameras to track citizens infected with the virus.
At the time, the public raised several concerns about the technology’s attack on privacy and consent. Is the exchange of personal data for convenience, order and safety a fair trade-off for citizens? Or are governments using the pandemic as an excuse to normalise surveillance?
Now, reports are surfacing that the police in China are buying technology that harnesses vast surveillance data to predict crime and protests before they happen. What’s worse is that the systems in question are targeting potential troublemakers in the eyes of an algorithm and the Chinese authorities—including not only citizens with a criminal past but also vulnerable groups like ethnic minorities, migrant workers, people with a history of mental illness and those diagnosed with HIV.
According to a New York Times (NYT) report, more than 1.4 billion people living in China are being recorded by police cameras that are installed everywhere from street corners and subway ceilings to hotel lobbies and apartment buildings. Heck, even their phones are being tracked, their purchases monitored and their online chats censored. “Now, even their future is under surveillance,” the publication noted.
The latest generation of technology is capable of warning the police if a drug user makes too many calls to the same number or a victim of a fraud travels to Beijing to petition the government for payment. “They can signal officers each time a person with a history of mental illness gets near a school,” NYT added.
Procurement details and other documents reviewed by the publication also highlighted how the technology extends the boundaries of social and political control and incorporates them ever deeper into people’s lives. “At their most basic, they justify suffocating surveillance and violate privacy, while in the extreme they risk automating systemic discrimination and political repression,” the report mentioned.
In 2020, authorities in southern China allegedly denied a woman’s request to shift to Hong Kong to be with her husband after software warned them that the marriage was suspicious. An investigation later revealed that the two were “not often in the same place at the same time and had not spent the Spring Festival holiday together.” The police then concluded that the marriage had been faked to obtain a migration permit.
So, given the fact that Chinese authorities don’t require warrants to collect personal information, how can we know the future has been accurately predicted if the police intervene before it even happens? According to experts, even if the software fails to deduce human behaviour, it can be considered ‘successful’ since the surveillance itself helps curb unrest and crime to a certain extent.
“This is an invisible cage of technology imposed on society,” said Maya Wang, a senior China researcher with Human Rights Watch. “The disproportionate brunt of it being felt by groups of people that are already severely discriminated against in Chinese society.”
In 2017, entrepreneur Yin Qi, who founded an artificial intelligence start-up called Megvii, first introduced a computer system capable of predicting crimes. At the time, he told Chinese state media that if cameras detected a person spending hours at a stretch on a train station, the system could flag a possible pickpocket.
Fast forward to 2022, the police in Tianjin have reportedly bought software made by Hikvision, a Megvii competitor that aims to predict protests. At its core, the system collects data of Chinese petitioners—a general term used to describe people who try to file complaints about local officials with higher authorities in the country. The model then analyses each of these citizens’ likelihood to petition based on their social and family relationships, past trips and personal situations to help authorities create individual profiles—with fields for officers to describe the temperament of the protester, including “paranoid,” “meticulous” and “short tempered.”
“It would be scary if there were actually people watching behind the camera, but behind it is a system,” Qi told state media back in 2017. “It’s like the search engine we use every day to surf the internet—it’s very neutral. It’s supposed to be a benevolent thing.” He also went on to add that with such surveillance, “the bad guys have nowhere to hide.”