It’s no lie that China has a surveillance problem. I get spooked living in London (the 9th most surveilled city in the world, according to the net), so I’m not sure I’d be able to handle one of China’s mega metropolises. A beautiful country with a rich history—and a relentless hardline authoritarian government. From strict no-COVID policies to forcing its bus drivers to wear emotion-tracking wristbands, China has a tight grip on its citizens. Perhaps this was what encouraged a team of promising students to invent an ‘invisibility’ coat.
The aptly named InvisDefense coat is the brainchild of four Wuhan University graduates, who recently took home the first prize at the China Postgraduate Innovation and Practice Competitions, in part sponsored by Huawei.
The professor who oversaw the mysterious cloak project at the school of computer science within Wuhan University, Whang Zeng, told the South China Morning Post that the technology “allows the camera to capture you, but it cannot tell if you are human.” The surveillance systems in China are incredibly thorough when it comes to distinguishing people from inanimate objects, often with a scary level of accuracy.
That’s where the ‘invisibility’ coat comes in, with its primary function being to inhibit the surveillance’s ability to accurately detect a human through motion and contour recognition. The garment has a specially designed camouflage pattern on its surface, which helps to interfere with the camera’s AI. When put to the test, the pattern reduced pedestrian detection by 57 per cent.
What about at night-time though? Surveillance cameras in China are also able to use infrared thermal imaging in order to detect heat signatures. To counter this sneaky measure, the cloak is equipped with irregularly shaped temperature modules under its high-tech surface which, in turn, confuses the infrared camera.
One of the main objectives for the students when constructing the jacket was to enable the wearer to be inconspicuous to the human eye as well. Wei Hui, designer of the core algorithm explained that “traditionally, researchers used bright images to interfere with machine vision.” While this approach worked, it often made the wearer a bigger target to those around him, rendering them invisible to the mechanical eyes of surveillance, but not to the organic ones of humans.
This partnership between AI and fashion has been exemplified in the past by clothing brand UNLABELED whose garments do a good job of challenging the machine, but a poor one at not catching the eyes of intrigued passersby as their pieces often feature bright and garish colours.
The Chinese academic team essentially tried to create as inconspicuous a design as possible, using algorithms to help keep the jacket wearable on an everyday basis—while still effective at deterring AI surveillance. To achieve this, they carried out hundreds of preliminary tests over a three-month period to formulate the best pattern.
The team have since shared how excited they are about the future potential of the coat, saying it’s the first on the market of its kind, and that its 57 per cent detection reduction success could grow exponentially in the coming future.
The coat boasts a surprisingly low cost of manufacturing, meaning that the tech’s retail price will sit at a reasonable 500 yuan (£58). The cost of printing the pattern is cheap, and only four heat modules are needed for the camouflage to be effective. But is this a price tag aimed at the everyday consumer hoping to regain a little bit of privacy in their day to day? Unfortunately, this is an unlikely scenario.
The team, who all reside in China, are aware of the power this tech holds, stating that “privacy is exposed under machine vision.” The ethical quandaries these systems impose aside, the students were quick to clarify their position on their invention, stating that the coat has both military and defence applications which could be used to help strengthen security, not weaken it. The project has primarily served to highlight “loopholes” in the Chinese system, said Wei Hui, while also showing the country—and the world—just how cost-effective a personal AI system can be.
AI and surveillance in general are prevalent across the entirety of China and are used in a variety of ways. From identifying waves of citizens as they commute to their jobs to tracking children who’ve stayed up late to play games—it’s a tool the Chinese government favours heavily. Although the InvisDefence coat has shown promise as a countermeasure to the country’s oppressive controls, those responsible attest that its purpose is purely beneficial to the nation, and not a means to stoke the flames of social upheaval.
Back in December 2021, COVID-19 deaths in South Korea had hit a record high when former Prime Minister Kim Boo-kyum admitted that the country could be forced to take “extraordinary measures” to tackle the surge. The plans included the use of AI and facial recognition by leveraging thousands of closed-circuit video cameras to track citizens infected with the virus.
At the time, the public raised several concerns about the technology’s attack on privacy and consent. Is the exchange of personal data for convenience, order and safety a fair trade-off for citizens? Or are governments using the pandemic as an excuse to normalise surveillance?
Now, reports are surfacing that the police in China are buying technology that harnesses vast surveillance data to predict crime and protests before they happen. What’s worse is that the systems in question are targeting potential troublemakers in the eyes of an algorithm and the Chinese authorities—including not only citizens with a criminal past but also vulnerable groups like ethnic minorities, migrant workers, people with a history of mental illness and those diagnosed with HIV.
According to a New York Times (NYT) report, more than 1.4 billion people living in China are being recorded by police cameras that are installed everywhere from street corners and subway ceilings to hotel lobbies and apartment buildings. Heck, even their phones are being tracked, their purchases monitored and their online chats censored. “Now, even their future is under surveillance,” the publication noted.
The latest generation of technology is capable of warning the police if a drug user makes too many calls to the same number or a victim of a fraud travels to Beijing to petition the government for payment. “They can signal officers each time a person with a history of mental illness gets near a school,” NYT added.
Procurement details and other documents reviewed by the publication also highlighted how the technology extends the boundaries of social and political control and incorporates them ever deeper into people’s lives. “At their most basic, they justify suffocating surveillance and violate privacy, while in the extreme they risk automating systemic discrimination and political repression,” the report mentioned.
In 2020, authorities in southern China allegedly denied a woman’s request to shift to Hong Kong to be with her husband after software warned them that the marriage was suspicious. An investigation later revealed that the two were “not often in the same place at the same time and had not spent the Spring Festival holiday together.” The police then concluded that the marriage had been faked to obtain a migration permit.
So, given the fact that Chinese authorities don’t require warrants to collect personal information, how can we know the future has been accurately predicted if the police intervene before it even happens? According to experts, even if the software fails to deduce human behaviour, it can be considered ‘successful’ since the surveillance itself helps curb unrest and crime to a certain extent.
“This is an invisible cage of technology imposed on society,” said Maya Wang, a senior China researcher with Human Rights Watch. “The disproportionate brunt of it being felt by groups of people that are already severely discriminated against in Chinese society.”
In 2017, entrepreneur Yin Qi, who founded an artificial intelligence start-up called Megvii, first introduced a computer system capable of predicting crimes. At the time, he told Chinese state media that if cameras detected a person spending hours at a stretch on a train station, the system could flag a possible pickpocket.
Fast forward to 2022, the police in Tianjin have reportedly bought software made by Hikvision, a Megvii competitor that aims to predict protests. At its core, the system collects data of Chinese petitioners—a general term used to describe people who try to file complaints about local officials with higher authorities in the country. The model then analyses each of these citizens’ likelihood to petition based on their social and family relationships, past trips and personal situations to help authorities create individual profiles—with fields for officers to describe the temperament of the protester, including “paranoid,” “meticulous” and “short tempered.”
“It would be scary if there were actually people watching behind the camera, but behind it is a system,” Qi told state media back in 2017. “It’s like the search engine we use every day to surf the internet—it’s very neutral. It’s supposed to be a benevolent thing.” He also went on to add that with such surveillance, “the bad guys have nowhere to hide.”