On Sunday 13 March, Clearview AI’s Chief Executive Officer Hoan Ton-That told Reuters that Ukraine’s defence ministry began using his company’s facial recognition technology from Saturday 12 March after the US startup offered to: uncover Russian assailants, combat misinformation and identify dead soldiers and civilians.
Ukraine is receiving free access to Clearview AI’s powerful search engine “for faces, letting authorities potentially vet people of interest at checkpoints, among other uses,” Lee Wolosky, an adviser to the company and former diplomat under US presidents Barack Obama and Joe Biden, told the publication.
The move was quickly implemented after Russia began invading Ukraine and, according to a copy obtained by Reuters, Ton-That sent a letter to Kyiv offering assistance. The company further revealed that it has not offered the same to Russia.
Ton-That said that his startup had more than 2 billion images from the Russian social media service VKontakte (VK) at its disposal, out of a database of over 10 billion photos in total. In the right hands, such a database could help Ukraine identify the deceased more easily than trying to match fingerprints and works “even if there is facial damage,” the CEO wrote. Although research for the US Department of Energy found decomposition reduced the technology’s effectiveness, a paper from a 2021 conference showed promising results.
Ton-That’s letter also explained that Clearview AI’s technology could be used to reunite refugees separated from their families, identify Russian operatives and help the government debunk false social media posts related to the war.
As of now, the exact purpose for which Ukraine’s defence ministry is using the technology is yet to be clarified. Other parts of Ukraine’s government are expected to deploy Clearview AI in the coming days, Ton-That and Wolosky added.
But the technology also represents some risks. First of all, even facial recognition is not always accurate—some of its critics say it could misidentify people at checkpoints as well as in battle. A mismatch like this could lead to the catastrophic consequence of civilian deaths—much like the unjust arrests that have previously arisen from police use of the technology.
“We’re going to see well-intentioned technology backfiring and harming the very people it’s supposed to help,” Albert Fox Cahn, Executive Director of the Surveillance Technology Oversight Project (S.T.O.P.) in New York, told Reuters.
Following that same line of thought, Ton-That said that Clearview AI should never be wielded as the sole source of identification and that he would not want the technology to be used in violation of the Geneva Conventions, which created legal standards for humanitarian treatment during war.
This approach to using the technology is clear in how Ukrainians have been introduced to it so far—like other users, those in Ukraine are receiving training and have to input a case number and reason for a search before queries, the CEO told Reuters.
It’s important to keep in mind that Clearview AI, which primarily supplies US law enforcement, is currently facing a number of lawsuits in the country for violating privacy rights by taking images from the internet. But the company asserts that its data-gathering process is similar to how Google search works. Still, several countries including the UK and Australia have deemed its practices illegal.
Cahn described identifying the deceased as probably the least dangerous way to deploy the technology in war, but he said that “once you introduce these systems and the associated databases to a war zone, you have no control over how it will be used and misused.”
Numerous studies over the past half-decade have continuously documented the relative inaccuracy of facial recognition systems in identifying black, Indigenous, and people of colour (BIPOC), women and children. In the US, faulty facial recognition technology has already resulted in cases of misidentification. In January 2020, Detroit police arrested Robert Williams after facial recognition technology falsely matched his driver’s licence photo with surveillance footage from a local robbery.
Williams was detained for 30 hours and then released on bail until a court hearing on the case. At the hearing, a Wayne County prosecutor announced that the charges against Williams would be dropped due to insufficient evidence. At the time, civil rights experts said Williams was the first documented example in the US of someone being wrongfully arrested specifically based on a false hit produced by facial recognition technology.
Commonly known as Wali—which is not his real name because of concerns about his security—the former sniper from the famed Royal 22nd Regiment of the Canadian Armed Forces joined the ranks of a foreign legion in Ukraine, responding to President Volodymyr Zelenksky’s call for foreign fighters to help combat the Russian invasion.
The 40-year-old soldier, who now works as a computer programmer, previously served in several wars, including Afghanistan from 2009 to 2011, and Iraq in 2015—where he travelled as a volunteer foreign fighter embedded with Kurdish forces to fight the Islamic State of Iraq and Syria (ISIS).
In June 2017, an unidentified Canadian special forces sniper fired a McMillan Tac-50 rifle to fatally shoot an ISIS militant in Mosul from more than two miles away, among the longest-recorded kills. To put it in perspective, San Francisco’s iconic Golden Gate Bridge is 1.7 miles long. The shot, which took about ten seconds to reach its target, was independently verified by a video camera and other data, USA Today News reported at the time.
Although it’s never been clarified exactly who fired the shot, Wali was part of the anonymous hero’s unit. Furthermore, Canada is renowned for its world-class sniper system, with soldiers working in pairs to account for wind speed as well as the increasing downward motion of the bullet as it loses momentum over such a long distance.
According to the CBC, Wali entered Ukraine from Poland with a group of British and Canadian veterans during the first week of March, sheltering in a renovated home to join Ukraine forces and a growing battalion of volunteer citizen soldiers.
Speaking to the publication, the man said he missed his son’s first birthday, the “hardest part” of his decision to travel to Ukraine. “A week ago I was still programming stuff,” Wali said. “Now I’m grabbing anti-tank missiles in a warehouse to kill real people… That’s my reality right now.”
Speaking to Canada’s French-language La Presse newspaper, Wali shared that he refuses to watch an “all-out invasion” before his eyes. “What I’m doing is short-circuiting Canadian politics,” he told the newspaper in French. “Yes, of course governments don’t like it, but [in Ukraine], I really feel that there is strong support, and not just moral support.”
Although Canada recommends that citizens avoid all travel to Ukraine, the government will not oppose Canadian nationals joining the ranks of the country’s International Legion of Territorial Defense. Canadian Foreign Minister Mélanie Joly stated that it was an “individual choice” and that Canada “supports any form of aid to Ukraine at this time.”
According to Ukrainian Foreign Minister Dmytro Kuleba, more than 20,000 people from 52 countries have volunteered to assist Ukraine against Russian forces following President Zelensky’s call for “anyone who wants to join the defence of Ukraine, Europe and the world” to “fight side by side with the Ukrainians against the Russian war criminals.”
Kuleba announced on Twitter that anyone interested in joining the ranks should contact Ukraine’s diplomatic missions in their respective countries. “Together we defeated Hitler, and we will defeat Putin, too,” he said.
“I want to help them because they want to be free, basically. It’s as simple as that,” Wali told the CBC. “I have to help because there are people here being bombarded just because they want to be European and not Russian.”