‘Unlawful and unethical’: UK police urged to ban facial recognition in all public places

By Malavika Pradeep

Published Nov 2, 2022 at 09:00 AM

Reading time: 4 minutes

37879

In 2021, several reports outlined the Metropolitan (Met) Police’s plans to supercharge surveillance following the purchase of a new facial recognition system. Dubbed Retrospective Facial Recognition (RFR), the technology essentially allowed police to process historic images from CCTV feeds, social media, and other sources in a bid to track down suspects.

“Those deploying it can, in effect, turn back the clock to see who you are, where you’ve been, what you have done and with whom, over many months or even years,” Ella Jakubowska, policy advisor at European Digital Rights, told WIRED at the time—adding that the tech can “suppress people’s free expression, assembly, and ability to live without fear.”

In March 2021, a report found that RFR was being used by a sum total of six police forces in England and Wales. Despite this statistic, the technology has largely evaded legal scrutiny—accompanied by the deployment of Live Facial Recognition (LFR) systems in public spaces. LFR essentially scans people’s faces as they walk down streets and compares them to a suspect watchlist in real-time, true Minority Report style.

The UK’s current policy on facial recognition

Over the years, several critics have argued that the use of both RFR and LFR encroaches on individual privacy and social justice. Given its controversial history of misidentifying people of colour, leading to wrongful arrests, and even putting LGBTQ+ lives at risk, experts have warned against the invasive surveillance tools capable of tracking the public on a massive scale.

“In the US, we have seen people being wrongly jailed thanks to RFR,” Silkie Carlo, director of civil liberties group Big Brother Watch, told WIRED. “A wider public conversation and strict safeguards are vital before even contemplating an extreme technology like this, but the Mayor of London has continued to support expensive, pointless, and rights-abusive police technologies.”

In July 2019, the House of Commons Science and Technology Committee recommended restrictions on the use of LFR until concerns regarding the technology’s bias and efficacy were resolved. Then, in August 2020, the UK’s Court of Appeal found that South Wales Police’s use of LFR was unlawful and, in September 2021, the United Nations High Commissioner for Human Rights called for a moratorium on the use of LFR.

However, the Met has stated that it will continue to deploy both the systems in question—as and when it deems it appropriate. “Each police force is responsible for their own use of LFR technologies,” a spokesperson said. “The Met is committed to give prior notification for its overt use of LFR to locate people on a watchlist. We will continue to do this where a policing purpose to deploy [it] justifies the use of LFR.” Meanwhile, Scotland has reported its intention to introduce the technology by 2026.

A history of unchecked biases and errors

Between 2016 and 2019, the Met reportedly deployed LFR 12 times across London. The first case was at the Notting Hill Carnival in 2016, the UK’s biggest African-Caribbean celebration, where one attendee was falsely matched to the real-time database of criminal records. Similarly, at Notting Hill Carnival in 2017, two people were falsely positive while another individual was correctly matched but was no longer wanted.

“Face recognition software has been proven to misidentify ethnic minorities, young people, and women at higher rates,” Electronic Frontier Foundation (EFF), a leading nonprofit defending civil liberties in the digital world, noted in an open letter in September 2022. “And reports of deployments in spaces like Notting Hill Carnival—where the majority of attendees are black—exacerbate concerns about the inherent bias of face recognition technologies and the ways that government use amplifies police powers and aggravates racial disparities.”

After temporary suspension over the COVID-19 pandemic, the police force resumed its deployment of LFR across central London. “On 28 January 2022, one day after the UK Government relaxed mask-wearing requirements, the Met deployed LFR with a watchlist of 9,756 people. Four people were arrested, including one who was misidentified and another who was flagged on outdated information,” EFF highlighted, adding that the tech was once again deployed outside Oxford Street tube station, where it reportedly scanned around 15,600 people’s data and resulted in four “true alerts” and three arrests.

“The Met has previously admitted to deploying LFR in busy areas to scan as many people as possible, despite face recognition data being prone to error. This can implicate people for crimes they haven’t committed.”

According to a 2020 report, London was the third most-surveilled city in the world, with over 620,000 cameras. As of 2022, however, it currently holds the eighth position on the list with 127,373 cameras in total. Another report claimed that, between 2011 and 2022, the number of CCTV cameras more than doubled across the London Boroughs.

UK police and the ‘unlawful and unethical’ use of LFR

After analysing LFR use by the Met and South Wales police, a new study by the University of Cambridge has now concluded that the controversial technology should be banned in “all public spaces.”

The report, authored by the University’s Minderoo Centre for Technology and Democracy, examined three deployments of LFR, one by the Met police and two by South Wales police, with an audit tool based on current legal guidelines. Based on the findings, the experts joined an increasing list of calls to ban the use of facial recognition in streets, airports, and any public spaces—the very areas where police believe it would be most valuable.

In all three cases, the team noted that important information about police use of the technology in question is “kept from view,” including scant demographic data published on arrests or other outcomes—in turn, making it difficult to evaluate whether the tools “perpetuate racial profiling.”

Apart from this lack of transparency, the researchers also found little accountability being taken for their actions, with no clear recourse for people or communities negatively affected by police use, or rather misuse, of the tech. “Police forces are not necessarily answerable or held responsible for harms caused by facial recognition technology,” stated the report’s lead author, Evani Radiya-Dixit. “We find that all three of these deployments fail to meet the minimum ethical and legal standards based on our research on police use of facial recognition. To protect human rights and improve accountability in how technology is used, we must ask what values we want to embed in technology.”

According to the report, at least ten police forces in England and Wales have trialled facial recognition to date, involving its use for operational policing purposes. While facial recognition is seen as a fast, efficient, and cheap way to track down persons of interest, it also noted how officers are increasingly under-resourced and overburdened—leading to a plethora of other policing concerns.

Keep On Reading

By Fatou Ferraro Mboup

Andrew Schulz’s problematic behaviour started long before the ShxtsNGigs controversy

By Fatou Ferraro Mboup

ShxtsNGigs face major backlash: When will comedians stop targeting Black women for cheap laughs?

By Abby Amoakuh

YouTuber Chunkz’s secret wedding leaked online with fans believing he married OnlyFans model

By Abby Amoakuh

Bridgerton’s casting director reveals why her inbox regularly gets flooded with NSFW audition tapes

By Abby Amoakuh

UK police officers complain unisex uniforms lead to squashed testicles and fungal infections

By Louis Shankar

Labour continues transphobic Tory legacy by backing UK ban on puberty blockers for trans youth

By Fatou Ferraro Mboup

Is Melania Trump’s pro-choice memoir a plot to boost Donald Trump’s 2024 election bid?

By Abby Amoakuh

Megan Thee Stallion sues blogger for posting deepfake porn of her on behalf of Tory Lanez

By Abby Amoakuh

Who is Laura Loomer, the right-wing conspiracy theorist threatening Donald Trump’s campaign?

By Charlie Sawyer

Did NFL player Cody Ford cheat on fiancé and TikTok creator Tianna Robillard?

By Abby Amoakuh

Here is what really happened between Julia Roberts and Travis Kelce at the Eras Tour in Dublin

By Fatou Ferraro Mboup

Trump-appointed judge faces backlash over viral video exposing her opinions on dwarf tossing

By Gabriela Serpa

Are we entering the Bronaissance?

By Fatou Ferraro Mboup

Study reveals alarming suicide rates among female doctors linked to misogyny and harassment

By Malavika Pradeep

8 celebrities and fashion moments you might have missed at the $600 million Ambani wedding

By Fatou Ferraro Mboup

TV host Andy Cohen faces cancellation over substance abuse, harassment, and exploitation allegations at Bravo

By Fatou Ferraro Mboup

Unpacking Vybz Kartel: the dancehall legend’s music, prison sentence, and controversial legacy

By Charlie Sawyer

The Apprentice star Sebastian Stan warns Trump’s criticism may spark new wave of violence

By Abby Amoakuh

The women in male fields TikTok trend is for the girlies who want to outsmart men at their own game

By Charlie Sawyer

Straight men are lying about their sexuality on dating apps to try and get more likes