‘Unlawful and unethical’: UK police urged to ban facial recognition in all public places

By Malavika Pradeep

Published Nov 2, 2022 at 09:00 AM

Reading time: 4 minutes

In 2021, several reports outlined the Metropolitan (Met) Police’s plans to supercharge surveillance following the purchase of a new facial recognition system. Dubbed Retrospective Facial Recognition (RFR), the technology essentially allowed police to process historic images from CCTV feeds, social media, and other sources in a bid to track down suspects.

“Those deploying it can, in effect, turn back the clock to see who you are, where you’ve been, what you have done and with whom, over many months or even years,” Ella Jakubowska, policy advisor at European Digital Rights, told WIRED at the time—adding that the tech can “suppress people’s free expression, assembly, and ability to live without fear.”

In March 2021, a report found that RFR was being used by a sum total of six police forces in England and Wales. Despite this statistic, the technology has largely evaded legal scrutiny—accompanied by the deployment of Live Facial Recognition (LFR) systems in public spaces. LFR essentially scans people’s faces as they walk down streets and compares them to a suspect watchlist in real-time, true Minority Report style.

The UK’s current policy on facial recognition

Over the years, several critics have argued that the use of both RFR and LFR encroaches on individual privacy and social justice. Given its controversial history of misidentifying people of colour, leading to wrongful arrests, and even putting LGBTQ+ lives at risk, experts have warned against the invasive surveillance tools capable of tracking the public on a massive scale.

“In the US, we have seen people being wrongly jailed thanks to RFR,” Silkie Carlo, director of civil liberties group Big Brother Watch, told WIRED. “A wider public conversation and strict safeguards are vital before even contemplating an extreme technology like this, but the Mayor of London has continued to support expensive, pointless, and rights-abusive police technologies.”

In July 2019, the House of Commons Science and Technology Committee recommended restrictions on the use of LFR until concerns regarding the technology’s bias and efficacy were resolved. Then, in August 2020, the UK’s Court of Appeal found that South Wales Police’s use of LFR was unlawful and, in September 2021, the United Nations High Commissioner for Human Rights called for a moratorium on the use of LFR.

However, the Met has stated that it will continue to deploy both the systems in question—as and when it deems it appropriate. “Each police force is responsible for their own use of LFR technologies,” a spokesperson said. “The Met is committed to give prior notification for its overt use of LFR to locate people on a watchlist. We will continue to do this where a policing purpose to deploy [it] justifies the use of LFR.” Meanwhile, Scotland has reported its intention to introduce the technology by 2026.

A history of unchecked biases and errors

Between 2016 and 2019, the Met reportedly deployed LFR 12 times across London. The first case was at the Notting Hill Carnival in 2016, the UK’s biggest African-Caribbean celebration, where one attendee was falsely matched to the real-time database of criminal records. Similarly, at Notting Hill Carnival in 2017, two people were falsely positive while another individual was correctly matched but was no longer wanted.

“Face recognition software has been proven to misidentify ethnic minorities, young people, and women at higher rates,” Electronic Frontier Foundation (EFF), a leading nonprofit defending civil liberties in the digital world, noted in an open letter in September 2022. “And reports of deployments in spaces like Notting Hill Carnival—where the majority of attendees are black—exacerbate concerns about the inherent bias of face recognition technologies and the ways that government use amplifies police powers and aggravates racial disparities.”

After temporary suspension over the COVID-19 pandemic, the police force resumed its deployment of LFR across central London. “On 28 January 2022, one day after the UK Government relaxed mask-wearing requirements, the Met deployed LFR with a watchlist of 9,756 people. Four people were arrested, including one who was misidentified and another who was flagged on outdated information,” EFF highlighted, adding that the tech was once again deployed outside Oxford Street tube station, where it reportedly scanned around 15,600 people’s data and resulted in four “true alerts” and three arrests.

“The Met has previously admitted to deploying LFR in busy areas to scan as many people as possible, despite face recognition data being prone to error. This can implicate people for crimes they haven’t committed.”

According to a 2020 report, London was the third most-surveilled city in the world, with over 620,000 cameras. As of 2022, however, it currently holds the eighth position on the list with 127,373 cameras in total. Another report claimed that, between 2011 and 2022, the number of CCTV cameras more than doubled across the London Boroughs.

UK police and the ‘unlawful and unethical’ use of LFR

After analysing LFR use by the Met and South Wales police, a new study by the University of Cambridge has now concluded that the controversial technology should be banned in “all public spaces.”

The report, authored by the University’s Minderoo Centre for Technology and Democracy, examined three deployments of LFR, one by the Met police and two by South Wales police, with an audit tool based on current legal guidelines. Based on the findings, the experts joined an increasing list of calls to ban the use of facial recognition in streets, airports, and any public spaces—the very areas where police believe it would be most valuable.

In all three cases, the team noted that important information about police use of the technology in question is “kept from view,” including scant demographic data published on arrests or other outcomes—in turn, making it difficult to evaluate whether the tools “perpetuate racial profiling.”

Apart from this lack of transparency, the researchers also found little accountability being taken for their actions, with no clear recourse for people or communities negatively affected by police use, or rather misuse, of the tech. “Police forces are not necessarily answerable or held responsible for harms caused by facial recognition technology,” stated the report’s lead author, Evani Radiya-Dixit. “We find that all three of these deployments fail to meet the minimum ethical and legal standards based on our research on police use of facial recognition. To protect human rights and improve accountability in how technology is used, we must ask what values we want to embed in technology.”

According to the report, at least ten police forces in England and Wales have trialled facial recognition to date, involving its use for operational policing purposes. While facial recognition is seen as a fast, efficient, and cheap way to track down persons of interest, it also noted how officers are increasingly under-resourced and overburdened—leading to a plethora of other policing concerns.

Keep On Reading

By Charlie Sawyer

Actor Jamie Dornan guiltily admits to stalking women in London. Here’s why

By Abby Amoakuh

From hot ugly to the Ryan Reynolds straight men theory, here’s what you missed on dateTok

By Charlie Sawyer

Watch Tyler, the Creator and Post Malone get down to Colbie Caillat

By Charlie Sawyer

Justice for Billie Piper: Why she’s worth so much more than her ex-husband Laurence Fox

By Abby Amoakuh

Woman inspired by Netflix docuseries Don’t F*ck With Cats butchers cat and man in brutal murder

By Charlie Sawyer

What to do if your landlord increases your rent, from negotiating to appealing to a tribunal

By Abby Amoakuh

After School Satan Club causes uproar in US elementary school

By Abby Amoakuh

Ice Spice fans hit back at online trolls following no-makeup selfie hate

By Abby Amoakuh

Three young girls in Sierra Leone have died after female genital mutilation rituals despite calls for ban

By Abby Amoakuh

New Alabama bill to add rape exception to abortion ban and punish rapists with castration

By Alma Fabiani

Alicia Keys surprises London commuters with piano performance at St Pancras train station

By Charlie Sawyer

The Mean Girls musical reboot trailer just dropped and it’s giving gen Z tryhard energy

By Fatou Ferraro Mboup

Miriam Margolyes angers adult Harry Potter fans after saying they need to grow up

By Abby Amoakuh

Sydney Sweeney claps back at TikTok scammer who pretended to be her dietitian

By Charlie Sawyer

Singer Luke Combs sickened to hear about his team’s $250K lawsuit against loyal fan, offers to help

By Alma Fabiani

John Cena reacts to Drake’s nudes on Instagram

By Charlie Sawyer

Tucker Carlson and Darren Beattie allege US government planted pipe bombs night before Capitol riots

By Fatou Ferraro Mboup

Taiwan political stunt backfires as 3 hospitalised after eating free laundry pods distributed in campaign

By Charlie Sawyer

How to get a refund on your student loan from SLC

By Fatou Ferraro Mboup

Meet Edward and Natalie Ortega, the parents of Wednesday actress Jenna Ortega