‘Unlawful and unethical’: UK police urged to ban facial recognition in all public places

By Malavika Pradeep

Published Nov 2, 2022 at 09:00 AM

Reading time: 4 minutes

37879

In 2021, several reports outlined the Metropolitan (Met) Police’s plans to supercharge surveillance following the purchase of a new facial recognition system. Dubbed Retrospective Facial Recognition (RFR), the technology essentially allowed police to process historic images from CCTV feeds, social media, and other sources in a bid to track down suspects.

“Those deploying it can, in effect, turn back the clock to see who you are, where you’ve been, what you have done and with whom, over many months or even years,” Ella Jakubowska, policy advisor at European Digital Rights, told WIRED at the time—adding that the tech can “suppress people’s free expression, assembly, and ability to live without fear.”

In March 2021, a report found that RFR was being used by a sum total of six police forces in England and Wales. Despite this statistic, the technology has largely evaded legal scrutiny—accompanied by the deployment of Live Facial Recognition (LFR) systems in public spaces. LFR essentially scans people’s faces as they walk down streets and compares them to a suspect watchlist in real-time, true Minority Report style.

The UK’s current policy on facial recognition

Over the years, several critics have argued that the use of both RFR and LFR encroaches on individual privacy and social justice. Given its controversial history of misidentifying people of colour, leading to wrongful arrests, and even putting LGBTQ+ lives at risk, experts have warned against the invasive surveillance tools capable of tracking the public on a massive scale.

“In the US, we have seen people being wrongly jailed thanks to RFR,” Silkie Carlo, director of civil liberties group Big Brother Watch, told WIRED. “A wider public conversation and strict safeguards are vital before even contemplating an extreme technology like this, but the Mayor of London has continued to support expensive, pointless, and rights-abusive police technologies.”

In July 2019, the House of Commons Science and Technology Committee recommended restrictions on the use of LFR until concerns regarding the technology’s bias and efficacy were resolved. Then, in August 2020, the UK’s Court of Appeal found that South Wales Police’s use of LFR was unlawful and, in September 2021, the United Nations High Commissioner for Human Rights called for a moratorium on the use of LFR.

However, the Met has stated that it will continue to deploy both the systems in question—as and when it deems it appropriate. “Each police force is responsible for their own use of LFR technologies,” a spokesperson said. “The Met is committed to give prior notification for its overt use of LFR to locate people on a watchlist. We will continue to do this where a policing purpose to deploy [it] justifies the use of LFR.” Meanwhile, Scotland has reported its intention to introduce the technology by 2026.

A history of unchecked biases and errors

Between 2016 and 2019, the Met reportedly deployed LFR 12 times across London. The first case was at the Notting Hill Carnival in 2016, the UK’s biggest African-Caribbean celebration, where one attendee was falsely matched to the real-time database of criminal records. Similarly, at Notting Hill Carnival in 2017, two people were falsely positive while another individual was correctly matched but was no longer wanted.

“Face recognition software has been proven to misidentify ethnic minorities, young people, and women at higher rates,” Electronic Frontier Foundation (EFF), a leading nonprofit defending civil liberties in the digital world, noted in an open letter in September 2022. “And reports of deployments in spaces like Notting Hill Carnival—where the majority of attendees are black—exacerbate concerns about the inherent bias of face recognition technologies and the ways that government use amplifies police powers and aggravates racial disparities.”

After temporary suspension over the COVID-19 pandemic, the police force resumed its deployment of LFR across central London. “On 28 January 2022, one day after the UK Government relaxed mask-wearing requirements, the Met deployed LFR with a watchlist of 9,756 people. Four people were arrested, including one who was misidentified and another who was flagged on outdated information,” EFF highlighted, adding that the tech was once again deployed outside Oxford Street tube station, where it reportedly scanned around 15,600 people’s data and resulted in four “true alerts” and three arrests.

“The Met has previously admitted to deploying LFR in busy areas to scan as many people as possible, despite face recognition data being prone to error. This can implicate people for crimes they haven’t committed.”

According to a 2020 report, London was the third most-surveilled city in the world, with over 620,000 cameras. As of 2022, however, it currently holds the eighth position on the list with 127,373 cameras in total. Another report claimed that, between 2011 and 2022, the number of CCTV cameras more than doubled across the London Boroughs.

UK police and the ‘unlawful and unethical’ use of LFR

After analysing LFR use by the Met and South Wales police, a new study by the University of Cambridge has now concluded that the controversial technology should be banned in “all public spaces.”

The report, authored by the University’s Minderoo Centre for Technology and Democracy, examined three deployments of LFR, one by the Met police and two by South Wales police, with an audit tool based on current legal guidelines. Based on the findings, the experts joined an increasing list of calls to ban the use of facial recognition in streets, airports, and any public spaces—the very areas where police believe it would be most valuable.

In all three cases, the team noted that important information about police use of the technology in question is “kept from view,” including scant demographic data published on arrests or other outcomes—in turn, making it difficult to evaluate whether the tools “perpetuate racial profiling.”

Apart from this lack of transparency, the researchers also found little accountability being taken for their actions, with no clear recourse for people or communities negatively affected by police use, or rather misuse, of the tech. “Police forces are not necessarily answerable or held responsible for harms caused by facial recognition technology,” stated the report’s lead author, Evani Radiya-Dixit. “We find that all three of these deployments fail to meet the minimum ethical and legal standards based on our research on police use of facial recognition. To protect human rights and improve accountability in how technology is used, we must ask what values we want to embed in technology.”

According to the report, at least ten police forces in England and Wales have trialled facial recognition to date, involving its use for operational policing purposes. While facial recognition is seen as a fast, efficient, and cheap way to track down persons of interest, it also noted how officers are increasingly under-resourced and overburdened—leading to a plethora of other policing concerns.

Keep On Reading

By Eliza Frost

Couples who meet online are less happy in love, new research finds

By Charlie Sawyer

Sabrina Carpenter accused of centering men on controversial album cover

By Eliza Frost

Jennifer Aniston to star in Apple TV+ adaptation of Jennette McCurdy’s memoir I’m Glad My Mom Died

By Eliza Frost

Misinformation spread by wellness influencers online is leading to falling contraceptive pill use

By Eliza Frost

Netflix’s Adolescence sweeps Emmys, with star Owen Cooper making history as youngest-ever male winner

By Eliza Frost

Cruz Beckham’s girlfriend Jackie Apostel defends the couple’s age gap relationship 

By Eliza Frost

Are you in Group 7? Explaining the latest viral TikTok trend

By Charlie Sawyer

How influencer Liv Schmidt promotes toxic eating habits through the Skinni Société 

By Eliza Frost

Netflix is predicting your next favourite show based on your zodiac sign 

By Charlie Sawyer

UK women who miscarry could face home and phone searches following new anti-abortion police guidance

By Eliza Frost

We finally know why Conrad and Belly broke up in The Summer I Turned Pretty season 2

By Alma Fabiani

The disturbing TikTok trend sexualising fake Down syndrome faces using AI filters

By Eliza Frost

Everyone’s posing like Nicki Minaj: the TikTok trend explained 

By Charlie Sawyer

From breaking up families to spreading rumours about Joe Biden’s death, here’s what QAnons been up to

By Charlie Sawyer

Father of former Harry Potter star gives serious warning to the new child stars in HBO Max reboot

By Eliza Frost

How fans manifested Elle Fanning as Effie Trinket in The Hunger Games: Sunrise on the Reaping

By Eliza Frost

UK to lower voting age to 16 by next election. A controversial move, but the right one

By Charlie Sawyer

Michael Cera reveals why he turned down a role in the Harry Potter franchise

By Eliza Frost

Why is Taylor not Team Conrad in The Summer I Turned Pretty?

By Charlie Sawyer

What is ketamine therapy, the psychiatric treatment healing famous Mormons Jen and Zac Affleck’s marriage?