There isn’t a week that goes by now without a new high-profile person being targeted by deepfake pornography. Whether it’s actors, musicians, influencers, or politicians, no one seems to be able to escape this exponentially growing trend of synthetic pornography that is created without the consent of the subject. Thus, it came to no one’s surprise when it was revealed that some of the UK’s top female politicians have fallen victim to deepfake pornography in the lead-up to the general election.
An investigation by the British television channel Channel 4 found 400 digitally altered pictures of more than 30 high-profile UK politicians on an unnamed, sexually explicit website “dedicated to the abuse and degradation of women.”
The women were either nudified, meaning that users applied tools to make the subjects appear nude or semi-nude, or Photoshopped with their heads imposed onto another person’s naked body.
Channel 4 decided not to name the website in question, considering that it encourages anonymous users to share thousands of photos and post pictures of men masturbating. Nevertheless, its investigation uncovered that the site had more than 12.6 million visitors, 9 per cent from the UK, in the last three months.
Disturbingly, many of the images have been online for several years.
Prominent politicians targeted included Labour deputy leader Angela Rayner, education secretary Gillian Keegan, Commons leader Penny Mordaunt, the former Home Secretary Priti Patel, and Labour backbencher Stella Creasy, according to Channel 4 News.
While many of the politicians involved choose not to comment out of fear of exacerbating the abuse, Conservative MP Dehenna Davison agreed to speak to Channel 4.
She commented that she found it “really strange” that people would bother to take the time to target women like her, but added she found it “quite violating.” She warned that unless governments around the world put in place a proper AI regulatory framework, “major problems” loomed.
The Ministry of Justice (MoJ) has announced plans to criminalise the creation of sexually explicit deep fake images through upcoming legislation. It was announced in April 2024 that there would be changes to the Criminal Justice Bill, including new offences for creating this kind of content. However, the amendment was criticised for focusing on intent to cause harm, instead of simple consent.
When the general election was called, the current government’s plan to completely outlaw the creation of deepfake porn was also dropped as a result. The Conservatives, Labour, Liberal Democrats and Plaid Cymru all told us they would ban the creation of deepfake porn.
Several of the victims are now planning to initiate legal action by involving the police. The site in question claims only to allow lawful content for people aged over 18. However, illegal activity on it has led to criminal convictions in the UK and elsewhere.