With a general election around the corner, womenâs rightsâparticularly in the online realmâhave taken centre stage for many young voters. And the primary issue on everyoneâs lips is nudify apps. A recent investigation by 404 Media revealed that Instagram is profiting from advertisements for apps that allow users to create nonconsensual deepfakes, with more people than ever are at risk of being targeted by these âundress applications.â Yet, comprehensive image-based abuse laws to address this danger seem out of reach at the moment. Here is SCREENSHOTâs take on why the UK government needs to zoom in on this issue ASAP.
Nudify apps are applications or features within image-editing apps that allow users to upload photos and apply tools to make the subjects appear nude or semi-nude. This manipulation can involve digitally removing clothing or altering the image to create the illusion of nudity.
The existence of these apps caused substantial uproar in March 2024 when it was discovered that Instagram and Facebook were distributing ads for Perky AI, a nudify application that marketed itself with AI-generated nudes of an underaged Jenna Ortega.
âSome of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies,â Emanuel Maiberg concluded in his investigation for 404 Media.
Following his report, Apple and Google pulled multiple nonconsensual AI nude apps from their respective app stores.
Of course, AI-generated nudes arenât a new issue demanding legislative attention. In the past, prominent celebrities like Taylor Swift, Bobbi Althoff, Jacob Elordi and Sydney Sweeney have been targeted by nefarious players online, intent on exploiting their celebrity to distribute pornographic content without their consent.
In the UK, the Ministry of Justice (MoJ) has announced plans to criminalise the creation of sexually explicit deep fake images through upcoming legislation. It was announced in April that there would be changes to the Criminal Justice Bill, including new offences for creating this kind of content. However, the amendment was criticised for focusing on intent to cause harm, instead of simple consent.
When Prime Minister Rishi Sunak dissolved the UK parliament on 24 May 2024, all efforts towards the regulation of sexually explicit synthetic content in the UK came to a halt. Therefore, it is essential for the next government to specifically target image-based abuse and place limitations on the permitted features for AI apps as well as appropriate AI advertisements on social media.
Research from Home Security Heroes revealed a 550 per cent surge in deepfake videos online in 2023 and with increasing accounts of adolescents being subjected to this practice, it is not only a necessity but a duty for the UK government to protect its citizens’ safety online.
As the capabilities of AI are rapidly accelerating, governments are confronted with a multitude of new dangers and needs for new and more sophisticated regulation. So, as we continue to campaign for womenâs rights, we must keep an eye on how new advancements in technology are affecting marginalised genders and how well governments are catching up with these issues.