In a recent interview with The New York Times, Wednesday star Jenna Ortega revealed that she deleted her X (formerly Twitter) account, after encountering disturbing deepfake pornography of herself, utilising pictures taken when she was still underage.
This confession comes on the heels of an NBC report released earlier this year in March, which revealed that Instagram and Facebook had distributed ads featuring AI-generated nude images of Ortega. The explicit pictures used for the ad were allegedly taken when she was 16 years old.
The app which launched this incredibly problematic advertising campaign is Perky AI. It is available to download for $7.99 and markets itself as being able to create sexually explicit images of anyone using artificial intelligence. Despite the outrage the NBC report generated, Perky AI still contains a feature that allows users to fully undress women to this day.
So, when New York Times interviewer and reporter Lulu Garcia-Navarro asked the young actor about her opinion on artificial intelligence, Ortega replied: “I hate AI. I mean, here’s the thing: AI could be used for incredible things. I think I saw something the other day where they were saying that artificial intelligence was able to detect breast cancer four years before it progressed. That’s beautiful.”
Yet, she still recounted multiple disturbing encounters she had with AI technology: “Did I like being 14 and making a Twitter account because I was supposed to and seeing dirty edited content of me as a child? No. It’s terrifying. It’s corrupt. It’s wrong.”
“You saw AI-generated images of you as a child? Like pornographic ones?” the reporter sought to clarify.
“Yes, of course,” Ortega replied, seemingly without hesitation. “One of the first — actually the first DM that I ever opened myself when I was 12 was an unsolicited photo of a man’s genitals, and that was just the beginning of what was to come,” the Beetlejuice Beetlejuice actor reflected.
“I used to have that Twitter account and I was told that, ‘Oh, you got to do it, you got to build your image’. I ended up deleting it about two, three years ago because the influx after the show had come out — these absurd images and photos, and I already was in a confused state that I just deleted it,” Ortega confessed.
She continued: “It was disgusting, and it made me feel bad. It made me feel uncomfortable. Anyway, that’s why I deleted it, because I couldn’t say anything without seeing something like that. So one day I just woke up, and I thought, Oh, I don’t need this anymore. So I dropped it.”
Ortega’s experience with deepfake pornography on X certainly isn’t unique. Adult material was always allowed on Twitter, even in its pre-Elon Musk era. However, in June 2024, the tech billionaire decided to make things official by enriching the right to share “consensually produced and distributed adult nudity or sexual behaviour” in X’s content policy.
To reaffirm this already questionable policy without first addressing the platform’s growing issue around content moderation and the ever-growing threat of deepfake pornography has resulted in extensive backlash. SCREENSHOT has recently found that the amount of sexually explicit content featuring specifically female celebrities such as Sydney Sweeney, Taylor Swift and Margo Robbie has increased significantly since Musk introduced these changes.
In the UK, the Ministry of Justice (MoJ) has announced plans to criminalise the creation of sexually explicit deep fake images through upcoming legislation. It was announced in April 2024 that there would be changes to the Criminal Justice Bill, including new offences for creating this kind of content. However, the amendment was criticised for focusing on intent to cause harm, instead of simple consent.
When the general election was called, the Tory government’s plan to completely outlaw the creation of deepfake porn was also dropped as a result.
The current Labour government has yet to announce plans to completely outlaw the creation and distributing of nonconsensual AI pornography.