Over the last couple of months, artificial intelligence (AI) and its seemingly endless possibilities have exploded into public consciousness, with innovations like chatbot ChatGPT and all-in-one image editing tool Lensa dominating the internet and forcing us to rethink the future of technology altogether.
AI—aka scientific inventions that specifically imitate the cognitive capabilities of a human—has been around for years (hey, Siri!) but ever-improving synthetic media technologies are now throwing up all manner of ethical issues. In other words, their developers often make use of data freely available to them.
If a stranger asked for a copy of your passport, would you hand it over without asking who they were, and why they needed it? What about your bank details? Or your fingerprint?
Chances are, the answer is no. But when it comes to giving away our personal data online, the boundaries have become blurred. We’ve all clicked ‘accept cookies’ in a fit of impatience (or because it sounded kind of cute, like a snack from grandma).
And then there’s our faces and bodies. Most of us don’t think twice before posting pictures of ourselves online. We do this knowing anyone can access public Instagram and TikTok accounts, and that even private profiles can get hacked. Why? Well, because everyone’s doing it.
Tons of people now make an excellent living out of sharing content. But what if our information is used against our will, or out of context? What if we’ve lost online autonomy over our own bodies?
Twitch stars QTCinderella and Pokimane just found out the answer to this frightening question. The two famous streamers make money through video content in which they’re often seen decorating cakes or playing games. What they don’t do is make sex tapes. But, thanks to sophisticated AI technology, there are dark corners of the internet where it now looks like they do. And anyone can pay to watch.
QTCinderella’s likeness was overlaid onto a pre-existing adult video without her permission. What’s worse, her fellow high-profile streamer (and so-called friend) Brandon Ewing aka Atrioc was caught downloading it, and views skyrocketed. Fans pointed out his transgression after he uploaded a video of himself, in which his computer screen evidenced him downloading the deepfake. He issued a tearful apology, but the damage was already done.
QTCinderella live streamed her response to the creators responsible, and to Atrioc himself. “I’m so exhausted and I think you guys need to know what pain looks like, because this is it. This is what it looks like to feel violated. This is what it feels like to be taken advantage of, this is what it looks like to see yourself naked against your will, being spread all over the internet. This is what it looks like,” she stated.
She continued: “Fuck the fucking internet. Fuck Atrioc for showing it to thousands of people. Fuck the people DMing me pictures of myself from that website. Fuck you all.”
I want to scream.
— QTCinderella (@qtcinderella) January 30, 2023
Stop.
Everybody fucking stop. Stop spreading it. Stop advertising it. Stop.
Being seen “naked” against your will should NOT BE A PART OF THIS JOB.
Thank you to all the male internet “journalists” reporting on this issue. Fucking losers @HUN2R
Technology has moved fast since Jordan Peele first made waves with a deepfake video of President Obama back in 2018. The film creator teamed up with BuzzFeed to make it look like the President of the US was speaking nonsense, using nothing but the simple-to-use Fakeapp. This spoof alone proved how easy it was to spread fake images and videos around the globe.
And now, as technology advances, so do the opportunities for the sexual exploitation of women through deepnudes. In October 2020, visual threat intelligence company Sensity AI published a report which stated that over 680,000 women had fake images made of them using as little as one photograph.
The effects of such violations can be far-reaching, causing mental health issues like depression and disordered thinking. QTCinderella shared via Twitter that the events had negatively affected her self-perception:
The amount of body dysmorphia I’ve experienced since seeing those photos has ruined me.
— QTCinderella (@qtcinderella) January 31, 2023
It’s not as simple as “just” being violated. It’s so much more than that.
The streamer vowed to sue the people who made the website, but because AI technology is still relatively new, it’s not easy. Federal revenge porn law in the US does not technically address deepfakes so it might not be possible, or in the least very complicated, for her to gain justice.
What’s worse is that the videos have given way to internet trolls weighing in on the events, many jibing that QTCinderella has nothing to be upset about, claiming that abuse “comes with the territory” of being famous. Spoiler alert, it doesn’t.
SCREENSHOT spoke with psychologist Zoe Mallett on internet security as well as how emotionally exhausting and draining online abuse can be. The expert explained: “Firstly, it’s okay to feel heightened emotions if you are being abused online, it’s the same if you’re being abused in real life. It hurts, we feel rejected, we feel sad, and it can make us feel isolated.”
Mallett continued: “Often, we can feel ashamed or embarrassed so we don’t talk to our friends or family about it. Especially if we feel that telling our family may run the risk of them policing us with where we spend our time online. We also have to take into consideration that online bullying and trolling is still quite new territory, and it’s hard to know who is behind the comments, and their purpose.”
Online abuse is gaining greater traction within academic research and, as Mallett explained, theorists have found that “trolls possess dark personality traits, including psychopathy, narcissism and sadism. This can help us better understand that, statistically, this abuse is coming from those experiencing very serious disorders. We can look at this to help us try and take away feelings of the comments being a personal attack.”
Of course, this reassurance feels lacklustre when you realise that individuals are now charging minimal amounts of money to make deepfakes. For the price of lunch, anyone can request an embarrassing or explicit fake video be made of an ex-partner, family member, supposed friend or classmate. It’s an incredibly unsettling reality.
It’s impossible to control the behaviour of anyone but ourselves. But there are things we can do to safeguard our personal information. For starters, this can consist of setting our personal accounts to ‘private’, making sure our passwords are secure, and blocking any suspicious accounts we don’t recognise.
Put aside some time to conduct an online audit. Have a think about how you look to the outside world, and whether you’re okay with that. Delete or archive any images or captions you wouldn’t feel comfortable being shared.
Mallett also recommends speaking to trusted individuals about your online activity and taking time away from your screen to consider other perspectives and distance yourself from the intensity of social media and internet culture.
Think twice before jumping on the latest social media trend, too. Ask yourself why the website or app needs your details, and if you’re comfortable sharing them. Would you want your information or photographs to be stored (and possibly sold on) by people you don’t know?
Something to bear in mind next time you share a photo with 3,000 people you’ve never met, or upload ten selfies to a brand new app in exchange for a sexy AI avatar—the results aren’t always as glamorous as you may think. As the expert noted, “You can get the same dopamine hit that the online world gives you with human touch, human connection and getting out into nature. Keep reminding yourself that there is a world outside of the online one.”
Props to QTCinderella for speaking out on her experience, drawing attention to this complicated issue, and reminding us about the sometimes dangerous consequences of sharing personal pictures online.
Remember the website This Person Does Not Exist which, when visited, greeted browsers with the face of a stranger, one that would then completely change when the page was refreshed with a new stranger’s face? Those faces were all deepfakes developed by algorithms, hence the name of the website. Since it appeared, there has been an influx in computer-generated images. The creator of This Person Does Not Exist, Phil Wang, made the site using a new development in machine learning called Generative Adversarial Networks (GAN). GAN forces two data sets to compete with each other in a game, encouraging each strand to learn from the other’s mistakes.
To train GAN, you feed it images of things you would like to generate more of, and it then simultaneously tries to generate images and learns how to distinguish its own images from the real ones (so to stay true to the original form or idea, in a sense).
As with many AI models, GANs require a large amount of data—the more data, the more variation. As a concept, this is both interesting and potentially useful, depending on what GAN is used for. However, collections of naked women that have been collected from porn production companies are being used to generate deepfake porn. The porn production companies involved in this new problematic doing have also previously been accused of forcefully coercing women to have sex on camera without their consent. Just so you can get an idea of the characters we’re dealing with here.
The dataset that is circulating in deepfake porn creation communities online includes images scraped from the porn production company Czech Casting in particular, which local police have accused of human trafficking and rape as well as trafficking still images from other porn sites. Czech Casting’s founder is currently a fugitive on the FBI’s most-wanted list.
Much like This Person Does Not Exist, the dataset is being used to create photorealistic images of women, but nude, who aren’t real and who all don’t look like any one person. Essentially, it’s porn generated entirely by AI: deepfakes porn, or deepnudes as we once named the same concept. Because of this, these algorithmically-generated images that are created aren’t technically doing much harm as they are not ‘real people’. However, legal experts, technologists and the women included in the datasets describe these creations as uniquely dehumanising.
Honza Červenka, a lawyer at the McAllister Olivarius law firm who specialises in revenge porn and technology has been following Czech Casting’s case. He told Vice that the idea of these images being less harmful because of being run through an algorithm makes them “anonymised” is a red herring. He said that “It’s mad science really, and completely and utterly re-victimizing to the victims of the Czech Casting perpetrators.”
The Czech Casting case does not stand alone, as it is simply impossible for technology to reverse in its tracks, meaning that this technology will undoubtedly develop. For now, GAN is incapable of generating videos similar to real porn videos, the best it can do is generate images. Going forward, there are several issues associated with algorithmic-generated porn. In the long run though, the difference between real and computer-generated videos will shrink until AI-generated porn potentially becomes mainstream.
Studio-made pornography may still exist, but it might become more of a niche interest for people who want to watch ‘real people have real sex’ instead of computer-generated videos. Recruitment agencies and production companies might start to lean towards companies that are involved in this AI-generated pornography and possibly provide them with data.
A huge issue with the algorithms used to create this kind of porn is that because programming is variable, the technology has the worrying potential of being used to target people, within any criteria. The algorithm can only produce things that it has seen before though, as this is how GAN generates data, however, the potential remains.
Computer scientist and co-founder of SketchDeck, David Mack, attempted to build an AI that generated porn and wrote an article on his experience. In it, he reflected on the fact that on a macro scale, if his project were a success, it had the potential to change the world. “Many people are harmed in the production of pornography, and this project (given the very low cost of producing new images) would supplant them. Pornography would harm fewer people.”
However on the other side, he wrote that “this would displace workers and reshape an industry.” He concluded that he had struck a corner of hypocrisy within our society and technology. Pornography is a major part of the internet’s usage and people’s daily lives, this new technology could potentially improve the darker and more dangerous sides to the porn industry, such as sexual abuse and trafficking.
All in all, there are benefits to the idea, but while the porn industry abuses the women within it, like in the Czech Casting case among others, the negatives outride the benefits—without a fully ethical support system to generate the data for the project, its future remains unclear. One thing we know for sure is that this tech and the industries it circles within will move fast.