Deep Dives Level Up Newsletters Saved Articles Challenges

Opinion

Revenge porn spikes during COVID-19 lockdown. Here’s everything you need to know

By Sofia Gallarate

May 21, 2020

COPY URL


Human rights

May 21, 2020

COPY URL

Video and phone sex, sexting, sending nude pictures, having Zoom orgies, streaming porn, you name it, we’ve probably all done it (or some of it). Months spent in forced isolation have brought what we call ‘internet sex’ to a whole new level. But as you should know by now, the internet is not all good. While most of us were enjoying sexy talk safely, since 23 March the number of people contacting the Revenge Porn Helpline—a service funded by the UK government in order to help victims of intimate image abuse—doubled. The quarantine hasn’t only affected the number of domestic violence cases, it has also seen a surge in online sex-related harassment.

Quarantine saw a spike in revenge porn

According to Clare McGlynn, a law professor at Durham University interviewed by the BBC for the article Coronavirus: ‘Revenge porn’ surge hits helpline, the overuse of social media mixed with the psychological stress brought by the COVID-19 pandemic might have played an important role in triggering abusive behaviours in subjects already at risk. This resulted in the rise in the circulation of revenge porn.

Revenge porn is a way for partners or ex-partners to impose control over someone; to threaten and shame their victims without physical involvement. The consequences of non-consensual sharing of intimate pictures or videos online can be overwhelming for the victim—in a split second, thousands of people could watch it and comment on it. Beyond the emotional and psychological effects of having to face such public exposure of an otherwise intimate image or video, the victim is then left to battle the removal of the content from the internet as soon as possible, which isn’t always easy, as we’ve learnt from the Fappening scandal.

During the first month of lockdown, over 200 cases were opened by the Revenge Porn Helpline, a disturbing new record number since revenge porn finally became a criminal offence in the UK in 2015. Today, perpetrators risk a maximum punishment of up to two years in prison, and on 6 May it was announced that starting this summer, threatening to publish intimate visual content might also be considered a criminal offence. This would mark a crucial step in the fight against online sexual abuse.

While law enforcement continuously adapts the justice system in response to this somewhat recent form of abuse, online platforms that host revenge porn are equally creating stricter regulations to help contain this toxic phenomenon.

A balance between human empathy and AI technology

Revenge nudes circulate widely on Facebook, 4chan, Telegram channels and other websites solely dedicated to the sharing of non-consensual intimate imagery. In response, victims and activists are calling out the platforms’ civic responsibility to make sure the issue is fronted from all sides. For instance, after facing heavy pressure from its users, Facebook formed a team of approximately 25 people who work on stopping the non-consensual distribution of intimate images.

With around half a million reports filed a month, the team has two main goals: actively working to remove the content reported by users and finding potential harming images the second they are uploaded onto the platform.

AI has been used by social media platforms to aid in the identification of hate speech, violent content, fake news and harmful propaganda, and it’s no different with revenge porn. Some believe that AI could recognise revenge porn if it is first exposed to a wide collection of data that contains nude pictures accompanied by sentences such as “look at this” and denigrating emojis in order to perfect its recognition process.

But many remain sceptical about AI’s ability to identify and understand the revengeful context behind the sharing of an intimate image—an attribute that has been classified so far as intrinsic to human empathy. Speaking to NBC News about Facebook’s attempts to ban revenge porn from its platform, Katelyn Bowden, founder of BADASS (Battling Against Demeaning and Abusive Selfie Sharing), a Facebook victim advocacy group she launched after being a victim of revenge porn herself, said: “I believe they are taking it seriously, but they are looking for a technical solution to a human problem.”

Bowden was invited by Facebook as a consultant in order to help the social media platform tackle its growing problem. The truth is, a team of 25 reviewers is not able to do the job alone, and neither can AI without the support of human moderators, who, according to Bowden, would have to become a much larger team in order to truly have the capacity to respond to the surge in revenge porn on the platform.

The breach of sexual privacy and the non-consensual circulation of intimate content create an unbearable sense of emotional distress and shame for its victims. Responsiveness and a better functioning support from both law enforcement and the platforms hosting this sort of content are strongly needed and could, at least, help victims regain control over something that feels out of their hands. Furthermore, the spread of revenge porn should not be exclusively tackled through filtering strategies; sex education and conversations around consent should be at the top of the prevention agenda.