Charities including the National Society for the Prevention of Cruelty to Children (NSPCC) and Meic—a helpline for younger people in Wales—are concerned that more young people have been sharing naked images of themselves during lockdown. They both said staff have seen an important increase since the beginning of lockdown in March. With the significant increase in the time teenagers are spending online comes a lack of face-to-face interaction, which can be seen as a factor to this new spike in sextortion.
Speaking to the BBC, Sabiha Azad who works on Meic’s helpline for children and young people explained that “Many young people want to send them because it has been normalised in terms of social media.” But there’s more to it than peer pressure. People—teenagers included—are at home alone and wanting intimacy. Young people are less likely to know how to explore it healthily and can easily be pressured into sending things they otherwise wouldn’t share or do.
The charities said most cases were believed to involve 14 to 16-year-olds, with a lot of people cropping out their heads from photographs for ‘safety’. Often, they forget about other distinguishable markers such as birthmarks, piercings or their bedroom’s decor.
Sextortion is defined as a type of revenge porn that employs non-physical forms of coercion to extort sexual favours from the victim. In other words, it is when someone tries to extort either money or sexual favours from someone else by threatening to reveal evidence of their sexual activity.
In this case, sexting blackmail is a specific aspect of sextortion where people share nudes with someone who then blackmails them or decide to share their pictures without their consent. During lockdown, sexting blackmail saw a spike in the UK.
While sending nudes at a certain age is nothing to worry about, being pressured to do so at an early age can often be a sign of a controlling relationship. It is illegal for under-18s to send or receive nudes. And, during lockdown, despite campaigns created to tackle this, many risks to younger people have been overlooked.
Azad added, “You can even get girls sending pictures on to their friends first to check if they look OK, or boys sharing the photos they get sent with others to compare them.” She further explained that during the pandemic, there’s a danger of forgetting young people—especially girls in this case.
And girls are the ones to see the repercussions. They’re much more likely to be referred to specialist services for support or to develop eating disorders after people share their private pictures because of the negative comments people can make. “It’s a very intimate image being shared and it may be shared to your family members. It often goes through schools, so everyone in that year group will probably see it, if not more,” added Azad.
One 13-year-old girl who remained anonymous was duped into sending sexual photographs to someone she met online, who she has now found out is an adult posing as someone else. She contacted ChildLine and explained she met her blackmailer on Instagram and developed an online relationship with him.
“He convinced me to send pictures of myself which were sexual. Now he’s threatened to share those pictures with my friends unless I send him more,” she told the BBC. She added that she was too scared to tell her mum in case she got into trouble.
Another victim who is 14 met “a good-looking boy” on a teenage dating app who made her feel special while she was having a tough time at home. When he started asking for nude photos, she said she “agreed as a joke to talk dirty instead.” After a while, she became uncomfortable and she blocked him, only for him to get in touch through another app, threatening to publish her profile picture next to the dirty messages.
Because it is illegal for under-18s to send or receive nudes, this kind of situation can result in a vicious circle of blackmail, which can lead to bullying from other young people. Young people can then receive more blackmail and send further images.
In order to tackle this problem, young people are urged to seek support from an adult. Children and young people can speak with a ChildLine counsellor online or on the phone between 09:00 and midnight on 0800 11 11 or can also contact Meic in an online chat.
Video and phone sex, sexting, sending nude pictures, having Zoom orgies, streaming porn, you name it, we’ve probably all done it (or some of it). Months spent in forced isolation have brought what we call ‘internet sex’ to a whole new level. But as you should know by now, the internet is not all good. While most of us were enjoying sexy talk safely, since 23 March the number of people contacting the Revenge Porn Helpline—a service funded by the UK government in order to help victims of intimate image abuse—doubled. The quarantine hasn’t only affected the number of domestic violence cases, it has also seen a surge in online sex-related harassment.
According to Clare McGlynn, a law professor at Durham University interviewed by the BBC for the article Coronavirus: ‘Revenge porn’ surge hits helpline, the overuse of social media mixed with the psychological stress brought by the COVID-19 pandemic might have played an important role in triggering abusive behaviours in subjects already at risk. This resulted in the rise in the circulation of revenge porn.
Revenge porn is a way for partners or ex-partners to impose control over someone; to threaten and shame their victims without physical involvement. The consequences of non-consensual sharing of intimate pictures or videos online can be overwhelming for the victim—in a split second, thousands of people could watch it and comment on it. Beyond the emotional and psychological effects of having to face such public exposure of an otherwise intimate image or video, the victim is then left to battle the removal of the content from the internet as soon as possible, which isn’t always easy, as we’ve learnt from the Fappening scandal.
During the first month of lockdown, over 200 cases were opened by the Revenge Porn Helpline, a disturbing new record number since revenge porn finally became a criminal offence in the UK in 2015. Today, perpetrators risk a maximum punishment of up to two years in prison, and on 6 May it was announced that starting this summer, threatening to publish intimate visual content might also be considered a criminal offence. This would mark a crucial step in the fight against online sexual abuse.
While law enforcement continuously adapts the justice system in response to this somewhat recent form of abuse, online platforms that host revenge porn are equally creating stricter regulations to help contain this toxic phenomenon.
Revenge nudes circulate widely on Facebook, 4chan, Telegram channels and other websites solely dedicated to the sharing of non-consensual intimate imagery. In response, victims and activists are calling out the platforms’ civic responsibility to make sure the issue is fronted from all sides. For instance, after facing heavy pressure from its users, Facebook formed a team of approximately 25 people who work on stopping the non-consensual distribution of intimate images.
With around half a million reports filed a month, the team has two main goals: actively working to remove the content reported by users and finding potential harming images the second they are uploaded onto the platform.
AI has been used by social media platforms to aid in the identification of hate speech, violent content, fake news and harmful propaganda, and it’s no different with revenge porn. Some believe that AI could recognise revenge porn if it is first exposed to a wide collection of data that contains nude pictures accompanied by sentences such as “look at this” and denigrating emojis in order to perfect its recognition process.
But many remain sceptical about AI’s ability to identify and understand the revengeful context behind the sharing of an intimate image—an attribute that has been classified so far as intrinsic to human empathy. Speaking to NBC News about Facebook’s attempts to ban revenge porn from its platform, Katelyn Bowden, founder of BADASS (Battling Against Demeaning and Abusive Selfie Sharing), a Facebook victim advocacy group she launched after being a victim of revenge porn herself, said: “I believe they are taking it seriously, but they are looking for a technical solution to a human problem.”
Bowden was invited by Facebook as a consultant in order to help the social media platform tackle its growing problem. The truth is, a team of 25 reviewers is not able to do the job alone, and neither can AI without the support of human moderators, who, according to Bowden, would have to become a much larger team in order to truly have the capacity to respond to the surge in revenge porn on the platform.
The breach of sexual privacy and the non-consensual circulation of intimate content create an unbearable sense of emotional distress and shame for its victims. Responsiveness and a better functioning support from both law enforcement and the platforms hosting this sort of content are strongly needed and could, at least, help victims regain control over something that feels out of their hands. Furthermore, the spread of revenge porn should not be exclusively tackled through filtering strategies; sex education and conversations around consent should be at the top of the prevention agenda.