You may remember the fappening, also known as TheFappening or Celebgate, which refers to the hacking and leaking of hundreds of nude photos of over 100 celebrities, including most notoriously Jennifer Lawrence and Rihanna back in 2014. But do you actually know the man who was behind it?
It was assumed at first that hackers worked out what the celebrities’ Apple iCloud login details were and sent phishing emails to their Gmail accounts to access their pictures. More than 400 images were leaked on 4chan and Reddit, which was quickly followed by stars threatening to take legal action. But in March 2016, a man admitted being behind the hack. Who is he?
Ryan Collins, now 49, is from the state of Pennsylvania in the US. While there aren’t that many pictures of him left on the internet, the image above was associated with his name. Ironically, it appears that he’s deleted his Facebook and other social profiles, which makes it harder for people to find pictures of him.
Collin is married with two children and studied science and technology at James Madison University in Virginia. It is also believed he worked for an e-commerce business and used to work as a personal chef.
The US attorney’s office previously said that Collins used a phishing scheme where he sent emails to celebrities that looked like they were from Apple or Google in order to ask victims to provide their usernames and passwords, which then granted him access to their private accounts. Collins apparently used software to download pictures and other private content from their Apple iCloud accounts and obtained more photos by pretending they were for modelling assignments.
Collins took a plea deal and admitted “gaining unauthorised access to protected computers to obtain information.” As part of the deal, prosecutors recommended that he should serve 18 months in federal prison, which Collins ended up doing, avoiding the maximum sentence of five years in jail.
That’s where things remain unclear. Collins never faced any charges over whether he was the one to leak the photos online. This means that we may never know if he is responsible for the fappening or if someone else did it.
Video and phone sex, sexting, sending nude pictures, having Zoom orgies, streaming porn, you name it, we’ve probably all done it (or some of it). Months spent in forced isolation have brought what we call ‘internet sex’ to a whole new level. But as you should know by now, the internet is not all good. While most of us were enjoying sexy talk safely, since 23 March the number of people contacting the Revenge Porn Helpline—a service funded by the UK government in order to help victims of intimate image abuse—doubled. The quarantine hasn’t only affected the number of domestic violence cases, it has also seen a surge in online sex-related harassment.
According to Clare McGlynn, a law professor at Durham University interviewed by the BBC for the article Coronavirus: ‘Revenge porn’ surge hits helpline, the overuse of social media mixed with the psychological stress brought by the COVID-19 pandemic might have played an important role in triggering abusive behaviours in subjects already at risk. This resulted in the rise in the circulation of revenge porn.
Revenge porn is a way for partners or ex-partners to impose control over someone; to threaten and shame their victims without physical involvement. The consequences of non-consensual sharing of intimate pictures or videos online can be overwhelming for the victim—in a split second, thousands of people could watch it and comment on it. Beyond the emotional and psychological effects of having to face such public exposure of an otherwise intimate image or video, the victim is then left to battle the removal of the content from the internet as soon as possible, which isn’t always easy, as we’ve learnt from the Fappening scandal.
During the first month of lockdown, over 200 cases were opened by the Revenge Porn Helpline, a disturbing new record number since revenge porn finally became a criminal offence in the UK in 2015. Today, perpetrators risk a maximum punishment of up to two years in prison, and on 6 May it was announced that starting this summer, threatening to publish intimate visual content might also be considered a criminal offence. This would mark a crucial step in the fight against online sexual abuse.
While law enforcement continuously adapts the justice system in response to this somewhat recent form of abuse, online platforms that host revenge porn are equally creating stricter regulations to help contain this toxic phenomenon.
Revenge nudes circulate widely on Facebook, 4chan, Telegram channels and other websites solely dedicated to the sharing of non-consensual intimate imagery. In response, victims and activists are calling out the platforms’ civic responsibility to make sure the issue is fronted from all sides. For instance, after facing heavy pressure from its users, Facebook formed a team of approximately 25 people who work on stopping the non-consensual distribution of intimate images.
With around half a million reports filed a month, the team has two main goals: actively working to remove the content reported by users and finding potential harming images the second they are uploaded onto the platform.
AI has been used by social media platforms to aid in the identification of hate speech, violent content, fake news and harmful propaganda, and it’s no different with revenge porn. Some believe that AI could recognise revenge porn if it is first exposed to a wide collection of data that contains nude pictures accompanied by sentences such as “look at this” and denigrating emojis in order to perfect its recognition process.
But many remain sceptical about AI’s ability to identify and understand the revengeful context behind the sharing of an intimate image—an attribute that has been classified so far as intrinsic to human empathy. Speaking to NBC News about Facebook’s attempts to ban revenge porn from its platform, Katelyn Bowden, founder of BADASS (Battling Against Demeaning and Abusive Selfie Sharing), a Facebook victim advocacy group she launched after being a victim of revenge porn herself, said: “I believe they are taking it seriously, but they are looking for a technical solution to a human problem.”
Bowden was invited by Facebook as a consultant in order to help the social media platform tackle its growing problem. The truth is, a team of 25 reviewers is not able to do the job alone, and neither can AI without the support of human moderators, who, according to Bowden, would have to become a much larger team in order to truly have the capacity to respond to the surge in revenge porn on the platform.
The breach of sexual privacy and the non-consensual circulation of intimate content create an unbearable sense of emotional distress and shame for its victims. Responsiveness and a better functioning support from both law enforcement and the platforms hosting this sort of content are strongly needed and could, at least, help victims regain control over something that feels out of their hands. Furthermore, the spread of revenge porn should not be exclusively tackled through filtering strategies; sex education and conversations around consent should be at the top of the prevention agenda.