A woman has recently spoken out about being sexually harassed on Meta’s virtual reality (VR) social media platform. She’s not the first… and won’t be the last. Nina Jane Patel, a psychotherapist who conducts research on the metaverse, said she was left “shocked” after three to four male avatars sexually assaulted her in the umbrella company formerly known as Facebook’s VR platform.
“Within 60 seconds of joining — I was verbally and sexually harassed — 3-4 male avatars, with male voices, essentially, but virtually gang-raped my avatar and took photos — as I tried to get away they yelled — ‘don’t pretend you didn’t love it’ and ‘go rub yourself off to the photo’,” Jane Patel wrote in a Medium post on 21 December 2021.
The 43-year-old mother said it was such a “horrible experience that happened so fast” before she even had a chance to think about using “the safety barrier,” adding that she “froze.” She continued by confessing how both her “physiological and psychological” reaction was similar to it happening in real life. “Virtual reality has essentially been designed so the mind and body can’t differentiate virtual/digital experiences from real,” Jane Patel wrote.
While the whole concept of the metaverse is still in its early stages, Meta opened up access to its virtual reality social media platform, Horizon Worlds, back in early December 2021. Those lucky enough to get a hold of the futuristic universe described it as fun and wholesome, with many drawing comparisons to Minecraft and Roblox.
In Horizon Worlds, up to 20 avatars can get together at a time to explore, hang out, and build within the virtual space. But not everyone’s experiences have been this pleasant. As first reported by the MIT Technology Review, “According to Meta, on November 26, a beta tester reported something deeply troubling: she had been groped by a stranger on Horizon Worlds. On December 1, Meta revealed that she’d posted her experience in the Horizon Worlds beta testing group on Facebook.”
Meta’s response was to review the incident and declare that the beta tester should have used a tool called ‘Safe Zone’ which is part of a suite of safety features built into Horizon Worlds. Acting as a protective bubble users can activate when feeling threatened, within it, no one can touch them, talk to them, or interact in any way—until they signal their preferences to switch the feature off.
Speaking to The Verge just after news of the incident started circulating, Vivek Sharma, Meta’s Vice President of Horizon Worlds and a man, called the incident “absolutely unfortunate” and added, “That’s good feedback still for us because I want to make [the blocking feature] trivially easy and findable.”
After first reporting about her assault on the metaverse, Janel Patel shared that most comments she received on her post were people trying to put the blame on her and not her aggressors, “The comments were a plethora of opinions from — ‘don’t choose a female avatar, it’s a simple fix’, to ‘don’t be stupid, it wasn’t real’, ‘a pathetic cry for attention’, ‘avatars don’t have lower bodies to assault’, ‘you’ve obviously never played Fortnite’, ‘I’m truly sorry you had to experience this’ and ‘this must stop’.”
While it hasn’t been confirmed whether this precise incident was Patel’s, one thing is obvious—it’s not the first time a user has been groped in VR, which further proves that until companies work out how to protect participants, the metaverse can never be a safe place.
In October 2016, gamer Jordan Belamire penned an open letter on Medium describing being groped in QuiVr, a game in which players—equipped with bow and arrows—shoot zombies. Belamire described entering a multiplayer mode, “In between a wave of zombies and demons to shoot down, I was hanging out next to BigBro442, waiting for our next attack. Suddenly, BigBro442’s disembodied helmet faced me dead-on. His floating hand approached my body, and he started to virtually rub my chest. ‘Stop!’ I cried … This goaded him on, and even when I turned away from him, he chased me around, making grabbing and pinching motions near my chest. Emboldened, he even shoved his hand toward my virtual crotch and began rubbing.”
“There I was, being virtually groped in a snowy fortress with my brother-in-law and husband watching,” she continued. At the time, QuiVr developer Aaron Stanton and co-founder Jonathan Schenker immediately responded with an apology and an in-game fix—avatars would be able to stretch their arms into a V gesture, which would automatically push any offenders away.
A recent review of the events around Belamire’s experience published in the journal for the Digital Games Research Association (DIRGA) found that “many online responses to this incident were dismissive of Belamire’s experience and, at times, abusive and misogynistic … readers from all perspectives grappled with understanding this act given the virtual and playful context it occurred in.”
It’s highly important for people to understand that sexual harassment does not have to be a physical thing. It can be verbal, and yes, more recently, it can be a virtual experience as well. The nature of virtual reality spaces is such that it is designed to trick users into thinking they are physically in a certain space. It’s part of the reason why emotional reactions can be stronger in that space, and why VR triggers the same psychological responses.
In the end, the burning question is: whose responsibility is it to make sure users are comfortable? Meta hands off safeguarding responsibility to its users, giving them access to tools to ‘keep themselves safe’, effectively shifting the blame onto them. And that’s not right.
The US army has faced a growing problem in its rank for decades now. According to Pentagon data, between 2014 and 2019, the suicide rate for active-duty troops rose from 20.4 to 25.9 suicides per 100,000. In 2019, there were 7,825 reports of sexual assault involving service members as victims, according to The New York Times. In the last three months of 2020, suicides among National Guard troops nearly tripled to 39 from 14 over the same period the prior year.
But how do you prevent militaries from committing suicide or sexually assaulting each other? You train members of the US army to intervene whenever they see alarming signs. Which tools do you use for this very specific type of training? Virtual reality (VR) headsets. At least, that’s what the Air Force is testing at the moment.
Years of prevention training—often in the form of somnolence-inducing PowerPoint presentations—have done little to reduce the rates of either problem. Whether the VR model can ultimately do better remains an open question, but military officials are encouraged by the early self-reported responses to the training.
So far, over 1,000 Air Force personnel have participated in the training, reports The New York Times. 97 per cent of those who tried it would recommend it, and trainees reported an increase in the likelihood to intervene with a person in crisis, Air Force officials added. Among those ages 18 to 25—a generation more used to interactive virtual experiences make up the majority of new recruits—the impact increased sevenfold. Officials intend to train at least 10,000 airmen with the programme this year.
The VR programme is based on the fact that intervention by bystanders has been approved by experts as one of the few effective tactics for both problems. If they witness harassment in a bar, for instance, or alarming messages on social media representing a suicide threat, then they’ll know how to approach those problems tactfully.
The programme places airmen in related scenarios with photo-realistic actors and coaching on which responses would be constructive and which may not. “You are an active participant. You have to be ready. I think that it is going to help airmen retain and remember knowledge,” said Carmen Schott, the sexual assault prevention and response programme manager for the Air Force’s Air Mobility Command. He continued, telling The New York Times: “We don’t want people to feel judged. They may not make perfect decisions, but they will learn skills.”
In the military, many barriers can get in the way of people trying to intervene—especially against someone of a higher rank. As of now, airmen going through the programme have only been interacting with suicidal virtual colleagues via their headsets. Another bystander programme, which will roll out in July, will place the users in a bar, watching a scene of sexual harassment unfold.
But this VR programme is not the only educational tool with more of a hands-on approach. Gamification seems to also play a crucial part in successful and impactful teaching. With the growing use of computer-based therapy in mental health and promising results seen in the use of gamification in psychotherapy, the push to gamify sex education is part of a broader movement—deploying video games to target health issues ranging from depression to tobacco use.