It took 60 seconds for Nina Jane Patel to be sexually assaulted in the metaverse. “A group of male avatars surrounded me and started to grope my avatar while taking selfies,” she recalls. “I tried to move away but they followed me, laughing and shouting. They were relentless.” Patel, a psychotherapist and metaverse researcher, swiftly took her headset off. “As I moved away, they yelled: ‘Don’t pretend you didn’t love it; this is why you came here.’”
Explaining the metaverse is like explaining Google to someone in the 1950s. Often referred to as “an embodied online world”, it is a 3D extension of the internet, comprising a series of virtual spaces where, by strapping on a VR headset or a pair of AR glasses, you can move, communicate and consume everything as you would in real life.
Currently, there is no fully formed, interoperable metaverse, so to speak. But there are a number of independent virtual environments bringing us closer to this brave new immersive world. Among them are Horizon Worlds (not yet available in the UK) and Horizon Venues, both launched by Meta last year; Patel was assaulted in the latter.
Made by Oculus, which is also owned by Meta, the Horizon apps offer users access to a vast range of virtual activities by way of a Facebook login and a £299 headset. There are gigs, escape rooms and even “intergalactic trains”, whatever that means. After creating an avatar from the hips up (they don’t yet have legs), users move through this space in real time, mingling among other headset-wearing folk, who could be sobbing next to you in a virtual cinema while sitting in their living room on the other side of the world. The potential benefits of this technology are exponential. But, as is becoming increasingly clear, so are the dangers.
Research conducted by the Centre for Countering Digital Hate (CCD) in December found there had been 100 potential violations of Meta’s policies for virtual reality in the space of 11 hours and 30 minutes. In addition to sexual harassment and assault, abusive behaviour highlighted in the report included racism, bullying, threats of violence, and “content mocking the 9/11 terror attacks”.
“It was clear from the outset of our research that extreme sexual content is common in the metaverse, and that manifests as sexual violence, too,” says Callum Hood, head of research at CCD. “We witnessed a number of users carrying out virtual sexual harassment of other users and recorded evidence of users being targeted with rape threats.” Among them is immersive media specialist Catherine Allen, who was in the Horizon Venues lobby trying to work out which event to go to when she was approached by a fellow female avatar.
“She told me she was seven-years-old,” says Allen, noting her concern given that Horizon’s age limit is 13. After a few minutes of chatting, they were approached by a group of men. “They surrounded us and started making jokes about how they could gang rape us,” she recalls. After Allen informed the men that there was a child present (all avatars look like adults), they insisted they were “just messing about”.
Chanelle Siggins’s perpetrator was just as laissez faire when he approached her in Horizon Venues, also in the lobby space, and simulated groping and ejaculating onto her avatar. “He shrugged as if to say: ‘I don’t know what to tell you. It’s the metaverse – I’ll do what I want,’” Siggens told The New York Times.
It’s not just women reporting sexual misconduct in the metaverse, either. In December, the journalist Hugo Rifkind reported being sexually molested on his second visit to Horizon Worlds. Writing in The Times, he recalled making his way to a virtual Billie Eilish concert with his friend Jeremy when “this fairly creepy-looking bald guy runs up, bends double and starts pumping his cartoon hands at our cartoon crotches”.
All of this is obviously alarming. But the thing that has perhaps been most disconcerting to metaverse researchers is how it has been perceived. When Patel first wrote about her experience on a Facebook group, she was met with a barrage of opprobrium, with comments including: “don’t choose a female avatar” and “don’t be stupid, it wasn’t real”. Similarly, when claims of sexual misconduct in the metaverse have been made on social media, users have responded with jokes and laughter emojis. The general consensus? Because these incidents happened virtually, they would have had little impact, if any at all. In short: get a grip. Of course, it’s not that simple.
“It took me a few days to process the event in its entirety,” says Patel. “Part of me wanted to shrug it off as ‘weirdos on the internet’ as I have after other incidents of online harassment. But this was different.”
Verisimilitude is fundamental to the success of the metaverse. The whole point, researchers say, is to create a fully functioning virtual environment that is as close to reality as possible. “Meta deliberately designed its Horizon apps so that there were almost no limits on how users interact with each other,” claims Hood. “The idea was that this would allow people to approach and interact with each other as easily as walking up to someone in the street.”
Psychologically speaking, there are clear consequences to this, particularly with regards to instances of sexual violence. “The intention of VR is to ultimately ‘trick’ the human nervous system into experiencing perceptual and bodily reactions within this different 3D space,” explains consultant psychologist Heather Sequeira. “Therefore, in a ‘virtual assault’, a person’s physical body might remain untouched, but the psychological, neurological and emotional experience can be very similar because the nervous system can’t tell the difference.”
Contextualise all of this within the victim-blaming culture we live in – where fault is consistently placed on sexual assault survivors as opposed to perpetrators, both in the court of law and of public opinion – and we have a big problem. One that, if this continues, will likely exacerbate existing issues surrounding violence against women and how seriously it’s taken by society and lawmakers.
“Without consent, any kind of sexual activity is sexual violence – that is true for activities that take place online or offline,” says Jayne Butler, CEO of the charity Rape Crisis. “There is a tendency to minimise online abuse or think of it as less damaging than that committed offline, but there is no sliding scale for sexual violence, and any form of it can be traumatising for victims and survivors. As with sexual violence that happens in person, online sexual violence can impact their sense of safety, leaving them feeling upset, scared and vulnerable.”
One of the biggest questions asked by those outside the metaverse research community is how it was even possible for sexual misconduct to happen there in the first place. Rather than a deliberate move, it seems to be an unfortunate byproduct of the way these systems have been designed: minimal restrictions mean maximum parallels to real life. “The online world is a microcosm of our society,” Butler points out. “And so the misogyny and harassment women and girls face on a daily basis can be reflected in the metaverse.”
Patel believes that as the metaverse develops, it will come to include all of the criminal and predatory activity we experience in real life. “Much like the internet, there will be dark metaverses, niche metaverses and a wide range of digital spaces for humans to engage,” she adds. “There is a pressing need to reconsider the design of the immersive world; my own experience is just the tip of the iceberg.”
So what is being done to combat this? Meta has responded to reports such as Patel’s by launching various new safety measures. This includes a feature that means when one avatar gets close to another, their hand vanishes, and another “personal boundary” feature, which puts an automatic ring around avatars to “prevent anyone from invading their personal space”. Additionally, just like on Meta’s other social platforms, like Facebook and Instagram, users can mute, report and block people.
Contextualise all of this within the victim-blaming culture we live in – where fault is consistently placed on sexual assault survivors as opposed to perpetrators, both in the court of law and of public opinion – and we have a big problem. One that, if this continues, will likely exacerbate existing issues surrounding violence against women and how seriously it’s taken by society and lawmakers.
“Without consent, any kind of sexual activity is sexual violence – that is true for activities that take place online or offline,” says Jayne Butler, CEO of the charity Rape Crisis. “There is a tendency to minimise online abuse or think of it as less damaging than that committed offline, but there is no sliding scale for sexual violence, and any form of it can be traumatising for victims and survivors. As with sexual violence that happens in person, online sexual violence can impact their sense of safety, leaving them feeling upset, scared and vulnerable.”
One of the biggest questions asked by those outside the metaverse research community is how it was even possible for sexual misconduct to happen there in the first place. Rather than a deliberate move, it seems to be an unfortunate byproduct of the way these systems have been designed: minimal restrictions mean maximum parallels to real life. “The online world is a microcosm of our society,” Butler points out. “And so the misogyny and harassment women and girls face on a daily basis can be reflected in the metaverse.”
Patel believes that as the metaverse develops, it will come to include all of the criminal and predatory activity we experience in real life. “Much like the internet, there will be dark metaverses, niche metaverses and a wide range of digital spaces for humans to engage,” she adds. “There is a pressing need to reconsider the design of the immersive world; my own experience is just the tip of the iceberg.”
So what is being done to combat this? Meta has responded to reports such as Patel’s by launching various new safety measures. This includes a feature that means when one avatar gets close to another, their hand vanishes, and another “personal boundary” feature, which puts an automatic ring around avatars to “prevent anyone from invading their personal space”. Additionally, just like on Meta’s other social platforms, like Facebook and Instagram, users can mute, report and block people.
However, just how much of a difference any of this will make is not clear. While personal boundaries may prevent unwanted touching and kissing, they would not necessarily prevent verbal abuse. And even though you could block someone who makes inappropriate sexual comments towards you, by the time you’ve done that, hasn’t the damage already been done? “I don’t think in that specific moment the boundaries would have made much difference in my situation,” says Allen. “The issue was not about touching, but the intimidation and the fact they were surrounding us and talking about gang rape.”
What’s more, Meta’s own safety features only apply in their first-party social apps, like Horizon, which Hood says “are much less popular than some of the third-party social apps on its platform”. He adds: “Our research suggests that these features, which effectively prevent users from getting too close to each other, will only help prevent the minority of abuse and harassment in VR that involves close contact between users’ avatars.”
Nonetheless, Patel argues that Meta’s new safety features are a “step in the right direction”, one that could inspire other companies investing in the metaverse to enforce similar measures. “Everyone in this industry needs to accept the ethical responsibility that is incumbent upon us as gatekeepers of the future metaverse,” she adds.
The truth is, whether we like it or not, the metaverse is growing day by day and will, in all likelihood, soon come to define our online lives. “Virtually all of the tech giants are investing in VR technology, which is a sign that they view this as the future of the internet and of social media,” says Hood. The technology is developing rapidly, too: in an upcoming report, Allen predicts an increase in VR users wearing haptic clothing that would enable them to actually feel sensations in metaverse spaces – scent is already said to be in development.
Despite her experience, Patel insists that the metaverse is still “exciting” and “full of potential”. “At this moment, we have the choice to demand a metaverse that prioritises safety and accountability over anonymity,” she adds. “We have the information and principles to build future metaverses that do more good than harm and the opportunity is now; the safety and mental health of future generations depend on it.”