The meta-verse already has a groping problem

Katherine Cross, a researcher in online harassment at the University of Washington, says that when virtual reality is immersive and genuine, toxic behaviors that occur in that environment are also real. “Ultimately, the nature of virtual reality spaces is such that it is designed to trick the user into believing that they are physically in a particular space, that each of their bodily actions takes place in a 3D environment,” says she. . “That’s part of the reason why emotional reactions can be stronger in that space, and why VR triggers the same internal nervous system and psychological reactions.”

That was true in the case of the woman who was groped at Horizon Worlds. According to The Verge, her post read: “Sexual harassment is no joke on the regular internet, but being in VR adds another layer that makes the event more intense. Not only was I groped last night, but there were other people there, who supported this behavior, which made me feel isolated at the Plaza [the virtual environment’s central gathering space]. “

Sexual assaults and harassment in virtual worlds are not new, nor is it realistic to expect a world where these problems will completely disappear. As long as there are people who will hide behind their computer screens to evade moral responsibility, they will continue to occur.

The real problem may have to do with the notion that when you play a game or participate in a virtual world, there is what Stanton describes as a “developer-player contract.” “As a player, I agree to be able to do what I want in the world of the developer according to their rules,” he says. “But as soon as that contract is broken and I do not feel comfortable anymore, the company’s obligation is to return the player where they want to be and back to being comfortable.”

The question is: whose responsibility is it to ensure that users are well? Meta says, for example, that it gives users access to tools to keep themselves safe, which effectively shifts the burden on them.

“We want everyone at Horizon Worlds to have a positive experience with security tools that are easy to find – and it’s never a user’s fault if they do not use all the features we offer,” said Meta spokeswoman Kristina Milian. “We will continue to improve our user interface and to better understand how people use our tools so that users are able to report things easily and reliably. Our goal is to make Horizon Worlds secure and we are committed to performing that job.”

Milian said users should go through an onboarding process before signing up for Horizon Worlds, which teaches them how to start Safe Zone. She also said that regular reminders are loaded on screens and posters in Horizon Worlds.

screenshot of the Safe Zone interface from Meta


screenshot of Safe Zone interface
Screenshots of the Safe Zone interface lent by Meta


But the fact that the victim of the Meta fumble either did not think about using the Safe Zone or could not access it is spot on the problem, Cross says. “The structural issue is the big issue for me,” she says. “Generally speaking, when companies address online abuse, their solution is to outsource it to the user and say, ‘Here we give you the power to take care of yourself.'”

And that’s unfair and does not work. Security should be easy and accessible, and there are plenty of ideas to make this possible. For Stanton, there would only be a need for some universal signal in virtual reality – perhaps Quivr’s V gesture – that could convey to moderators that something was wrong. Fox wonders if an automatic personal distance, unless two people mutually agreed to be closer, would help. And Cross believes that it would be helpful for training sessions to explicitly set norms that reflect those prevailing in ordinary life: “In the real world, you would not randomly grope for anyone, and you should bring it over to the virtual world. . “

Until we find out whose job it is to protect users, an important step towards a more secure virtual world is to discipline aggressors, who often go free and remain eligible to participate online, even after their behavior becomes known. “We need deterrents,” Fox says. This means ensuring that bad actors are found and suspended or banned. (Milian said Meta “[doesn’t] share details of individual cases “when asked what happened to the alleged catcher.)

Stanton regrets that he has not pushed for an industry-wide adoption of the power movement and failed to talk more about Belamire’s groping incident. “It was a lost opportunity,” he says. “We could have avoided that incident on Meta.”

If anything is clear, it is this: there is no body that is clearly responsible for the rights and security of those who participate anywhere online, let alone in virtual worlds. Until something changes, the metaverse will remain a dangerous, problematic space.

Leave a Comment