A security guard in a public place has enough clear reasons to ask a guest to leave the premises or permanently ban him from the place. He or she can do this for fighting, damaging property, disturbing other visitors, annoying the staff, or something else.
On the other hand, the bouncer’s work in the metauniverse is arranged differently. You don’t even need a solid physique to stand up to rampaging visitors. It is enough to press a button that turns off the user’s microphone or denies him access to the system.
But understanding the behavior of „avatars“ in the digital space is more complicated than in real life. And it’s even harder to control it.
The question of where users can be given full autonomy and where they should be monitored has been debated for almost two decades. And there is still no answer.
The problem may seem insignificant now, but the more billions of dollars investors and tech companies pour into developing new virtual spaces, the more they will need an answer.
„It’s like inviting a whole lot of people over to your house, but you don’t have any furniture.“ – Lorelle VanFossen
“Educators in VR” company holds dozens of virtual events each month and assigns a team of moderators to oversee each. They’ve seen it all in their four years on the job.
“The trolls and the disruptive people think that they’re new. They think it’s a new thing to stand in front of a stage and turn around and jerk off in front of everyone with hand gestures, and they think they’re inspired. Well, after you’ve seen it 85,000 times, you’re done.” – Lorelle VanFossen
Stories of users insulting and harassing other players are standard on the Internet. Some guests come to digital events purely to annoy others. For example, to run around the room, shout racist remarks, or even threaten to „cut someone’s throat“.
It can be challenging to identify the offender alone, so the moderators communicate with each other through an audio service like Discord. That way, they warn each other about suspicious users and decide if the „avatar“ ignores the rules or not. After all, he can wander from side to side, not because he wants to annoy someone, but simply because his system is „glitchy“.
Moderation in metauniverse is always a waltz between trying to understand users‘ true motives and having to make quick decisions.
Moderators turn off microphones at „avatars“ with dogs barking in the background, block the path of those trying to climb the stage, stop a mass dance performance from a Macarena song, etc.
Some users are not technically savvy and do not know that even their most straightforward actions in the real world are reflected in the virtual world. For example, going from standing to sitting, a character can „punch“ the digital floor and rise into the air when standing up. If a person is inept at using a headset, he or she may accidentally make the „avatar“ scurry around the room.
There is another problem: the limits of acceptable behavior in the real and virtual worlds are different. If a person with purple-colored skin shows up at an offline medical conference, he will probably be asked to leave. In the metaworld, they won’t even look at this because it’s just an external character characteristic.
Or, if a person pats someone on the head in the real world, their actions are likely to be inappropriate. But in the metauniverse, it’s a gesture from the so-called „gray“ zone: it’s not a big deal but unpleasant.
There are too many nuances, so Educators in VR determines the severity of moderation depending on the event. At entertainment parties, moderators intervene only in extreme cases, and group meditation sessions have a „zero tolerance“ policy. There, an „avatar“ can be banned simply for walking around annoyingly much.
Lorelle VanFossen herself initially opposed such severe restrictions but later realized they were justified. After all, for some, meditation is daily therapy, while for others, it’s a way to take a break, relieve stress after work, and „patch up“ their mental health.
The moderation policy also depends on the platform itself. AltspaceVR, for example, focuses on professional communities, so it strictly controls behavior. The multiplayer game VRChat, on the other hand, is famous for anarchy.
“It’s the most disgusting, frustrating, stress-inducing, headache-inducing, mental health–depleting job on the planet.” – Lorell Vanfossen
How moderation will work when millions of people congregate in metauniverses is unknown. AltspaceVR has full-time moderators but leaves most of the work to users. Meta also allows „avatars“ to block offenders on their own. But in the future, users are unlikely to have enough time and energy to control the behavior of countless people.
The way out is automation. For now, however, algorithms can’t even handle toxicity on text-based social networks. It is unclear how they will work in a metauniverse, where every action can be interpreted differently.