Trolling, Griefing, and Harassment in Virtual Worlds: What the Newer Social VR Platforms Are Doing to Combat It

troll-3328599_1920.png
How do you deal with a troll? (image by Anaterate on Pixabay)

There was a particularly irritating troll at Alfy’s Voices of Sansar competition this past Saturday. Trying to find and mute her (currently the only tool available to us in Sansar) was an exercise in frustration, hovering my cursor over each avatar in the crowd watching the show until I found her. Gindipple has released some software that might help us the next time we get hit by a troll at an event:

Eject or Ban 14 May 2018.png

We’ve been pretty lucky in Sansar so far; we haven’t seen anything like the levels of trolling and harassment that occur in the more popular social VR spaces like VRChat and AltspaceVR. (VRChat, in particular, is infamous for its griefing.) But we Sansarians all know the onslaught of trolls is coming, and every social VR platform is going to have to come up with its own technical solutions to the problem of trolls.

So, how are the other social VR platforms dealing with this issue?

 

Sinespace

Sinespace has pretty limited options as well. You can basically report and ignore other avatars around you:

Sinespace Ignore and Report 3 14 May 2018.png

 

VRChat

VRChat is taking the most controversial step of banning new users from uploading avatars or worlds until certain (unspecified) conditions are met, and taking away such privileges from older users who misbehave:

Hello, VRChat! We’ve been working on some new “Trust” systems to help make VRChat a friendlier place. These systems will be used to help gate various features until users have proven themselves to be a friendly member of the community. One of the first parts of the Trust system is called “Content Gating”. This system is designed to reduce abusive or annoying behavior involving avatars or other content.

Here’s generally how it works. When a user first creates a new VRChat account, they will be unable to upload custom content like worlds or avatars. After spending some time in the app and having positive interactions with other users, they will eventually receive in-app and email notifications that their account has access to world and avatar creation capability. This time may vary from user to user depending on various factors.

If the new user chooses to spend time in VRChat behaving badly or maliciously against other users, they may lose the capability to upload content. They will receive a notification in-app and via email that they have lost access to content uploading. If they spend more time in the app and follow the Community Guidelines, then they will eventually regain access to these systems. Again, this time may vary depending on various factors.

The CEO of at least one other competing metaverse corporation has said that he doubts this step will actually work as intended. In addition to these new sanctions, VRChat also has the ability to mute (so you can’t hear) and block (so you can’t see) other avatars in its pop-up user interface, and a “safe mode”, which is a sort of “nuclear option” where you can mute and block all avatars which are not on your friends list.

VRChat is also temp-banning people who troll, but sometimes other people get accidentally caught in the cross-fire. I seem to remember that there is also a feature where you can ask avatars who share your world to vote “yes” or “no” on ejecting a misbehaving user from that instance.

So all in all, VRChat has developed the most evolved and developed tools for dealing with trolling. But then again, they’ve been forced to.

 

AltspaceVR

Back in 2016, AltspaceVR introduced a “space bubble” to keep other avatars from invading your personal space. I do know that you can also mute other avatars who are annoying you. You don’t have an option to block offensive avatars in AltspaceVR, but then again, you don’t really have any choice in your avatar, they’re so very limited!

I would load and run AltspaceVR to check all these features out, but the latest version of the client software (where you get to choose your new “home” location) has completely locked up my high-end PC THREE. TIMES. tonight and I am not going to risk trying it again! AltspaceVR seems to be experiencing some major growing pains. Seriously not impressed.

 

High Fidelity

High Fidelity has a Bubble icon on its tablet user interface that works similarly to the AltspaceVR space bubble:

High Fidelity Bubble 14 May 2018.png

You can also mute nearby avatars, or set them to “ignore” so they can’t messsage you in-world. Pretty much the same features as the other social VR spaces have. All the tools in all the newer social VR spaces are pretty limited.

 

General Issues in Dealing with Trolling and Griefing

So, let’s move from specific technical solutions to a more general discussion on how to handle griefing in general. What’s the best way to go about dealing with griefing, trolling, and harassment in online communities?

Dr. Mark Dombeck, in an article on the website MentalHealth.net, neatly outlines some of the issues in community and game design that affect trolling:

In my experience, manipulating perpetrator anonymity is an important factor in controlling griefer’s/troll’s antisocial behavior. The more easily identifiable and able to be held accountable for their actions community members are, the fewer instances of bad behavior you tend to see.

Allied with the idea of altering perpetrator anonymity is the idea of altering expectation of punishment. Accountability enables easier punishment. There are several ways that punishment can take place however. Punishment can be very informal, where community members heap scorn on other members who violate the social contract or simply ignore them (by using filters within the community to literally make their presence invisible). This sort of informal punishment is what makes accountability effective all by itself. Accountability can also enable more formal varieties of punishment such as entry bans. In my experience bans are the most useful way to discourage the really hardcore antisocial behavior that happens on communities. Punishment can never hope to eradicate all griefer/troll behavior however, because the really hardcore griefers will thrive on punishment, seeing attempts by the management to eject them as high praise for their work.

Here are a few other elements of the community or game that can be manipulated and which might have an impact on reducing griefing/trolling behavior.

Setting up Initiation Barriers probably would affect griefing behavior. The easier it is to get into a community, the more likely that community is to become a target for griefers. In part this has to do with helping people to identify with and value the community and not take it for granted. When you have to do a lot of work to get into a community you are more likely to care for that community and not want to harm it. The problem here is that the same barriers that might keep out griefers also keep out legitimate members. It is difficult to set a barrier high enough to keep out one group without also keeping out the other group.

I’d expect that the more opportunity there is to act out griefer behaviors with a group of other griefers, the more often the behavior would happen. People tend to take less responsibility for individual actions when they are acting as part of a group or mob. This social psychological principle goes by several names including the bystander effect, and diffusion of responsibility. The solution here would be to limit people’s ability to socialize, but as that utterly defeats the purpose of the community it isn’t really much of a solution.

I would expect that manipulating the frame of the community or game can increase or decrease the chance that griefer behavior will occur. The frame of a game or community has to do with its identity – how members think of what they are doing when engaged in the game or community. If an interaction is thought of as a game and therefore not something that is real or important it is easier to self-justify doing mayhem. If an interaction is thought of as a more serious behavior such as part of a support group interaction, the urge to do mayhem is maybe less strong (for some at least). The Wired article talks about this issue somewhat indirectly, noting that Second Life members don’t think of what they do in Second Life as being part of a game but rather view it as a more serious community. The “non-game” frame of Second Life participants makes such participants more likely to view griefing behavior taking place within Second Life in non-game ways, such as considering it to be actual theft or terrorism.

Second Life has often been an arena for trolling because it’s very easy to create a free, throwaway account to be offensive. If one gets banned, the griefer can go ahead and create another free account. All the newer social VR spaces have this problem, since they don’t want to discourage people from signing up and (hopefully) staying and generating income for the company.

There are no easy answers here. The best we can do is try various solutions and see if they prove effective or not. In these early days of the metaverse, we’re all still learning the best ways to design our communities to chain the trolls.

troll-3330172_1920.png

Liked it? Then please consider supporting Ryan Schultz on Patreon! Even as little as US$1 a month unlocks exclusive patron benefits. Thank you!
Become a patron at Patreon!