Lawful Neutral: Defining toxicity in the MMO industry

    
58

One of my earliest memories from World of Warcraft isn’t really a positive one. It was in back in the Wrath days. My guild had just recruited a new member, who jumped into Ventrilo (heh) to hear us chat. One of my guildies was a woman. When this new member heard her speak over Vent, he immediately sent her a message and said, “Are you really a girl who plays WoW?” She was. He blundered on and asked whether she had a boyfriend (she did) and then some far more inappropriate questions I won’t repeat here.

Being an officer in the guild, she immediately kicked this person. His response was to post in general chat in Dalaran and suggest that the server’s denizens all call my guildie a slut. General chat obliged. She got thousand of tells from people calling her all sorts of names. She couldn’t keep up with the reporting and was forced to log into an alt. Her mailbox was completely full of messages from other players she didn’t even know harassing her, and they kept coming. It was weeks before she stopped being harassed by strangers in WoW all at the command of one random toxic misogynist gamer.

That was my first experience with really toxic behavior in an MMORPG. It’s the kind of thing that stuck with me. But nowadays, it’s almost the cost of playing games online. Toxicity has become such an issue that games have died, websites have been shut down, and yes, lives ruined and lost. Just on MassivelyOP, we have hundreds of articles about toxic behavior. That’s a lot of ink.

But gamers often can’t agree on what toxicity is. What causes toxicity? How are companies combating toxicity, and is it working? And where can game companies do better? In this special miniseries in our Lawful Neutral column, we’re going to dig into gaming toxicity, our response to it, and what we can do better. Today, we are going to start with the basics: Is toxicity really that much of an issue and what spawns it in the first place?

Yes, toxicity is a big deal in our hobby

The stereotype of a tween screaming racist, bigoted, homophobic slurs into the mic in Call of Duty exists for a reason. We all have games that we avoid because of the known toxicity of the community. But in the last decade, toxicity and trolling behavior in gaming has reached beyond our hobby and received mainstream attention, as Washington Post and the New York Times covered back in 2019 and Wired covered in 2020. Some streaming celebrities in gaming are being harassed and swatted daily and have had to effectively go into hiding.

And the perception is that toxicity is actually getting worse. According to the Anti-Defamation League, 81% of adults aged 21-45 who played online games say they’ve experienced some form of harassment. About 64% of those who experienced harassment said it impacted their gameplay experience, and about 68% of online multiplayer gamers experienced severe abuse, including physical threats, stalking, and sustained harassment. These numbers are higher across the board over the same report from 2019.

So what constitutes toxicity?

If there’s one thing that we as gamers are stellar at, it’s disagreeing about things. This is no different when we try to define what we consider toxic behavior. In fact, even the people who study toxicity in online spaces can’t agree. Depending on whom you ask, the definition of toxicity varies across different theorists as everyone defines his or her own framework. I’m going to focus on single theorist’s approach. Dr. Rachel Kowert, psychologist and Research Director for Take This, takes a multi-step approach focusing on behavior, outcomes, and intent. I think her framework makes sense, accounts for the all important context, and sets us up to explain a lot of the behavior that we see in the MMO gaming space.

Starting with behavior, Dr. Kowert first defines “dark participation” as “all deviant verbal and behavioral actions that take place on the internet.” Dark participation includes things like trash talking, using exploits, loot stealing, griefing, doxxing, harassment, or even posting comments on an article explicitly intended to break the commenting code but doing so in such a way that the poster can claim innocence and indignation at being moderated, something we’ve literally seen on MOP just in the last week (ahem). Dark participation, by Dr. Kowert’s definition, isn’t inherently bad. Instead, she says that the dark participation can be considered toxic by context or the cultural acceptance of a behavior.

She defines toxicity itself as “any outcome of these behaviors that cause harm to another [person’s] health or well-being.” So we can make the statement here that all toxicity results from dark participation, but not all dark participation results in toxicity because toxicity adds in the additional requirement of harm. Notice that being toxic doesn’t require the intent to be toxic, only that the behavior causes harm to another individual. Harm is also broadly defined to include harm to one’s well-being, not just demonstrable harm.

The final piece is the intent, which is how Kowert defines trolling. “[T]rolling refers to the intent of the perpetrator. In internet slang, a ‘troll’ is someone who sows discord on the internet with the deliberate intent of eliciting an emotional response or otherwise disrupting on-topic discussions and actions among other players. Deliberate intent being the key phrase in this definition.” So behavior (dark participation) with the intent (trolling) to cause harm (toxicity) makes up this framework.

For example, imagine playing a game with friends that you know very well and you all trash talk to each other and laugh about it. It’s friendly and fun and build on the context of those relationships. In that case, you are engaging in dark participation in the game because you’re trash talking (the deviant behavior), but it’s not toxic in this instance because the context of the situation is that you and your friends are fine with this interaction. The outcome here wasn’t negative, so it wasn’t toxic.

Now imagine the exact same situation, but you are playing the same game with people whom you don’t know at all. You engage in the same trash talking you did with your friends because that’s how you’re used to interacting, but those random people don’t take it as friendly and fun because they don’t have the context to understand your intent. Those random players walk away from the game feeling hurt and insulted. In this case, you are still engaging in dark participation, and this time the outcome was negative, and so the behavior was toxic. But it wasn’t trolling because you weren’t intending to cause harm.

Finally, imagine that you are running a dungeon and you see someone who’s putting out low DPS. You start trash talking that player, berating and insulting them, hoping that they’ll leave so you can get a player with better DPS. In this case, you are trolling because you engaged in dark participation with the intent to cause harm.

There are two important takeaways here. The first is context is incredibly important: A behavior in one context might not toxic, but the exact same behavior expressed in the same way in a different situation could still be toxic. The second is that we can unintentionally contribute to a toxic environment. Trolling, or intentionally causing harm, is not the only way to create a toxic environment.

This went over great.

Why toxicity?

So what actually creates toxic environments? The answer is… lots of things. There’s a combination of personal and social influences that create toxicity. (There’s also technical influences, but I’ll be covering that in a different post).

Kowert points to Dr. John Suler and the Online Disinhibition Effect, where factors unique to online environments, like anonymity and feeling “invisible,” can cause people to do things they otherwise wouldn’t, which expresses itself as “benign disinhibition” (or as I like to think of it, oversharing on the internet) and “toxic disinhibition,” whereby people are more likely be toxic and mean to each other.

She also points to social cognitive theory, which is based on social learning. Kowert says that the toxic and trolling behavior is learned from communities that are already toxic. This makes sense with what we know in gaming. Some games have a reputation for toxicity that persists, like Rust. WoW is big enough to have subcultures, but many of those subcultures  have reputations for being toxic and the people who join them also tend to turn toxic.

Finally, Kowert references the Theory of Planned Behavior, which states that the intent to engage in toxic behavior varies based on specific context. This means that someone is more likely to engage in toxic behavior if the group is accepting of that behavior. So someone might be toxic in a toxic guild because it’s accepted behavior but be a decent and respectful human being in another guild because it’s not accepted behavior there.

Other researchers have argued that competitive games tend to create aggressive behavior, rather than violent games. (I do note that the study I’ve linked here loosely correlates aggression with toxicity, which is certainly debatable.) But even anecdotally, we’ve seen that PvP-focused games are more prone to have toxic communities overall than PvE, though of course there are always exceptions.

I think this last one will be a surprise to literally no one: Trolling behavior has a strong correlation with sadism and psychopathy. The more someone trolls, the more the likely he is to have sadist and psychopathic tendencies in the rest of his life as well. Many people troll because they legitimately enjoy harming others.

Our takeaways from the psychology behind toxicity are, again, not earth-shattering. Toxic environments are self-perpetuating. People are more likely to engage in toxic and trolling behavior in communities where that behavior is expected. And because of the Online Disinhibition Effect, gamers are more likely to engage in both benign disinhibition and toxic disinhibition.

In our next piece in this series, I’ll take a look at ways that some game companies are trying to combat toxicity – and some of the problems in those approaches.

Every other week, Andy McAdams braves the swarms of buzzwords and esoteric legalese of the genre to bring you Massively OP’s Lawful Neutral column, an in-depth analysis of the legal and business issues facing MMOs. Have a topic you want to see covered? Shoot him an email!
Advertisement
Previous articleMortal Online 2 shares its new roadmap and answers questions about crafting, combat, and player politics
Next articleRavendawn pegs first alpha phase to July 27 for top-end supporter pack buyers

No posts to display

58 Comments
newest
oldest most liked
Inline Feedback
View all comments