This week’s Massively Overthinking comes to us from Kickstarter donor Dahui, who asks,
“What do you think MMO developers can do to try to minimize the toxic behaviors that are so prevalent in some of the bigger name MMOs?”
I posed Dahui’s question to the writers, and now I pose it to you.
Brendan Drain (@nyphur): The reality is that the internet allows people to evade the consequences of their actions. When there are people out there actually bullying others over the internet while remaining unpunishable or even completely anonymous, we can hardly expect everyone to be friendly in our online games. There have been cases in which the media and Twitter mob have ruined people’s lives and nobody was punished, to say nothing of those who have called SWAT teams or bomb threats on twitch streamers. In Northern Ireland, we used to call it “10p terrorism” when bomb threats were called in from public phone boxes, but the internet has made this free and given it global reach. Law enforcement worldwide has some catching up to do in tackling these most serious of cases.
That said, there are a few ways game developers might be able to eliminate abusive players and behaviour from their games. Using a buy-to-play business model rather than free-to-play would help put a price on consistently breaking the rules, and gameplay with long progression systems may make people value their game accounts more so that they’d be less willing to risk a permanent ban. I think Guild Wars 2 also stumbled on a nice recipe for making the community more friendly when they eliminated competition from normal gameplay by giving every player their own separate full reward for every monster or event they fight.
The vast majority of toxic behaviour in MMOs stems from competitive gameplay or difficult group gameplay, both of which place your success partly in the hands of other players. There’s always a temptation in team games to blame your teammates for your losses rather than admitting that you made a mistake or were simply outplayed, and that naturally leads to toxic behaviour. League of Legends has seen some success in battling this by handing out automated temporary chat bans for infractions and giving the players report cards showing exactly what they did wrong and how to avoid escalating punishment. The vast majority of those who receive such a ban reportedly reform their behaviour, and that means fewer cases for the GMs to handle.
Brianna Royce (@nbrianna, blog): To a degree, they can’t. Some people create toxicity wherever they go, and after they’re inside a game, there’s nothing developers can do but weed them out, one by one, an extremely expensive process. It’d be nice if every MMO studio could throw bajillions of dollars at the community problem the way Riot Games is doing, but instead we’ll have to be content with learning from Riot’s research.
In the meantime, I’d like to see MMO studios work on prevention. Some games actively seek miscreants as their core constituency, and once your community has been tainted by or overrun by toxic players, it’s almost too late to salvage it. We need to start in the beginning, in the alphas and betas of video games. Devs need to stop the swarming behavior on their forums and streams and social media early on, demonstrate that asshattery is unwelcome, and prevent that culture of toxicity from ever forming in the first place. That can come from within the game, with game mechanics designed to reward good behavior and cooperation, and it can come from without, too: Studios like Daybreak and Trion have gone so far as to block players from their games based on bad behavior outside of them.
Eliot Lefebvre (@Eliot_Lefebvre, blog): One of the things that I’ve been thinking about over time is that a lot of it comes from treating other players as fundamentally resources. This isn’t a mindset unique to any specific game style; it’s just that relying on other players more directly and visibly narrows your focus. If no one has crafted the sword you want to buy and started selling it on the auction block, you’re annoyed with crafters in general; if you’re in a dungeon with a specific player who is not making a run easy, you’re annoyed with Craig.
There’s no easy solution, but there are tools to minimize toxic behaviors, and one of those is to not make the only possible route to advance in the endgame amount to “get locked in a room with seven to 39 other players repeatedly every week so that your success relies upon them even if you know what you’re dong.” It’s also exceedingly good form to have systems for commending other players, offering praise for those who do things well and keep players motivated to be kind to others. Encouraging positivity and allowing players to choose their own means of advancement both make the game more pleasant for the people playing it.
A lot also depends on how a game’s community is moderated, just the same. EVE Online has a contentious and nasty environment not just because of the mechanics that encourage self-serving competition but also because the studio behind the game actively encourages players to be cutthroat and rewards members of the community who do so. The punishments given to offending players tell players what the designers want the game to be, so it falls on the side of the studio to determine what sort of environment they ultimately want to encourage.
Jef Reahard: Toxic behavior is present wherever humans are present. There is no effective (or cost-effective) way of stopping undesirable behavior, and that’s assuming we can all agree on what “undesirable” behavior actually is, which is unlikely in today’s politically correct America.
Someone will probably say that we should do away with internet anonymity, but no, we shouldn’t because there are no data that show that to be an effective deterrent, especially given the ease of IP spoofing, acquiring unlimited numbers of F2P accounts, etc.. And of course like any other draconian big brother remedy, removing anonymity can and does cause more harm to the rule followers than the rule breakers.
TLDR, /addignore the toxic people and move on with your gaming.
MJ Guthrie (@MJ_Guthrie, blog): Egads, that’s not an easy order to fill, is it? Unless you want to go so far as to eradicate in-game chat and all forums so people cannot interact at all, there is always going to be some level of toxicity thanks to the nature of humans and the internet. However, I do think there are steps that can mitigate it to a fair degree. These include an easy report system (click on the name, don’t make me have to hunt through a slew of UI windows!) with real people manning complaints who can actually look into problems, an account-wide block ability, and easy-to-use chat filters. There needs to be a no-tolerance policy for harassment that is strictly enforced and serious repercussions for false reporting. Forum behavior should affect access privileges to the game and vice versa; be a troll in one spot, can’t access the other either.
Oh, and devs need to never exhibit a “gamers will be gamers” attitude that excuses vile behavior as just a part of gaming. That just sets the whole tone for a game.
Beyond that, the ability to totally turn off other players (their chat, their visual presence, everything) as a block feature would be pretty snazzy! I am talking levels from just specifically named folks on a block list to show only guild members — all the way up to the point that no one even shows up on your screen unless he or she is on your friends list!
Tina Lauro (@purpletinabeans): Developers can only do so much about toxic players without nannying us to the point of ruining the social experience that’s supposed to be a big part of MMOs. They most definitely should monitor player interactions and have a robust reporting system in place, but I think the real answer might lie within the community. Players can deal with issues more directly since they’re the ones who are a part of the social experience, and every time someone pipes up or kicks someone from a group to quell any toxicity, they make that social experience better. As I’ve mentioned before, I liked rating party members anonymously in World of Warcraft and would like to see further iteration on that kind of community-led rating system going on in MMOs. That way, we’d have a good idea of how someone has previously conducted themselves in our favourite game before deciding to party up or guild invite them.
Patreon Donor Roger: To me it’s a simple answer: community managers and reachable GMs. They are the frontline of defense against toxic behavior. The next step would be community tools like reporting and blocking and people who can investigate the reports. These have proven to me that they work well in lowering toxicity as long as the company is willing to or can put in the resources to dedicate to the community team.