Toxicity. It’s plagued online spaces since long before there were MMORPGs and other online video games, as any old school BBS user can attest. But it’s a problem we deal with all the same, maybe more now as lines between genres have blurred and gaming evolves into an accepted mainstream activity.
And solving the MMO toxicity problem is the topic MOP Patron and The Exiled developer Alexander Zacherl has posed for us today.
“How can devs and players together make sure that MMO communities do not turn toxic? Would love to hear some best practice from other games.”
Let’s tackle toxicity in this week’s Massively Overthinking. Which studios are dealing with it best right now — and how? What actually works to keep communities clean and friendly? What works for preventing the problem vs. curing the problem? What would you like to see done that no one’s trying at all?
Andrew Ross (@dengarsw):Â Let’s start with something positive. Players are capable of policing themselves a bit, even if it means losing a game. That being said, that’s generally an exception to the rule that people on the internet can do horrible, mean spirited things to strangers.
Blizzard, oddly enough, may have dealt with toxicity the best in the same way I felt MOBA Solstice Arena (RIP!) did: by extremely limiting chat functions to strangers. People still find ways to be jerks to their fellow players, but it’s a decent start. The problem, of course, is that it only minimizes the problem rather than solving it. In fact, I generally feel that Blizzard harbors some very toxic players across multiple IPs. You’d think Battle.net would record how frequently someone’s reported or added to ignore lists across multiple games and force them to group with others of their ilk, but it doesn’t seem to be so.
It’s why I like Riot in theory. League of Legends just has too bad a reputation at this point for me to get friends to give it a shot, which is why I haven’t returned to the game in about a year, but Riot uses good theory and academic research to try to make change, and it publishes the results so other companies could, you know, maybe try to address the issue too. In fact, LoL may be the game most famous for toxicity, but it’s not alone, and I’d argue that certain other big companies with multiple IPs and single-account style launchers prevent Riot’s attempts from having long lasting effects on casuals who play their games by giving them refuge to return to their trollish behavior. Think of the bullies you knew as a kid: Maybe they got punished at school, but I felt the cruelest ones were learning how to do it better at home from their own family members. Had their family worked on redemption and self improvement, maybe they could have grown. Truthfully, it worked for me.
What does Riot do well? For one, it uses statistics and “player tips” to remind players that their attitude statistically shows that it can change the outcome of their battles, helping to prevent the problem before it arises. To players who miss those tips (or willfully enjoy being trolls), Riot not only gives a “time out” but tells them why they’re toxic and makes them reform, at least temporarily. What’s best to me is that it’s even reinstated griefers who show real growth and remorse. While we all want justice, redemption should matter, and by doing this with popular or recognizable players, small-time trolls who lose their way have an example to follow in order to reform if they really want to.
Part of the reason this can work is that players need to report toxic behavior. Even if you’re not personally offended, if you notice another player is understandably offended, stand with them. It doesn’t matter if someone’s bad at a game or not; it’s much more helpful in the immediate moment to give constructive criticism than to type out an insult.
As I strongly hinted at though, we can do better. Game companies like Steam and Blizzard could make sure that players on your ignore list don’t pop up in your small group encounters. Ideally, they’d work with more of Riot’s research, if not Riot itself, to find industry wide solutions. In addition, if big, multi-game companies start to keep tabs on who’s being a mean little troll and force them to play together, normal players can stay free of them. That, combined with Riot’s troll-feedback system, could ensure problem players would see the difference between playing with people who act the way they do versus, well, normal-functioning members of society who don’t shake off social norms when their face is hidden behind a screen.
Going back to Blizzard and Hearthstone: Co-op modes in particular can help get players to act decent to one another when combined with other strategies, and I say this as a PvP fan. For example, Tavern Brawls that have the players taking down an NPC trying to destroy them both may help to foster bonds in a game that tends to pit us against each other. It’s the MMO equivalent of a bonus for grouping, something MMOs have done well in the past (and is even used in some shooters). Applied to PvP games, it might be interesting to have games allow to declare “rivals,” players or even player organizations that you have a bonus for engaging. This would cut down on the killing of randoms, and should one side bully the other, removal of said status could work as a way of cutting down on harassment, as the other side would (hopefully) be more motivated to battle players who offer more rewards.
Brendan Drain (@nyphur): Toxicity and trolling in online games and forums is nothing new, but it’s definitely hit its peak in competitive online team games like MOBAs and MMOs that force you to rely on other players to succeed. You can play perfectly well and still lose a game because of someone on your team underperforming, or conversely, you can play badly and there will be a temptation to blame the loss on your teammates. You’re practically encouraged to complain at your teammates and accuse them of being bad at the game, and the natural response is to get defensive and accuse them right back. How do we expect that to turn out?
League of Legends has seen a very high level of reforming players who don’t repeat offend if they’re punished for their infraction quickly and the punishment is mild. Automated punishments like chat restriction for swearing and matchmaking cooldowns for abandoning a game are practically essential now, and I kind of like what Blizzard did with replacing “gg ez” with a funny phrase. What I’d like to see done in the future is some kind of feedback system to let you know that when you report a player, it actually leads to something. When you report a player for misconduct in most games, the report just flies off into the ether, and you have no idea whether that player later gets punished.
We need quick punishment decisions made on reported players, a series of mild but escalating punishments to give them a chance to reform, and developers willing to start taking verbal abuse in chat seriously as harassment. For some games I’d even pay for a premium opt-in matchmaking service where you only get matched against other people who have opted in and your reports are responded to with feedback. I’d be very happy to be subjected to more strict rules surrounding things like chat if it meant having a nicer community to play with.
Brianna Royce (@nbrianna, blog): Riot Games has been throwing money at the toxicity problem for years and has tried everything from player judgment tribunals to de facto votekicks and personalized attention. What sticks with me from Riot’s many years discussing its findings is that a lot of toxic players exhibiting toxic behavior have no idea what they are doing is toxic. As a microcosm of toxicity in the real world and online, the competitive gaming culture has been laced with bullying, name-calling, and cruelty for decades. Folks who grow up on that think it’s normal and acceptable, and they bring it into games with them until someone in authority tells them, point-blank: Dude, it’s not normal, it’s not OK, and there are repercussions for acting this way.
But let’s not pretend MMORPGs are immune just because we have non-competitive content too. We’ve seen a spate of developers from Daybreak to Trion attempt to clean up their communities in drastic ways, including threatening to ban people for their social media shenanigans. Sometimes it goes way too far into naming-and-shaming scenarios, such as the famous (and unsettling) example of Guild Wars 2 last year. While I understand the principle there, I’d much rather see developers set positive examples with their own staff, their own community management teams — praising good behavior, laying out clear policies, enforcing them justly, and being willing to spend the time and effort (read: money) sorting out redeemable offenders from the lost.
Beyond that? Communities are seeded long before they form by the game’s design itself. Games foster communities based on the activities they allow and incentivize. When you, for example, design a dog-eat-dog game and market it for all the hardcore wannabe badasses in the audience, don’t act surprised when that’s what you attract and when everyone else stays away. When you design a co-operative game where player success depends on working together, that’s exactly what you’ll get.
Justin Olivetti (@Sypster, blog):Â Social engineering in MMOs isn’t exactly my forte, so I’ll limit myself by saying that we need ways and tools to encourage each other for good and helpful behavior. I liked how FFXIV let you nominate a particularly great teammate at the end for an extra reward, and I wish all games would give you the option to add helpful players to your friends list following a good run. Giving players more tools to run and promote fun events for the community is a must, as well. And creating opportunities for large-scale cooperative projects that inspires and excites everyone involved has a bonding effect.
So basically, look for as many ways as possible to promote positivity, social bonding, networking, and good team behavior with game mechanics and by not hampering players looking to do this.
Larry Everett (@Shaddoe, blog):Â As some of you know, I have run large MMO communities in the past — thousands of people on one website. It’s extremely difficult to do anything with that many people hitting your site on a daily basis without some kind of toxicity settling in and ruining parts of it. But it can be combatted. When I set up a new community, or I help others set up an online community, I don’t focus on moderators, and I don’t concern myself or the people running the site with how moderation will be handled — not at first. I encourage moderators to not be the police of the community, but rather, I like moderators in my online communities to engage in community discussions. I believe the job of the moderators is to administrate civil discussion on the topics of the site, praise those who are doing it right (even if they disagree), and encourage others to do the same.
If I might praise Massively OP here a little bit… we have writers who engage with the community civilly and often. The writers aren’t just touting their options but starting topics that need to be discussed about the genre of gaming we enjoy. We also have editors and moderators who also actively take part in the discussions on the site. This allows them to immediately see if there is a poster who is personally attacking another commenter or one of the writers.
It’s a tough job and certainly one that is not appreciated enough. The best way to weed out toxicity in a community is to show the community how things should be done. Far too many community managers play the role of marketing specialist and not the role of managing a community. Community managers shouldn’t just engage the community when it the game has something new to sell or when something goes horribly wrong. Of course, they should put out fires and promote, but they should also be there when a player talks about reaching a new goal in the game. They should start topics about managing a guild or improving your keybinds. I think an old adage applies here: They won’t care how much you know until they know how much you care.
Your turn!