Anyone who’s been on the internet for more than a few minutes knows how quickly forums and comment threads can quickly descend into toxicity — not that such a thing would ever happen around here — and community managers and moderators are constantly trying to figure out how to combat that problem. But according to a new report published on GamesIndustry.biz, the solution may be simpler than you’d think.
The article cites Creative Assembly’s Grace Carroll, who spoke on the subject at Develop:Brighton, as saying that on the Steam forums for the studio’s Total War series of games, simply the “visible presence of moderation” was enough to tone down the toxicity.
“If someone posts a really awful comment, and I reply . . . they’re like, ‘oh my god, I didn’t think you’d read it, I didn’t think you’d reply to it,'” Carroll says. “The attitude can turn from horrible to apologetic straight away.”
Take notes, game devs and community managers. You can check out the full summary of Carroll’s talk over at GamesIndustry.biz.
There’s some interesting stuff to be unpacked in a recent analysis of Conan Exiles that characterizes it as replete with griefing, racism, sexism, and general unmoderated player garbage. Equally interesting is the official response from Funcom, which is essentially “this isn’t an MMO so we’re under no obligations to moderate this stuff.” You can read that as any mixture of “we don’t want to hire moderation staff” and “we want money more than we want players to be happy” as you desire.
It’s true that Conan Exiles isn’t a full MMORPG. It’s also true that there are official servers with Funcom’s name on them, which means that there’s a legitimacy there. And it raises the interesting question of what obligations studios have to the players in this particular environment.
What qualifies as “griefing” can have a wide scope and cover a lot of things, and some of that is part of the game at its core; after all, there’s plenty of griefing behavior beyond PvP that makes a game like EVE Online what it is. And that’s not even counting servers that aren’t officially run by the development team. So what obligations do studios have to provide a griefing-free MMO environment? Does it apply only to official servers? Only to MMORPGs? Only to sufficiently large servers? When is moderation no longer the problem of the game’s owners?
This past week’s announcement by Valve stating that the company would not moderate or vet games that it sold except for ones that were “trolling” or straight-up illegal stirred no small measure of controversy among the industry and games community. While some have praised the company for creating an open store model that is free of moderation, many others see it as opening up the platform for titles full of offensive or hateful content, not to mention continuing to flood the store with poor quality games.
Valve hasn’t changed its stance on this, but the company did attempt to elaborate on what types of games that it considers “trolling.” Long story short: It’s being pretty vague on the definition.
One example that it gave was the removal of a game that was made to generate outrage: “We rejected Active Shooter because it was a troll, designed to do nothing but generate outrage and cause conflict through its existence. In addition, the developer had been involved in numerous misrepresentations, copyright violations, and customer abuses. There are no second chances for Active Shooter, or its developers. And to be explicit, while the developer behind it was also a troll, we’d reject Active Shooter if it had been submitted by any other developer.”
If you have learned nothing about Steam, you have probably noticed that Valve is actively disinterested in moderating, curating, or really interacting with the platform beyond collecting money from games sold there. So it’s surprising that Valve’s latest policy update somehow manages to be even less involved in moderation, stating that there will be absolutely no moderation except for games that are outright illegal or “blatantly trolling.” So you will be able to see adult-only games on there, maybe, as well as absolutely no rules against hate speech, gambling scams, or any of the other problems people have asked the multi-billion-dollar company to deal with.
This has already attracted quite a bit of commentary, ranging from pointing out that permitting anything on Steam does make a statement on Valve’s values to pointing out that it abdicates responsibility to keep things cheaper to just asking what this means for games that have had to cut content. But, hey, nudity is allowed now, because the only way to permit that is to say that absolutely everything is permitted. After all, if you allow bare breasts, how can you then say it’s not all right to have a group called “Kill All Trans [HIDEOUS SLUR] Now”? Obviously they’re exactly the same thing.
It’s fair to say at this point that Steam is an enormous part of the PC gaming market. It’s also fair to say that Valve has demonstrated very little interest in moderating the platform in any way, preferring algorithms to actually walking in and stopping review-bombing efforts (among other abuses). There’s no real way to program in algorithms to prevent hate groups forming, but it does appear that Valve has gone through the Steam Groups and done one of its most aggressive banning passes to shut down hate groups.
Successful? Well, you won’t find a bunch of hate groups by searching for “school shooter.” You can, however, still find them; you just have to work a little bit harder at it. At this point, it seems that the only way these groups are really going to be removed from the platform altogether is if Valve really makes an aggressive project of moderating the platform, and that seems unlikely. But it’s a step in the right direction.
Streaming stuff can be pretty darn cool. We’re fond of our own streams, just by way of example. But no one likes having stream chat either be a collection of profanity that no one can wade through, nor does anyone like having to spend so much time trying to moderate chat that you barely get half a step in before losing an hour to moderation. Thus, the new AutoMod tool makes life that much easier for moderators and prevents nastiness from seeping into the chat.
Streamers can configure the AutoMod to several different levels; once in place, the system will flag certain comments as offensive ahead of time and require moderator approval before they go through. That means that if you love going into Twitch streams and saying the most offensively immature things you can think of, no one is going to see them but you and a thoroughly unimpressed moderator. And you can look forward to a nigh-inevitable ban, too.
Back in 2011, our former corporate overlords at Massively-of-old noticed that games like League of Legends were getting pretty damn popular and asked us to work them into the site. In order to incorporate them into an MMO blog without disrupting the existing MMO news coverage, we decided to put all of the news on games that may not fit the MMO definition into a new roundup-style column called Not So Massively. In the years that followed, the column kept track of dozens of online games in various stages of development, watched the MOBA genre mature, saw many games plod slowly into an early grave, and witnessed the e-sports explosion on a weekly basis.
It’s no secret that online gaming has been trending away from the persistent online universes of MMOs and into the shorter session-based gameplay of MOBAs, action RPGs and first person shooters. With gaming preferences changing, it wasn’t long before Not So Massively became oversaturated with news each week and began drawing more traffic than some of the MMO news. Naturally, we’ve now adapted and started rolling MOBAs and other online games into our everyday news coverage. As we hit the end of 2015 and approach almost a full year since Massively was reborn independently as MassivelyOP, I’d like to look back at the past year and highlight the top ten most surprising and controversial Not So Massively stories of 2015 in no particular order.
Give another human being a creative tool with the instruction to make anything, and odds are that first creation is going to be some variety of dong. It’s just human nature. This is funny when it’s a bunch of adults but a pretty huge issue when you’re making a game meant for children. The developers of LEGO Universe have recently spoken up about the challenges of making a building game in which every creation had to be very closely scrutinized for… well, you get the idea.
In a world where games like Minecraft and Landmark have both taken off, it’s relatively obvious that creative building games are welcomed. The problem was that preventing the display of wee-wees was an absolute ironclad portion of the game’s development; it couldn’t be automated, the branding required constant hypervigilance. As a result, there was a huge cost associated with just moderating the game, extending even to the developers playing around with the toolset. By extension, it turns out that there’s really no way to build software to automatically detect dangly bits.
Are you ready to head down into a cellar, get some rum, and become the lord or lady of rum in EverQuest II? Because that’s not really what the Rum Cellar campaign is about, just FYI. Still, you can find out for yourself on April 28th, when the campaign goes on sale for $14.99 as a stand-alone purchase. Players who want to pick this adventure up along with the Altar of Malice expansion – a required component to play through it – will be able to pick up both at a bundle price of $49.99 ($94.99 for the Altar of Malice Collector’s Edition). And, of course, subscribers get a 10% discount. It’s all in keeping with EQII’s new plan to switch to cutting expansions in favor of DLC.
If you’re a little more focused on the community side of things, you’ll want to know that after the latest round of forum and Reddit drama, the community management team over on the official forums asked for player feedback on moderation policies and got… well… lots of feedback. The last post on that thread explains the conclusions and the useful information taken away by the moderation team. Community Manager RadarX writes,
We are going to initiate the edit post policy at minimum on the EQ2 forums.
We cannot re-institute the forum-admin email process based on the resources we have available. We’re discussing what alternatives we can provide. .
We are doing a full review of our moderation process to ensure everyone on our team is fully trained and capable of providing consistent practice across our titles.
[Source: Rum Cellar Highlights
, An Open Dialogue Regarding Forum Moderation
; thanks to Kinya for the tip!]