Overwatch devs consider harnessing machine learning to fight toxicity

    
41
Overwatch devs consider harnessing machine learning to fight toxicity

According to a new Blizzard interview on Kotaku this week, Overwatch toxicity has a new enemy: AI. Jeff Kaplan says that Blizzard has “been experimenting with machine learning” in order to teach its “games what toxic language is.”

As Kotaku points out, Blizz has already touted its successes in the war on toxicity, having claimed back in January that it had boosted reporting by a fifth and reduced chat abuse by almost that much. But its next move is to teach its AI to detect abuse before it’s ever reported. That’s the real trick, of course, since teaching AI context – the genuine and abusive versions of “GG,” for example – is much harder than it sounds.

Furthermore, Kaplan tells the publication, Blizzard is eyeing a horizon where it’s focusing as much on the “positive version of reporting” as the negative.

We’re not quite to Minority Report, but it’s getting closer. As Massively OP’s Eliot put it in work chat this morning: Leave it to video game companies to treat human beings as an engineering problem.

Source: Kotaku

No posts to display

newest oldest most liked
Subscribe to:
Reader
Bruno Brito

IT’S HAPPENING

Reader
Kickstarter Donor
Patreon Donor
Loyal Patron
zoward

Eliot’s right – It’s the hammer-and-nail problem. If the only tools you truly understand are engineering tools, you’ll try to use them to solve non-engineering problems, and likely fail in spectacular and/or hilarious ways.

I foresee a new sport amongst trolls: sneaking their payload past the AI.

Reader
J

What a waste of time and resources. At what point did the generational roll over happen and people became so thin skinned? I’ve been playing competitive shooters on and off since the 90s. Starting with games like Quake and Jedi Knight. I never had a problem with some smack talk. I never had a problem with a bit of cursing or abusive language. If and when I thought it had gone to far, then I already had the only tool I ever needed. Block, squelch or ignore.

Why do we need more than that? It’s going to be fucking amazing when the masses start learning how to jerk the AI around. Thats already happened several times in recent years.

Reader
Utakata

Block, squelch and ignore would also be considered “thin skinned” according to your argument though. As you seem to imply we should all roll over and just accept abusive behavior, because it’s what all the cool kids did back then.

Personally, I don’t think that was ever more acceptable than as it is now. Folks instead might be coming aware that “smack talk” really isn’t the best way to communicate…whether it’s at your boss, over dinner with your mom or with your gaming “friends”.

Reader
J

Smack talk as you put it, will always be a part of any sport. If it be a video game or any RL sport. If anyone thinks some silly algorithm is going to stop that, they are mistaken. Blizzard’s proposed tools are a costly and inferior solution.

Smack talk can even be part of the fun. One person might find the language to be abusive. While other players considers it par for the course and even fed off of that banter.

Instead of using existing and superior tools. We are talking about using a virtual moderator. I will pass on any game with that level of babysitting.

Ultimately I have the tools that I need. I have had those tools for years. I feel like there is a lot of false outrage around all of this. I don’t believe that a large number of players are upset. As all of the news on this particular front is Blizzard heavy and the competitive shooter realm contains more titles than Overwatch.

I look forward to the tales of false bans, players petitioning their unjust punishment and groups of people manipulating the AI (a misused term).

Reader
Utakata

My pigtails senses a disturbance in the goal post moving…

That said, this is not sports though. Nor is it really acceptable behavior in sports. So let’s not draw on other activities for the false comparisons. Instead, focus on the problematic issues of using AI filters in dealing with toxicity. Which is entirely different from your use of sissy shaming, and is just giving another good reason why toxicity should be addressed with online gaming.

Reader
J

Not a false comparison. Team gaming (esports) and team sports have many parallels.

Blizzard very much wants this product to be a esport and they have been pushing that agenda for years now. Starcraft being the original 500lb gorilla. Which has had entire cable networks dedicated to it. They tried to push it with WOW for awhile, but that has fallen flat. Now the same thing is going on with Overwatch.

As such. I think they have this growing desire to control every aspect of what people see and hear. They want the narrative to be in their hands. Even more so when it involves any sort of tournament play. If that is small Twitch tournament or something more massive.

Most other mainstream and physically oriented sports are simply watched by us. We don’t have a mic shoved in everyone’s mouth. If we did, we would hear some pretty crazy though vulgar things. While rare you can sometimes catch a glimpse of this when the mic is a bit to close to a player or a coach at the wrong time. It is also the reason many events are on a slight delay. Professional team sports will always have a head game element.

Now me. I’ve never thought watching others play for any length of time was particularly enjoyable. Same goes for things like soccer and football. I would much rather participate.

I still think they are barking up the wrong tree here and others in comments below have pointed out several other games that have better and much easier to implement systems. Ultimately if they do something stupid then the playerbase will either -check- them on it or simply go play something else. For now Overwatch is still a hugely popular title, but it doesn’t take much for a shooter to become old news. After all Fort and PUBG are the new big kids on the block.

Ultimately my original opinion and stance are as they were. I don’t personally need anything beyond a mute button.

Reader
Utakata

There is no and was no justification for poor behavior, toxicity or “smack talk”, as you call it. We have arrived here today because reasonable folks are sick of it. Nor can bad behavior exist in a vacuum, especially with the advent of social media and such. If you want to go back to a magical time where you smack talk all you want and nobody gives a crap, go live in the woods and cuss at the trees. /deal

Reader
J

You come back and let me know once Blizzard has saved the world from the bad people that hurt your feels with words. I can see where you mind is in for the big picture and I happy it will never come to pass.

I don’t need a magical time for anything. I still exist in the present. You are the only one off in some fantasy world.

Also, take your dying social media and stuff it. The rest of the world has found out what a vile thing that really is the past few months. Though many had seen it for what it really was all along.

You be careful now before the massivelyop “AI” comes along and sees that you are online bullying me and gives you the boot. < sarcasm.

Reader
Randy Savage

.

1z5fk5.jpg
Reader
Bryan Correll

This is exactly what I was thinking. Any machine intelligence exposed to online toxicity is bound to develop a hatred of humanity.

Reader
Grave Knight

Ugh, no, you’ll end up with a bunch of false positives and there is a good chance it won’t be accurate when dealing with problem players at all.

Reader
CMDR ShrunkenQuasar

Provide it with enough data, and the number of false positives will be negligible to the point of being less than one percent. Humans are simplistic and tend to act in a predictable fashion, which makes it insanely easy for a computer to learn from and predict not only which players have been toxic, but which ones are likely to become toxic and should be put on a watchlist for closer scrutiny in the future.

Reader
Schmidt.Capela

It doesn’t even have to be negligible, it just needs to be less than the number of false positives you would get from employing humans instead.

Reader
Jacobin GW

Toxicity will always exist in a competitive environment since winning and losing can never just be one big bland experience.

They love to use buzz words like machine learning and AI when all they are really doing is just banning words or phrases which leads to tons of false positives.

People who agree to take part in a game that involves murdering as many people as possible within the time limit should be able to handle some salt.

Reader
Slaasher

Yeah, its not “some” salt they are talking about here

Reader
tiltowait

I’m not convinced there’s an abusive form of GG. I know it’s said sarcastically sometimes (like when one team gets stomped and had 1/2 their members afk or trolling) but it’s still a far cry from ez, or calling people trash.

I’ve had my own way of combating toxicity for a long time, and it works great. I mute people early and often, at first hint they are toxic. It’s great, works 100% of the time.

camren_rooke
Reader
camren_rooke

Best user name evar.

Old school cool.

Reader
Scotty

This just isn’t going to go away until devs embrace the Island of Misfit Toys model.

Let the community decide who’s toxic and park them in their own queues with other Misfit Toys until they learn to play nice.

Reader
Grave Knight

I assume that’s like DOTA2’s Low Priority mechanic.

Reader
Sally Bowls

Leave it to video game companies to treat human beings as an engineering problem.

Now, now, this isn’t about solving the problem of human beings; it is just trying to use engineering to mitigate some of said problems.

Most companies, or at least the ones that will survive, are embracing the technology.

https://tech.slashdot.org/story/18/04/01/180217/non-tech-businesses-are-beginning-to-use-ai-at-scale

——

Although, in a darker vein, some AI/ML/DL is about “solving” the problem of some humans

https://news.slashdot.org/story/18/04/02/2243228/military-documents-reveal-how-the-us-army-plans-to-deploy-ai-in-future-wars

Reader
Schmidt.Capela

Worth noting that the Overwatch devs implementing anti-toxicity tech are part of a Blizzard-wide taskforce focused on fighting toxicity in all of Blizzard’s games, as well as collaborating with other companies from the Fair Play Alliance. So, if this toxicity-fighting AI achieve good results, it’s quite possible for it to be implemented in WoW, as well as similar features being implemented in other online games across the industry.

And Blizzard isn’t taking this lightly. An Overwatch pro recently got a $3K fine, a 3 games suspension, and got slapped by his team with a 10 weeks ban from streaming for mocking Koreans in his personal stream.

Reader
Utakata

I prefer they didn’t use filtering systems. Tech is not yet that advanced yet to decipher and/or discern nuance and context. Just ask the dead person behind the wheel of that driverless car. Just saying.