ToxMod is a new moderation tool that recognizes and reports voice chat hate speech in real-time

    
33
Popcorn appropriately.

Moderating a game’s text chat isn’t easy by any means, but at least there are a variety of tools and algorithms that try to make managing it a bit easier. That’s not quite the case for voice chat, which can not only be more immediately impactful but also be harder to act upon quickly. Modulate Inc, a technology company using artificial intelligence to improve online voice chat, hopes to change that with the release of a tool known as ToxMod.

ToxMod is described as “the world’s first voice-native moderation service,” letting game developers detect toxic, disruptive, or otherwise problematic speech in real-time, and respond swiftly and as they deem appropriate. ToxMod utilizes machine learning models to understand not just what each player is saying, but how they are saying it, taking into account emotion, volume, prosody, and more. These features help ToxMod to focus less on whether something is being said in a moment of frustration and more on the start of something more disruptive. ToxMod even learns more about a game’s community the more it’s utilized on top of its core voice recognition algorithms.

For those who are concerned about Big Brother levels of listening in, the software creators insist that ToxMod processes all data on each player’s device in real-time, preserving player privacy and only sending audio off-device to mods when toxicity has been detected.

source: press release
Advertisement
Previous articleMassivelyOP’s 2020 Golden Yachties: Best MMO post that is actually just song lyrics
Next articleMMO Year in Review: A ‘colossal’ announcement (February 2020)

No posts to display

33 Comments
newest
oldest most liked
Inline Feedback
View all comments