Moderating a game’s text chat isn’t easy by any means, but at least there are a variety of tools and algorithms that try to make managing it a bit easier. That’s not quite the case for voice chat, which can not only be more immediately impactful but also be harder to act upon quickly. Modulate Inc, a technology company using artificial intelligence to improve online voice chat, hopes to change that with the release of a tool known as ToxMod.
ToxMod is described as “the world’s first voice-native moderation service,” letting game developers detect toxic, disruptive, or otherwise problematic speech in real-time, and respond swiftly and as they deem appropriate. ToxMod utilizes machine learning models to understand not just what each player is saying, but how they are saying it, taking into account emotion, volume, prosody, and more. These features help ToxMod to focus less on whether something is being said in a moment of frustration and more on the start of something more disruptive. ToxMod even learns more about a game’s community the more it’s utilized on top of its core voice recognition algorithms.
For those who are concerned about Big Brother levels of listening in, the software creators insist that ToxMod processes all data on each player’s device in real-time, preserving player privacy and only sending audio off-device to mods when toxicity has been detected.