The MOP community has been discussing moderation in MMORPGs quite a bit this week, and now here comes Discord to add another angle to the conversation. The chat company announced a fresh toolset for chat communities to help them moderate themselves: AutoMod. It’s not meant to do all of the moderation work for channels; it just allows hosts to set their own flags and responses.
“At launch, AutoMod comes equipped with keyword filters that can automatically detect, block, and alert you of messages containing harmful words or phrases before they’re ever posted in your #text-channels, Threads, and inside your Text Chat in Voice. You can even have users who try to post harmful words or phrases be Timed Out automatically, so they won’t be able to continue posting until you’re back. If you’ve chosen to have AutoMod send you alerts about flagged messages, you can specify it to post the alerts to a text channel of your choice. There, your moderation team can view a flagged message to determine the best course of action, whether it’s removing the message itself, timing out the user who tried to post it, or allowing it to remain.”
Readers will know this is pretty similar to some of the built-in WordPress systems that we ourselves use here on MOP, so this doesn’t seem unreasonable, nor does it rely on pure automation to do all of the heavy lifting of keeping a community clean (which wouldn’t work anyway). Even so, malcontents usually find ways around these tools eventually. Perhaps the first thing Discord users will want to do is pop in some crypto blocking verbiage as the NFT spam over there is kinda out of control.