Here’s how Discord says it’s been addressing toxicity and spam

    
14

It’s nearly impossible to be a gamer in 2019 and not be at least familiar with Discord, the ubiquitous chat program that’s replaced, in turns, voice chat, text chat, even guild chat, and for some folks, Steam. That also means it’s a prime destination for the worst kinds of humans. And you might be wondering what exactly a company with 250M users does to address that. This is what Discord’s latest transparency blog post is all about.

The company says it received over 50,000 reports in just the first three months of 2019, as users dispatched complaints on everything from harassment, threats, and doxxing to self-harm and hacking and spam.

“In the investigation phase, the Trust and Safety team acts as detectives, looking through the available evidence and gathering as much information as possible,” Discord says. “This investigation is centered around the reported messages, but can expand if the evidence shows that there’s a bigger violation — for example, if the entire server is dedicated to bad behavior, or if the behavior appears to extend historically. We spend a lot of time here because we believe the context in which something is posted is important and can change the meaning entirely (like whether something’s said in jest, or is just plain harassment).”

Punishments come in multiple tiers, starting with just removing content and warns and escalating to permabans – or reports to police. According to Discord, most spam reports check out and are summarily actioned, but others, like self-harm and “exploitative content” complaints, often result in no action, either because they’re fake, malicious, mislabeled, missing info, or insufficient for punishment. The numbers seem to show that spam is the biggest problem by kind of a lot – and the remaining toxicity is much smaller than you might guess.

“For perspective, spam accounted for 89% of all account bans and is over eight times larger than all of the other ban categories combined. […] Amongst all other categories, we banned only a few thousand users total. Relative to our 50 million monthly active users, abuse is quite small on Discord: only 0.003% of monthly active users have been banned. When adjusted for spam, the actual number is ten times smaller at 0.0003%.”

Source: Discord
newest oldest most liked
Subscribe to:
MilitiaMasterV
Reader
MilitiaMasterV

‘Self-harm results in no action’

Uhm. Inaction on that is actually something that can lead to a person’s death, therefore, if it comes out that you COULD have done something to help stop it, and you didn’t, you can be held criminally liable. I forget the exact charge, but it’s similar to telling someone to go off themselves, and them actually doing it…which is also something people are being charged for nowadays…

Reader
John Kiser

You cannot be charged if you are an individual. You have no legal responsibility to do anything if someone is going to inflict self harm unless you are a therapist, police officer, teacher, or someone in a position of authority over that person. A therapist needs to make reasonable responses to prevent anything like this..

You can’t think of the charge because quite frankly none exist. The only time you can be charged is if you are in a position of authority and/or you are goading the person into offing themselves like that one teen girl did to that teen boy.

You may be thinking negligent death, but it isn’t a charge that sticks or is even done unless you meet one of the criteria above. Even as a parent you cannot be charged if your child tells you they are going to kill themselves and you do nothing.

I’m really unsure where you are getting your legal ideas from, but it sounds like maybe you watch too many crime dramas where this sort of thing might exist in those worlds, but don’t really exist in real life.

https://law.stackexchange.com/questions/16670/who-within-the-united-states-would-have-a-legal-obligation-to-report-or-stop-a

MilitiaMasterV
Reader
MilitiaMasterV

Even as a parent you cannot be charged if your child tells you they are going to kill themselves and you do nothing.

Pretty sure that would fall under one of the numerous ‘child neglect’ laws…so I don’t think you’re right in what you’re saying.

I’m not a lawyer, so I can’t say ‘for certain’, I just read a LOT about various cases IRL and yes, I have watched quite a few shows on the subject also.

I really do believe they can/would charge you if it was shown you had a chance to do something and didn’t. But proving ‘intent’ in a court of law is another matter…so even if they charged you, it may be tossed out because of lack of malicious intent. Regardless, inaction on the issue isn’t something people should be ‘for’.

Reader
John Kiser

You cannot be charged unless you are in a position of authority and fail to act. You could potentially be charged if your child is underage and you are there to do something about it, however you cannot in most circumstances simply be charged for knowing that someone tells you they are going to self harm as it simply doesn’t work out like that. There isn’t real cases where people have been charged unless they have attempted to assist in the suicide themselves.

Not acting unless you are a teacher, police officer, therapist/psychologist/psychiatrist won’t really do much of anything particularly online. Suicide isn’t considered a crime or illegal in most states for that matter and where it might be could be about the only possible places that you might be charged if you don’t meet the criteria of a person of authority. Basically unless you are a caregiver of some sorts, a a therapist, a doctor, a police officer or the like you cannot be charged.

The reason that inaction on the issue is someone saying they are going to hurt themselves is not a reason to BAN them from a service in this case. You will note that Discord here is talking about action taken to people/servers etc. Someone claiming they are going to self harm being reported to them isn’t getting that person banned because the reports people are getting for it are often fake, mislabeled, missing info, or insufficient for punishment.

If they took action for every claim of self harm a shit ton of people would be banned unfairly from discord. People exaggerate shit constantly and say shit. People also maliciously report etc. You need to understand that not acting in this case isn’t doing nothing about people that are genuinely about to self harm and there is no way for discord to know for certain what is real and what is not and they have no legal obligation to report that kind of thing to authorities and they could actually be massively disruptive. Not to mention that it would be rather impossible for them to report much of anything. They have an IP address and your username and maybe your ISP and this is assuming you aren’t using a VPN to connect in the first place.

The best they could do in that situation would be to attempt to report it to police based on IP address and hope for accurate reporting on it, though again the police have no real legal authority to attempt to stop it in most states, but again I think you confuse inaction here. Discord receives a lot of reports of self harm and they are stating that action isn’t taken against people that are reported for it because most of them are fake etc.

MilitiaMasterV
Reader
MilitiaMasterV

Yes, I agree they shouldn’t be ‘banned’ for talking about self-harm. But as a platform that is now becoming synonymous as a place where mass shooters congregate…

{That’s literally how the media is currently portraying Discord to those who have never heard of it…which, if you’re not a gamer, you may have zero idea what the platform is for/what it is…and the name of the service points to a : ” lack of agreement or harmony (as between persons, things, or ideas) ” or “active quarreling or conflict resulting from discord among persons or factions : strife” which doesn’t lead towards people thinking good things…}

…and some people go and talk about harming themselves or others on the platform, this type of stuff shouldn’t be ‘ignored’/taken lightly either. Police do have the ability to ‘find you’, even online…even with precautions you may have taken VPN/etc/etc. The internet may be giving people a false sense of anonymity nowadays, which they are using to be vicious and cruel to others or being blasé in their attitude towards someone speaking about harming themselves.

I’d also point out, that sending cops to a person who is talking about self harm, is liable to end with the person being harmed anyway, as most police are untrained in how to deal with people with ‘issues’ of that kind. The police in some areas have ‘specially trained’ groups to deal with folks talking like that.

Also, yes, there is people using ‘fake’ reports a lot, and those types of people aren’t receiving punishments for their actions often, hence why they continue to do it.

Still, at the end of the day, my main point is that people shouldn’t just ignore stuff like this because that apathy is not good for society as a whole. Not caring about your fellow people is a bad sign.

Reader
John Kiser

Except that if there was a credible report of a mass shooter they’d do something. They said they don’t do anything about reports of self-harm and “exploitative content” complaints have no real action taken against them. Discord specifically here when saying they are not taking action is talking about not taking action in those instances most of the time due to it being primarily fake or malicious reports.

You seem to think self harm means harming other people or something and I really am having trouble figuring out where you think self-harm and “exploitative content” complaints fall into the same type of scenario as hate speech or general toxicity.

The point is that discord takes no action with that shit because it is not their place to do so and they should not be held liable for it. Also police could attempt to get to a VPN, but you’d have to deal with going through a long list of proper procedure to do so which would involve getting a subpoena of their records and good luck doing that if the VPN is a foreign company without involving federal stuff. A VPN will only lead back to the VPN unless they keep logs and in which case they’d need to track routes for where things are delivered ultimately and again this involves way more procedure than grabbing an ISP provided IP provides.

Not to mention discord makes no claim of doing nothing when someone is reported for potentially going out and harming others. Discord can only respond when shit is reported. I’ve seen friends of mine banned because servers and members of servers were nuked because of what some members on that server did from discord so they don’t take no action if people are actually talking about harming others.

Any online community can end up a place where people talk about harming others. If they don’t use discord they will just switch to self hosted discord alternatives at this point many of which mirror the feature set of discord minus bots. We shouldn’t ignore if they are saying they will harm others and it is reported, but that isn’t what discord is saying they don’t do. Discord is saying they do nothing when reports of self harm and exploitative content are not acted against most times. They don’t just flat out ignore it, but action isn’t taken most of the time.

Reader
___

I think 10 years ago people were less toxic. I’m starting to think that everyone who is being toxic is so bored by something, they find enjoyment in acting toxic.

Reader
Tuor of Gondolin

I think that 10 years ago, people had thicker skins.

Reader
kalech

People haven’t changed. People have the same thin or thick skin as they’ve always had, its just that the visibility of bullying is changing and victims are allowed to have a voice now. You just couldn’t, and didn’t want to, hear them before.

Coupled with the fact that the internet gives us a much broader platform, it makes toxicity/bullying much more visible. People who are fine with bullying also get very defensive, because their own shitty and unsympathetic behaviour is becoming more visible as well, hence the ol’ “People used to have thicker skin” argument popping up left and right.

Reader
Xenus

I think the average internet user age is dropping. Most toxicity is actually childish. Also people love negativity, creating, being part of dramas. Any constructive critic won’t get you enough clicks but outrages/ranting on Youtube makes you money.

Random MMO fan
Reader
Random MMO fan

10 years ago there were less things to be toxic about. US was not so divided over social issues, Russia did not invade territory that is recognized by majority of the world to belong to Ukraine and games were less toxic. For example, we did not have Overwatch – a game that breeds toxicity by design (only 6 players per team who constantly argue which hero should one of them choose), many people were still enjoying playing on 32-player TF2 servers where no one gave a fuck which class you chose and whether you were going for objective or doing emote dance with the player of opposite team in the lower level tunnels of 2fort map ;-)

Reader
___

I remember similar things about the game called Alliance of Valiant Arms. People were more focused on playing and enjoying rather than polluting the chat with insults and whatnot.

Reader
John Kiser

Oh please. We had a ton of varying games FPS wise that could breed toxicity. The problem is entirely with gaming becoming more and more mainstream which in turn attracts a larger percentage of douchebags to gaming who are usually generally toxic individuals whatever age they are.

The problem with shooter games has become the fact that we can no longer self moderate with private servers which in turn breeds toxicity in those communities at a much higher rate. You can’t say games like overwatch breed it when a mostly PVE game like WoW has one of the most toxic gaming communities out there and that is cooperative.

The sense of entitlement among younger gamers is also a big issue and even older gamers that only got into it in the last decade are also more of a problem. A lot of these are people that used to rag on us when we were younger for gaming and now they with those attitudes and the like are all over gaming.

Bad game design so that publishers / game devs can control the lifespan of a game as well as bringing in more and more mainstream people is what leads to a huge slew of the current issues. In the past most of us were like minded and far far less toxic because of this, but we can’t say toxicity was never there or that games never bred it.

UO back in the day had a huge problem with pkers (player killers) that were so bad players were leaving in droves until a PVE ruleset was made. That is toxicity in and of itself and a lot of the pker crowd are and always have been toxic. You get called a carebear if you point out that a lot of them do it just to be toxic and troll or grief/harass other people. You get called a carebear if you point out that PK isn’t actually PVP because most of the time pkers are targeting individuals that won’t or can’t fight back thus removing the versus entirely (just because you are attacking someone else doesn’t mean you are actually in a fight to apply a versus label).

The toxicity has been there for a long long long time now and it is just getting bigger and worse because of way more people and way more mainstream type folks gaming now.

Reader
Arktouros

That’s about the right size of toxicity I’d expect reported.

Most people hanging out in communities are there because they are similar or share similar interests. If you didn’t, you’d probably leave. There’s very little reason for people who have different interests to interact or mingle with one another.

This in contrast to something like a game where you have multiple groups of people from multiple cultures (online and real) with different values and ways of interacting with one another.