EA’s new Positive Play Charter is basically ‘don’t be a dick’ in list form

Whether it solves toxicity is another story

    
21
tappa tappa tappa

One of the ideas I’ve worked a long time to internalize is that when people say they’ve changed, I should give them time to walk that new walk before dismissing it. I’d like to do the same for EA, which this week posted its new Positive Play Charter spanning all of its games – basically an anti-toxicity declaration.

“At EA, we believe in the power of positive play,” the company says. “Being part of a gaming community should be positive, fun, fair, and safe for all. Like with most communities, we have positive play guidelines to help make sure our games and services are an enjoyable experience for all players. Whether you’re new to gaming or have been an active player for years, we need your help to make this a community we all want to be part of.”

The list of rules is more or less Wheaton’s Law or the platinum rule, just spelled out in more detail: treat others as they would like to be treated, no racism, no sexual harassment, no homophobia, no doxxing, no impersonation, no report abuse, no harassing the mods, no cheating, no griefing, no goldselling, no account trading, no piracy, no illegal activities, and so on. Just to take all the fun out of everything, they also want to make sure you don’t have an excessive potty mouth.

Of course, there’s no point to having rules if there’s no punishment for breaking them, right?

“These guidelines apply to all EA games, services, and platforms. If you break any of our Community Guidelines or violate our User Agreement, we may place restrictions on your account and revoke your access to certain or all EA Services. A short-term ban or suspension is our way of preventing further disruptive behavior and asking you to reexamine how you behave. We’re not mad at you; you just need a break. […] If you have repeat or severe instances, we may permanently terminate your account. This includes all of your EA Accounts if you have more than one account.”

How will it do all this across so many games and publishers? I guess we’ll see, won’t we.

Advertisement

No posts to display

newest oldest most liked
Subscribe to:
Reader
memitim

“no piracy” Heh, snuck that one in there didn’t you?

Reader
Bruno Brito

We may (…)

We may. But considering we’re also a soulless corporation, we may not.

Nah, pulling their leg. EA has a lot of issues, but on this specific camp, they seem to be trying. Let’s see if it’ll achieve anything.

Specially considering how they tend to go full into profit and treat their customer like garbage, and somehow can’t realize that’s also breeding toxic feedback.

Reader
Zuldar

no cheating, no griefing, no goldselling

As long as they follow their own rules than sure, sounds good.

Reader
Skor

Back in the day we had this anti-harassment command that everyone had access to. It was /ignore. Maybe they should bring that back and give the power back to the individual.

Reader
Bruno Brito

I didn’t know back in the day, laws and rulesets for behavior didn’t exist.

Reader
Crowe

I skimmed through this — maybe a bit too fast. I didn’t see anywhere if they mentioned that “it’s ok in a three way.”

Reader
Arktouros

I guess? Just more policing action.

What matters isn’t what’s against the rules or even if there’s consequences for rules but how strictly they are enforced. This is almost 99% where companies fail on this matter because enforcement usually requires people to look at things and hiring people to look at things eats into profits.

However fundamentally I don’t think this will really curb or change anything. The root of why toxicity occurs is never addressed with policing actions and just serves to cover up the problems and pretend they don’t exist.

Reader
angrakhan

I would think with modern data analytics technology you could for the most part automate this. I’m not suggesting automate it to the point the software executes the ban. That’s where you’ll run into issues. Rather have the software generate the “naughty list” report based on player activities and have a human put eyes on the chat logs or whatever to make the final judgement call. Yes, it takes more people than zero, but you wouldn’t need an army like you would if you were just going to rely on real time GM moderation in games.

Reader
Arktouros

We’re getting there slowly, the biggest problem is giving machines context for things that aren’t inherently obvious.

A lot of the ideas and ways we use language is constantly shifting. The classic example is the word “Retard” which was supposed to be the scientific term for idiot, moron, imbecile and the like. However after a few decades of people using it disparagingly now we use “Disabled.” Does the machine flag the guy calling someone a retard but not the person who says the other person is mentally disabled?

There’s other scenarios as well such as sarcasm. Does the machine flag someone for talking to another player who is playing poorly by say how great they are doing in what humans would recognize as a clearly sarcastic/mocking manner?

For that matter how does it determine the difference between someone playing badly on purpose to ruin the match for everyone else and someone who just having an off game?

There’s really a whole list of things. Like you can catch the obvious stuff such as racial or homophobic slurs but again people will just adjust around that and still be toxic in new and different ways. Since people aren’t addressing the root reasons why people are toxic (nor do I think it’s functionally feasible to do so) it’s all just trying to badly keep up and other humans are really the most efficient if not the most costly answer.

Edit: Even this post got auto moderated and clearly not hateful in any way or context :)

Reader
angrakhan

Well, I’m talking data analytics beyond just passing chat through a black list filter and flagging things that match a word in the black list. That’s what got you automodded. That kind of thing is very simplistic. Facebook, for example, knows a shocking amount of information about you. For example it can tell not only where you live, but if you’re a stay at home mom versus a career dad, if you are liberal or conservative, what religion you are without ever asking you to fill out some demographic questionnaire. It does this by looking at your post history, your likes/dislikes, who your friends are, which friends you unfriend/unfollow/block, etc. I have friends who own their own businesses who advertise on Facebook, and the specificity with which you can target your ads is quite shocking.

If you wonder why Facebook is free… it’s because you’re not the customer… you’re their product. They sell you. I digress!

Back to gaming!

As far as someone being sarcastic. In my opinion that’s a “sorry, that’s life” issue. People are jackasses. People are definitely jackasses in any scenario where they are anonymous. I mean, do you really want a game to become that level of thought police? At some point people have the freedom to speak their mind. If someone is harassing you, just mute/block them. Problem solved.

You can use data analytics to track someone throwing matches. For example you can track a person’s average performance in matches and flag matches that are well below par for the player, then match that up with who they are teamed with or against at the time. Is there a correlation between the low score and the opponents? Like 90% of the time when the player’s score is below X the opponents happen to be the same group of players. Well, that starts to look like someone throwing a match for the benefit of the opponents. Again, you would need human intervention to verify.

“Whole list of things” is rather broad and generic, but I think you might be surprised how many of them can be addressed through data analytics.

Reader
Kickstarter Donor
squid

EQ had their “Play Nice Policy” for years…until DBG admitted that they couldn’t/wouldn’t enforce it and directed players to “police yourselves.” Now the only players who get banned are the ones trying to police themselves the only way they can in a PvE game—training the botters.

Reader
Arktouros

Yea games like EverQuest show what happens when you don’t design for some sort of conflict resolution in the game design. Most PvE games later on solved this mostly by removing the competition over limited resources such as monsters and made the rewards shared by all who participate. Other games rely on PvP mechanics to do the bulk of the lifting there. However going back to no solution is just incredibly frustrating and players will simply use whatever methods they can to ruin others.

Fun fact about New World, this happens there now as well, and there’s plenty of reports of people who train each other when other people come into an area they’re farming :)

Reader
Patreon Donor
Life_Isnt_Just_Dank_Memes

They’re literally gonna release their 2nd Star Wars game in a row without any microtransactions. I guess miracles are real!

Reader
Arktouros

Miracle or was the public backlash so huge previously that Disney told them to not do it or they’ll pull the license? No corp wants a bad look, let alone one that’s running a transmedia franchise.

Reader
McGuffn

“Do you believe in Backlash? Yes!” – Al Michaels, 1980 (Updated for 2020).