Back in 2014, something happened in the gaming space, something whose impact was so damaging that even almost 10 years later, I realize that I still hesitate to use its name in print. It’s still a bit like playing Bloody Mary in the mirror, except you type this word out and you get a deluge of “actually it’s about ethics in games journalism” comments. But it isn’t, and it never has been. As a cultural zeitgeist in gaming, Gamergate was laser-focused on protecting toxicity and radicalizing gamers towards extremist positions. And gaming has never been the same.
I’ll bet most folks reading this article will likely respond to the idea we’re going to talk about today – that extremists are drawn to gaming spaces for organization and recruiting – with a “Yep, that checks out.” Extremists use games and gaming-adjacent spaces to “find like-minded individuals” and recruit for their ideologies. They blend in with the casual racism, bigotry, hated, homophobia already sadly present in gaming spaces, then take advantage of underfunded or laissez-faire content moderation to introduce more extreme ideas to unwitting gamers. The methodology isn’t novel, but the medium in which it’s happening is; so many people still see gaming as an unserious “kid’s activity” that it’s hard for folks who aren’t gamers to believe this is happening.
However, over the last few years, that’s been changing. The United Nations Office of Counter-Terrorism recently put out a report called Examining the Intersection Between Gaming and Violent Extremism. Researchers conducted a three-phase study of extremism in online spaces over a one-year period from May 2021 to May 2022. Phase 1 was a literature review and focus groups with experts, Phase 2 was focus groups with groups of gamers, and Phase 3 was a survey sent out approximately 600 gamers to complete. (Before anyone comes at me with “that’s not a representative sample set,” the report itself addresses this, framing the findings as a starting point for generating more research and better insight, rather than as the final word on the topic.)
For the purposes of the report, the research was focused on right-wing extremism because of the ease of identifying those extremists in online spaces. But the researchers do mention other types of extremists as well, particularly “gaming jihadists,” and that’s not a euphemism; they literally mean those who use gaming as a tool for recruiting gamers for Islamic terrorism. The report was also very specific in reiterating (repeatedly) that there is no link between gaming and violence – and similarly that there’s no causal link between gaming and extremism. Gaming isn’t creating violent extremists; gaming is just the space being exploited.
The report frames the ways in which extremists exploit gaming spaces using the European Commission’s Radicalization Awareness Network (RAN) definitions. RAN includes six avenues extremists use to subvert gaming to spread their ideologies: creating ‘”bespoke” games, modding existing games, using in-game chat, using gaming-adjacent spaces (like Discord, Twitch, and Youtube), using “gaming cultural references” (memes), and the one that I find most gross, gamification. That means things like extremists using gaming-esque leaderboards to track actual body counts and awarding Discord badges for recruits meeting milestones in embracing the ideology.
“In addition, gaming communities, in which misogyny, hate towards minorities, expressions of violence, toxicity and politically incorrect humor are prevalent, offer extremists the strategic benefit of being able to blend in and build on the problematic atmosphere to meet audiences where they are and then, potentially, motivate a deeper engagement with extremist ideas.”
The UN report also makes a very depressing assertion: that these recruiters are drawn to gaming in particular because of the existing casual bigotry and toxicity in the genre, which makes it easy for extremists to hide in plain sight and groom gamers for more radicalized content.
The Phase 2 focus group affirms this suspicion. The researchers note that “the places where extremism is most visible are not the places where extremism is more prominent.” Much of the worst extremist content happens in private chats within guilds, DMs, or Discord servers. The Phase 2 group also suggests that extremists will often use “soft-pill” gaming-related memes or gifs that relate to toxic gaming culture as a gateway to identify targets for radicalization or indoctrination. The generalized casual toxicity of gaming makes it hard to identify who’s just being an immature edgelord and who’s a legitimate future terrorist as the content is functionally identical.
The report continues with how extremists use gaming to radicalize folks: appealing to those who already have an interest but need a “push” for further radicalization, cultivating interest in extremist ideologies from extremist-adjacent folks, building on the toxic masculinity identity of games, and using video game aesthetics to appeal to wider audiences. For example, I learned in the report that the shooters in both the ChristChurch mosque shootings and the Oslo shooting livestreamed the attacks explicitly to make it look like Call of Duty.
The third phase, the survey of gamers, reflected what the other two phases had found. A solid 85% of gamers polled reported observing or experience toxic behavior in games, most of which was misogyny, racism/xenophobia, or homophobia, with religious extremism, antisemitism, and Islamophobia making up less. Gamers called out Riot Games’ League of Legends explicitly as a key game with a toxicity problem and also cited Discord and Twitch as having huge toxicity problems – unfortunately, nothing new to readers here.
The survey also asked folks to give the qualities that they thought made a game more susceptible to toxic behavior, and unsurprisingly that list contained things like “Popular,” “Highly competitive,” “PvP-focused,” “No moderation,” and “No consequences.”
It shouldn’t be a surprise to anyone that the responses to the UN survey itself also included trolling and toxicity – because of course they did.
The report ends with a series of recommendations, the biggest one being that the industry needs to do much more when it comes to moderation, though it also urged a “light-touch” to avoid over-policing gaming. Given the effort many AAA multiplayer titles put into fighting toxicity (or at least how much they want to be seen as fighting toxicity), I’d say it’s hard to disagree with that.
I’ve written a lot about toxicity in gaming over the last few years, and to me extremism is just toxicity’s older, angrier, more dangerous sibling. Nothing in the report will be new for any of us who regularly game, but for those who aren’t familiar with our world, I can see this report being eye-opening — if anyone takes it seriously. More than anything else, I think extremists are taking advantage of the perception of gaming as something for kids that can’t possibly be subverted.
But we know toxicity is a real problem in our games. It was a little harrowing to read United Nations researchers just say flat-out that “the casual bigotry in gaming makes it easy for extremists to blend in.” I knew that to be the case because I’ve seen it in our hobby myself, but it was quite another to read it so starkly stated by concerned academics on the outside looking in.
If I take the study’s conclusions a step further, it’s hard to believe that nation states aren’t using games to groom and radicalize folks to their causes. Heck, I recently saw a job posting for a Gaming Platform Advisor role with the US Department of Defense requiring security clearance. I guess we know at least someone is taking toxicity and extremism seriously.
2014’s Gamergate was an extremist movement that was birthed by and grew out of gaming culture, and it marked the first time that extremism in gaming captured the public’s notice. It was so wildly effective at amplifying toxic ideas and radicalizing some gamers to extremist ideologies that now non-gaming extremist movements are seen copying exactly the same tactics in the years since. Until the industry collectively starts taking it seriously, stops downplaying toxicity and extremism, and really gets to work fixing the root problem, I’m afraid it’ll keep right on happening.