New research published in the renowned journal Frontiers in Psychology has uncovered a worrying trend: extremist groups are strategically using video game environments and their adjacent platforms to recruit and radicalize young, impressionable users. These gaming-associated digital spaces, often viewed as safe havens for entertainment and social interaction, are being subverted into fertile grounds for spreading extremist ideologies.
The study, conducted by Dr. William Allchorn and Dr. Elisa Orofino of Anglia Ruskin University’s International Policing and Public Protection Research Institute (IPPPRI), reveals the sophisticated methods by which extremists exploit the relative monitoring gaps of gaming-adjacent platforms. These platforms, which blend live streaming, voice chats, and text communication during gameplay, operate under different moderation regimes compared to mainstream social media, presenting unique challenges for content regulation.
The crux of the research indicates that extremists are deliberately “funneling” users from mainstream social media to these less-regulated gaming environments. This is facilitated by the inability of conventional moderation tools and human moderators to effectively detect and counter harmful content in these spaces. Extremists leverage the informal, semi-anonymous, and highly interactive nature of gaming communities to build trust and gradually introduce radical ideas.
Far-right extremist content dominates these platforms, with the propagation of white supremacist ideologies, neo-Nazism, anti-Semitic rhetoric, and a toxic combination of misogyny, racism, homophobia, and conspiracy theories such as QAnon. Islamist extremism also surfaces, though less frequently, alongside “extremist-adjacent” materials including the glorification of school shootings and other violent acts, content specifically designed to evade detection algorithms and manual review.
A particularly alarming insight from the research is the appeal of hyper-masculine gaming genres, such as first-person shooter (FPS) titles, for extremists looking to recruit. These games foster competitive, aggressive playstyles that resonate with extremist narratives glorifying dominance, violence, and tribalism. The online gaming environment, by connecting strangers united by a shared hobby, offers a unique ecosystem where extremists can seamlessly infiltrate community dynamics.
This process of “funneling” starts with initial contact during gameplay, often through matchmaking algorithms that pair players for cooperative or competitive sessions. Here, extremists initiate rapport-building through shared interests and gaming jargon before guiding conversations and interactions to gaming-adjacent platforms where moderation is minimal. These satellite spaces serve as echo chambers where extremist content can be amplified without immediate threat of being taken down.
One participant interviewed in the study detailed how the grooming process capitalizes on in-game interactions: “That’s where you have matchmaking. It’s where you can build quick rapport with people. But that’s the stuff that very quickly moves to adjacent platforms, where there’s sort of less monitoring.” This underscores the strategic movement from heavily moderated mainstream platforms to more ambiguous, gaming-related environments.
Concerns were also raised regarding younger gamers’ increased vulnerability to extremist influencers who combine live streaming of gameplay with insidious extremist narratives. This dual engagement entices viewers with entertainment while subtly normalizing dangerous ideologies. The immersive and interactive nature of these streams lowers defenses against radicalization, making youth particularly susceptible.
Further compounding the problem, content moderators expressed frustration over inconsistent enforcement policies and a near impossibility of managing the volume and nuanced complexities of harmful content. The presence of hidden symbols and coded language designed to circumvent keyword detection techniques further hampers moderation efforts. This ambiguous communication creates hurdles for both human and automated moderation systems.
While AI tools are increasingly employed to assist moderation, these systems struggle with context, sarcasm, and culturally coded memes common in gaming cultures. For example, phrases like “I’m going to kill you” are frequently used in games as competitive banter without actual threat, a nuance lost on many algorithms. This disconnect results in many false positives and negatives, limiting the effectiveness of current automated safeguards.
Dr. William Allchorn, co-author of the study, emphasized the urgent need for a deeper understanding of these unique digital environments by law enforcement and policymakers. “Social media platforms have attracted most of the attention of lawmakers and regulators over the last decade, but these platforms have largely flown under the radar, while at the same time becoming digital playgrounds for extremists to exploit.” His remarks highlight the critical gap in targeted policies and detection tools tailored specifically for gaming-adjacent platforms.
The research also highlights a widespread lack of clear reporting mechanisms. Users are often unaware of how to report extremist material, and when they do, their complaints are frequently dismissed or ignored, fostering a sense of helplessness. Strengthening both AI-driven and human moderation is essential, alongside updated platform policies that address harmful yet technically lawful content, which currently occupies a grey area in content regulation.
Ultimately, this study calls for decisive, coordinated action from tech companies, regulators, law enforcement, and the community at large to stem the insidious spread of extremism in gaming environments. The intersection between digital gaming cultures and extremist recruitment represents a new front in the battle against violent radicalization, demanding innovative, informed strategies to protect vulnerable populations.
Subject of Research: People
Article Title: Policing Extremism on Gaming Adjacent Platforms: Awful but Lawful?
News Publication Date: 31-Jul-2025
Web References: 10.3389/fpsyg.2025.1537460
Keywords: Social media, Propaganda, Human population, Terrorism, Written communication, Video games