Over the past two decades, gaming environments have become increasingly social and massified. Multiplayer games allow players to cooperate, compete, and communicate in real time, while adjacent platforms (such as Discord or Twitch) extend interaction beyond gameplay itself. Scholars studying gaming cultures have long emphasised the importance of these spaces for community building and identity construction (see, for example Gray, 2018).
However, research has also examined how extremist actors exploit these same environments for malign and harmful pursuits. Studies on the intersection of gaming and extremism highlight several patterns: the use of gaming aesthetics in propaganda (Kingdon, 2024), recruitment within gaming communities (Newhouse & Kowert, 2024), the development of extremist modifications or games (Robinson and Whittaker, 2021), and the circulation of ideological memes tied to gaming culture.
Recent research conducted by the authors through 13 qualitative interviews with P/CVE practitioners, moderators, and researchers working on gaming platforms provides further insight into how these dynamics operate in practice (see Allchorn & Orofino, 2025). As outlined in this Insight, the findings indicate that gaming ecosystems can function as “gateway social spaces” where individuals encounter ideological narratives and gradually form identities aligned with broader extremist milieus. These observations were presented at a recent symposium for the Horizon-funded GEMS project (Gaming Ecosystem as a Multilayered Security Threat), entitled “Beyond the Clan: Identity Formation, Influence, and Extremist Milieux in Online Gaming‑Adjacent Spaces”.
Theme 1: Collective Identity Formation Through Team‑Based Gameplay
Historically, extremist organisations have relied on collective activities to cultivate group cohesion and reinforce ideological identity. These activities have included physical training exercises as well as social gatherings intended to strengthen interpersonal relationships within the movement. As one interviewee noted, “we’ve seen this historically… on the far right and with Islamist groups,” where, for example, Islamist networks organised themselves into cells that participated in activities like “paintballing and survivalist exercises… used for team bonding and group bonding and survival skills” (Interviewee 1). Such activities were not incidental, but central to cultivating trust, discipline, and a sense of belonging within the group.
Today, these dynamics are increasingly replicated in digital gaming spaces. As the interviewee explained, “technology has evolved and that same [real life] group bonding is now taking place online,” particularly in warfare-based games like Call of Duty or Fortnite, which “transfer a digital version of those survivalist and team building skills,” requiring players to “work as a team together to kill a common enemy” (Interviewee 2). Through shared, cooperative gameplay, these environments foster social connections and collective identity in ways that mirror traditional group-bonding contexts such as sports teams or social clubs.
Digital infrastructure now enables similar dynamics to occur online, particularly in competitive multiplayer games such as first‑person shooters and more collaborative online sandboxes (like Roblox and Minecraft). The structure of these games – collective strategy, shared goals, and communication – can reinforce group cohesion while normalising adversarial or militaristic narratives, and has led to mods and red-teaming by extremist actors.
These environments often include communication systems such as voice chat or private team channels, which allow players to coordinate strategies and interact socially beyond gameplay. Repeated interaction fosters trust and familiarity among players, gradually creating cohesive online communities.
“You’ve got the team building in the game, and then you’ve got the conversations as well. Obviously, you can set up a private game [or channel] and just have private conversations yourself” (Interviewee 4).
While these dynamics are not inherently problematic, they may also create opportunities for ideological influence. Groups that form around shared gameplay experiences may subsequently develop broader discussions about politics, culture, or identity. Within these conversations, extremist narratives can sometimes be introduced gradually and normalised within the community.
Theme 2: Youth Identity Development and Susceptibility to Influence
Research on youth radicalisation consistently highlights the role of social belonging in processes of ideological mobilisation (Petrosino & Morgan, 2025). Individuals seeking identity may gravitate toward communities that offer compelling narratives of purpose, injustice, or heroism. Gaming environments intersect with these developmental processes by creating spaces where social interaction is constant, and group identities can form quickly through shared play and collaboration.
Interviewees involved in platform moderation and counter-extremism work emphasised that those seeking to shape ideological attitudes tend to focus on younger audiences, particularly within gaming spaces where they are highly active. As one interviewee explained, rather than attempting to influence older individuals in offline social settings, actors would be more likely to target teenagers, who are seen as “more impressionable and idealistic,” and therefore more susceptible to being engaged and influenced by ideological messaging (Interviewee 2).
At the same time, it is important to stress that gaming itself does not cause radicalisation. Rather, gaming platforms represent one of many social environments in which broader processes of identity development unfold. The same mechanisms that foster friendship, cooperation, and community among players can also be exploited by actors attempting to introduce ideological narratives.
Theme 3: Idolisation and Influencer‑Driven Identity Formation
A key dynamic shaping identity within gaming ecosystems is the influence of online personalities, including streamers and content creators who often act as role models for younger audiences. Interviewees noted that adolescents may internalise political or ideological messages when these are conveyed by figures they admire, making such content more persuasive and “more impactful and difficult to regulate,” particularly given the limits of controlling live or user-generated content (Interviewee 7). This reflects an “idol–imitator” dynamic, where audiences adopt the attitudes and behaviours of influential figures, who in some cases – such as attackers – can become symbolic icons whose ideas spread across platforms through memes, manifestos, and online discourse (Interviewee 3).
At the same time, influence is not limited to high-profile figures. Interviewees emphasised that more subtle, interpersonal interactions within gaming spaces can also shape beliefs, particularly among individuals experiencing isolation or marginalisation. In these contexts, relationships framed as friendship or mentorship may gradually introduce ideological narratives, often without being perceived as harmful by those involved.
These layered dynamics present challenges for moderation. As one interviewee explained, approaches increasingly focus on behavioural signals – such as whether “this conversation [is] focused on someone’s social support network” or whether there are “indications this user doesn’t have other friends they’re playing with” (Interviewee 13) – rather than solely on explicit content. This shift reflects the need to identify vulnerability and grooming patterns, including intense one-to-one interactions or efforts to build dependency.
Overall, influence in gaming ecosystems operates across multiple levels, from visible influencers shaping community norms to more hidden interpersonal dynamics. For platforms, this complicates moderation efforts, as traditional tools are often less effective in detecting subtle ideological messaging embedded within humour, gaming culture, or coded forms of communication.
Theme 4: Decentralised Ideological Communities and Hybrid Identities
A further shift shaping contemporary extremism is the movement away from hierarchical organisations toward decentralised ideological milieus where an ‘influencer plus follower’ model of organisation prevails. Historically, extremist groups (such as Al-Qaeda, ISIS and Boko Haram) often relied on formal membership structures, leadership hierarchies, and identifiable organisations.
Digital environments have transformed this landscape. Online ideological ecosystems increasingly function as loose networks of communities connected by shared narratives, symbols, and cultural references rather than formal organisational membership.
Participants described this shift as the emergence of fluid “milieux” rather than clearly defined, structured organisations. As one interviewee explained, it is now possible to “coagulate a group online around social identity cohesion markers,” yet, “comparatively to 20 years ago,” such formations would not be considered “an organisation in the same way” (Interviewee 5). Individuals may therefore move across multiple online spaces simultaneously, drawing on and combining different ideological elements without formal membership, reflecting a more diffuse and networked form of collective identity.
Decentralised anti-government movements show how shared identities can form online without formal structures. As one interviewee noted, participation is often “more of a pick and mix,” reflecting “the era of influencers,” where identity is shaped less by group membership and more through “influencer–followers relationships” (Interviewee 10). In this context, memes and symbols act as markers of belonging, complicating efforts to identify and monitor these diffuse networks.
Theme 5: Cross‑Platform Identity Performance
Another defining feature of contemporary online extremism is cross-platform identity performance, where users maintain multiple personas and adjust their behaviour according to moderation levels and audience expectations. As one interviewee noted, individuals may present themselves differently across spaces, with some explicitly acknowledging that “you should see me on this platform – I’m way more extremist there because we don’t get moderated there” (Interviewee 6). This reflects a broader pattern in which public platforms are used for outreach and visibility, while less regulated or private spaces facilitate more explicit ideological expression and interaction. Several high‑profile cases illustrate this dynamic. Investigations into the Buffalo supermarket attack in 2022 found that the perpetrator had used private Discord communities to share ideological material and discuss plans prior to the attack. Discord took action promptly following the attack to mitigate further harm and the spread of related content. Similarly, researchers analysing the Christchurch attack identified how the perpetrator posted messages on 4Chan before livestreaming the attack.
These examples highlight how extremist actors strategically navigate digital ecosystems, exploiting differences in platform governance and moderation systems.
Recommendations: Implications for Technology Companies
The dynamics described in the research above highlight several challenges for technology companies seeking to counter extremist exploitation of gaming ecosystems. Gaming platforms are fundamentally social environments, and many of the features that enable community formation—voice chat, livestreaming, private groups—are also the features that can be exploited by extremist actors.
Addressing these challenges requires innovative approaches that extend beyond traditional content moderation:
- First, platforms should invest in specialised moderation teams with expertise in gaming cultures and coded online language. Extremist narratives in gaming environments often appear through humour, memes, or references that may be difficult for automated moderation systems to detect.
- Second, cross‑platform collaboration is essential. Extremist actors frequently move between platforms, using mainstream networks for outreach and smaller communities for ideological discussions. Information sharing between companies could help identify patterns of coordinated activity.
- Third, platforms should expand transparency and partnerships with academic researchers studying extremism and digital cultures. Collaborative research initiatives can improve understanding of emerging trends and enable earlier identification of harmful networks.
Conclusion
Gaming environments have become important social ecosystems where millions of users form friendships, communities, and identities through shared experiences. However, the same social dynamics that enable community formation can also create opportunities for extremist actors seeking to influence or recruit individuals online. Our research examining gaming‑adjacent platforms highlights how identity formation processes—collective gameplay experiences, youth identity development, influencer cultures, decentralised ideological networks, and cross‑platform identity performance—can intersect with broader extremist ecosystems.
Understanding these dynamics is essential for developing effective counter‑extremism strategies. Rather than treating gaming platforms solely as security risks, policymakers and technology companies should recognise them as complex social ecosystems where identity formation, community building, and ideological influence interact – both for good and for ill.
—
Dr William Allchorn is a Senior Research Fellow at International Policing and Public Policing Research Institute (IPPPRI), Anglia Ruskin University, and an expert on radical-right extremist social movements in the UK, Europe, and beyond. He has recently advised the UK, US and Australian governments on their approaches to countering the extreme right-wing, and has undertaken some of the first metrical testing of violent far-right extremist counter-narratives in the UK, US and Australia with governments (e.g. EU Internet Forum), civil society organisations (e.g. Counter-Extremism Project), and tech companies (e.g. Meta) across the world.
Dr Elisa Orofino is a Senior Research Fellow and Academic Lead for the Extremism and Counter-Terrorism Sub-Theme at International Policing and Public Policing Research Institute (IPPPRI), Anglia Ruskin University, and an expert on Islamist extremist social movements in the UK, Europe, and beyond.
—
Are you a tech company interested in strengthening your capacity to counter terrorist and violent extremist activity online? Apply for GIFCT membership to join over 30 other tech platforms working together to prevent terrorists and violent extremists from exploiting online platforms by leveraging technology, expertise, and cross-sector partnerships.