Click here to read our latest report “Transmisogyny, Colonialism and Online Anti‐Trans Activism Following Violent Extremist Attacks in the US and EU”

A Survey of Violent Extremist and Terrorist Activities Across the Gaming Environment

A Survey of Violent Extremist and Terrorist Activities Across the Gaming Environment
28th June 2021 Aaron Tielemans
In Insights

The relationship between video games and large-scale violence has been questioned since the 1999 shooting at Columbine High School. In recent years, this conversation has expanded to include specific dimensions questioning the influence of gaming on processes of radicalisation and violent extremism.

Violent extremists and terrorists are certainly attempting to exploit the gaming landscape, hoping to capitalise on the massive, youthful audience and the gaming world’s deep integration within pop culture. Therefore, in order to stay ahead of this adversarial shift, it is important that we continue to develop our understanding of how the gaming environment is, in fact, being exploited and to what effect.

This piece surveys violent extremist and terrorist activities across unique elements of the gaming environment, assessing threat severity based on prevalence, current research, and opportunity for exploitation. Case studies and research provide insight on motivations and effectiveness, but further research on violent extremist and terrorist exploitation of the gaming environment is necessary.

This piece builds on the Radicalisation Awareness Network’s framework and focuses on three aspects of the gaming environment where violent extremists and terrorist entities are active:

  1. Video Games: Creating bespoke video games or modifying existing games.
  2. Gaming Exports: Utilising gaming cultural references and transferring game design elements to non-gaming contexts.
  3. Gaming Communication: Using in-game chats and gaming adjacent communications platforms.

This Insight argues that these aspects do not currently pose as generalised or severe a threat as characterised in many discussions around the subject. Existing research and case studies do not show that video games or other gaming aspects function as causal factors in processes of radicalisation towards violent extremism. The technological and cultural elements of the gaming environment must, instead, be understood both as unique components vulnerable to exploitation and as part of broader efforts by violent extremists and terrorists to target new technologies and components of popular culture.

Video Games – Bespoke Games and ‘Mods’

Two “bespoke” games released in the last three years include Hezbollah’s First Person Shooter (FPS) game, Holy Defence, and far-right German nationalist group Ein Prozent’s 2-D, Super Mario-style Heimat Defender: Rebellion. A neo-Nazi “mod” of Doom 2, hosted on the Daily Stormer website, demonstrates how violent extremists can modify an existing video game rather than creating one from scratch.

Unique in geography, ideology, and aesthetics, these games are thought to function as tools for recruitment, reinforcing support, or generating media attention predicated on the competence implied by creating a video game that is purported to radicalise.

However, these case studies are currently few and far between. For the few cases that do exist we can assess a low user uptake and an unlikelihood to radicalise someone not already converted to the cause.

In addition to these games’ low prevalence, there is no strong evidence-based, causal link between these video games as a catalyst or impetus for radicalisation.

Gaming Exports – Gamification and Gaming Cultural References

Gaming exports offer access to audiences beyond gamers. Gamification, defined as “the use of game design elements within non-game contexts,” is used in contexts such as education and health to incentivise sustained engagement and make user experiences more enjoyable. Gaming culture is often synonymous with pop culture. In the context of violent extremism, appropriation of gaming culture allows violent extremists to speak the language of their younger target audiences.

In 2019, white supremacist terrorist perpetrators in the Christchurch, New Zealand attack and the Halle, Germany attack, used livestreaming to amplify their attacks. These livestreams mimicked the streaming phenomenon where individuals livestream themselves playing video games to an audience and incorporated design elements of the FPS genre by allowing viewers to watch attacks from the perpetrator’s perspective.

The Christchurch perpetrator also referenced a meme in his livestream connected to YouTube’s top individual creator, gamer Felix ‘PewDiePie’ Kjellbergand. The white supremacist perpetrators in the Oslo and Utøya attacks in 2011, and the El Paso, Texas attack in 2019 mentioned the Call of Duty franchise and other video games in the manifestos released prior to their attacks.

Islamic State (IS) also utilises a comprehensive playbook of gaming exports. Scenes from IS videos correspond to parallel scenes from Call of Duty: Black Ops 2 and the terrorist organisation created FPS-style footage by attaching cameras to fighters’ helmets. IS also designed Huroof, a gamified application that teaches children the Arabic alphabet through matching games. The incorporation of game elements into an educational app for children demonstrates their potential pervasiveness.

Compared to limited case studies of video games designed by violent extremists and terrorists, gamification and reappropriation of gaming aesthetics and slang are more widely available as easy ways to implement recruitment tactics and disseminate propaganda to younger audiences. Current research on gamification is not as extensive as research on the link between video games and violence, though some preliminary findings suggest that the motivation behind the use of gamification is facilitating sustained engagement, potentially increasing an individual’s susceptibility to radicalisation.

Gaming Communication – In-Game Chats and Gaming Adjacent Communications Platforms

Video game communication takes place in a highly dynamic and porous environment that includes both in-game chats and gaming adjacent platforms. It is likely that any one gamer will engage with a variety of video game communication methods depending on which game they are playing and with whom. These mediums have the potential to facilitate a number of violent extremist or terrorist communication or recruitment efforts targeting gamers or wider audiences.

Both in-game communication mechanisms (often audio or text) and gaming adjacent platforms are ideal environments for “test[ing] the waters” to identify individuals for recruitment. British neo-Nazi Mark Collet has livestreamed tournaments for Call of Duty: Warzone, drawing younger audiences into online spaces where other viewers adopt the names of far-right figures and post both coded and explicit messages into live chats.

In addition to recruitment and propaganda, far-right extremists have used gaming adjacent platforms to organise and communicate about rallies and events that have turned violent, including the Unite the Right rally in Charlottesville, Virginia and the storming of the United States Capitol on 6 January 2021.

Livestreaming platforms allow users to project real time events to their audience. Violent extremists can use livestreaming to disseminate propaganda and, should a streamer garner a substantial following, capitalise on monetisation options. White nationalist Anthime Joseph Gionet, who livestreamed from inside the United States’ Capitol on 6 January under the name ‘Baked Alaska’, received donations as he streamed in the form of DLive’s in-app currency, Lemon. With YouTube Super Chat, viewers can “purchase chat messages that stand out and sometimes pin them to the top of a chat feed,” allowing them to send coded messages, such as a donation of $14.88, a reference to white supremacist numeric code for the 14-Words and Heil Hitler (H being the 8th letter of the alphabet).

The lower profile of gaming communication infrastructure has attracted extremists to these platforms. Even as Discord scales up its capacity to detect and remove violent extremist communities, the anonymity and privacy it offers make it an attractive option. Both the accelerationist Boogaloo movement and the conspiracy theory network QAnon had servers identified by Discord as part of the over 2,000 servers removed for hosting extremist or violent content in the second half of 2020. Public-facing platforms like Twitch have attracted streamers who are already banned from larger platforms like Twitter and YouTube.

Regarding threat severity, in-game chat functions are a possible tool for violent extremists attempting to make initial contact with potential recruits before moving to a more private communication space. However, these interactions are considered rare. There is not yet a complete picture of which communication methods are being used, to what extent, and to what end.

The diverse mediums of the gaming communication infrastructure could allow violent extremists to operationalise different platforms for different purposes, such as targeting and engaging large audiences, curated groups, or individuals.

As gaming platforms increase in popularity, wider safety and policy attention should be paid to the specific motivations of violent extremist platform abuse, as well as to what safety-by-design measures can be implemented in order to detect, disrupt, and remove bad actors. Given that the problem is both transnational and cross-platform, identifying lessons learned from other parts of the tech sector and multi-stakeholder collaboration will be important next steps.


While it is important to recognise violent extremist and terrorist efforts to exploit the gaming landscape for various intentions, it is critical that it be represented as part of the technology and pop culture frontiers that violent extremists and terrorists have always sought to exploit, rather than as a preeminent threat.

Though elements of the gaming landscape may function as a factor in an individual’s radicalisation process, the presumption that any of these elements can serve as an isolated vector for violent extremism is unproven. Furthermore, isolating ‘gaming’ as the primary threat reduces a complex landscape to a monolith, detracting from the important recognition that each gaming landscape element will require unique solutions to counter violent extremist and terrorist exploitation.

In addition to promoting nuance in the threat landscape, these same elements may have the potential to inform prevention efforts. Gamification could contribute to sustaining and measuring engagement for prevention initiatives, as well as offer guidance for counter-narrative construction. Competitive gaming – known as E-Sports – may offer a new institution around which P/CVE practitioners develop programs to generate positive social outcomes for individuals and communities, supporting prevention efforts in the same way that physical sports do. Finally, we have already started to see the development of games in prevention spaces, such as Wicked Saints’ World Reborn, which addresses issues such as bullying and prejudice.