Click here to read our latest report “30 Years of Trends in Terrorist and Extremist Games”

Understanding the Game: Bridging Research Gaps at the Nexus of Gaming and Extremism

Understanding the Game: Bridging Research Gaps at the Nexus of Gaming and Extremism
9th March 2022 Galen Englund

Preventing and countering violent extremism practitioners are increasingly concerned about pathways to radicalisation, including those leading to violent extremism, present in online gaming environments. No longer a niche group, gamers globally number over 2.81 billion, with the highest concentration in Asia-Pacific (over 1.5 billion). Gamers are also increasingly diverse – at least 59% male and 41% female. At the same time, COVID-19 has led to a surge in online gaming and has become a lifeline during isolation: a way to find community, bolster mental health, and find joy. With the popularity of online gaming comes increased attention from extremist groups of all persuasions. Today, these groups may be exploiting the massive pandemic-related spike in online gaming to push propaganda and conspiracy theories via gaming platforms. Far from only being a solitary place to play, popular games offer chatrooms and entire voice chat servers, while sub-communities of gamers find corners of commonality across many platforms beyond games themselves. Numerous studies have failed to find a causal connection between violence in video games and violence offline. But connectivity on gaming platforms brings other risks for offline harm, including the documented use of platforms by extremist and terrorist actors.

In 2021, we developed a report at Love Frankie (LF) for our community of practice at the Extremism and Gaming Research Network (EGRN) to understand the state of research into linkages between video games, gaming communities, and extremism. That annotated bibliography and analysis, State of Play: Reviewing the Literature on Gaming & Extremism, provides an overview of existing available literature on research and publications on the use of gaming platforms, video games, and related online fora (chatrooms, gaming-based social networks, and the like) on violent extremist organisations and radical hate groups to inform areas of possible studies. In addition, we review and summarise the limited literature on factors that draw individuals into online gaming spaces and gaming-related environments, and the radicalisation pathways they may encounter. This report focuses on sources written in English and does not account for publications that may exist in other languages, though our work indicates these are limited beyond select quality publications in German. Given the limited academic literature on the niche subject of extremism, radicalisation, and gaming, we chose to examine articles from both the academic sector and grey literature. We prioritised peer-reviewed journal articles and research publications but ultimately cast a wider net. We analysed 76 papers published from the early 2000s to August 2021.

We noticed that existing empirical research into gaming, radicalisation, and extremism is limited. Games studied are often outdated, while lab-controlled experiments have limited relevance to the online realities of modern gamers. English-language literature focuses on Western research subjects without considering cultural differences among gamer communities worldwide. Gender and other equity concerns when examining socialisation and possible radicalisation processes in gaming subcultures are underexplored.

Across what research does exist, our review indicates at least three factors that contribute to the exploitation of gaming and gaming-adjacent spaces by harmful and extremist actors:

  1. Regulation and enforcement to create safe, inclusive communities by the gaming industry of online platforms and social spaces is nascent and often insufficient. In turn, this enables the proliferation of social environments that harmful or dangerous organisations may exploit. Existing platform policies and machine-enforced algorithms have not entirely curbed extremism in online games and gaming platforms. Recent efforts appear promising and echo trust and safety efforts taken by Meta, Twitch, and YouTube. However, early scoping by researchers at the Institute for Strategic Dialogue shows a great degree of variability across a selection of gaming-adjacent platforms surveyed. Given the fluidity of the end-user ‘terms and services’ based on national laws and community guidelines, existing prevention or counter-hate speech policy frameworks often fail to consider the differences in gaming platforms’ activities concerning exclusion and compliance. As a result, internal and external policies for this space are inconsistent and, at best, obscure, resulting in failures to deter extremist content.
  2. Extremist organisations and individuals can co-op specific functionalities developed by games and gaming platforms, including financial incentivisation and live streaming fundraising. Those who support radical narratives promoted by live streamers can be provided with the option to donate to live streams or sponsor their gamer of choice by ‘gifting’ them in-game virtual items, which can be exchanged for fiat or cryptocurrency. De-platforming these individuals – from YouTube, for example – often results in them moving their content to more permissive, ‘free-speech’ platforms. In-game purchases, including virtual character skins, novel game weapons, and other perks, have been used by terrorist organisations to launder and move funds. Game key swaps have been linked to money laundering efforts, while non-fungible token (NFT) gaming could also provide another laundering concern.
  3. Gamification, or the porting of visual or cultural aspects from games to other settings, and novel recruitment activities appear to be actively pursued by extremist organisations. Ranging from propaganda videos of first-person-shooter scenes recorded through helmet cameras to broadcasts of radical right supremacists storming the United States Capitol, gamification relies on game-content-based principles, such as ‘leveling-up’ or ‘achievement unlocks’ applied to other contexts. Similarly, violent extremists mod games to run military-like stimulations in online gaming environments. At the same time, the use of youthful imagery, such as cartoon ninjas, ice cream cones, and memes, in youth-focused mainstream video games such as Roblox, Division 2, and Minecraft, are often linked to anti-Semitic, racist, or sexist tropes and are used to desensitise participants to extremist content in a form of ‘memetic warfare’. Lastly, demographics play a role in extremist use of gaming tropes: youth-heavy channels link pop media and fridge ideologies leveraged by extremist groups.
  4. Games can also build inclusive, resilient online communities. Positive interventions can build on existing pro-social endeavors, such as Child’s Play charity and the gaming-specific mental health support through Take This. As discussed previously on GNET, new efforts are already afoot: the Dutch police have used videogames to interface with younger communities at a neighborhood level through, as have British police in North Yorkshire. Further engagement with gamer sub-cultures, such as through partnerships with E-Sports leagues, live streamer influencers, and mental wellbeing support networks, show promise. Additional research and practice is warranted.

Beyond the above trends, our review indicated seven significant research gaps obscuring a clearer picture of gaming and gaming adjacent areas at risk of offline and online exploitation by extremist actors:

  1. There is a lack of comprehensive research into newer gaming-related communication platforms and channels such as Steam, Dlive, Odysee, Trovo, Tamtam, and others. Despite red flags, especially from far-right organisations, how and to what extent extremist groups exploit platforms popular among gamers to spread harmful content and incite hate remains unclear. Channels popular with non-English and German-speaking audiences, including fringe non-English platforms, are poorly understood.
  2. Specific language and narratives employed in hate speech and radicalising propaganda, especially targeting non-US or UK-based audiences across regional and ideological variations, must be examined further. An in-depth comprehension of this phenomenon may allow practitioners to develop positive interventions better to track and prevent radicalisation and recruitment attempts.
  3. Mechanisms for terrorism and extremist-related financing through gaming and gaming-adjacent platforms are poorly understood. Live-streaming payments to neo-Nazi and radical right actors have been the focus of study, but detecting infringements and small-scale extremist-related finance flows pose a persistent technological difficulty. Nascent research into microfinancing through NFTs and money laundering schema involving in-game purchases and game purchase keys should be bolstered and hastened.
  4. The behaviour of younger gamers, including their preferences around platforms, in-game norms and behaviours, and exposure to harmful content, is insufficiently understood. Future research should be responsive to the attitudes of underage gamers and disaggregated by age and gender, at a minimum.
  5. Extremist narratives and storylines in many contemporary mainstream and fridge ‘mods’ for games have not been analysed in great depth. Additional research into first-person shooter (FPS) games in countries where violent extremist organisations have their own ‘bad guy’ characters depicted within gameplay storylines may be of particular interest to analyse the impact of such narratives. This may also inform narrative components for positive interventions.
  6. The potential links between online socialisation in gaming-related spaces and sub-communities, online behaviour, and offline behaviour are also under-assessed. Future studies on online socialisation leading to radicalisation across geographies, genders, and ideological spectrums are needed.
  7. Lastly, research into online and offline affinity spaces’ self-regulation guidelines within the gaming industry is limited. Understanding what policies increase online safety and reduce online harms should be prioritised. The support of organisations like GIFCT in improving responsible platform governance is invaluable and should be considered in seeking ways to improve standards.

While new initiatives are underway, research into gaming and radicalisation is still sparse. Substantially more empirical research, policy development, and cross-sectoral engagement is needed. Alongside our colleagues at the Extremism and Gaming Research Network, we intend to evidence how malign actors use gaming for harm and how it can also be used for good. We are building collaborative solutions across academia, government, gaming platforms, and private sector entities to create an evidence base for fostering more resilient and inclusive online communities of gamers globally.

The Extremism and Gaming Research Network (EGRN) brings together world-leading counter-extremism organisations to develop insights and solutions for gaming and radicalisation. The network comprises top-tier think tanks, private sector institutions, academia, and other actors collaborating internationally with a shared vision. Over 50 members are now part of the EGRN (as of Feb 2022), including GNET and GIFCT. The authors are founding members of the EGRN as part of Love Frankie, a strategic communications agency specialising in the Asia-Pacific region.