Linda Schlegel and Judith Jaskowski are members of the Extremism and Gaming Research Network (EGRN). The EGRN works together to uncover how malign actors exploit gaming, to build resilience in gaming communities to online harms, and to discover new ways to use gaming for good.
Introduction
Six years since the live-streamed attack in Christchurch, New Zealand, pushed the potential nexus between gaming and extremism to the forefront of radicalisation research and counter-extremism efforts, it has become abundantly clear that extremists are seeking to exploit gaming and the digital gaming ecosystem in various ways. From the development of propaganda games to the exploitation of commercial video games, hateful discourses on a number of gaming (-adjacent) platforms, and the instrumentalisation of gaming aesthetics to increase the appeal of propaganda output, recent research indicates that the presence of extremist influences in digital gaming spaces may not only be diverse but fairly widespread. Our research showed that these influences include right-wing extremist, Islamist/jihadist, incel-/manosphere, antisemitic, racist, and conspiracy-related narratives, as well as other forms of identity-based hate such as xenophobia and hate against members of the LGBTQAI+ community – all of which could reach and potentially influence millions of users in digital gaming spaces.
Consequently, the need to develop and implement gaming-related measures to prevent and counter (violent) extremism (P/CVE) has become increasingly recognised over the last few years. Actors such as the United Nations, the European Commission, the Global Internet Forum to Counter Terrorism, and the Extremism and Gaming Research Network have underlined the importance of curbing extremist influences in digital gaming spaces and enhancing both prevention and counter-extremism efforts in the gaming sphere. However, while the last years have seen a stark increase in gaming-related P/CVE projects – ranging from the development of prevention games to digital youth work approaches on gaming (-adjacent) platforms – such initiatives are still few and far between. Efforts to prevent and counter extremism in digital gaming spaces are, by and large, still in their infancy.
This Insight sheds light on some of the issues that may contribute to the sparsity of gaming-related P/CVE efforts from the perspective of practitioners. It is based on work conducted in the context of the RadiGaMe project (Radicalisation on Gaming Platforms and Messenger Services), particularly on workshops as well as formal and informal interviews with German-speaking P/CVE practitioners over the course of 2024. In the context of this study, the term ‘practitioner’ refers to P/CVE professionals working for non-governmental and civil society organisations which develop and implement measures to prevent and/or counter extremism. Participants raised several concerns and reported crucial challenges that are currently impeding or, at the very least, complicating the development and implementation of gaming-related P/CVE measures. These challenges may be grouped into five categories: 1. Lack of knowledge on (extremist activities in) digital gaming spaces; 2. Challenges related to technology, game design, and equipment; 3. Practical challenges such as difficulties of access and the international character of gaming worlds; 4. Ethical, privacy, and data protection considerations, and 5. Challenges pertaining to cross-sector collaboration with other stakeholders. We briefly report on each of these five areas of concern below and close with suggestions for ways forward.
- Lack of knowledge
Although our understanding of extremist activities in digital gaming spaces has improved considerably over the last few years, systematic large-scale empirical studies that examine the extent and nature of the phenomenon across the gaming sector are rare. In addition, most existing studies focus largely on right-wing extremism and related content, whereas in-depth information on other types of gaming-related extremist activities is still lacking. Research on and evaluation of the handful of existing P/CVE efforts in digital gaming spaces is even more scarce. Consequently, practitioners lack crucial knowledge and guardrails to develop and implement gaming-related P/CVE measures. This may lead to feelings of being forced to ‘fly blind’ in this sector and to rely heavily on (potentially risky) learning-by-doing approaches, such as developing interventions without in-depth knowledge of digital gaming spaces. In addition, many practitioners may feel a lack of confidence to explore this new area. Without clear guidelines or proven strategies, there is a fear of making mistakes or causing unintended problems. This hesitation hampers the development of P/CVE initiatives in digital gaming spaces.
These feelings are exacerbated by a general lack of insight into and personal experiences of digital gaming spaces. While gaming is a mainstream phenomenon, many digital gaming spaces have subcultural attributes, including, for instance, their own memes, jokes, aesthetics, abbreviations, modes of communication, and standards for interacting with other users. Because the gaming ecosystem is so heterogenous, spanning an immeasurable variety of gaming genres, forums, gaming (-adjacent) platforms, streaming, esports, and even offline conventions, which may all differ in their distinct affordances and communication characteristics, knowledge of the exact gaming space practitioners seek to work in, is essential. This knowledge must be acquired and constantly updated as the gaming ecosystem is always in flux. Cultivating an in-depth knowledge of gaming subcultures and keeping up with the changes in the gaming sphere, in addition to all the other responsibilities practitioners may face in their daily work routine, is a demanding task. Significant time and resources are needed to properly engage with this topic and familiarise oneself with digital gaming spaces. It may seem daunting having to dedicate many working hours to exploring this new sphere. The lack of insight into digital gaming spaces and the sheer amount of knowledge required to navigate the gaming realm can therefore contribute to a hesitancy to develop gaming-related P/CVE interventions.
- Tech challenges
Because digital gaming spaces have only recently been recognised as an important area for P/CVE efforts and still constitute largely uncharted territory, practitioners lack the basic necessities such as access to suitable equipment. Funding for PCs and consoles, headsets, microphones, and livestreaming tools, software, and access to commercial video games is often not available. As a consequence, practitioners may use their personal equipment, which poses not only financial but also potential security and data protection challenges.
Practitioners may also lack the required technical knowledge to develop and implement certain gaming-related P/CVE measures effectively. This includes, for example, basic game design knowledge to effectively incorporate P/CVE content into gaming formats and the fundamentals of coding and gaming knowledge necessary to create modifications (‘mods’) of existing games. Even the use of sandbox games—defined by Merriam-Webster as “a video game or part of a video game in which the player is not constrained to achieving specific goals and has a large degree of freedom to explore, interact with, or modify the game environment”—requires at least a rudimentary understanding of the game mechanics. Platforms such as Minecraft and Roblox, in particular, demand some level of experience in creating gaming content. While some practitioners may possess such knowledge because they are gamers themselves, others may need to acquire it. This begs the question of how practitioners can be supported in broadening their gaming knowledge in a professional context and how the P/CVE sector can effectively collaborate with actors who specialise in these areas, such as game development studios.
- Practical challenges
Practitioners experience a number of practical challenges when seeking to implement P/CVE efforts in digital gaming spaces. This includes issues of access: In order to take part in in-game chats, practitioners need to play the video game in question. This not only requires work time to be dedicated to gaming but may often also necessitate working outside regular office hours, such as evenings, weekends, and holidays, as games often see a lot of traffic during these times. In certain gaming spaces, it may be necessary for practitioners to engage users in real-time, for instance, because the (voice-based) chats are volatile and cannot usefully be engaged with in an asynchronous manner or, for instance, on livestreaming platforms where streams may be deleted immediately after completion. Additionally, voice-based chats may require entirely different P/CVE approaches than tested, existing intervention measures, such as digital youth work, which are largely text-based. It remains to be seen whether and, if yes, how far engagement via (often group-based) voice-based chats is suitable for prevention and counter-extremism work.
Furthermore, the international character of the gaming sphere may be challenging to navigate in the current P/CVE system. Prevention and counter-extremism projects, when implemented by civil society organisations and P/CVE practitioners, are typically characterised by a national or even local focus. Funders may even demand proof of a local impact and favour projects with narrowly defined target audiences. Digital gaming spaces, however, are characterised by cross-national communication, and users often converse in English even if they are not native speakers. Oftentimes, it may be impossible to determine where the user is from unless it is specified in the user name or profile description. This has important practical implications for P/CVE work in this domain.
While gaming’s international character allows practitioners to reach many users at once, it complicates integration with national and local initiatives, targeting specific audiences, and tailoring content effectively. It also raises important logistical questions: Should practitioners trained for their own national contexts engage with users from vastly different backgrounds? If not, what alternatives exist in digital gaming spaces?
- Ethical, privacy and data protection considerations
(Semi-) private communication channels such as in-game chats, Discord servers, or groups on gaming forums may pose a number of practical challenges for P/CVE practitioners, such as the question of how to access these spaces. However, there are also issues surrounding ethical engagement, the ‘intrusion’ of private chats, the importance of user privacy, and considerations surrounding data protection. Participants recognised that P/CVE efforts may require joining (semi-) private communication channels but questioned whether practitioners should be allowed to do so and under which circumstances. It could be perceived as an invasion of privacy, an unduly intrusion into private leisure activities, and may stand in contrast to users’ rights of undisturbed, unmonitored, personal conversations. This concern is likely to be particularly strong in our sample as the German data protection and privacy regulations are among the strictest worldwide. Nevertheless, weighing up the right to private communication versus the need to meet users where they are and where extremist content is shared is relevant in all contexts.
Participants also reported concerns about personal risk and safety. Some private groups have implemented vetting processes before admitting new members. In order to join such spaces, creating ‘fake profiles’ may be required. This begs the question of not only when but also how much and in which circumstances it is justifiable for practitioners to lie to potential recipients of their interventions and whether the personal risk for practitioners outweighs the potential benefits. When is such a covert intervention ethically sound? Is it likely to be successful?
Another important challenge to recognise is the potentially high number of minors in digital gaming spaces. The likelihood of encountering minors differs vastly depending on the characteristics of the video game or gaming (-adjacent) platform. However, the anonymity of most digital gaming spaces means that practitioners often do not know whether they are engaging with minors or adults. Because working with minors requires a specialised skill set and also comes with much more rigorous ethical guidelines for engagement, practitioners are left in limbo. This is exacerbated by the internationalisation of the gaming sphere: A practitioner may be unknowingly communicating with a minor in an entirely different country and context who could possibly be much better supported by a local P/CVE practitioner.
- Collaboration with other stakeholders
Extremist activities in digital gaming spaces are a cross-sectoral issue that requires multi-stakeholder dialogue and cooperation between researchers, policymakers, law enforcement agencies, the gaming industry, civil society organisations, and P/CVE practitioners. However, suitable dialogue formats and collaboration efforts seem to be lacking. Participants lamented the lack of opportunity for information exchange and discussions with the gaming industry and law enforcement agencies, particularly on the national level, as well as specific collaborative formats regarding shared risk assessment and casework. Even if such dialogue takes place, challenges remain. These include:
- Different stakeholders have different definitions of extremism and the type of content or behaviour that requires action;
- Other stakeholders, such as police authorities and some tech companies, prioritise repressive and reactive measures such as legal action, banning and de-platforming, whereas P/CVE practitioners emphasise the need for preventative and proactive efforts, and;
- The clashes of different paradigms, particularly the tension between approaching hate and extremism solely as a security issue versus taking a broader approach to building a healthy (digital) civil society.
In addition, practitioners reported practical concerns, such as being unsure how to approach other stakeholders, particularly from the gaming industry and gaming (-adjacent) platforms, and hesitancy to request support or advocate for joint P/CVE initiatives.
Conclusion and Ways Forward
It is evident that practitioners experience many hurdles regarding the development and implementation of gaming-related P/CVE measures. These obstacles must be addressed and overcome promptly. Discourses in digital gaming spaces potentially influence millions of users. If extremist narratives are normalised in such discourses, they may contribute to shifts in real-life perceptions, opinions, and, ultimately, political realities. Therefore, extremist activities in the gaming sphere cannot be left unchallenged. P/CVE measures are one of several important building blocks to curb the influence of extremist narratives and hateful discourses in the gaming realm, and consequently, gaming-related P/CVE efforts must be broadened and strengthened to counter extremist influences in the gaming sphere.
Ways forward for practitioners may include recognising the importance of the gaming sphere for digital target audiences and the necessity to adapt P/CVE measures to this space. To advance the field, someone must be willing to start despite the lack of knowledge and insights on gaming-related P/CVE. Creating and testing gaming-related P/CVE formats, even within a challenging information environment and incomplete evidence, is paramount. Given the current threat landscape, waiting until we have a deeper understanding of the phenomenon and strong evaluations of the most effective and suitable approaches is not feasible. In addition, while acquiring knowledge about subcultural peculiarities and technical expertise is beneficial, reinventing the wheel may not always be necessary. Rather, P/CVE approaches tested in other digital environments could be adapted for implementation in digital gaming spaces. To integrate digital gaming spaces into P/CVE work, P/CVE organisations should provide their practitioners with suitable equipment and allow them to familiarise themselves with video games and the digital gaming ecosystem as part of their professional activities.
The gaming industry should support P/CVE measures implemented in their game or on their platform, including efforts that are not only focused on identifying and deleting extremist content and users but also projects aiming to support the growth of positive, inclusive gaming communities. Industry actors should be open to participating in information exchange and dialogue formats, take action to reduce the barriers to asking for support that practitioners report offer information and engage in collaborative efforts against extremism with other stakeholders.
Funders should consider funding a variety of different approaches to gaming-related P/CVE work and recognise that it may not be useful to insist on a strict national or local focus due to the international character of digital gaming spaces. It would be beneficial if funders would allow for the development, implementation, and testing of bold, creative pioneer projects that are explicitly allowed to take risks and potentially not have the desired impact – because this ‘failure’ also contributes to a more thorough understanding of which approaches are effective and ineffective in digital gaming spaces. Only by doing so can practitioners push the boundaries, test gaming-related approaches, and create new ideas for potentially impactful interventions. Doing so would also be beneficial to advance the extremely limited number of evaluations of gaming-related P/CVE interventions and gain a deeper understanding of the approaches that may be particularly (in-)effective.
Linda Schlegel is a Postdoctoral Research Fellow at the Peace Research Institute Frankfurt (PRIF), where she co-leads the RadiGaMe project and researches extremist activities in digital gaming spaces. She is also a Research Fellow at modusIzad, where she explores new avenues for digital P/CVE approaches, and a founding member of the Extremism and Gaming Research Network (EGRN).
Judith Jaskowski has been working as a research assistant at modus – Centre for Applied Deradicalisation Research with a focus on games since 2023. She previously worked as a game development manager for various German developer studios.