Introduction
In the entertainment industry, gaming has become the most profitable business, surpassing both music and film production. According to estimates by the gaming market data company Newzoo, there are over 2.8 billion gamers worldwide, a majority of whom are in the age range of 16-24. Whereas the gaming industry has its own economic and social benefits, it also has shown signs of becoming a platform for spreading extremism. The intersection of video games and violent extremism has recently gained greater traction in global counterterrorism policy circles. Gaming platforms such as Steam, Twitch, DLive, and more are aiming to provide their users with the best experience online and an environment to connect with like-minded individuals. However, these same motives are often exploited by extremist organisations.
Video games and their associated platforms are vastly becoming hubs of radicalisation, extremism and recruitment by far-right extremist organisations. The development of bespoke games and modifications, often known as MODs, has given extremist organisations the ability to further spread their digital propaganda. While Valve Corporation, a software company that develops and distributes video games, has provided gamers worldwide with the Steam platform to experience the best gaming environment, the threat of radical networks exploiting the platform remains high. According to ISD, “two Steam groups were expressly affiliated with violent extremist groups. One of these is a group affiliated with the Nordic Resistance Movement (NRM). The NRM was connected to a series of bombings in Gothenburg in 2016 and 2017.” This demonstrates how Steam can further be implicated in heinous terrorist campaigns.
The issue of far-right radicalisation on Steam presents a danger not only for the company repute, but to the greater security of the world around. This Insight offers a nuanced understanding of far-right radicalisation through community groups on Steam and provides recommendations to address the issue.
Far-right radicalisation under the current gaming landscape
The debates surrounding the intersection of video games and extremism highlight three important aspects to be aware of, particularly as it pertains to the far-right. First, the way far-right groups use gaming platforms as traditional social networking websites to connect with like-minded individual users; second, the kind of games being produced and supported by the far-right organisations; and third, themes of extremism, radicalisation, fund-raising and recruitment employed by these groups.
One aspect of extremist exploitation of gaming platforms is the “top-down” and “bottom-up” gamification. The top-down gamification refers to the community groups made by extremist organisations and awarding their users for supporting and spreading digital propaganda, eventually earning in-group hierarchical awards. On the other hand, the bottom-up approach refers to the ‘organic’ subculture that develops through individual users by quoting gaming language over real-life trolling or extremist actions. Steam has the highest and most diverse number of community groups related to far-right extremism.
In terms of the material and content available on these gaming platforms, there is evidence of far-right propaganda available in huge amounts. The materials include books, videos, documents, manifestos, memes and more. Even on other platforms apart from Steam, interviews of far-right leaders, such as Andrew Anglin, are available for users. In addition to the overt strategic extremist activism, there is more casual material available too, including memes, xenophobic content, and overtly pro-Nazi and pro-Hitler content, often categorised as “shitposting.”
The gaming industry, as described earlier, has approximately 2.8 billion users, which presents a massive target audience for radicalisation, especially the youth majority that is more easily susceptible to such extremist content. The anonymity of the online ecosystem, built on encrypted mechanisms (links to Telegram groups) and hardly traceable modes of communication (audio/video chat), allows far-right organisations to overtly disseminate their propaganda for radicalisation and recruitment. Additionally, the gaming subculture could be adopted by far-right organisations to lure young gamers into their extremist worldviews.
Furthermore, research suggests that far-right organisations often use these platforms to build community and thus help their members reinforce their far-right radical views. These communities often organise events and matches for their members and wider gamers who could be exposed to radical propaganda either through in-game voice chats or other materials described above. These tournaments often exhibit xenophobic requirements in terms of who can join them, such as no women, no people of colour, and generally no minorities.
Dissecting Far-right Entrenchment on Steam Gaming Platform
Launched over 20 years ago, Steam has exponentially increased its reach around the globe by becoming the most used gaming platform. In 2023, the platform boasted a total of 33 million concurrent users, with an annual revenue of USD 8.56bn. Steam remains the most entrenched and long-lasting gaming platform by far-right extremists, allowing them to use it for their propaganda. This is due to the fact that Steam has the most laissez-faire approach to content moderation compared to other top gaming platforms, such as Discord, Twitch and DLive. The company’s content guidelines are very loose, hardly mitigating its users who engage in extremist content sharing. Going through the content guidelines of Steam, one can identify that despite several third-party independent reports, the word “extremism” is not used within its guidelines. Nor are there any measures highlighted throughout which the moderators can ban, block or oversee in-chats or voice-chats during the games that support and propagate far-right content. Another major highlight within the content guidelines is the onus of reporting on users rather than strict content filters.
Drawing upon the data discussed above on Steam groups used by far-right extremists, it is evident that the platform has struggled to ban or remove content that promotes far-right ideologies. There are several groups related to far-right organisations still active and propagating sensitive and extremist material. These groups include ones related to Nordic Resistance Movement, Misanthropic Division and Atomwaffen Division. Moreover, there are also community groups and boards that provide off-platform links and material through which users can reach far-right websites such as The Daily Stormer.
The fundamental policy lacuna that Steam faces is its moderation capabilities and implementation of policy guidelines. While surfing through the community boards and comments over why Steam has failed to remove so much far-right content on its platform, one user commented that he always reports such content, albeit to no response from the platform. While the moderators at Steam are able to ban or block users for violating community guidelines, many of these moderations are user-flagged, and Steam hardly takes proactive measures. This is due to Steam’s desire to provide users the ability to connect and generate content, which allows a better gaming experience. Another reason why Steam has remained lax in countering such content and banning users is the fear of a “Gamergate” experience.
However, since June 2018, Steam has removed 179 games due to updated content moderation policies and 50 instances of Nazi-related content at the request of the German government. Many of the games removed espoused far-right ideology and were based on neo-Nazi content and imagery. Despite these measures, Steam’s platform has clarified that its removal of games and content is not because they were misogynistic and/or xenophobic, but because they incurred unknown costs to the Valve Corporation, its developer partners and users. The majority of their content moderation or removal of games was targeted towards developers and studios that were not producing games in good faith. Their moderation policy changes were also, as described earlier, user-flagged, and the platform’s moderation is aimed more towards developers’ discretion in removing content.
However, the policy of allowing game developers to opt out of Steam moderating its user boards and community groups resuscitates the point that the platform’s policies are enabling extremist organisations to continue using it for their digital propaganda and recruitment.
Policy Actions to Combat Far-right Radicalisation
While the issue of gaming and violent extremism is not limited to Steam, the overall policy landscape needs to catch up with the fast-changing environment of the gaming industry. To improve the overall security infrastructure of the gaming industry and mitigate the harmful effects of far-right content on users and those targeted by these groups, comprehensive updates are needed at all levels. Gaming platforms, civil society, independent regulators, and government agencies must all enhance their policies regarding the extremist use of gaming platforms by far-right organisations and their negative impacts. Understanding the necessary policy changes requires highlighting existing policy gaps and the need for further research.
To begin with, the limited nature of research on the intersection of gaming and violent extremism also prohibits many of the gaming platforms from advancing highly strict and critical security guidelines and implementation of moderation tools. As discussed within the literature and analysis, there is scant evidence that far-right organisations use Steam and the games available on the platform solely for the purpose of radicalisation or recruitment. Many of the games usually played by far-right groups on Steam are world famous, including Counterstrike: Global Operations or DOTA 2. These games, as identified through literature, do not provide far-right groups with the ability to glorify their ideological symbols but are played purely because it is fun. Many far-right groups are ‘organic’ in nature, with users joining these groups because they already espouse far-right ideologies, rather than being radicalised through the platforms themselves. Therefore, there is a greater need for research into the linkages between the primary motivations of far-right individuals on these gaming platforms and their ability to radicalise and recruit others.
While Steam is often dubbed as the least moderated and strict gaming platform, it could learn from its rival platforms in moderating extremist content. These include open-sourced third-party regulatory reports on its guidelines and moderation activities. The following are recommendations for upholding the best gaming experience for Steam users while mitigating the risk of the platform being used as a hub for far-right extremism:
- Rather than platform-based isolated policies to counter violent extremism, a consortium of the biggest and most widely used platforms should collaborate to create policy guidelines for countering far-right and other forms of violent extremism. Similar policy guidelines and moderation policies would discourage and limit the space for far-right organisations to switch to different platforms with less strict policies.
- An independent regulatory partnership should be established with organisations studying online hate and harassment on gaming platforms. Through collaboration with independent regulators such as the Anti-Defamation League, bi-annual reports can be produced to track trends in hatred and harassment. These reports can then inform the enhancement of moderation tools for specific games and community groups, or lead to the eventual banning of problematic groups from the platform.
- In-game voice chats remain the most elusive aspect of content moderation. Steam should introduce a permanent workforce for voice-chat moderation, which shall be supported through advanced AI-based technologies that flag specific keywords such as “Nazi”, “Hitler”, and more, and provide reports to the community groups that have used such words the most. Based on these reports, the moderation team can assess and permanently ban the users and remove the community group after several warnings.
Shiraz Shaikh is an Assistant Research Associate at the Islamabad Policy Research Institute (IPRI). He also holds an MPhil in International Relations from the National Defence University (NDU) Pakistan. His scholarly works spread across the issues of terrorism, traditional and non-traditional approaches to security, extremism, far-right and Islamic movements.
LinkedIn: linkedin.com/in/shiraz-shaikh-86653aa3 X: https://twitter.com/s_2772