Click here to read our latest report “30 Years of Trends in Terrorist and Extremist Games”

Playing with Hate: How Far-Right Extremists Use Minecraft to Gamify Radicalisation

Playing with Hate: How Far-Right Extremists Use Minecraft to Gamify Radicalisation
2nd July 2025 Gagandeep
In Insights

This Insight will analyse how far-right extremist networks have turned Minecraft, one of the world’s most popular online games, into a tool for ideological grooming and radicalisation. By creating immersive, gamified spaces that embed hateful narratives into familiar gameplay mechanics, extremists are turning gaming culture into recruitment infrastructure. This refers to how extremists use popular and widely accessible cultural activities, such as video games, as tools to subtly introduce and reinforce their beliefs among users. This Insight focuses on Minecraft specifically because of its unique blend of accessibility, community-driven content, and low moderation thresholds that make it particularly vulnerable to abuse by ideologically motivated actors.

Minecraft as a Radicalisation Gateway

With over 200 million monthly active players, Minecraft offers near-total creative freedom, including the ability to design maps, host private servers, and create mods. While most players use these features harmlessly, far-right actors have exploited this freedom to build propaganda-filled environments, memorialise violent events, and embed hate speech into virtual architecture. This is not limited to hateful swastikas; some builds include recreated mass shooting sites or ‘shooter training’ simulators. Others are more subtle, using old buildings from fascist times, hidden messages, or characters that repeat extremist ideas.

Figure 1: Swastika in a ‘mock’ Holocaust world.

One widely circulated video among far-right extremist Telegram channels, Mincraft avatars are depicted marching under the Black Sun (Sonnenrad), a symbol commonly associated with neo-Nazi groups, with the caption “Defend Your Land” written on it. This screenshot below (see figure 2) was taken from the original 33-second video circulating on a neo-Nazi Telegram channel.

Figure 2: Black Sun symbol embedded in a Minecraft build, often used by white supremacist and neo-Nazi groups

The video features edited Minecraft gameplay with messages such as “Our Patience has Limits,” “Pillagers Stay Winning,” and “Save Minecraft: Exterminate the Invaders,” phrases that echo common extreme-right rhetoric framing immigration or diversity as invasions and promoting narratives of cultural or racial conflict. These phrases, paired with familiar in-game aesthetics, reframe xenophobic or ethno-nationalist rhetoric as a call to arms inside the game world. This blend of visual propaganda and gamification makes Minecraft a gateway, not just because of what players see, but how they are invited to act within those ideological frames.

Figure 3: Screenshot of extremist Telegram Channel.

What makes Minecraft unique is its potential to blur the lines between play and ideology. A player may download a custom map simply because it is popular or edgy, but once inside, they’re exposed to carefully curated narratives. Extremist world-building often mirrors real-life radicalisation strategies, normalising white supremacist ideologies and subtly reinforcing them through repetition, humour, or aesthetic appeal.

Known Extremist Activity in Minecraft

While direct attribution remains difficult due to the short-lived nature of servers, researchers and OSINT communities have documented the circulation of Minecraft-themed extremist content within far-right Telegram groups and fringe forums. These include recreations of ideological texts, violence-themed missions, and meme-based propaganda. These online communication channels often glorify acts of mass violence, including recreations of incidents like the Christchurch mosque shooting, complete with manifestos and missions.  In this context, ‘missions’ refer to in-game tasks or objectives designed to simulate extremist violence, often mirroring real-world attacks or ideological goals. While it is difficult to attribute these maps to specific organisations, there is strong thematic overlap with groups like ZoomerWaffen, Atomwaffen, Iron March, National Socialist Order, and FRENs Chan (Far-Right Ethno Nationalist).

One example includes a widely circulated server that used Minecraft’s redstone, the game’s version of electrical wiring, to simulate ‘detonations’ at symbolic targets. Redstone can be used to trigger explosions, open doors, or create automated sequences, making it a powerful tool for storytelling or, in this case, disturbing ideological messaging. In another instance, players could explore a digital town populated with in-game books quoting The Turner Diaries and Siege, key texts in the white supremacist accelerationist world.

A related real-world case further underscores this trend: in 2022, a Russian teenager was sentenced for terrorism-related charges after authorities claimed he had plotted to blow up a virtual FSB building in Minecraft — a case widely criticised but nonetheless indicative of the symbolic power such digital spaces can carry.

Although many of these servers are short-lived, quickly banned or shut down, their blueprints are often shared and re-shared across communities, keeping the content alive and spreading. This decentralisation also makes moderation extremely difficult: once a file exists, it can be hosted and downloaded anywhere.

Digital Culture and Recruitment

Gamified radicalisation does not always look like explicit calls to violence. Instead, it can unfold through humour, aesthetics, and community participation. Minecraft appeals to younger audiences, many of whom are still forming their political identities. The platform’s openness to user-generated content creates an environment ripe for manipulation and ideological indoctrination.

Figure 4: Screenshot of a Minecraft player’s X post.

Far-right groups have learned to speak the cultural language of these gaming communities. They don’t just preach—they meme, mod, and build. These tactics lower the threshold for ideological adoption. A player does not need to agree with white nationalist ideology to find a custom Nazi zombie map entertaining, but repeated exposure to similar content can prime them to accept increasingly radical views. It is propaganda by immersion. 

A recent GNET report shows that extremist actors have been embedding real-world attack scenarios and ideological objectives into game mods and missions for over 30 years, demonstrating how sustained gameplay interactions can reinforce radical beliefs.

In one documented case, a Discord server linked to a now-banned Minecraft world was also used to distribute literature, organise voice chats on accelerationist themes, and vet users for access to more secure spaces. Discord acted swiftly, removing the server under its terms of service once its extremist content was identified. More recently, following the January 2021 US Capitol attack, Discord deleted the pro-Trump server “The Donald” for its overt connection to incitement and planning of violence, illustrating how the platform continues to disrupt extremist networks. While not every participant may have been radicalised, the community acted as a low-risk gateway to more explicit ideological spaces.

Figure 5: Nazi War Eagle in Minecraft world.

Why Minecraft?

Minecraft’s design permits certain types of user-generated content more freely than many other games, creating vulnerabilities that extremists can exploit:

  • Modularity: 

While Minecraft’s Bedrock version includes automated checks, the Java edition has a more open ecosystem, allowing users to share custom worlds, mods, and mechanics via third-party launchers or independent servers, often with minimal moderation for extremist material.

  • Anonymity: 

In-game chat and private servers offer the same VPN-friendly, pseudonymous environment as other multiplayer titles, but because Minecraft servers are often privately hosted, extremists can discuss and organize under the cover of anonymity with minimal risk of detection.

  • Decentralisation: 

Whereas many games confine user-generated content (UGC) i.e. maps, mods, skin and other assets, within official storefronts, Minecraft’s controversial builds almost always circulate off-platform on Discord, Telegram, or forums, making them harder to trace and moderate effectively.

Unlike more tightly controlled platforms like Roblox, Minecraft has a looser content policy, particularly when it comes to third-party content. Many extremist Minecraft mods and maps are not discovered until well after they’ve been downloaded and played.

From Pixels to Propaganda: A Broader Trend

The use of Minecraft is part of a larger trend: the gamification of hate. Far-right actors have adapted elements of meme culture, select video games, and niche digital subcultures to create ideological echo chambers rather than broad “breeding grounds”. They don’t just preach, they orchestrate interactive events, embed coded messages in custom maps, and host community challenges that mirror extremist narratives. These tactics lower the threshold for ideological adoption.  Minecraft is particularly potent because it offers an interactive narrative – a world one can live in, build in, and share with others.

This isn’t isolated. Games like Grand Theft Auto, Roblox, and even Garry’s Mod have seen similar ideological abuse, but Minecraft’s popularity with younger users and its ease of customisation make it an especially risky case. As of 2024, 43% of Minecraft’s player base are aged 15–21, compared to 58% of Roblox users under 16, while Grand Theft Auto carries an M (Mature 17+) rating, restricting its core audience to ages 17 and up. Moreover, Minecraft supports over 257,000 community-created mods and worlds across platforms like CurseForge, whereas Roblox’s Studio tools require in-platform moderation, and GTA’s modding relies on external tools and is unsupported in its online modes.

Recommendations for Tech Platforms

To prevent the spread of extremist content within games like Minecraft, several steps can be taken:

  1. Enhance Community Reporting Mechanisms: Mojang, the creator of Minecraft (which is owned by Microsoft), should invest in making in-game reporting of mods, maps, and servers more accessible and transparent.
  2. Automated Content Scanning for Downloads: Minecraft launchers and mod hosting sites can implement AI-driven scanning tools to detect hate symbols, keywords, or references to extremist events and ideology in downloadable content.
  3. Platform Partnerships with Researchers: Game companies should build formal collaborations with extremism researchers and OSINT communities to stay ahead of abuse trends and share findings securely.
  4. Stronger Moderation on Linked Platforms: Much of the distribution of radical Minecraft content happens off-platform, especially on Discord and Telegram. Companies must adopt a cross-platform view of radicalisation pipelines and disrupt entry points accordingly.
  5. Digital Literacy for Parents and Youth: Tech platforms can help fund and promote educational campaigns that explain how gamified propaganda works. Awareness is a critical first step in prevention.

Ethical and Privacy Considerations

Monitoring private servers and custom content poses privacy challenges. Not all edgy content is extremist in nature, and false positives could undermine trust in moderation systems. Any intervention must balance user privacy with the need to address abuse. Transparency about how decisions are made and avenues for appeals will be key.

Looking Ahead

The radicalisation landscape is becoming more immersive, more decentralised, and more culturally embedded. While sandbox titles like Minecraft often provide purely creative and social experiences, extremist actors have learned to conceal ideological messaging within custom maps, community challenges, and private servers. This allows them to sidestep conventional moderation, targeting interested players through cooperative gameplay or invite-only events, without alerting the broader community. By embedding propaganda into optional content rather than the core experience, these networks cultivate small, dedicated circles of participants who, over time, may be exposed to ever-more explicit ideology.

As games become increasingly social and moddable, the tech sector must treat them not just as products but as complex ecosystems. Moderation needs to expand beyond banning keywords or symbols. It requires understanding how ideologies evolve, how communities behave, and how cultural content can be co-opted for radical ends.

Minecraft is not the cause of radicalisation, but it is part of the delivery mechanism. As extremists continue adapting, tech platforms must learn to play defence as strategically as these actors play offence.

Gagandeep holds a Bachelor of Arts from Panjab University and will begin a Master’s in Terrorism and International Security at the University of Nottingham this fall. He previously interned at the United Service Institution of India, New Delhi. His research interests include the role of technology in conflict, hybrid warfare, and the crime-terror nexus.

Are you a tech company interested in strengthening your capacity to counter terrorist and violent extremist activity online? Apply for GIFCT membership to join over 30 other tech platforms working together to prevent terrorists and violent extremists from exploiting online platforms by leveraging technology, expertise, and cross-sector partnerships.