In January 2025, a man named James Wesley Burger used Roblox—a platform designed for creativity and youth engagement—to openly issue threats of ideologically motivated violent extremist attacks. The case illustrates a disturbing trend: extremists appear to exploit Roblox to radicalise, recruit, and mobilise users, raising urgent questions about safety on one of the world’s most popular gaming platforms.
While often described as a game, Roblox is more accurately a platform—a creative ecosystem where developers and users design their own games. Regardless of terminology, it has become one of the most widely used gaming platforms, topping mobile charts and competing with Minecraft and Fortnite for desktop playtime.
Despite being an industry leader in terms of trust and safety and investing considerable resources into platform safety initiatives at a time when most digital platforms otherwise abandon it, Roblox is no longer just a safe creative space for predominantly the youth to enjoy, express, and develop. Recent years have also shown that extremists view Roblox as an attractive platform within the broader gaming universe, likely in part due to its enormous user popularity and design flexibility. Notably, extremists’ preference for Roblox has developed despite the company being one of the most committed gaming companies in terms of trust and safety, highlighting the complex nature of establishing secure and safe social technology platforms.
This Insight discusses how gaming platforms have become attractive digital spaces for extremists, focusing specifically on Roblox and recent cases of extremist exploitation. This builds on a growing body of research showing that extremists across ideologies are building a presence on gaming and adjacent platforms such as Steam, Discord, and, to a lesser extent, Minecraft. While gaps in data remain, trends point to the use of these spaces for propaganda dissemination, recruitment, mobilisation, and potentially, fundraising.
In recent years, intelligence agencies have warned about extremists’ use of Roblox and that they detect a dynamic of younger and younger people becoming radicalised. Similar dynamics have been evident in my own work on how gaming and gaming-adjacent platforms are exploited by extremists, a theme that the GIFCT’s pool of experts are currently scrutinising to provide the public and the gaming companies with a better understanding of the challenge at hand.
Extremism on Roblox
One element that attracts extremists to gaming is the ability to create virtual worlds resembling their ideological utopias—or to reenact real-world violence, such as the Christchurch shooting or the Islamic State’s caliphate. On some platforms, extremists rely on game modifications (‘mods’) to inject ideology into gameplay. However, in open-design spaces like Roblox and Minecraft, they can build ideological experiences from the ground up. While such reenactment might be excused as silly incidents with little real-world impact, there are also more serious examples of extremists engaging youth in attempts to exploit, radicalise and mobilise for action. A few notable examples highlight this trend.
In January 2023, Singapore’s authorities discovered that a 16-year-old was using Roblox to engage with like-minded pro-Islamic State supporters. The boy was creating his own propaganda material featuring video sequences from Roblox – the ultimate example of gamification. Being a member of numerous Islamic State-themed servers with digital recreations of Islamic State governance in Syria and the Philippines, the Singaporean youth played his virtual role as a spokesperson for the pro-Islamic State community and even pledged allegiance to an in-game caliph within the Roblox space. They had, in fact, made a virtual alternative world, self-sufficient and to some extent disconnected from real-world events.
While Roblox did not issue any comment addressing the Singaporean youth, one of the company’s executives had commented on the growing extremist presence on the platform the year prior. Remy Malan, Roblox’s VP of Trust and Safety, back then said that the company “abhor extremist ideologies and have zero tolerance for extremist content of any kind on Roblox. Because of the swift, proactive steps we take, extremist content is extremely rare on our platform and therefore, for the vast majority of the Roblox community who do not seek out such content, it is very unlikely they would be exposed to it…. We recognise that extremist groups are turning to a variety of tactics in an attempt to circumvent the rules on all platforms, and we are determined to stay one step ahead of them. We are deliberately agile in our efforts – we constantly strengthen the tools and filters we use to track down bad actors and expand the range of content blocked by our moderation systems.” However, the complexity of this task is highlighted by a number of additional cases.
Two documented cases from Germany highlight how Roblox can serve as an entry point in the radicalisation process—particularly for minors. In both cases, 12-year-old boys encountered far-right extremist content while playing World War II-themed Roblox games. Older players used Nazi symbolism in-game to build trust, eventually inviting the children to Discord servers filled with National Socialist propaganda, antisemitic rhetoric, and role-play mimicking Nazi hierarchies. While the ideological grooming escalated off-platform, the initial exposure and normalisation happened on Roblox—demonstrating how extremists use shared gameplay to lower psychological barriers and initiate radicalisation. These cases exemplify how extremists strategically exploit the social and thematic openness of Roblox, particularly its appeal to younger users, to seed ideological influence and guide vulnerable individuals into more tightly controlled off-platform communities.
The “Burger case” starkly illustrates how Roblox can be used not just for social interaction but as a primary channel for expressing, testing, and escalating extremist beliefs. In January 2025, U.S. authorities arrested Texas resident James Wesley Burger after another Roblox user reported his in-game threats of violence to the FBI. Using the account, Burger engaged in explicit discussions using Roblox chat about planning attacks modelled after Islamic State tactics. Roblox logs revealed that he used the platform to describe potential targets—including a Christian music festival—and voiced admiration for jihadist martyrdom, even referring to himself as a terrorist. According to the criminal complaint, his chats included statements such as “martyrdom or bust” and intentions to cause terror. Investigators later confirmed his identity using IP and billing records from Roblox and in all fairness, the company moved swiftly to assist law enforcement’s investigation before any real-world harm could occur.
It is not just Islamic extremists appearing to rely on Roblox as a medium to radicalise or exploit youth. As a number of ongoing lawsuits indicate, the extremist nihilistic network known as ‘764’ is following a similar strategy using Roblox as a digital space to find youth to exploit. According to the Royal Canadian Mounted Police (RCMP), this is true not only for Roblox but also for a number of other gaming spaces, including Minecraft. A rapidly growing network both in scale and geographical scope, the ‘764’ is infamous for targeting extremely young individuals because adherents are also young themselves. For this simple reason, platforms like Roblox with a young user group are potentially attractive bases to recruit new followers.
Journalists have also uncovered the extensive presence of Proud Boys accounts on Roblox. Back in December 2023, the Australian Federal Police went as far as to issue an official warning to parents about how extremists are using gaming platforms, highlighting Roblox in particular, to propagate and radicalise. Previous GNET research has also shown that Roblox contains content glorifying National Socialism and antisemitism. In addition, Roblox has also been accused of having a serious challenge with Child Sexual Abuse Material (CSAM).
Why Roblox?
Roblox isn’t the only platform facing these challenges, but it may be the most exposed—largely because of its design and user demographics.
With over 100 million daily active users (DAU), the platform stands out for how young its audience is: around 56% are under 16, and 20% are younger than 9. Further, Roblox’s appeal lies in its customisability. It allows users to build their own games, avatars, environments, and interactions, making it ideal for fostering creativity—but also ripe for abuse by actors looking to shape ideology within immersive digital spaces.
As mentioned, Roblox is not an actual game, but a platform that enables users to create their own games, or ‘experiences’, that currently comprises more than five million active games. This ‘creative space’ vision is exactly what makes Roblox stand out and what makes it attractive both from a developer and user perspective. The platform is more dynamic and customizable than other gaming platforms, which enables extremists of various ideological orientations to develop and customise games, features, and cosmetics (skins, avatars, gear and so on) that align with one’s ideological worldview.
As an engaging platform, Roblox offers a range of features – particularly in its design – that make it an appealing hub for extremists in the gaming space for the following reasons:
- It has a large number of users that extremists can attempt to recruit, socialise, radicalise, and mobilise;
- Its young user base offers a more vulnerable group which some extremist networks have a preference for, and;
- The dynamic and customisable offerings on the platform allow extremists to build virtual spheres resembling their ideological utopias, which strengthen socialisation processes and function as a facilitating factor in recruitment and radicalisation.
Roblox is attempting to fix the problem. But will it succeed?
The scale of Roblox’s challenge is cemented by a few numbers. The platform has more than 100 million DAU, covers millions of games, and processes more than “50,000 chat messages every second.” Every day, the Roblox community sends 6.1 billion chat messages on average while generating 1.1 million hours of voice communication. This leaves the company in the complicated situation of attempting to execute successful trust and safety moderation at an exceptionally high scale.
Like most other large social media (SoMe) and gaming platforms, Roblox faces problems enforcing an effective community standard policy and terms of service. Arguably, the greatest challenge comes in the form of monitoring an enormous amount of text and voice, in different languages and dialects, in real time. Yet, as the cases above highlight, simply detecting communication and designs that violate internal policies proves challenging, thus calling for exceptionally clear internal definitions and classifiers of what constitutes extremist discourse and action. Further recommendations are delineated in the final section of this Insight.
In taking an industry-wide approach, however, Roblox appears committed to keeping its platform safe from user harm. It has easily accessible terms of use, including its community standards that explicitly prohibit any “behavior that incites, condones, supports, glorifies, or promotes any terrorist or extremist organization or individual (foreign or domestic), and their ideology, or actions.” The terms also include a zero-tolerance policy for the exploitation of minors, prohibition of online threats, bullying, and hate speech. It even goes as far as including off-platform behaviour, which is not industry standard.
As part of the new obligations under the EU’s Digital Services Act, Roblox is now issuing a yearly Transparency Report detailing the platform’s ongoing efforts to make it a safer online space. This includes content moderation, both the automated and human review processes. The company claims that its “ML [machine learning] models detect policy-violating text, images, speech, audio, and 3D content comprehensively across Roblox; this content is reviewed by both our automated and manual systems. These models are trained on Roblox-specific language and abbreviations, and are able to consider the context of potential violations that would likely be missed by other, non-Roblox-specific, ML models.” While in theory this sounds bold and reassuring, real-life examples show Roblox’s models might capture some, but not all, community standard violations.
In late 2024, Roblox launched a significant overhaul of its safety systems and parental controls. Parents can now remotely manage settings, monitor screen time, and view friend lists. For users under 13, Roblox has restricted direct messaging and introduced a four-tier maturity model to better manage age-appropriate content. More than 30 safety-related features were introduced that year alone. That number has already reached 100 in 2025 and includes the introduction of ‘Sentinel’, a new AI system helping to detect early signals of child endangerment, thus showing the company’s commitment to do better.
Conclusion and Recommendations
The low-prevalence but high-impact cases explored in this Insight underline the great complexity of dealing with trust and safety at large social and gaming platforms. With extremists evolving their methods and targeting increasingly younger users, safety measures must be proactive, adaptive, and deeply embedded into platform architecture. Roblox’s challenge is not only one of scale, but of design—the very features that make the platform appealing to young users and developers also create pathways for ideological exploitation. Addressing this duality will be key to safeguarding its future.
Having well-defined policies that are constantly updated and evolving is a foundational step. The real challenge comes in enforcing policy transgressions with such a high volume of content and interactions. It necessarily involves automated systems to monitor and detect all text and voice chat, preferably in real time. Such a system should be advanced enough to know all the intricacies of extremist discourse and symbolism, have a built-in understanding of typical processes such as radicalisation and recruitment, and the ability to scan voice chat in numerous languages, dialects, and slang. While that sounds like an enormous challenge in technical capability and subject matter expertise already, it does not stop there. With extremism and extremist modes of online engagement constantly evolving, so too must trust and safety practices and capabilities.
The cases highlighted in this Insight are not just about individuals; they highlight the unique vulnerabilities of Roblox as a platform. The combination of real-time social features, young user demographics, and the platform’s design flexibility creates an environment where extremist rhetoric can be introduced, normalised, and operationalised with alarming ease. As with other cases involving Islamic State sympathisers and extremist networks like ‘764’, Burger’s use of Roblox shows how the platform is increasingly serving as both an ideological playground and a recruitment vector—not despite, but because of its creative openness and youthful appeal. Roblox’s trust and safety challenge, then, is not only about scale, but about how the very architecture of the platform can be prevented from being co-opted for extremist ends.
–
Dr. Tore Refslund Hamming is Director of Refslund Analytics, a boutique consultancy specializing in trust and safety, online extremism and counter-terrorism. For more than a decade, Dr. Hamming has worked on issues related to radicalization and counter-terrorism both as a researcher, consultant, and as an intelligence manager. He is a current member of GIFCT’s working group on Addressing Youth Radicalization and Mobilization and a member of the Christchurch Call Advisory Network.
–
Are you a tech company interested in strengthening your capacity to counter terrorist and violent extremist activity online? Apply for GIFCT membership to join over 30 other tech platforms working together to prevent terrorists and violent extremists from exploiting online platforms by leveraging technology, expertise, and cross-sector partnerships.