Click here to read our latest report “30 Years of Trends in Terrorist and Extremist Games”

Critical World Events: Extremist Misinformation Networks and Narratives in Times of Crisis

Critical World Events: Extremist Misinformation Networks and Narratives in Times of Crisis
3rd June 2025 Dr. William Allchorn

Introduction

Global crises, such as COVID-19, produce societal instability and can create a fertile ground for radicalisation. Extremist actors (both groups and individuals) exploit such crises by distributing misinformation to amplify uncertainty and distrust. In recent years, the consequences of these developments became apparent when demonstrations, stoked by various online conspiracies and misinformation campaigns, devolved into the storming of government buildings in Germany, the US and Brazil. Alongside this, online extremism – defined as online activism cleaving to an extremist ideology – has become one of the most pressing threats to public security in the UK – with online misinformation spread by extremist actors leading to widespread riots offline in the UK in the Summer of 2024. While new online regulations have been recently introduced in the UK, the need for more research into the intersection between extremist propaganda and misinformation on social media has become an urgent public safety priority. Most notably, the functions, activities, and sources of online misinformation spread by extremist groups across ideologies in the UK may contribute to the emergence of such insurrectionary activity.

This Insight attempts to fill the gaps in research on online extremism by exploring the crossover between extremist posting, hashtags and misinformation. More precisely, it highlights the findings of a project (conducted by the authors and funded by the UK Home Office) that investigates the narrative appeals, hashtags, and recruitment dynamics of far-right and Islamist extremist misinformation networks (i.e. loose coalitions of actors exhibiting a cogent extremist ideology) on social media platforms (X, TikTok and Odysee) during the UK Riots 2024 and other critical world events such as the Russo-Ukraine War and Israel-Gaza War. Specifically, we asked: How do extremist actors exploit ‘cognitive openings’ created by critical world events in order to radicalise and recruit non-aligned audiences online?

Context: Extremist Responses to Recent Critical World Events

Russia’s war of aggression in Ukraine, the Israel-Gaza War and the 2024 UK Riots have exacerbated existing societal tensions, deepened distrust in democratic, scientific, and media institutions, and set in motion a radicalisation trend that can now be observed in wider parts of the UK population. Leveraging crisis grievances, extremist groups, both Islamist and far-right, have been able to strategically exploit these events to advance their agendas, recruit members, and destabilise societies. Far-right and Islamist extremist actors have adeptly employed misinformation, exploited cognitive openings, crafted persuasive narratives, and leveraged social media to attract and radicalise individuals. 

This is evident in Islamist and far-right responses to the Israel-Gaza War, the Russo-Ukraine War, and the 2024 UK Riots – all crisis events chosen for this study. For example, Islamist extremist groups (such as Hamas and Palestinian Islamic Jihad [PIJ]) have historically used the Gaza conflict to justify violent actions against Israel, portraying themselves as defenders of the Palestinian cause. Conversely, far-right groups have exploited the conflict to fuel anti-Muslim sentiments. In the UK, the Israel-Gaza war led to a record increase in anti-Muslim incidents last year, with far-right and anti-immigration groups instigating riots based on misinformation about Islamist migrants. Moreover, some far-right groups have used the war to promote ultranationalist ideologies, with some individuals travelling to the region to support factions aligning with their beliefs. Far-right extremists have capitalised on the riots to propagate anti-immigrant and nationalist sentiments, blaming minority groups for societal issues and recruiting disaffected individuals. As we well know, extremist groups adeptly exploit critical world events to further their agendas, using misinformation and cognitive openings around conflicts to attract and radicalise individuals. Understanding these tactics is crucial for developing effective counter-extremism strategies and promoting societal resilience against radicalisation.

Our Research

Between January and April 2025, we scraped over 2,500 public-facing social media posts from leading UK-based non-violent Islamist and far-right organisations on X, TikTok & Odysee. We analysed these posts using a misinformation taxonomy created by Claire Wardle used AI-assisted Natural Language Processing to inductively generate key topics of conversation related to the three critical world events; detect mismatches between the contents and titles of posts; and a dictionary of recruitment terms from previous projects to deductively analyse entreaties to activism used throughout the 2022-24 period analysed. We complemented this quantitative analysis of the whole corpus with a qualitative content analysis of a sample of 60 posts from each platform for each of the four accounts (two Islamist extremist accounts (one on X and one on TikTok) and two far-right accounts (two accounts on Odysee; one owned by the group and the other owned by the leader).

Findings

What our research found was striking:

  • Topics – Overall, and perhaps unsurprisingly, the most posted about narrative topics for the Islamist portion of the study was on the Israel-Gaza conflict (2022-24) compared to the Russo-Ukraine war and UK riots, which were the most posted about topic for the Islamist portion of the dataset. In relation to specific narratives, Islamist groups chose to connect the Israel-Gaza conflict to a ‘permanent war’ conducted by ‘white supremacist’ Israel and the West on the Muslim Ummah that has been ‘livestreamed’ around the world. In contrast, far-right groups chose to use pro-Russian narratives when it came to the Russo-Ukraine war, anti-Zionist narratives when it came to the Israel-Gaza conflict and anti-government narratives when it came to the UK riots.

Figure 1: Islamist Most Frequent Words 
(Top: Twitter/X & Bottom: TikTok)

 

Figure 2: Far-Right Odysee Topic Word Scores 
(T: Leader & B: Group)

  • Hashtags – Overall, the far-right posts surveyed tended to be more pugnacious in using racist and prejudicial tropes when using hashtags and relating them to their ideology compared with the Islamists adoption of more mainstream hashtags concerning the critical world events surveyed. For the far right, hashtags were used – not so much to blend in (as with the Islamist side of the dataset) – but to stand out. 

Figure 3: Islamist Most Frequent Hashtags
(T: Twitter/X & B: TikTok)

 

Figure 4: Far-Right Most Frequent Hashtags
(T: Leader & B: Group)

Recruitment – An AI based recruitment classifier model was run on the data. Overall, the classifier revealed that Islamist groups were the most prolific group when it came to recruitment during the three critical world events surveyed – with almost ten times the amount of recruitment terms overall when compared to the far-right group surveyed. In particular, we see a spike in recruitment terms in March 2023 (TikTok), March 2024 (X) and October 2024 (TikTok and X) around the anniversary of the Israel-Gaza War but also at key points of protest activity by Pro-Palestinian groups in the UK

 

Figure 5: Islamist Recruitment Term Frequencies (T: Twitter/X & B: TikTok)

On the far-right side of the dataset, we found a spike in recruitment terms in autumn period of each year – coinciding with the group’s annual conference and other offline activities (e.g. banner drops, leafletting and flyering).  Moreover, what we see is that the modal month-by-month category out of all three sub-types of recruitment tended to be soft recruitment for the far-right groups and hard recruitment for Islamist groups – with the most absolute number of posts going to more direct recruitment terms on the Islamist side (like “protest”, “fight”, “attack”, “boycott”) and more indirect recruitment terms (like “art”, “film”, “social”, “eat”) on the far-right side. This coincides with a larger shift within the wider UK far-right extremist milieu – using social events, camps, gaming and movie nights as the first step towards creating a ‘whites only’ community.

Figure 6: Far Right Recruitment Term Frequencies (T: Group & B: Leader)

  • Misinformation – Overall, the study showed a high degree of mismatch between post titles and the content of posts (demonstrating a false connection) – with some exceptions within the Islamist side of the dataset.

Figure 7: Islamist Mismatch vs Match Counts  (T: Twitter/X & B: TikTok)

Figure 8: Far-Right Mismatch Counts (T: Group & B: Leader)

It was found that Misleading Content (when genuine information or imagery is manipulated to deceive), False Context (when genuine content is shared with false contextual information) and Propaganda (when content is used to manage attitudes, values and knowledge) had similar levels of prevalence/ratios across both ideological groups.

Conclusions

Results from this project shed light on online, networked extremist recruitment and how it intersects with misinformation behaviour and language. It has explored the similarities and differences in the most common hashtags, recruitment appeals, and tactics used by such networks to spread misinformation on mainstream social media. On the whole, far-right networks commented on more varied topics, wished to stand out and showed more innovation when it came to the different misinformation tactics they used, while Islamist extremists tried to blend in. Moreover, what we see is that the modal month-by-month category out of all three sub-types of recruitment tended to be soft recruitment for the far-right groups and hard recruitment for Islamist groups – suggesting a key shift in the way that far-right groups overall conducted their recruitment activities towards non-ideologically branded activities as a form of recruitment, overall. These results suggest that key threat vectors of posting behaviour lie in both the normalisation and radicalisation of content in the online space, as well as the overwhelming use of misinformation within extremist online networks to sway public conversations in a certain ideological or conspiratorial direction. With the added inclusion of foreign state actor interference, this is a very worrying trend indeed.

Recommendations

Based on the findings of this research, efforts to address extremist misinformation and the strategic use of critical events therefore need to see responses evolve across several key sectors:

1. Disrupting Extremist Misinformation Networks

  • Create a UK Digital Threat Observatory: Modelled after the EU’s EUvsDisinfo or the U.S. State Department’s Global Engagement Centre, this body would monitor and rapidly respond to extremist-driven misinformation, especially during geopolitical crises. It would coordinate across government, academia, and civil society.
  • Mandate cross-platform misinformation monitoring: Platforms should be required by Ofcom to report coordinated manipulation attempts related to extremist actors—particularly when false narratives trend rapidly across X, Telegram, TikTok, and YouTube.
  • Community misinformation rapid response units: Fund trusted community groups to run WhatsApp and Facebook groups that identify and challenge misinformation in real time, especially in minoritised or youth-heavy online spaces.

2. Countering and Replacing Extremist Narratives

  • Fund scalable narrative inoculation strategies: Building on psychological “inoculation theory,” support initiatives like the University of Cambridge’s Harmony Square or Bad News Game that teach users to spot manipulation by simulating how extremist narratives are constructed.
  • Invest in community-led counter-narrative campaigns: Fund grassroots organizations (e.g. Maslaha, Tell MAMA, Exit UK, Hope Not Hate) to create authentic narratives that reflect local voices, cultural identity, and lived experience—especially youth from vulnerable demographics.
  • Platform-level elevation of credible content: Require platforms to elevate and recommend verified counter-narrative content during trending extremist hashtags or viral incidents (e.g., during the 2024 UK riots or Gaza escalations).

3. Preventing and Interrupting Online Extremist Recruitment

  • Real-time recruitment disruption via content redirection: Build on the Redirect Method (used by Moonshot CVE) to identify individuals searching for extremist content and guide them to positive alternatives or exit pathways (e.g., helplines, community mentors etc.).
  • Mandate risk detection integration in platform UX: Require platforms to flag when users exhibit known behavioural patterns associated with radicalization (e.g., joining multiple hate groups, binge-watching hate content) and offer resources or interrupts (e.g., “Are you okay?” prompts).
  • Enhance funding for deradicalization and disengagement NGOs: Support organizations like Connected Futures and Exit UK to intervene earlier by using digital monitoring tools and referrals from schools, youth services, or peers.

Dr. William Allchorn is a Senior Research Fellow at International Policing and Public Policing Research Institute (IPPPRI), Anglia Ruskin University, and an expert on radical-right extremist social movements in the UK, Europe, and beyond. He has recently advised the UK, US and Australian governments on their approaches to countering the extreme right-wing, and has undertaken some of the first metrical testing of violent far-right extremist counter-narratives in the UK, US and Australia with governments (e.g. EU Internet Forum), civil society organisations (e.g. Counter-Extremism Project), and tech companies (e.g. Meta) across the world. 

Dr Elisa Orofino is a Senior Research Fellow and Academic Lead for the Extremism and Counter-Terrorism Sub-Theme at International Policing and Public Policing Research Institute (IPPPRI), Anglia Ruskin University, and an expert on Islamist extremist social movements in the UK, Europe, and beyond.

Dr. Lakshmi Babu Saheer is a Senior Lecturer in Computing and director of Applied AI research group. Her work focuses on Machine Learning, Deep Learning and Artificial Intelligence mainly in the domains of Natural language processing, Climate Change, NetZero, Air quality, Vegetation & Terrain Identification, Sustainable Food Supply Chain, Audio, Speech, Internet of Things, Biomedical and Healthcare. She has more than 20 years of applied industrial and academic research experience working at different prestigious international organizations.