Click here to read our latest report “Far‐Right Extremism and Digital Book Publishing”

The UK Riots, Misinformation and Foreign Interference: A Smoking Gun or Something Else?

The UK Riots, Misinformation and Foreign Interference: A Smoking Gun or Something Else?
18th September 2024 Dr. William Allchorn
In Insights

Introduction

The UK Government has said to be examining the role of foreign interference in the wake of the country’s recent far-right riots. This Insight explores the recent history of foreign interference in protest movements and how it intersects with extremist misinformation online. 

While some forms of misinformation fall outside of mainstream social media site’s terms of service, it is argued that these ‘weak signals’ of more banal and everyday forms of misinformation by extremist actors provide evidence of how liberal democratic political culture is being eroded in a more iterative and piecemeal fashion – pointing to the need for greater examination of the narratives, news sources and effects of news sharing by extremist groups. 

Context: The UK Riots & Misinformation

In early to mid-August, protests by anti-Islam and anti-immigration groups in the UK spread in response to rumours circulating online that a murderer of three children was an “Islamist migrant”. In towns and cities across the UK, riots ensued, with mosques and hotels housing migrants targeted, leading to violent clashes with police.

Beyond a coterie of online far-right influencers (such as Tommy Robinson, Andrew Tate and Nigel Farage) as well as tycoons allegedly promoting online disinformation, one aspect that has received (relatively scant) attention was the impact of foreign states on promoting the false messages. 

In particular, the role of a (previously Russian-owned) online news channel, Channel 3 Now, has come under particular scrutiny – circulating a false name for the 17-year-old charged over the Southport attack and suggesting that the attacker was an asylum seeker who arrived in the UK by boat last year. 

This, combined with untrue claims the attacker was a Muslim from other sources, has been widely blamed for contributing to riots across the UK.

Content Amplification over Generation: The Foreign Interference & The UK Riots

Whilst the extent of evidence for foreign interference is minimal, such campaigns have received increasing political attention over the past ten years – with the FBI setting up its own Foreign Influence Task Force in 2017, the UK including a specific section outlawing foreign interference in its 2023 National Security Act, and the EU Commission President Ursula von der Leyen promising to set up “a European Democracy Shield” against hybrid warfare campaigns from Russia, China, and Iran. This has come in tandem with more offline campaigns to stop state-led infiltration of key institutions and cyber-attacks.

In the case of the UK riots, Russia was likely the most prevalent threat actor in amplifying hostile narratives online. Reports emerged early on that pro-Kremlin Telegram channels reshared and amplified the site’s false posts. In this sense, the Kremlin was parasitic rather than proactive – amplifying existing inflammatory content from genuine users rather than producing its own. 

As shown in a recent piece by the UK’s Royal United Services Institute, and when reviewing typical provocative terms associated with the protests and riots on X (#twotierkier, #twotierpolicing, ‘UK has fallen’ and so on), this demonstrates telltale signs of Kremlin interference. As cited by the RUSI piece, some profiles reposting content were established in 2022 – the year of the Kremlin’s invasion of Ukraine – or in the run-up to the UK 2024 General and Local Elections. Some appear to have shifted from attacking the UK’s support for Ukraine to lambasting the Labour government and the alleged double standards of UK police by reposting content from figures like Yaxley-Lennon, Elon Musk and provocative anonymous accounts on X and Telegram at a rate that would require a person to go without sleep. Most of the content was produced domestically; the Kremlin simply lent a ‘helpful megaphone’.

The Signal Amidst the Noise? Extremist News-Sharing Activities & Behaviours

In our research, online extremist communities are no exception, with a barrage of embedded news stories populating the social media feeds of extremist groups, albeit with their own conspiratorial and accelerationist framing. This has profound implications for liberal democracy, especially given the propensity for extremist communities to spread misinformation online via ‘fake news’ that can become normalised and accepted. Whilst there are established studies on terroristic and extremist ‘outlinking’ for recruitment purposes, we do not know enough about the functions, activities, and sources of online news sharing by extremist groups cross-ideologically.

One study that has done a great deal to spread light on this is Törnberg and Nissen’s recent study of 17 anti-Islam group Facebook pages. They explored the purposes and communicative functions of far-right groups’ social media hyperlinking activities. They found that far-right groups are predominantly linked to mainstream media, far-right media, and far-right non-institutional groups. Moreover, they found that 1) Promoting Political Issues, 2) Opponent Dismissal, 3) Collective Action Promotion and Organisation, and 4) Extremist Political Networking and Promotion were the main types of functions associated with extremist hyperlinking. Another instructive study is Dowling’s study this year of Australian white nationalist far-right communities on Gab and Telegram. She found that news sharing in far-right online circles may legitimise and reify far-right ideology through the juxtaposition of mainstream news media, indicating the validity of far-right grievances, and introduces a prototype model of news sharing legitimisation that starts with the over or tacit framing of a news item and ends with mainstreamisation and legitimation of ideology. 

These results closely relate to two studies we have conducted over the last two years, looking at the online posting behaviour of vocal extremist groups across three ideologies (Far-right, Islamists and Eco-radicals) on Facebook, X and Telegram. While studying the news-sharing patterns of the groups under analysis will be the focus of our upcoming follow-on project, in previous research, we were able to observe common cross-ideological behaviours replicating the functions mentioned by Törnberg and Nissen above as well as the effort to legitimise the ideological claims of each group through mainstream news media. We also noticed a variety of posts targeting specific countries and/or politicians as “enemies”, hence fostering hostile public opinions towards them.

Conclusion and Recommendations

In conclusion, we would recommend that tech platform Trust and Safety teams sensitise themselves to these ‘weak signals’ of more banal and everyday forms of misinformation by extremist outfits that need greater attention and focus. While some forms of extremist misinformation fall outside of platforms’ terms of service, news sharing on social media platforms facilitates the reconfiguration of how we perceive socio-political realities wherein liberal democratic values, such as pluralism, equality, and tolerance, have no place. Social media news sharing in extremist communities may, therefore, give us a clue as to how liberal democratic political culture is being eroded in a more iterative and piecemeal fashion beyond the fringes through the quality of information and media diet that is being served to users. Work by organisations like NewsGuard has attempted to stem the tide on this, but greater action is needed. 

Moreover, from a researcher standpoint, we do not know enough about the narratives, news sources and effects of news sharing by extremist groups. Whilst there are established studies on terroristic and extremist ‘outlinking’ for recruitment purposes (McDonald, 2022; & McDonald et al, 2022), we do not know enough about the functions, activities, and sources of online news sharing by extremist groups, cross-ideologically in the UK – with many other studies taking U.S. and European groups as their point of reference.

We’re hoping to correct this. In a forthcoming project, we’ll be looking at the functions, activities, and sources of online news sharing by extremist groups, cross-ideologically, and getting under the hood of how and where such misinformation behaviours take place; all in an attempt to make sure that events like the UK riots (2024), Dublin Riots (2023) and earlier anti-immigration riots (such as Chemnitz in 2018) don’t happen again.