Click here to read our latest report “30 Years of Trends in Terrorist and Extremist Games”

Conspiracy Extremism and Digital Complexity – Where to Start?

Conspiracy Extremism and Digital Complexity – Where to Start?
5th October 2020 Elise Thomas
In Insights

As the global COVID-19 pandemic rolls on, conspiracy theories are fuelling protests, altercations between citizens and law enforcement, property damage and violent attacks around the world. As tensions continue to rise, and as conspiracy communities continue to spread and radicalise, the risk of violent extremist acts fuelled by conspiratorial beliefs is also increasing.

There is an urgent need for policymakers, law enforcement, traditional media and social media platforms to develop greater awareness and understanding of conspiracy extremism and to grapple with the difficult question of how to respond.

The link between extremist acts and conspiratorial beliefs is, of course, not new. Extremist ideologies throughout history have incorporated conspiratorial elements, for example, the still prominent and longstanding anti-Semitic conspiracy about a secret Jewish world order.

However, the rise of digital technology and the modern information ecosystem has fundamentally altered the nature of conspiracy movements and, therefore, also of conspiracy-linked extremism. Conspiracies spread differently now; communities form differently now; individuals radicalise, plan and commit acts of extremism differently now. It is important to understand how these dynamics operate to mitigate and respond to the risks they pose.

This is, of course, much more easily said than done. The chaotic and frequently incoherent nature of conspiracy narratives, the wide variations between different conspiracies and the groups and communities which form around them, and the speed with which those narratives and communities can shapeshift make it difficult to analyse conspiracies via the same frameworks applied to other forms of extremism.

With some exceptions, these are not organised groups. There is no clear dividing line between who is a ‘member’ and who is not. It’s not even clear how they should be described – for example, is QAnon a cult? A movement? A meme, a hobby, an ideology, a belief system, a political faction? It has elements of all of these but fits none of them exactly. There are influential figures but no real leader or organising structure. The short-term tactics and targets shift on a weekly, sometimes daily basis; there is no clear long-term strategy or goal.

This difficulty has been exacerbated by the massive spike in conspiracy-related online activity since mid-March 2020. This has both expanded the reach of established conspiracy communities and disrupted them.

This disruption was evident in, for example, the ‘Save the Children’ protests held in the US in late August 2020. Although attendees at the protests waved QAnon signs and symbols, the protests were disavowed by many figures who were influential in the QAnon movement prior to the COVID-19 pandemic. They claimed the events were a “false flag” (figuratively, but probably also literally given the many QAnon flags waved at the rallies). Their objections seemed to have mattered little, however. This highlights how fluid the dynamics of conspiracy movements are and how swiftly power and influence shifts; and it underscores the decentralised and leaderless nature of these movements.

The narratives and ideologies underpinning conspiracy theories can shift just as unpredictably. In late June 2020, a single influential QAnon conspiracist sowed the seeds for a conspiracy narrative connecting the online homeware store Wayfair with human trafficking. The narrative incubated on Reddit’s r/conspiracy board, before spilling out onto mainstream social media platforms with tens of thousands of posts and tweets, some of which received over a hundred thousand engagements. One Instagram account dedicated to the conspiracy accrued at least 21,000 followers in a matter of days. Wayfair’s belated efforts to tamp down the claims over a week later did nothing but add fuel to the fire.

This completely unfounded conspiracy narrative led to widespread reputational damage on social media; harassment and personal targeting of Wayfair employees including CEO Niraj Shah; and the clogging up of hotlines, which hampered investigations into real child trafficking.

Fortunately, no violent acts have (yet) occurred in connection with this particular conspiracy narrative but the potential is clearly there. The Wayfair episode has distinct parallels with the Pizzagate shooting in 2016, in which a gunman stormed a pizza restaurant in the belief there were trafficked children being held in the (non-existent) basement. The speed and baffling incoherence with which conspiratorial movements identify and go after targets – who may not even have been aware the conspiracy existed until they become victims of it – presents a real challenge for countering the risks for acts of violence.

Attempting a rigid analysis of conspiracy extremism is like trying to trace a steady path through quicksand. Rather than sinking in the mire, we need a clear-eyed focus on what is knowable in the short-term, and what lessons we can adopt from other forms of digital extremist activity.

Firstly, for the foreseeable future, it is likely the greatest risk comes from lone actors, who have either self-radicalised or mutually radicalised (that is, been an active member of an increasingly radicalised conspiracy community, who support and encourage one another in escalating their beliefs and actions). This is the pattern we have seen in conspiracy violence incidents to date, such as the conspiracy-fuelled Hanau shooting, derailing of a train in Los Angeles or the recent QAnon-linked assault on the Canadian Prime Minister’s residence.

It is possible this may change at some point in the future – other digital-first movements such as the Boogaloo have spun off small cells of would-be terrorists and there are some examples of QAnon supporters colluding to commit other crimes such as kidnapping. At this stage, however, a lone actor attack remains the most probable form of conspiracy-related violent extremism and that understanding should guide efforts to respond to and mitigate potential threats.

Secondly, when it comes to the risk of extremist violence, not all conspiracies are created equal. A conspiracy theory like QAnon, for example, which is fundamentally political and frequently demonises and dehumanises real world targets, is more likely to lead to violence than, say, flat-earth or Big Foot conspiracy theories.

Even within conspiracies, there are variations among different sub-communities. For example, while of course the possibility cannot be ruled out, it seems reasonable to assume that ‘Pastel QAnon’ is likely to be less of an extremist violence risk than QAnon communities on 8kun (the successor to the 8chan imageboard which has already been linked to multiple acts of extremist violence). Developing an understanding of which sub-communities of conspiracy movements are likely to pose a more significant threat could help inform decisions about how to prioritise resources and monitoring.

Conspiracy theories are a complex, multi-layered social problem that will require an equally multi-layered response from policymakers, social media platforms, media and civil society. Addressing the risks of violent extremism is a small but crucial part of that larger picture. While the turbulent, amorphous and rapidly shifting nature of many conspiracies poses a significant challenge for analysis, particularly in the current moment of disruption driven by the global pandemic, we can begin by learning the lessons from recent conspiracy-fuelled violent attacks, and prioritising analysis of the sub-communities most at risk of sparking the next act of conspiracy extremism.