Click here to read our latest report: Behind the Skull Mask: An Overview of Militant Accelerationism

Online White Supremacy: Looking for a Place to Spread Hate in the Age of Multiple Communication Platforms

Online White Supremacy: Looking for a Place to Spread Hate in the Age of Multiple Communication Platforms
14th October 2020 Dr. Ariel Koch
In Insights

Introduction

A study by the Anti-Defamation League, which was published in early 2020, has found that American right-wing extremists were responsible “for 330 deaths over the course of the last decade, accounting for 76 percent of all domestic extremist-related murders in that time.” A study by the Institute for Strategic Dialogue has found that right-wing terrorism is carried out by individuals who are linked to loose online networks and communities of like-minded extremists. The Christchurch mosques’ attacks, according to an article by Ana-Maria Bliuc et al., serve as an example of “the role of online communities in both empowering violent extremists to perpetrate acts of terrorism (by providing ideological belonging and inspiration) and widening the potential influence of isolated acts (by enabling the broadcast of their actions to wider audiences).”

Another study, by the Soufan Group, has found that white supremacist attacks in Western countries are examples of the transnationalisation of the white supremacist movement. Studying these online communities and/or networks is thus an inseparable part of our efforts to understand the nexus between white supremacist terrorism and Internet-based communication platforms on the one hand; and between these platforms and the transnationalisation of white supremacism on the other. Acknowledging the exploitation of their platforms by violent white supremacists, different companies and platforms started to remove accounts and channels who spread racial hatred and incite violence. This led to the migration of white supremacists from one platform to another, as part of the search for alternative “free spaces” (Pete Simi and Robert Futrell, American Swastika: Inside the White Power Movement’s Hidden Spaces of Hate (Lanham, MD: Rowman & Littlefield Publishers, 2010)). According to Rita Katz’s article from July: “Neo-Nazis Are Running Out of Places to Hide Online.”

Nevertheless, as it seems, despite the efforts to deplatform and forbid service for white supremacists, they not only remain active and persist on these platforms from which they have been banned, but also expand their activities to other platforms, either on the clearnet or the darknet (see the article by Maura Conway, Ryan Scrivens and Logan Macnair). In a GNET Insight about the chan online subcultures by Florence Keen, she asked: “[g]iven the overwhelmingly cynical and nihilistic attitudes found within these subcultures, in which anything and everything can be reduced to a joke, what exactly can be achieved by engaging with, and attempting to moderate chan culture?” Although there are differences between the chans and social media platforms, this question remains relevant for every company that wants to prohibit the spread of racially motivated hatred on its platform.

White Supremacy Exploitation of Online Communication Platforms

As the World Wide Web entered our lives during the 1990s, it was quickly adopted by white supremacists who have found a new tool they can exploit for their own causes. One of the oldest white supremacist websites, Stormfront, was created in 1995 and is still online. According to the Southern Poverty Law Center, Stormfront members have been responsible for almost 100 murder incidents since 2008, when Barack Obama entered the White House. The white supremacist online community continued to evolve with more websites and forums; and more groups and individuals, joining in. When new social media platforms entered our lives (and changed it), they were quickly adopted also by white supremacists who exploited these newly available tools for their own causes.

Nowadays, manifestations of white supremacy beliefs are visible on every corner of the Internet: on websites, blogs and forums; known and less-known social media platforms; messaging apps; gamers’ platforms; video and music sharing sites; and messaging-boards from which new online subcultures have emerged. There is also white supremacist activity on the darknet which includes sites, forums, and dashboards. Naturally, in addition to spreading their messages among the masses, the most suitable platforms would be the spaces that attract millions of users, an endless pool of potential candidates to recruit to extremist ideologies. It is true not only for white supremacists but for other forms of violent extremists as well.

A study that was published in 2016 by J. M. Berger showed that Twitter was the central platform for both Islamic State (IS) and American white supremacists. According to Berger, “[o]n Twitter, ISIS’s preferred social platform, American white nationalist movements have seen their followers grow by more than 600% since 2012, which outperformed ISIS in nearly every social metric, from follower counts to tweets per day.” After Twitter started to remove accounts affiliated with both IS and the white supremacist community, adherents of both extremes searched for an alternative; a place without censorship (or with minimum supervision on content), where they would be able to spread propaganda, to recruit members and funds, to threaten their enemies, and to mobilise supporters to activities.

Founded in 2013 as an encrypted communication app, Telegram was the perfect place for jihadi-terrorists and white supremacists who migrated to it from Twitter and Facebook. At least until 2019, when Telegram changed its approach, deleted thousands of channels and chats, and led both players to search for new alternatives. Nevertheless, recent research by the British centre HOPE not hate has found that Telegram remains a cesspool of right-wing extremists despite the ban of prominent channels. It is true also for other social media platforms that gained popularity among white supremacists, such as Parler or the Russian social media VKontakte (VK). In 2016 it was reported that American neo-Nazis found a safe haven for their beliefs in VK. Last year, the ADL noted that VK plays a role in connecting American white supremacists to their counterparts around the world.

Conclusion

White supremacists point to different alternatives for Google, YouTube, Facebook, and Twitter. Some of these alternatives, such as Gab, VK, Parler and Bitchute, gained more popularity within the white supremacist milieu than others. However, it does not mean that they will replace Google, Facebook, Twitter, or YouTube. It just means that there are more spaces white supremacists are able to exploit and abuse. In many cases, white supremacist groups and individuals chose to establish presence first on the popular social media platforms, as it is easier to find like-minded people and potential recruits there. A recent example could be found in the case of The British Hand, which used Instagram and Telegram to recruit and incite violence. As nowadays there are more options to communicate with others; to watch content produced by extremists and terrorists; and to directly contact extremists and terrorists than at any given time before, it makes the struggle against online hate an endless cat-and-mouse game which cannot end with a knockout.