Although individuals on the far right, fascists, and white supremacists had a prior presence on Telegram, the aftermath of the 6 January 2021 Capitol Insurrection significantly impacted their online ecosystem. Deplatforming resulted in a surge of new traffic to Telegram where channels and online groups have seen an increase in channel subscribership and group membership. Toxic narratives are reaching increasingly wider audiences. Numerous groups and ideologies, such as anti-government militias, Proud Boys, QAnon, and other conspiracy theorists, had an on-the-ground presence at the Capitol but this study will primarily focus on content distributed from white supremacist and fascist Telegram channels.
The article will first describe the methodology used for data collection followed by an examination of primary themes and a consideration of the wider impact these narratives. Finally, the conclusion will offer policy suggestions for social media platforms and identify areas for further research.
We selected a total of 23 far right and white-supremacist themed Telegram channels, each with no less than 1000 subscribers per channel. We established a minimum subscriber number to ensure that the data being analysed received higher numbers of viewership from channels within the white supremacist Telegram ecosystem. The smallest channel had 1000 subscribers while the largest channel had a total of 47,000 subscribers. To ensure that we were applying a consistent criterion, we only selected channels that shared overtly white supremacist and fascist propaganda and/or channels with names explicitly indicating their stance such as having ‘white’ or ‘fascist’ in their titles. We set a short time frame for data collection (6 January – 13 January) because we hoped to capture a snapshot of primary themes and narratives circulating on Telegram over a week-long period starting from 6 January 2021 – the day the insurrection took place. One factor impacted data collection: during the archiving process Telegram banned numerous channels and we adapted by selecting replacement channels that fit our criteria within our designated time frame.
While reading through content from our list of channels, we independently created a set of codes. To avoid double-archiving identical content, we only counted original posts from the channel itself as opposed to including content forwarded/cross-posted from other channels. Upon finishing the collection process, we consolidated our themes into 12 coded categories. After creating a master code list, we uploaded our screenshots into NVivo and independently coded our portion of the data. To ensure consistency through intercoder reliability, we compared a portion of our independently coded screenshots and found that we mutually agreed on a majority of code assignments. Lastly, we calculated the number of data points per category to derive percentages and arrived at a total of 763 data points. Depending on the content, data points were also frequently coded under multiple categories.
The following 12 categories used to organise the data include: anti-establishment, general new updates, humor, victimhood, bigotry, martyrdom, accelerationism, juxtaposition, safety protocols, redpill normies, historical references and other.
This theme composed the largest percentage of any single category. Narratives contained a wide range of sentiments: anti-Trump, anti-Pence, anti-GOP, anti-political, and anti-law enforcement. Such a diverse array reflects far right and white supremacist efforts to push the overarching idea that ‘there is no political solution’. In all cases, these posts appeared to have a single goal in mind: magnify and direct resentment towards the President, the GOP as a whole, Democracy, and law enforcement. Its all-encompassing anti-establishment narrative takes a ‘big tent’ approach that exploits individuals’ perceived grievances against a wide array of people and institutions. As will be further discussed in the ‘redpill normies’ section, heightening resentment against the political system is intertwined with the wider strategy of encouraging target audiences to completely reject Trump and the GOP and, in turn, open them to fringe white supremacist anti-establishment propaganda that advocates for the usage of violence.
General News Updates 21%
This content was primarily derived from mainstream media outlets. In this category, channels posted news updates to inform subscribers on updates from mainstream media sources such as CNN reported on the insurrection. The posts with news updates contained the title of the article with a link to the article without further commentary from the channel admin(s). As the insurrection came to an end, the amount of news updates greatly decreased although updates concerning the aftermath continued to be posted. This suggests that the channels’ audiences wanted to remain aware of moment-by-moment developments regardless of the new source. In addition, it would be fair to conclude that staying up to date on the situation and informing other subscribers of incoming news provided a certain sense of virtual involvement and excitement for subscribers.
Despite the serious nature of the insurrection, humour composed 11% of the dataset. Content ranged from mocking the GOP to celebrating/praising the insurrectionists to more generic sarcastic ‘shitposting’ about the situation. As Chelsea Daymon, an extremism researcher observes, humour serves multiple functions in extremist online ecosystems: it resonates with the intended in-group audience through irony and carries deeper sinister meanings understood amongst the in-group. Humour also offers the ability to promote violence with plausibility via the “it’s ironic” excuse, and packaging propaganda narratives “masked” as shitposting. Additionally, the usage of humour may have a mainstreaming effect. Cynthia Miller-Idriss, Director of the Polarization and Extremism Research and Innovation Lab (PERIL) at American University, notes that extremists shift away from negative emotions such as anger and opt to instead “use humor, wit, and clever codes that convey exclusionary and dehumanizing messages” to “weaponize humor” to funnel “extremist ideas into the mainstream.”
Data in this category was frequently cross coded with ‘martyrdom’ and a majority of the content incorporated white supremacist narratives about the decline and/or oppression of the white race. Although others who died during the insurrection were included, there was a particular focus on Ashli Babbitt. Posts focusing on Babbitt included a heavily gendered dynamic highlighting her status as both a mother and wife. These posts purposefully framed Babbitt as a pure innocent white female victim with direct appeals to the reader designed to elicit anger at an injustice: “Her name was Ashli Babbitt, she was murdered inside of the US Capitol building by police…. your government hates you. They want to kill your children.” The narrative message links the necessity of protecting white women with the concept of motherhood in that white women are viewed as the producers of the white race through “their reproductive capacity” and therefore must be defended to ensure its future. Victimhood narratives also attempted to intimately relate to Trump supporters and ‘redpill’ them by giving them a sense of collective victimhood, “She (Babbitt) was a Trump supporter like many of you are…” Posts in this category also attempted to portray insurrectionists as victims of a government crackdown and false accusations of being terrorists, “They (the GOP) betrayed you live on T.V. while calling you terrorists.”
This theme included antisemitic, racist, and homophobic content. Bigotry posts incorporated memes as well as general commentary dehumanising minority groups. Many of the channels frequently mentioned the fact that President Joe Biden had Jewish cabinet members. The channels’ decision to implement bigotry into their posts suggests efforts to direct and feed into anger towards minorities, push antisemitic conspiracies, and dehumanise selected demographics. A few of the posts in the channels were purposely directed towards the Black security guards at the Capitol by calling them the n-word. The bigotry reflected known feelings of the far-right and white supremacists, but it also demonstrates how bigoted narratives are frequently woven into everyday commentary on the part of these extremists.
Fascists and white supremacists eulogise terrorists, such as Dylan Roof and Anders Breivik, who they extol as ‘martyrs’. However, there is another category of ‘martyr’: individuals whom they view as victims of injustice at the hands of their perceived enemies. Martyrdom “serves as a symbol for a movement or ideology, as a source of encouragement for action and unity due to the sacrifice of the individual.” Demonstrating this point, one posts stated, “They’re [individuals killed during the insurrection] people you can recognise, relate to…they’re people you can look into the eyes of and say “they didn’t deserve this, they died for this.”” In terms of individuals, Ashli Babbitt was referenced the most by posts in this category. As previously mentioned, her story provides the most appealing angle because of her gender, race, and motherhood status becoming what Ashley Mattheis terms a “shield maiden” who embodies a “mythic figuration of white womanhood.” Babitt’s mythologised martyrdom narrative did not escape gendered criticisms. Another post declared that “men shouldn’t allow women to be in this position…she’s (Babbitt) still a hero though.” Notably, many of the channels that highlighted her as a symbol of racial purity and white womanhood posted anti-QAnon content even though Babbitt herself was a staunch believer in QAnon conspiracies. This point speaks to these channels’ compulsion to create sanitised revisionist narratives that erase the complexities of the individual in efforts to present an easily digestible martyrdom narrative. In turn and per the propaganda, Babbitt became not a martyr for QAnon “but for ‘white America’” and “a rallying cry for retaliation…”
The “accelerationism” theme appeared throughout numerous channels: it encouraged individuals to take whatever actions necessary to hasten violence and societal/political collapse. These posts promoted violent imagery such as individuals holding weapons and included a ‘no political solution’ narrative with the phrase “voting will not replace them.” In short, it framed violence as the only solution. This slogan has previously circulated in the Telegram extremist ecosystem, but the current circumstances may offer fertile ground for positive reception of this message. There was a large overlap with the ‘anti-establishment’ code because ‘no political solution’ is integral to this propaganda narrative.
On many occasions, channel admins posted messages contrasting the insurrectionists to Black Lives Matter (BLM) and Antifa. These posts frequently compared the damage caused during the insurrection to the 2020 BLM summer protests and insisted that authorities’ and media outlets’ responses to BLM and Antifa were disproportionality laxer. They believed they were not being treated fairly in comparison to other individuals that chose to carry out similar actions. “I have never seen the federal government act so quickly. Anti-fascists and blacks torched entire cities…but when whites took over the Capitol, it put fear in them. Our unity is powerful.” A number of these posts were cross-coded with ‘anti-establishment’ – they advocated for a “no political solution” stance and used juxtapositions with BLM and Antifa to further their narrative that whites are under attack by politicians, the government, media, and law enforcement.
Safety Protocols 3%
The law enforcement response to the insurrection caused Telegram users to find ways to secure their accounts and prevent their identities from being detected. Subscribers and channel administrators spread messages about enabling two-factor authentication on their accounts and tactics to conceal their identities on other social media platforms. Many of the posts urged users to take all necessary precautions to avoid law enforcement. Other posts emphasised the importance of using a burner phone to prevent the number from being traced. This theme showed an increase in security concerns following the insurrection and the efforts channel admins invested in instructing others on methods of maintaining online privacy. It is important to note that such posts would be advantageous to new Telegram users unfamiliar with this new landscape.
Redpill Normies 3%
Posts in this category contained content focused on sharing strategies to radicalise Trump supporters into white supremacist and fascist ideologies. Although this approach is not new, the current circumstances have created an opportune moment in recruitment potential for white supremacists and fascists. Following the removal of Parler from the Apple and Google Play app stores as well as Amazon’s cloud hosting service, Telegram experienced an influx of deplatformed users, becoming the 5th most downloaded app in the US. In response, extremist Telegram channels and user accounts crafted recruitment messages and dialogue designed to appeal to demoralised Trump supporters. One channel with the word ‘fascist’ in the title proposed, “Go in here (chats with Trump supporters) and play nicely with these magatards. They could be brought to our side using some tact…treat them like family. If we alienate them, we die.” This example highlights the manipulative yet intensely personalised approach that recruiters are advised employ when engaging with their potential targets. Other approaches included exploiting Trump supporters’ disillusionment and frustration to push a ‘no political solution’ narrative, creating a sense of fear and resentment in Trump supporters by informing them of the domestic terrorist label, and taking a slow ‘redpilling’ approach to radicalise the individual overtime.
Historical References 1%
Drawing on historical legacies and celebrating a shared history further cements cohesion for an in-group by evoking a sense of patriotism. By referencing the American Revolution, and to a lesser degree, the Confederacy, these posts centered on inspiring and encouraging the target audience by comparing current events to the past. They also positioned the insurrectionists as carrying the legacy torch of the American Revolutionary spirit; in turn sending the message that the American Revolutionaries of 1776 and the insurrectionists are, in essence, one in the same.
A portion of the content fell into categories outside of the 12 primary themes and we opted to group these under ‘Other’ because if listed on their own, each category would constitute under 1% of the data. Posts included: images showing specific organisations at the Capitol on 6 January, criticism of the insurrectionists, a range of conspiracy theories, survivalist advice, calling on veterans to fulfill their duty at home by “fighting the real enemy here on the home front.”
When examined as a whole, the ways in which these various narrative themes complement each other to form a largely cohesive narrative become apparent. All components serve a purpose ranging from, but not limited to, emotional manipulation, recruitment, mythologising historical narratives, creating martyrs, avoiding law enforcement detection and privacy protection, and strengthening in-group identification through victimhood narratives. The manipulative narratives are designed to elicit strong emotions in users ranging from fear, anger, camaraderie, righteousness, pride and hope – among an array of other sentiments – while simultaneously offering a sense of community belonging. Although many organisations and individuals were physically at the insurrection, Telegram offered them a way to virtually experience the Capitol insurrection as they received continuous updates in the form of commentary, propaganda, video footage, and general news updates from channel networks in real time. Social media allowed users on various platforms to stay up to date on events in real time and receive persistent extremist messaging and propaganda.
More information concerning the varying degrees of coordination that took place on various platforms continues to surface, however, there is no doubt that individuals promoted the concept of storming the Capitol and a number of individuals coordinated their movements during the insurrection. While extremist content may appear to move away from mainstream platforms, tech platforms must make note of the potential for cross-platform pollination and attempts by deplatformed users to regroup elsewhere and/or strategise a return to the original platform under a less conspicuous name to avoid another ban. Other policy suggestions include incorporating insights from former far right extremists in online intervention work, tracking attempts by extremists to hijack hashtags, and relying on researchers who are observing these spaces to report on strategies that extremists are employing to avoid detection.
This article resulted from a Polarization and Extremism and Innovation Lab (PERIL) study analysing Telegram content from white supremacist channels. See footnote 17 in “Uniting for Total Collapse: The January 6 Boost to Accelerationism” by Brian Hughes and Cynthia Miller-Idriss.