On 6 January, insurrectionists stormed the US capitol and disrupted the confirmation of election results in the House and Senate; a day that resulted in five fatalities. Shortly after, Twitter, Facebook, Instagram and other social media sites announced their removal of President Donald Trump’s accounts due to concerns of further incitement to violence. Within the same week, Amazon, Apple and Google severed busines ties with Parler– a platform that advertises itself as a Conservative free speech platform – to prevent a continued spread of “dangerous and illegal content.”
Amidst these major deplatforming efforts, an influx of deplatformed Parler accounts are moving to Telegram as well as other apps. Telegram downloads saw a reported +146% increase from 5-10 January, becoming the 5th most downloaded app in the US by 12 January and Telegram users in extremist spaces are taking notice. In white supremacist and neo-Nazi Telegram groups, discussions have centred around strategies to infiltrate pro-Trump chats and redpill what they have termed as “Parler refugees” with carefully constructed narratives and propaganda. The idea to recruit among a pro-Trump demographic is not new but hardcore white supremacists view the current platform migration process as an opportune moment. It is important not to further spread such messages by duplicating them word for word, however, a general overview of their approach could provide helpful insights and raise awareness of these nefarious efforts targeting new Telegram users. The following are bullet point overviews of redpilling strategies being circulated:
- Share specific propaganda (videos, texts, Telegram channels, etc.) with pro-Trump users in hopes that they will react positively to the messages
- Spam pro-Trump chats with redpilling content
- Convince them to reorient their priorities and goals away from “mainstream conservatism”, the GOP, and Donald Trump; make them feel demoralised about the current circumstance of their party and President
- Enter pro-Trump chats and search for individuals who may be recruitment material
- Focus on bringing into the fold an older demographic (Trump “Boomers”) who may feel disenchanted with the President
- Draw users from other platforms to Telegram where much of the content is uncensored
- Employ tailored redpilling approaches depending on which demographic you’re dealing with – for example: appeal to young men with radical and rebellious propaganda, speak to college-educated men and women with academic-sounding messages, and for middle-aged people, elicit feelings of fear by feeding them disinformation.
- Make Telegram so appealing that new users will not want to return to “controlled opposition platforms” such as Parler
- Use Trump supporters’ emotions of disappointment, anger, and concern to ease the recruitment process and open them up to “education”
- Take a subtle redpilling approach and introduce Trump supporters to white supremacist ideas gradually as opposed to appearing overly forceful by spamming them with hardcore content right away. The most important hurdle to overcome is getting them to embrace white supremacism through the watered-down optics approach of “it’s ok to be white” and “it’s ok to work for the white well-being.”
One extremist channel with a large number of subscribers shared a message that laid out a detailed redpilling guide. In short, it proposes mapping “normie Trump” Telegram chats to identify key members in the network, striking up friendly conversations with these individuals followed by a pre-constructed message (which I will avoid providing here) containing links to propaganda, and finally, providing a list of extremist Telegram channels that they should further explore. Once the target spends time with the suggested links and propaganda content, the instructions encourage the recruiter to engage in some follow-up conversation to see how the Trump supporter is responding. Two important aspects the “guide” are: 1) the recruiter must mention that he or she was a former Trump supporter (whether that is true or not) and 2) this process must take place in direct messages to ensure that the target is paying full attention without any distractions, pushbacks, or complaints from admins or other members in chat groups.
These manipulative messages are designed to elicit strong emotions in their targets ranging from fear, anger, and hope – among an array of others – while simultaneously offering a sense of belonging in a ‘new’ community, i.e. hardcore white supremacist groups and ideologies. Perhaps most importantly, the critical threshold they want Trump supporters to cross is the summary outlined in the last point provided in the list above regarding the “work for a white well-being” redpill and any counter messaging efforts should centre on this aspect in particular. Recruiters are encouraged to use everything at their disposal including lying about being former Trump supporters to appear more legitimate to their Trump supporter targets. Recruitment organising is crucial to identify as early as possible so that counter-narrative responses may be employed against these strategies. Kurt Braddock’s work on inoculation states that inoculation messages should contain three key elements:
- Highlight an “impending threat” that will attempt to challenge the individual’s (the target of the extremist messages) viewpoint
- Expose the individual to weakened versions of the most potentially “potent” extremist content
- Refute the previously presented watered down extremist content from step 2
An inoculation message structured from Braddock’s outline could reduce the harm of recruitment efforts by white supremacists on apps such as Telegram. It would ideally warn Trump supporters that dangerous hardcore white supremacist/neo-Nazi groups and recruiters see them as easy prey and inoculate them against any individuals seeking to recruit. The very fact that these recruiters are coming from a place of condensation, mockery, and even disdain towards Trump supporters also needs to be emphasised.
More widely, the effects of this deplatforming process continue to play out and questions remain about extremist recruitment and the amplification of extremist content on apps with encryption capabilities. Researchers have cited the online trajectory of pro-Islamic State (IS) supporters, who have faced significant suspension efforts on many platforms, as a case study with important ‘lessons learned.’ However they have also cautioned against drawing too many parallels with the extreme right and white supremacism. Maura Conway outlines a number of factors such as the financial profitability of extreme right/white supremacist content, the ability for them to build their own social media platforms, and the continuously-shifting “scene” composed of an often amorphous online ecosystem not dominated by a single group or sub-ideology. As CNN Security Editor, Nick Walsh, states, this is not to suggest that the US extreme right and/or white supremacists should be compared to IS. Instead, let us use the example of deplatforming IS as a potentially helpful framework for better understanding extremist online milieus and their general sustainability while equally taking into account “distinguishing feature” differences.