Click here to read our latest report “30 Years of Trends in Terrorist and Extremist Games”

Upvoting Extremism: Collective Identity Formation and the Extreme Right on Reddit

Upvoting Extremism: Collective Identity Formation and the Extreme Right on Reddit
25th November 2020 Tiana Gaudette

Researchers who have explored right-wing extremists’ use of the Internet have typically focused their attention on dedicated hate sites and forums. As a result, the nature of right-wing extremists’ use of online platforms that are outside the mainstream purview of digital platforms – including Facebook, Twitter, and YouTube – has gone mostly unexplored by researchers, despite the fact that lesser-researched platforms (e.g., 4chan, 8chan, Reddit, etc.) have provided adherents with spaces to anonymously discuss and develop ‘taboo’ or ‘anti-moral’ ideologies. Scholars who have recognised this lacuna have called for further exploration of extremists’ use of under-researched platforms. Conway (2017), for example, noted that “different social media platforms have different functionalities” and posed the question: how do extremists exploit the different functions of these platforms? In response, the current study begins to bridge the abovementioned gap by exploring one platform that has received relatively little research attention, Reddit, and the functional role of its voting algorithm in facilitating collective identity formation among members of a notoriously hateful subreddit community, r/The_Donald.

This research is guided by the following research question: How does Reddit’s unique voting algorithm (i.e., it’s upvoting and downvoting function) facilitate ‘othering’ discourse and, by extension, collective identity formation on r/The_Donald following Trump’s presidential election victory? To answer this question, data were collected from a website that made Reddit data publicly available for research and analysis. We extracted all of the data posted to r/The_Donald subreddit in 2017, as it marked the first year of Donald Trump’s presidency and a time when his far-right political views encouraged his supporters to preach and practice racist hate against the out-groups, both on- and offline. Research has similarly identified a link between Trump’s election victory and a subsequent spike in hatred, both online and offline, in the year following his victory.

We then identified the 1,000 most highly-upvoted user-submitted comments in the data. A second sample, which comprised of 1,000 user-submitted comments, was randomly sampled from the data to serve as a comparison to the highly-upvoted sample. The purpose of this was to explore what makes highly-upvoted content unique in comparison to a random sample of non-highly-upvoted comments. The data were then analysed using thematic analysis.

Several conclusions can be drawn from this study.First, a particularly effective form of collective identity building was prevalent throughout the r/The_Donald community – one that was riddled with hateful sentiment against members’ perceived enemies: Muslims and the Left. The thematic analysis of the highly-upvoted content indicates that members most often agree with (i.e., upvote) extreme views toward two of their key adversaries to mobilise their social movement around the so-called threat.

In particular, r/The_Donald’s rules specifically condoned anti-Muslim content, which in large part explains why such content was so prevalent among the most highly-upvoted messages. Additionally, Trump’s own anti-Muslim rhetoric, which has emboldened right-wing extremists to commit hateful acts against Muslims, may explain why his supporters on r/The_Donald were so eager to vilify the external threat. Known as the ‘Trump Effect’, the results of the current study suggest that Trump’s anti-Muslim rhetoric emboldened his fervent supporters on r/The_Donald to spread anti-Muslim content via Reddit’s upvoting algorithm. Indeed, anti-Muslim hate speech is increasingly becoming an “accepted” form of racism across the globe and extreme right movements have capitalised on this trend to help mobilise an international audience of right-wing extremists. For instance, the othering of Muslims, specifically, is commonly used to strengthen in-group ties between right-wing extremists against a common enemy. By describing Muslims as the violent perpetrators and themselves as those who must defend themselves from “them”, for example, members in r/The_Donald framed themselves as victims rather than perpetrators – an othering tactic commonly used by the extreme right, among other extremist movements. This suggests that, by endorsing anti-Muslim sentiment, r/The_Donald offered extremists a virtual community wherein they may bond with likeminded peers around a common enemy. Worth highlighting though is that, while a large proportion of the highly-upvoted content in the current study often frame Muslims as a serious and imminent threat, users tended not to provide specific strategies to respond to the so-called threat. Such behavioural monitoring may have been done in an effort to safeguard the community from being banned for breaking Reddit’s sitewide policy against inciting violence. Regardless of these efforts, recent amendments to Reddit’s content policy that explicitly target hate speech led to the ban of r/The_Donald.

There are a number of reasons that might explain why the Left was the target of othering discourse in the highly-upvoted content on r/The_Donald. For instance, such sentiment most likely reflects the increasing political divide in the U.S. and other parts of the Western world, where the ‘Right’ often accuse the ‘Left’ of threatening the wellbeing of the nation. However, the results of the current study suggest that some users on r/The_Donald took this narrative to the extreme. To illustrate, the highly-upvoted anti-Left discourse that was uncovered in the highly-upvoted content mirror ‘paradoxical’ identity politics expressed by the alt-right movement, which accuses the Left of authoritarianist censorship, yet position the far-right as the harbingers of ‘peaceful social change.’ On r/The_Donald, the Left (i.e., the out-group) was construed as the opposite of peaceful and even as a violent and physical threat to members of the in-group. Framing the out-group as a physical threat to members of the in-group served to further delineate the borders between ‘them’ versus ‘us’ and solidifies bonds between members of the in-group who, together, face a common enemy – a community-building tactic that is commonly discussed in the social movement literature. Notably, however, in the face of this so-called physical threat, within the highly-upvoted comments users cautioned others from retaliating against ‘Leftist-perpetrated violence.’ It is probable that members may have wanted to avoid discussions of offline violence to avoid being banned from Reddit for overstepping its content policy against citing violence. Similar tactics have been reported in empirical studies which found that extreme right adherents in online discussion forums will deliberately present their radical views in a subtler manner in fear that they will be banned from the virtual community.

Second, comparing the content in the highly-upvoted sample with the random sample, it is clear that there was a complete absence of dissenting views among the most visible content in the highly-upvoted comment sample. This suggests that Reddit’s voting algorithm facilitated an ‘echo chamber’ effect in r/The_Donald – one which promoted anti-Muslim and anti-Left sentiment. In particular, rather than encouraging a variety of perspectives within discussions on r/The_Donald, Reddit’s voting algorithm seemed to allow members to create an echo chamber by shaping the discourse within their communities in ways that reflected a specific set of values, norms, and attitudes about the in-group. For example, among the community’s most highly-upvoted content was an absence of comments which offered an alternative perspective or even a dissenting view to the community’s dominant extremist narrative. In some cases, members even used humour and sarcasm – which is commonly used by commenters on Reddit and related sites like Imgur regardless of political ideology – to express otherwise ‘taboo’ anti-Muslim and anti-Left views, reflecting a longstanding tactic used by the extreme right to ‘say the unsayable.’

Overall, our study’s findings suggest that Reddit’s upvoting and downvoting features played a central role in facilitating collective identity formation among those who post extreme right-wing content on r/The_Donald. Reddit’s upvoting feature functioned to promote and normalise otherwise unacceptable views against the out-groups to produce a one-sided narrative that serves to reinforce members’ extremist views, thereby strengthening bonds between members of the in-group. On the other hand, Reddit’s downvoting feature functioned to ensure that members were not exposed to content that challenged their extreme right-wing beliefs, which in turn functioned as an echo chamber for hate and may have also functioned to correct the behaviour of dissenting members – a possibility gleaned from the results of previous research on the effect of Imgur’s bidirectional voting features on social identification. Seen through the lens of social movement theory, the extreme views against Muslims and the Left that characterised the ‘othering’ discourses among the most highly-upvoted comments may have been more likely to produce a stronger reaction from r/The_Donald’s extreme right-wing in-group and, as a result, they may have been more likely to mobilise around these two threats.

For much more on these findings and the nature of the study in general, we encourage you to read the full manuscript which was recently published in New Media and Society.

Tiana Gaudette is a Research Associate at the International CyberCrime Research Centre (ICCRC) at Simon Fraser University (SFU).

Ryan Scrivens is an Assistant Professor in the School of Criminal Justice at Michigan State University (MSU). He is also an Associate Director at the ICCRC and a Fellow at VOX-Pol and GNET.

Garth Davies is an Associate Professor in the School of Criminology at SFU.

Richard Frank is an Associate Professor in the School of Criminology at SFU and Director of the ICCRC.