Click here to read GNET's latest report Polarising Narratives and Deepening Fault Lines: Social Media, Intolerance and Extremism in Four Asian Nations.

Extremism in the Manosphere During the Presidential Transition

Extremism in the Manosphere During the Presidential Transition
22nd January 2021 Dr. Alexis Henshaw
Dr. Alexis Henshaw
In Insights

In recent months, the use of technology and social media to advance conspiracies and far-right ideologies has been the subject of much commentary. A recent GNET Insight about the online fallout from the storming of the US Capitol on 6 January 2021 took as its focal point reaction to the death of Ashli Babbitt, a protester who was killed by police after entering the building illegally. While their analysis focused on efforts to convert Babbitt into an iconic figure, embraced by various far-right groups, responses elsewhere in online extremist networks were less sympathetic. These include reactions on incel forums where a video of her death was shared, her political beliefs mocked, and the moment depicted using crude cartoons. These reactions represent a window into one extreme corner of the larger online environment referred to as the “manosphere.”

The term “manosphere” collectively refers to an array of groups scattered across a range of websites, united mostly by the celebration of “traditional” (i.e. male as provider/breadwinner) masculinity and hostility toward feminist views. Horta Ribeiro et al. point to an array of outlets encompassed by the term, including groups on mainstream social media, on purpose-built sites that claim to be a refuge from social media censorship, and sites on the dark web. Ideologically, groups associated with the manosphere range from older movements like men’s rights activists (who formed offline in the late 20th century, advocating for reforms to laws on divorce and child custody, etc.) to more extreme communities like incels, who have been banned from platforms like Reddit for advocating violence.

While some have pointed out that most discussions in these communities are toothless, bordering on “pathetic,” analysts have expressed concern about the rising potential for violence motivated by extreme misogyny. Hoffman and Ware point out that nearly 50 people in the US and Canada have been killed in attacks linked to online incel activity. Others express concern that online misogyny can serve as a kind of “gateway drug,” with anti-feminist discourse co-opted in recent years by neo-Nazi, white nationalist, and Identitarian groups, among others.

Efforts to map the manosphere lend further evidence to these concerns, suggesting that despite nominal divisions between more and less moderate groups, users easily and frequently migrate to newer and more extreme spaces for discourse. Such moves are facilitated in part by algorithms and content moderation policies on mainstream social media platforms. A recent study focused on YouTube pointed out the ease with which visitors to the site can access content expressing extreme misogyny—even when users are not directly seeking out related content. Similarly, content moderation policies have continued to allow space for users to funnel conversations to private channels and offsite locations that are less closely monitored. In some of these communities—especially in the wake of the US elections and the events of 6 January —conspiracies and violent discourse proliferate. One such site, which has hosted a conspiracy-fueled discussion thread on the 2020 elections where posters shared information from QAnon, described the US as being in a state of “war,” and called the election of Joe Biden a “coup,” continues to have an “official” sub-Reddit and an affiliated YouTube channel. Some incel and “red pill” forums also openly advertise chats and discussions held over Discord.

Reactions to the events of 6 January by law enforcement in the US and technology companies have not dampened these discussions. Rather, it has emboldened some in new ways. Similar to the bans on well-known incel communities by Reddit in 2017 (and again in 2019), which spurred a new wave of dedicated manosphere sites operating under the banner of free speech, moves by social media providers to suppress content that supports violence and alleges voter fraud have led some groups to launch online campaigns to fund new repositories for multimedia content like videos and podcasts, leveraging crowdfunding models and cryptocurrency donations. The advertisement of these campaigns using established social media platforms further underscores the shortcomings of current efforts to de-platform accounts associated with election conspiracies and violence.

The regulation of the manosphere raises important questions about the trade-offs between free speech and minimising the spread of extremist views. Current approaches used by some sites—like the quarantining of communities on Reddit, a measure that leaves communities easily accessible by clicking through a content warning—leave much to be desired. Evidence of links between some manosphere communities and other hate groups and conspiracies (especially QAnon, which various social media companies have already pledged to de-platform) should call into question where providers draw the line on what they consider “extremism.”