Click here to read GNET's latest report Emergent Technologies and Extremists: The DWeb as a New Internet Reality?

The Tate Storm: Why Banning Andrew Tate from Social Media Will Not Stop the ‘King of Toxic Masculinity’

The Tate Storm: Why Banning Andrew Tate from Social Media Will Not Stop the ‘King of Toxic Masculinity’
14th September 2022 Abhinaya Murthy
In Insights

“This War has Just Begun” 

In August 2022, UK-based advocacy group Hope Not Hate made waves when they successfully campaigned to ban social media content creator and former professional kickboxer Andrew Tate from all major platforms (TikTok, Instagram, Twitter, YouTube and Facebook). The final briefing cited his extremist views including misogyny, Islamophobia, homophobia, and racism which have the potential to harm boys and men who largely make up his enormous follower base. Tate made headlines in 2016 for his violent behaviour on the reality show Big Brother, in which he was caught beating a woman with a belt on camera. While Tate had been making content since then, it is this campaign that brought his acts of violence and hateful tendencies back into international focus. Most recently, Tate announced a move to Rumble, a platform that recently exploded with popularity among conservative audiences – particularly during the pandemic – for tolerating or even promoting medical misinformation. Rumble claims to be the new home for right-leaning social media personalities censored by Big Tech. Since this move, Tate called for two “emergency meetings” on the platform which garnered a combined total of 1,183,111 views. He emphatically stated, “I have my soldiers…this war has just begun…I ain’t going nowhere”, followed by a call to share his content everywhere to ensure that he stays relevant despite being deplatformed by Big Tech. Considering Tate’s content continues to reach the masses, this Insight shifts focus from him to his supporters and discusses the role of virality in sustaining violence by unpacking the neoliberal capitalism-mis/disinformation-hate speech nexus. 

Algorithms and Visibility 

The subject of algorithms engineering inequality has occupied the minds of media studies and science and technology scholars for decades. In her pathbreaking book titled Algorithmic Oppression: How Search Engines Reinforce Racism, Safiya Noble exposes how the online world is laden with offline inequalities. Eubanks, in Automating Inequality, further discusses how automated programs are biased against the poor and are grossly harming public service systems. In the context of social media, spaces are subjected to different levels of visibility. Bucher explains how there are vertical arrangements of visibility and invisibility in her analysis of Facebook’s ranking algorithm, EdgeRank. Algorithms can decide what is visible, to whom, and to what extent. 

An algorithm is a series of steps that performs a task. Machine learning algorithms are taught to learn over time through repetitive, programmed tasks. For example, YouTube recommends a certain type of content–say, food vlogs––to a user if it learns that the user repeatedly clicks on food videos. On a superficial level, recommendation algorithms on platforms like Instagram or TikTok are based on learning users’ content consumption patterns and repeating them. Thus, algorithms have no agency and are ‘unintelligent’ on their own. However, it is imperative to discuss technical systems as socio-technical systems and unpack the social shaping of technology. Exploring political economy explanations of social media networks reveals that the value of content, content producers, and consumers are determined by several factors including the commodification of audiences.

Further, it is futile to approach any analysis of virality, especially in relation to extremist content and hate speech, without examining the dialectic relation that audiences share with platforms. It is therefore useful to unpack Tate’s virality using an affordance-based approach applied by researchers of hateful content on social media, which places emphasis on users, the narratives that users subscribe to, and how that leak into everyday interactions on social media platforms. Applying such an affordance-based approach whilst being cognizant of political economy explanations allows for a critical examination of the symbolic and material violence stemming from online misogyny, particularly when promoted by an ‘influencer’ like Andrew Tate, especially in the case of Tate’s ‘Hustler’s University’. 

Subscribe now: Neoliberal Capitalism, Radicalisation and Narratives of Oppression

‘Hustler’s University’ takes complete advantage of social media usage in the current neoliberal economic setting. The Hustler’s University website states that students can immediately start making money through freelancing which does not require any initial investment from their end. He has publicly self-disclosed his agenda to use his students to manipulate social media platforms by having them posting his content using around 500 accounts across social media platforms. The Observer finds that Tate makes this happen by incentivising his students to post his own clips across social media platforms, thus making his content go viral. In a podcast, Tate boasted that a 16-year-old student earns at least £45,000 per month posting his content on TikTok. His content, as has been widely reported and acknowledged, is dangerous––ranging from messages of violently attacking women with a machete to show her place, to teaching men that their value can only be derived from fast cars, incredible wealth and being a confident womaniser. 

Zuboff, Couldry and Mejias, make it abundantly clear that social media platforms operate on profit-making agendas. Gillespie argues that social media companies emerged at a time where there were no obligations for content moderation with conflict driving user engagement as it attracts more comments and clicks. Thus, Tate does not only sell a business course that is enabled by digital platforms but preaches a lifestyle underpinned by a “[white] supremacist, capitalist patriarchy”, to borrow bell hook’s illustrative term. These narratives of male empowerment and success normalise and maintain structures of heteropatriarchy, racism, able-bodiedness, toxic masculinity, and exploitative, neoliberal capitalism. 

Personal abuse, hate speech and violent extremism often exist in the same ecosystem, “with one feeding into the other, not necessarily in a linear way” (Mirchandani, 2018). To that, Hope Not Hate found that Tate has been notoriously developing and sustaining relations with far-right leaders (like Stephen Lennon) and communities (like the manosphere) for decades. Indeed, even in his recent interview with Tucker Carlson on Fox News, Tate explicitly shares misinformation about the severity of the pandemic by calling the lethal virus a “common cold”. Online misogyny, misinformation, and disinformation have material consequences including long-lasting psychological impact on its victims and in some severe cases––such as the Rohingya refugee crisis––mob lynchings and ethnic cleansing. When approaching online misogyny and violent vitriol from the lens of digital disinformation, it becomes clear that persuasive hateful communication is rooted in powerful social narratives that the public already subscribes to. This is made clear in Tate’s most recent message uploaded on 6 September 2022 on Rumble where he outlines his ‘41 Tenets of Tateism’ most of which emphasise toxic masculine ways of life: 

“If you stick to these 41 tenets of life and make sure that your actions, your intentions, your will and your thoughts are true to these tenets, the world is absolutely and utterly open to your complete conquest…it’s easy to become rich, it’s easy to have beautiful females, it’s easy to have any car you want, go anywhere you want, nobody is going to mess with you, you will become ultimately powerful.” 

In another Rumble video, Tate states that he has “inspired the masculine youth of the day by speaking to them in a language they understood and resonated with”. Tate comes across as a revered hypermasculine hero, and consequently built a loyal following by capitalising on narratives that men hold on to. Thus, even while Tate was banned from mainstream social media, his move to Rumble was welcomed with warmth and uproar from his followers. Research suggests that such migration from well-moderated platforms to niche platforms that are more tolerant towards radicalising narratives is often what follows Big Tech bans of prominent hateful figures. 

“Top G Certified”: On Veneration, Victimisation, and Virality 

Existing literature on political communication reveals that politicians need to appeal to the “imaginative dimension” of digital disinformation and connect to people’s “deep stories”. Thus, as outlined in the previous section, to understand how online misogyny and violence is sustained in digital spaces, public sentiment must be analysed. There is a tendency for extremists to frame subjects who traditionally have symbolic power in society as a suffering majority under the threat of oppression, and frame critics as hatemongers. Tate sells a similar, convincing narrative where he frames men today as in need of saving from the ‘matrix’–– a system that prevents men from rising to power economically and socially. He preaches to young men that intersectional feminism is a cog in the system of male oppression which they should break free from. 

Hate against one minority often leads to discrimination against others; in building a narrative of emasculation, Tate’s supporters also subscribe to extremist Islamophobic ideologies. Tate ‘fears’ that “the future of Western Europe is Islamic” and that the “guardians of British society” are “refusing to inspire people to have children…inspire marriage, and they want to demonise men every chance they get and let in millions of third worlders who don’t play by the game”. An ethnographic study by Hochschild with White, Trump-leaning, working-class Louisiana residents explains how these majority communities often subscribe to a narrative that they are actively and systemically disempowered. Tate is idolised among young men who share the narratives he propagates. Such veneration leads to creating an influence that is far-reaching and polarising. In Tate’s case, his extremist views and actions normalise online and offline abuse to the masses of young men under his influence. 

What’s Next for Online Safety? Moving Beyond Deplatforming

This year, the UK Government passed the Online Safety Bill which established a regulatory framework for social media companies to be held responsible for the content shared on their platforms. The Bill places emphasis on the highest risk platforms to address harassment, terrorism and abuse with urgent care and duty, employing a fine up to ten per cent of revenue generated in case of non-compliance. In a comprehensive report, Vidgen, Margetts, and Harris review evidence of the prevalence of abuse that UK users are exposed to from five distinct sources including the 2019 Oxford Internet Survey which showed that 30-40% of UK residents have encountered online abuse. To that, de-platforming Andrew Tate comes as a welcome response, particularly for users in the UK and USA where Tate is most popular.

However, Media Matters for America, a left-leaning watchdog organisation, notes that TikTok had previously failed to remove duplicate accounts made by followers of right-wing extremist and notorious conspiracy theorist Alex Jones. With regards to Tate, a TikTok spokesperson shared that duplicate content is actively being investigated and removed from their platform. As discussed above, abusive online content travels. Researchers urge governments to use their position to “understand the dynamics of online harms and how they span and migrate between platforms and communities”. This Insight further calls for an affordance-based approach that centres users and their practices which accounts for entrenched emotions and narratives. This will allow for an understanding of viral vitriol championed by people in a unique position of influence and power. 

In a digital space that is rife with contested meanings and relations, the only way to address online misogyny and extremist speech is to thaw at the gnarly ends of structural inequalities. Entangled within such inequalities, are shifting public imaginations bearing equally conflicting identities which should be challenged to unravel causes and solutions for online hate. This will require nuanced, context-driven research and policy action which identifies the various intensities of extremism, personal abuse and hate speech, how they bleed into one another, and the intersecting nature of online and offline behaviour.

Abhinaya holds an MSc in Media, Communication, and Development from The London School of Economics and Political Science. She focuses on issues surrounding technology, human rights, and online harms. She applies feminist and decolonial theories to better understand online practices, technological systems, and affordances.