The Dark Side of TikTok
The growing presence of extremist groups on social media platforms has become more and more prominent. Yet, while most academic attention is focused on leading platforms like Twitter, Facebook or Instagram, the extremist migration to other platforms like TikTok has gone virtually unnoticed. TikTok is the fastest-growing application today, attracting a huge audience of 1.5 billion active users, mostly children and teenagers. TikTok is a Chinese video-sharing platform owned by ByteDance, a Beijing-based company. It is mostly used to create short dance, lip-sync, comedy, and talent videos. The fresh, innovative and fast-moving content has hooked young audiences around the world, resulting in over 2 billion downloads globally since its launch in 2017.
The popular teenage platform has a dark side too. TikTok has been the subject of troubling reports about its content, which is reportedly filled with nude images of children, child predators, devious algorithms, lack of privacy, and teens bullying and harassing one another. Moreover, the seemingly innocent video-sharing platform is hiding a much more sinister side: it allows for a steady stream of drugs, predatory messages and animal cruelty. Its lax security and control have allowed it to become a magnet for pedophiles, profanity, crime, violence and extremism.
TikTok’s Extremist Content
While collecting data for our ongoing project on online extremism, we were surprised to find some troubling extremist content on TikTok. As a result, we conducted a special study, in a first attempt to identify extremist use of TikTok, with a focus on various far-right groups. To scan TikTok for far-right content, we applied a systematic content analysis. The first stage involved identifying TikTok accounts of known far-right groups. The list of far-right groups included neo-fascists, neo-Nazis, anti-Semites, white supremacists and other extremist groups and organisations that feature ultranationalist, chauvinist, xenophobic, racist, homophobic, anti-communist, or reactionary views. We then examined these accounts as well as the accounts which showed interest in the topic through liking or commenting on their posts, or simply by following them. Far-right users were identified by examining their avatars and profile photos, as well as by their usernames: many far-right users use flags, images or symbols of known far-right groups as their avatars or profile photo, as well as usernames relating to the ideology. The scan of TikTok videos was conducted over four months (February-May 2020).
Our systematic scan revealed hundreds of postings related to far-right extremism. These encompassed the far-right ideologies of fascism, racism, anti-Semitism, anti-immigration, chauvinism, nativism, and xenophobia. These postings ranged from espousing violence and promoting conspiracy theories to glorifying terrorists. The majority of far-right content on TikTok relates to anti-Semitism and Holocaust denial, and we also found numerous postings of Hitler’s speeches including life in the Third Reich. We also found homophobic and racist postings including a young boy preaching to his followers that “white people need to start owning some black people, so take your black friends and sell them.” Another trend was posting videos encouraging viewers to fight and take up arms, featuring far-right symbols such as the swastika and the “sonnenrad.” The sonnenrad, or black sun, is an ancient symbol which was used by the SA and SS in Nazi Germany, and has since been hijacked by neo-Nazis and white supremacists. One video from the Atomwaffen Division calls for a “race war now.”
Far-right extremists and killers are also glorified on TikTok. In one posting, Brenton Tarrant, responsible for the 2019 Christchurch shootings in New Zealand killing 51 people, was made into the intro for a video game entitled “Brenton Tarrant’s Extreme Mosque Shooter” with the option to “start” and “load game” alongside a picture of Tarrant. Other postings showed their support for Anders Breivik, a far-right terrorist responsible for the 2011 Norway attacks killing 77 people. Breivik is represented as a hero, with a halo, the Go-Pro camera which he used to live-stream his attack on Facebook, a copy of his manifesto, a sonnenrad which featured on the manifesto’s cover page, the rifle used to commit the attack, a military helmet and the far-right symbol of the “sun cross” version of the Celtic Cross. Other postings are dedicated to Dylann Roof, an American white supremacist responsible for the Charleston church shooting of 2015, killing nine people, gaining over 240 likes across 16 videos. Elliot Rodger, also referred to as “the Supreme Gentleman,” who was responsible for the 2014 Isla Vista killings, an act of retribution against the woman that had rejected him, was described as an “incel hero.” We found several postings of Rodger’s “Retribution” video which had been edited with effects and music.
TikTok uses an algorithm to recommend videos to users. According to TikTok’s listing in the iOS App Store, it is a “personalized video feed based on what you watch, like, and share.” When users open the app, the videos are on auto-play, making it hard for users not to start watching them. In our study, we found that after reviewing videos of far-right material, the “For You” page started recommending similar videos based on the content we had been watching despite not interacting or following any of these users. TikTok’s algorithms could push young users who accidentally or otherwise start watching such content to be exposed to more of the same disturbing videos.
Conclusions
These findings come amid growing calls for tighter social media regulation. TikTok claims it is “raw, real, and without boundaries.” Yet, the lack of boundaries and the growing popularity of this platform, make it an ideal hotbed for violent and extremist content. While similar concerns were voiced against other social media platforms, TikTok has unique features that make it more troublesome. First, TikTok’s users are almost all young children and teenagers who are generally more naïve and gullible when it comes to malicious content. Second, TikTok is newer than other platforms, and is thus severely lagging behind its rivals who have had more time to grapple with how to protect their users from disturbing and harmful content. Yet TikTok should have learned from these other platforms’ experiences and applied their own Terms of Service that does not allow content that is deliberately designed to provoke or harass its users.
Gabriel Weimann is a Full Professor of Communication and a Senior Researcher at the Institute for Counter Terrorism (ICT), a Professor Emeritus at the University of Haifa and a guest Professor at the University of Maryland. Natalie Masri is a Research Assistant and a graduate student at ICT.
This insight is based on an academic paper “Research Note: Spreading Hate on TikTok“ published in Studies in Conflict & Terrorism.