Click here to read our latest report “Far‐Right Extremism and Digital Book Publishing”

Islamic State Supporters on Twitter: How is ‘New’ Twitter Handling an Old Problem? 

Islamic State Supporters on Twitter: How is ‘New’ Twitter Handling an Old Problem? 
18th November 2022 Moustafa Ayad
In Insights

The ‘Impersonation’ Insurgency 

Elon Musk has launched a war on “impersonation.” While most of those in his sights are parodying his tweets as the majority shareholder of the world’s most popular microblogging site — namely famous comedians like Kathy Griffin and Sarah Silverman, a journalist and former editor of the Texas Monthly, Christopher Hooks, or the former Minnesota Vikings kicker and writer Chris Kluwe —there are others on the platform we should really be worried about. 

In the three weeks since Musk became Twitter’s owner and sole board member, he has engaged in hyperbole about “freeing the bird,” and plotted a $20, and then revised $8 subscription model for verified accounts, which was immediately leveraged by trolls and activists parodying former presidents, and mocking multinational corporations and their leadership.  Lost in what can only be described as one of the most tumultuous weeks at Twitter, was how extremists have been waging an impersonation insurgency on the same platform. 

Following Musk’s purchase, a Twitter account appeared and began retweeting Islamic State content while impersonating a fitness and Only Fans model with more than 304,000 followers on Twitter, 3.23 million followers on YouTube and more than 6 million followers on Instagram. The account is one of a pair, the other using an avatar of a lingerie model whose sole purpose is to amplify tweets, Twitter Spaces, and other content by Islamic State supporters on the platform. 

If this seems wildly unimaginable, then the events of the past three weeks on Twitter pale in comparison. Since the changing of the guard at Twitter, fears grounded in the reality that Musk is seemingly unprepared for the challenge that comes with running a social media platform have run high. Much of this can be attributed to decisions such as halting internal access to content moderation tools, firing 50% of his staff, and then asking that some of them come back to work

Adding to the anxiety has been research into a resurgence of hate speech at unprecedented levels. The Network Contagion Research institute noted that slurs targeting Black people rose 5,000 per cent after Musk closed the deal. Furthermore, Mont Clair State University researchers documented a pronounced rise in hate speech on the platform in the same period.  This was similarly confirmed by Twitter’s own Head of Trust and Safety. Then, Musk’s own comments fueled an advertiser exodus that happened to correspond with hundreds of thousands of Twitter users migrating to the open-source messaging platform Mastodon, in the wake of what has been by all accounts a chaotic transition. 

The Islamic State: Resurgence or More of the Same?

Under new management, with a laissez-faire style to content moderation, concerns are that Twitter will not only become a haven for racist trolls but similarly a den of inequity rife with extremists. Institute for Strategic Dialogue (ISD) research teams have been tracking and monitoring terrorist groups and their supporters across various platforms, and have observed an emboldened set of extremists take to Twitter once again and succeed in carving out footholds, however small. In the first 12 days of the takeover, ISD tracked 450 new Islamic State Twitter accounts – a 69 per cent increase over the previous 12 days. 

Twitter has long had an Islamic State problem; research led by J.M. Berger and Jonathon Morgan conducted a census of supporters in 2016 and found a minimum of 30,000 Islamic State accounts during October and November of 2014. An outcry and concerted effort by governments globally to disrupt and degrade the narratives and appeal of the Islamic State led to the creation of the Global Internet Forum to Counter Terrorism (GIFCT), of which Twitter is a founding member. Hashing technology which uses artificial intelligence to automatically detect terrorist content has now become a standard practice at most major social media platforms. Six years following the release of the census, supporters of the group and its ideology have become a shell of their former selves. 

However, Islamic State supporters’ tenacity for the ‘media jihad’ has not waned on Twitter. In 2019, ISD found that 590 Islamic State accounts flooded the platform in the wake of Abu Bakr al-Baghdadi’s death in a singular week, using a new set of moderation evasion and content manipulation tactics. Now, ISD researchers have found a new group of Islamic State supporters heralding this new Twitter era, retweeting Musk in the process, and harkening back to their heyday on the platform back in 2014 and 2015. 450 new Islamic State accounts were found in the first 12 days of Musk’s takeover, as compared to 267 accounts over the 12 days prior. During that time, Twitter’s takedown rate of these accounts stayed at 27 and 28 per cent respectively. In essence, the same. Researchers collated the numbers from accounts that target Islamic State accounts on the platform for takedowns, and then independently verified their existence through social media intelligence gathered from the accounts. 

An Emboldened Few: Shifting Strategies 

What does this tell us about the direction of the platform and its ability to moderate the most noxious content? First, takedown rates appear to have been unaffected, but a surge in new accounts suggests a potential shift in dynamics. While current moderation seems to be limiting an all-out scrum of sockpuppet accounts from appearing, there is a concerted effort to gain footholds based on the data available on new account production. 

Similarly, during the past 12 days, a newly released 20-minute Islamic State video covering attacks in Central Africa was released. Islamic State accounts tend to increase in the wake of a new release, uploading and ‘fanning’ out content to new or existing networks. Fanning is the process by which accounts share new content with their networks which they, in turn, share with theirs, creating an amplifier effect in the process. This is a well-documented phenomenon and includes the use of both ‘real’ and sockpuppet accounts to share new material. 

True to form, 48 hours after the Islamic State Central Africa release, researchers found 160 new accounts on Twitter, the highest total of any two-day period in the dataset prior to the Musk acquisition. By monitoring new content amplification, ISD researchers know that Islamic State supporters use sockpuppet accounts impersonating other users as the first wave of amplification. This results in a ‘content blitz,’ where supporters on popular social media platforms share newly-released content through throw-away, hacked accounts as a means to draw the attention of moderation tools and moderators as other accounts share outgoing links to Telegram, WhatsApp and other third party sites that can host the new video or audio content. 

Rather than deploying phalanxes of sock puppet accounts — a norm in Arabic-speaking Twitter spaces — these accounts are more comfortable using hacked influencer-like profiles with large followings as a ‘fanning’ mechanism for terrorist content uploaded by smaller accounts. Two influencer Twitter accounts with avatars stolen from real-world influencers have been doing most of the heavy lifting for Islamic State Twitter. Elaborate catfish-style accounts claiming to be Puerto Rican and Scottish have shared hundreds of tweets to more than 20,000 followers, calling for Americans and Europeans to join the Islamic State and fight, targeting France and Germany specifically. The accounts similarly retweeted Twitter listening sessions launched by supporters of the Islamic State, a tactic that has become more popular over the past 12 days. ISD researchers found three Twitter Spaces launched in a 24-hour period which garnered 648 listens. 

ISD similarly found accounts that celebrated the new moderation standards on the platform, specifically with regard to Islamic State content. These accounts applauded Musk and this new era of Twitter, likening it to 2014. When Musk tweeted out “being attacked by both right & left simultaneously is a good sign,” one account retweeted it and compared it to attacks faced by Islamic State supporters globally. 

During a four-week period, ISD researchers found that 717 accounts were formed; of those, only 195 were taken down, demonstrating no discernible change in takedown rates under Twitter’s new management. So while there might be a resurgence afoot, emboldened in the wake of erratic acquisition and the culling of global teams at Twitter, the platform’s takedown rates have stayed relatively the same. 

There is, however, another battle in the works. 

This fight is focused on the ‘impersonation’ of public figures, and specifically Musk himself, and it is overshadowing the ability of the Islamic State to impersonate users to spread propaganda indicating deeper concerns for the platform and its global user base. 

It is cliche now to say that the Islamic State thrives in ungoverned spaces, but the adage holds up even if it is online. Following a series of missteps by Twitter leadership, the platform may seem as if it’s becoming just another example of an exploited, ungoverned space.

Moustafa Ayad is Executive Director at Institute for Strategic Dialogue (ISD) for Africa, the Middle East and Asia (AMEA).