Research on online radicalisation and recruitment has become increasingly sophisticated, with scholars looking at the processes, pathways, and triggers that lead an individual to engage in anti-social behaviour. Platforms have responded to the research and specific events with some action. And yet, ideologically motivated violent extremist and religiously motivated violent extremist groups continue to threaten as they adapt and respond to the countermeasures, as they look to find new ways to spread their message and recruit.
In The Nichomachean Ethics, Aristotle argued that the purpose of human existence is the pursuit of happiness. He also identified three types of happiness: pleasure, passion, and purpose, leading him to argue that an individual could spend a lifetime searching for happiness. In later years, people have come to see happiness as a tangible commodity, and less of an internal matter.
The fixation with happiness or lack of, has become a central feature in contemporary society. There is a desperate need for tangible happiness, be it good health, a good job, wealth, and so on, but when we fail to attain such happiness, we look for external explanations as to why we are unhappy. In their need to seek, pursue, and reach happiness, individuals increasingly turn to social media.
The growing sophistication of algorithms, machine learnings, artificial intelligence could drastically change the recruitment strategies of nefarious actors as one could rely on such tools as recommender algorithms and the personalisation of content to push specific narratives at vulnerable individuals under the guise of offering them purpose and happiness. In the words of a radicalised individual, once they began to watch religious videos “It was like an enlightenment. I thought that I knew the ultimate truth now, I was addicted to those videos. Finally, I found a sense in life.” Moreover, every time they watched the video they felt a sense of gratification, encouraging them to rewatch.
One of the appeals of groups such as Islamic State, al-Qaeda, and others, is that they construct a single narrative that provides adherents with an emotionally satisfying explanation that resonates with the individual’s (and their community’s) outlook. This simplified narrative draws on negative emotions and identifies individuals, systems, communities, as the cause for one’s misery, but at the same time, the narrative offers a positive-rich goal of a world free of the horrors that the individual faces regularly.
Social media platforms look to ways to engage with users with content that would make them happy, which is dependent on platforms collecting copious amounts of personal data, with many users not appreciating the way data is used or traded. Secondly, social media platforms are predicated on humans being predictable, which means that by tracking actions, activities, behaviours, it becomes possible to facilitate desired outcomes. It is here that the seeds of the threat lie because the purpose is to provide users with content that would meet users’ interests, without determining whether the content is truthful, safe, or will enhance or harm, the user’s well-being.
To understand the role algorithms could play in the automated distribution of anti-social content, it is useful to recognise that over the last few years, social, political, economic, cultural changes, coupled with technological innovations have led to higher levels of despondency, which the pandemic has exacerbated. Increased disillusionment has encouraged people to find new ways to look for happiness, leading to greater use of the Internet (to search for answers and community) and of social media (to provide the answers and the community).
Research on radicalisation indicates that the path towards radicalisation begins with a personal crisis such as a loss of a job or some form of discrimination that causes a cognitive opening. Trawling these, open platforms allow recruiters to identify potential targets by examining a user’s social, political, economic, cultural, religious, psychological posts to determine whether the person is open to engaging in actions aimed at changing mainstream society. Once they identify their target, the recruiter surreptitiously forms a relationship with the potential recruitee by finding common interests. Once they have a relationship, recruiters move to private messages and/or encrypted platforms allowing the recruiter to connect with the individual on a more personal level. In other words, recruiters engage in a process that looks to feed the psychological and sociological appetite of the target with the promise that if they embrace the narrative the recruiter offers, the individual will have happiness (a state of pleasurable content of mind) and reach transcendence.
Islamic State recruiters and propagandists seem to have understood people’s need for happiness as their strategy centred on the enlisters creating and projecting a certain image/content of Islamic State that resonated with a specific audience (such as men or women) amplified by a homophilic community that also serves as an eco-chamber and a filter bubble.
With the collapse of the territorial Islamic State, there had been a growth in supporters-cum-volunteer media, changing the recruitment and propaganda efforts. These actors create new content, preserve old content, and devise new ways to attract recruits. Consequently, there is an enormous amount of data out there that has not been removed by the platforms.
Many technology algorithms are effectively popularity contests. Algorithms such as PageRank not only index content but, determine the ranking of the content by seeing how popular the ‘page’ is which means that it would appear higher should a search occur. It is therefore unsurprising that bland information is pushed lower on the Google search, which also means fewer people are likely to see it, even if the content is highly pertinent. This is because the online space is designed for individuals to engage with content because the algorithms direct content that will keep the individual online and create transitory transcendence (i.e., the euphoria end as soon as the individual logs off and faces the real world). It is worth recalling the words of Communication Professor, Jay David Bolter, who captured the addictiveness of social media in his admission that sometimes he accesses YouTube to watch something only to find himself spending hours on the platforms because of the platform’s ability to direct content that provides the user with pleasure. Bolter, in other words, accessed the platform for a specific purpose but the platform by understanding his psychological needs found ways to keep him engaged by showing him glimpses of content that would be of interest to him, enabling him to reach a form of transcendence, as he gets lost in a world of short movie trailers and other videos that feed his curiosity and interests.
Recruiters and propagandists could exploit and manipulate social media algorithms by getting their supporters and sympathisers to ‘like’ content that sustains the distinctive, single narrative, but because of the way recommender algorithms work, increased interaction with specific content will ensure that the platform would promote the content, as was evident with Shahid Buttar’s hashtag #PelosiMustGo.
The prospects of nefarious actors using algorithms, specifically recommender algorithms, means their recruiters could create generalist, borderline content that their sympathisers would like. Once that happens, the system would direct the content to potential recruitees if the platform’s algorithm determines that it is content that the individual could be interested in, which is why greater attention needs to be placed on social media platforms’ persistent need to get users to engage with content.