Click here to read our latest report: Behind the Skull Mask: An Overview of Militant Accelerationism

Part 1: Algorithmic Deconstruction in the Context of Online Extremism

Part 1: Algorithmic Deconstruction in the Context of Online Extremism
15th September 2020 Dr. Yvonne Jazz Rowa
In Insights

Threats to cyberspace typically arise from intentional agents as well as systemic threats that emanate from the innate unpredictability of computer systems and networks. For the most part, the design of Internet protocols did not envisage a future of exploitation. Most hardware and software for Internet-enabled devices have weak in-built security measures. The agency exercised by Internet users coupled with cyberthreat literacy and complacency further delineate the Internet ecosystem. In addition, social media companies experience significant challenges in countering online threats while the global reach of the Internet presents further complexities with regard to responsibility and enforcement of regulations. The relationship, responsibility and delimitations of power within the public-private technological sphere and governmental authority is therefore primal.

The interplay between speech-acts and media technology has modelled world views and established self-identity in relation to others. Accordingly, speech-acts in the digital sphere facilitate the understanding of the dialectic between social relations with other actors, and their collective experiences in time and space. Speech-acts also occur within a socio-political framework suffused in values, beliefs and categories. They are concerned with meaningfully employed forms of expression that include monologues, stories, remarks, speeches, arguments and conversations. These constitute perceptible forms of communication yet extremism in the digital space is symptomatic of the interaction between latent algorithmic design, digital agency and broader societal structures. A neglected domain of expression therefore includes computer programming languages that deploy algorithms that act as interlocutors of language and human interactions.

While the context of messaging is key, the significance of performatives is neither in the context of an utterance nor the speaker’s intention but in their reproducibility (cascade). However, reproducibility is inconceivable in the absence of context and intention. Further, in underlining the power of technology, advancements in the media sphere in particular have bolstered the speed and impact of reproducibility of extremist content. Technology has also mediated active online interactions and recalibrated the structure and delivery of language. Passive consumption and engagement with online content has been just as impactful. A pictorial meme may signify and convey a powerful message that cascades the cyberspace in record time. A retweet or like of the same meme, or lack thereof, may also convey its significance and impact on its audience in the absence of engagement, rendering it a voiceless yet consequential utterance. Could the same convey a thought, value or belief, perhaps?

The subjects in extremist discourse intersect in the nucleus of technology. Subjects typically commune in social networking platforms that pose risks and opportunities alike. The discourse on social media as a driver of extremism has been skewed towards the visible dynamics on online platforms. But lurking beneath the visible user interface that enables the operation and control of social media applications are powerful programming commands called algorithms. The algorithmic conception merits an exploration of the burden of responsible technology in the design-consumption interface. As such, to what extent do adversarial actors exercise individual agency or proxy agency by design in their engagement with popular social media applications? In addition, what are the potential risks to radicalisation in polarised contexts?

In social media networks, algorithms influence the content displayed on feeds based on users’ digital footprints and therefore command significant influence. Algorithms are designed by rational humans with a system of values, beliefs, goals and visions, which renders algorithms, intrinsically value-laden and goal-oriented. Correspondingly, the ideology that informs a technology company’s vision, that in turn instructs the design of algorithms is also embedded within the algorithmic architecture. Algorithms therefore act as the interlocutors between front-end (users) and back-end (engineers and visionaries) actors. Significantly, both realms constitute human actors. Company visions are not static, particularly in the fast-evolving technological terrain. Visions typically evolve as technology breaks new ground, with companies such as Facebook revolutionising technological, social, political and economic spaces. While algorithms are not particularly sentient, they are clearly value-laden, interactive and represent and convey a system of values and goals. As such, they condition human affect, thinking and behaviour, and are primarily “modelled on visions of the social world.” The operationalisation of institutionalised values and goals has significant influence on digital behaviour with which it interacts. As part of this ecosystem, algorithms wield significant invisible, sophisticated power that has impacted democracy and social cohesion. Algorithms therefore exert and confer power by proxy. Social media algorithms have similarly compelled traditional media to adapt its antiquated business model to perpetual technological reordering. Consequently, algorithms have practical functional as well as utilitarian and instrumental value.

Algorithms are inherently discursive and therefore an important aspect of language. The interplay between language and the invisible, sophisticated power of the algorithm imbues visible forms of online human interaction that along with other contextual factors constitute viable conditions for extremism. Invisible power is the most insidious form of power that configures ‘psychological and ideological boundaries of participation’. It is ‘the internalisation of powerlessness’ through dominant ideologies, values and behaviour. The impact of invisible power in technology is therefore visibilised through behavioural patterns that may include Internet addiction and constructive or adversarial engagements. At the same time, algorithms convey and execute actor goals and are therefore virtual interlocutors in online-offline environments.

Algorithmic capabilities in decision-making processes play out along the spectrum of social cohesion and the radicalisation of public discourse. Algorithms are not only predictive but are also deterministic and have the potential for unintended effects. In recent times, the convergence of online and offline extremism has necessitated the deconstruction of the interactive elements within the extremism ecosystem. As such, it is just as important to examine other actors with whom extremists interact. This is particularly so when polarising political figures, for example, act as mobilising forces for online supporters. Similarly, political figures constitute the target audience for extremist messaging that in turn informs the positions and mobilising force of the political class. This dynamic typifies the shifting roles and symbiotic relations among online actors as agitators and audiences. Taking down online extremist content of the ‘average Joe’ while retaining cascading content from likeminded political figures is effectively a piecemeal approach.

The discourse on online extremism similarly duly underscores the relationship between social, mainstream and alternative media. Besides algorithmic deployment, the online presence of media companies that champion hyper-partisan positions has permeated the online sphere. These forms of media are therefore an essential component of the online extremism ecosystem. Mainstream media has for long attuned the masses to competing political ideologies and dominant narratives that have engendered conformity and antagonism towards prevailing political structures. The media has also contributed to the proliferation of online extremist subcultures that continue to consume and dispute mainstream content. Additionally, articulations and struggles over power have traditionally been displayed in the media. The media has constructed power relationships through the manipulation of perceptions within the frames of (re)narrativisation and (mis)representation. Meanwhile, the latitudinarian character of social media, buttressed by algorithms, has promoted the quest for online Messiahs and access to alternative sources of information perceived to counter the influence of mainstream media.

A system of algorithms can be conceived as ideologies that fashion human behaviour and reconstruct world views in a conflicting campaign of reward and harm. Moreover, they can also be characterised as envoys that expose latent societal schisms thus presenting opportunities for action. Media technology amplifies but does not constitute a singular cause of radicalisation or conspiracy theories, which pre-date the Internet. However, the convergence and interaction between design, the plurality of user agency and other contextual conditions have incubated online extremism. Evidently, counter-terrorism efforts will remain void as long as broader contextual factors and actors in other sub-systems that drive radicalisation pervade. Ironically, this includes the polarising actions of the policing political class.