“Online, we do not yet have a tradition of the rule of law. We are not certain how to apply laws when the internet is used to attack or to repress.” Marietje Schaake, Director of Policy, Stanford Cyber Policy Center and former Member of the European Parliament, writing in 2017
Navigating the inherent tension in safeguarding online spaces from violent extremist content while adhering to the foundational principles of the liberal order is seemingly a Catch-22. Technology can only be democratic if tempered by a multi-faceted and accountable political process. This should include citizens and their political representatives, rather than just “consumers and Silicon Valley thought leaders”. How to implement the rule of law in the digital sphere should therefore be grounded in discussions about who gets to set the rules and norms defining online public life. Such ‘normative power’ – and especially its operationalisation in online infrastructure – is crucial to responsible democratic citizenship.
At the time of writing, Australia and New Zealand were coming to terms with the sentencing of the Christchurch shooter, bringing home the painful reality of terrorism and its far-reaching online networks. Beyond the violence, it is also a sobering reminder that digital technologies and online platforms have provided tech-savvy authoritarians, revisionist powers and violent extremist groups with opportunities to weaken Western democratic societies and their institutions.
But as the debate around online regulation illustrates, democratic legitimacy is not only being undermined from the outside. More systemic, internal fault lines are also at play as it remains a challenge to keep online platforms free from terrorist and violent content while preserving freedoms of speech and expression. This involves addressing fundamental questions about the nature of power as well as the norms underpinning political communities in the digital sphere.
In response to the horrific livestreamed terror attack that took 51 lives, the Christchurch Call was implemented as a multi-stakeholder initiative uniting governments (and their affiliated agencies), tech platforms and civil society organisations to prevent the exploitation of the Internet for terrorism. Additionally, the Australian Parliament passed the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act. The ensuing criticism by civil rights organisations, academics and the tech industry for suppressing freedom of expression online shows how politically contentious debates about collective security and individual privacy have become.
It is a collective aim to create a democratic information environment where the rule of law is applied online just as it is in ‘real life’. But points of friction emerge from conflicting logics between governments and tech platforms – market imperatives of metrics-driven growth and consumer rights clash with governments’ obligation to confront security threats and attacks on state sovereignty.
As part of a notable effort for greater visibility, the Director General of the Australian Security Intelligence Organisation (ASIO), recently remarked:
Yes, privacy is paramount, but privacy is not total because there’s a balance between privacy and security, and under the rule of law when appropriate warrants are in place, law enforcement or ASIO should be able to get access to bug someone’s car or house … As a society, whether we know it or not, we’ve accepted the fact. Why should cyberspace be any different?
This brings the tension between collective security and individual privacy into focus: government agencies’ requests to monitor, decrypt or remove content on social media platforms are not a simple matter of achieving security through the rule of law. Since the Internet has become a handy enabler of greater surveillance, control, and coercion, it is debatable whether trade-offs between privacy and security can in fact be balanced. This has to do with the very architecture of cyberspace and requires reflection on its fundamental dynamics.
As politics plays out in more diffused systems, states exercise power alongside corporate, civic, and criminal actors in interlocking webs of networks. Examples like Pegasus, the surveillance program the Israeli firm NSO Group has reportedly sold to many governments, reveal the blurred lines between the power of corporations, machines and the state. This confirms traditional realist conceptions of power do not apply in cyberspace as they do in ‘real life’. To ascertain where authority lies, it is necessary to reconceive the meaning of power in cyberspace.
The evolution of any system in society – be it military, political, socio-economic or cultural – is not driven by strategic calculus alone but also by beliefs and values. Any definition of power should reflect these elements. The challenge of cyber security therefore “is (or should be) as much to do with the technology of detection and interdiction as it is with social norms and attitudes.” Termed ‘normative power’, this can be understood as the ability to shape what can be considered normal in international life. Reflecting the present networked diffusion of power, more actors in cyberspace are now also involved in the process of shaping and constituting norms. And here, for example, intelligence agencies are identified as important norm-setters’ whose ‘actorship’ can contribute to clarifying existing rules.
This is even more important as we are entering a ‘post-liberal world’, characterised by contestations of the existing power dynamics, institutions and values. COVID-19 seems to have only galvanised extremist typologies. Through a cross-fertilisation of ideologies, different groups opportunistically unite around resistance to governments’ public health measures. Violent extremist messaging in this new climate not only aims to polarise but expose the alleged failings and inherent decay of democratic systems as a justification for violently overturning them.
In recent years, the erosion of Western dominance, even ‘Westlessness’ (evident for instance in the theme and speeches of the 2020 Munich Security Conference) has become a frequent talking point. Yet, rhetorical recourses to the shared foundation of Western values, ethnocentrically reiterating their normative superiority are not enough to tackle problems resulting from structural crises, entrenched power relations and biases.
Responsive policy comes as a result of grappling with social and normative complexity, rather than wanting to reduce it. Oversimplifying discourse to legitimise or attack measures by any actor – advocates, tech companies or governments – stands in the way of recognising the fundamental complexities behind these issues. This places responsibilities on all actors in the crucial process of enabling democratic citizenship in an online public sphere.