Click here to read our latest report “Far‐Right Extremism and Digital Book Publishing”

Balancing Online Content Moderation and the Rule of Law

Balancing Online Content Moderation and the Rule of Law
4th September 2020 Devorah Margolin
In Insights

Heated debates have continued to spring up around the world focusing on the growing role of technology companies in moderating content online. Only this week, Facebook made public their own internal issues and discussions with content moderation. In order to give a platform to this debate, the Program on Extremism at George Washington University recently hosted an event bringing together different voices to share their views on existing efforts to moderate online content, and the future direction of moderation on platforms big and small.

The virtual event explored the balancing act of addressing online counter-terrorism practices and policies against commitments to transparency and free expression. Specifically, this event focused on the work carried out by the Global Internet Forum to Counter Terrorism (GIFCT). Founded in 2017 as a collaborative forum between Facebook, Microsoft, Twitter, and YouTube to disrupt terrorist and violent extremist exploitation of their platforms, the GIFCT has recently transitioned into an independent, standalone organisation. Since its inception, the organisation has faced concerns from civil society and digital freedom groups regarding the GIFCT’s role in international freedom of expression, transparency, and partnerships with government and industry.

This panel, moderated by Program on Extremism senior research fellow Bennett Clifford, brought together four distinct voices, including GIFCT Executive Director Nicholas J. Rasmussen; Researcher and Instructor at the Combating Terrorism Center at West Point Audrey Alexander; Counter Terrorism Policy Manager at Facebook Dina Hussein; and Lecturer on Law and S.J.D. Candidate, Harvard Law School and Affiliate, Berkman Klein Center For Internet & Society Evelyn Douek. Together these speakers brought very different voices to the discussion and highlighted the diverse perspectives in the field.

The GIFCT Perspective

Newly-announced GIFCT Executive Director Nicholas J. Rasmussen opened the event  with a review of recent significant changes to how the GIFCT is organised and hopes to proceed in the future. He stressed that nothing crosses department lines quite like harms on the Internet or on the online environment, whether it is terrorism or other online activities.  Director Rasmussen highlighted GIFCT’s renewed role and opportunity in bringing stakeholders from private technological companies, academic researchers, and government practitioners together. He argued that the GIFCT represents a unique platform for these stakeholders to work together to address the hard challenges associated with content moderation, and noted that one of the goals of the newly-independent organisation was to expand GIFCT membership to include more platforms, some of which do not possess the same resources of their larger peers. Mr. Rasmussen closed his remarks by stressing that one of the core challenges ahead lies in accurately and completely defining the boundaries of terrorism and extremism when an increasingly wide range of extremist actors are not always operating in support of a UN-designated terrorist organisation.

The Tech Company Perspective

Counter Terrorism Policy Manager at Facebook Dina Hussein represented the tech company perspective. Ms. Hussein highlighted the major discrepancies over what governments and private companies classify as terrorism, and that as a result tech companies are often left to form their own determinations. Larger platforms with users spread across multiple countries have to contend with the significant variation in the ways terrorism and violent extremism manifest themselves geographically. On this issue, Ms. Hussein stressed that there is no one-size-fits-all solution. Instead, she pointed to the importance of knowledge sharing through partnerships like “Tech Against Terrorism,” the core channel between the United Nations and tech companies around the globe and a key partner of the GIFCT. She also addressed tools used by the consortium of companies that participate in the GIFCT like the hash sharing database, which pools thousands of identifications for harmful images across each of the partner organisations into a centralised database. Ms. Hussein closed by addressing the recent successes of the hash sharing database, but also the need for tech companies to support one another going forward. 

The Third-Party Perspective

Audrey Alexander, a researcher and instructor at the Combating Terrorism Center at West Point speaking in her own personal capacity, opened her remarks by discussing how terrorists and violent extremists exploit information and communications technologies. Ms. Alexander explained that terrorist and violent extremist actors are often dynamic, opportunistic, and create complex online ecosystems. She also explained that not all content is equal or problematic. In fact, most of it is irrelevant and does not necessarily meet the criteria for terrorist content, especially considering the different definitions being used. Ms. Alexander offered a memorable and important guiding question for those tasked with content moderation, “How can we check ourselves before we wreck ourselves?” Ms. Alexander further argued that it is imperative to address problems as they are rather than how we envision them. Specifically, she highlighted the problems of creating content moderation policies based on the beliefs that terrorist content is connected to terrorist activity, and that terrorist and violent extremist content can ever be completely removed from platforms. Instead, Ms. Alexander offered a paradigm of content marginalisation, recalibrating content removal instruments to marginalise and mitigate threats while acknowledging that there will always be terrorist content online. The marginalisation paradigm provides a more adaptable and resilient posture to face content moderation challenges, and relies on a range of stakeholders to work together to take proportional and pragmatic steps to keep harmful actors at bay.  Ms. Alexander concluded her remarks with hopes that the GIFCT can set important precedents in terms of online content moderation, and that doing so requires a democratic process. The path to success, she argued, may not necessarily be the fastest, but it involves bringing technology companies together and giving them a banner, toolkit, and functional set of norms.

A Lecturer on Law and S.J.D. Candidate at the Harvard Law School and Affiliate at the Berkman Klein Center For Internet & Society, Evelyn Douek has been a vocal advocate for information transparency. Ms. Douek reflected on the history behind the data sharing model used by the GIFCT, which evolved out of policies implemented to tackle child sexual abuse. However, while the abuse of children is clearly identifiable, the same cannot be said of terrorism. Countries around the world define terrorism differently, and technology companies could enter into situations where such definitions are used for political reasons. Ms. Douek challenged the GIFCT’s status as a so-called “content cartel” as its member companies move beyond the realm of violence and into regulating hate speech and other problematic content. This has led to a lack of transparency among technology companies and between technology companies and the public, and a growing accountability deficit across all actors as a result. For example, when ‘hashes’ get added to the hash database, it is not clear which partner company is adding them. That uncertainty can create an imbalanced power struggle between big and small platforms, where larger companies dictate content removal. Ms. Douek proposed that meaningful and ongoing discussions must be held that ask hard questions about what good transparency looks like, what constitutes meaningful oversight, and the best approach for proper accountability and remediation. By tackling these questions, Ms. Douek believes we can begin to take steps in the right direction when it comes to online content moderation.

What more can be done?

As this event demonstrated, there are many difficult questions that still need to be answered, and these issues will not dissipate as companies are forced to contend with the range of extremist content proliferating platforms. For this reason, as the GIFCT emerges as an independent organisation, it is imperative that a wide range of voices are deeply involved in the content moderation process–and that the voices that are critical of technology companies have a seat at the table to further good-faith processes by the GIFCT to engage in transparent, proportional, and pragmatic content moderation efforts.