Click here to read our latest report: Cults and Online Violent Extremism

Catfishing to Counter Extremism: Lessons from an Indonesian Experiment

Catfishing to Counter Extremism: Lessons from an Indonesian Experiment
14th August 2020 Cameron Sumpter
In Insights

From anxious captive audiences to fresh subversive narratives, COVID-19 has been a boon for those who peddle their extremism in online spaces. Authorities warn that manipulative recruiters are preying on bored young people, many of whom have been stuck indoors, frustrated, and worried about their future.

Governments, civil society and donor organisations across the world will likely renovate efforts to counter online radicalisation and divisive rhetoric – particularly as they’ve adapted their prevention programming to the pandemic’s restrictions and practitioners continue to work from home.

But a surge in demand and potential enthusiasm for online ‘counter-messaging’ or ‘alternative-narrative’ campaigns does not acquit these strategies of their shortcomings. Positive online propaganda and attempted reasoning may be popular for their relative low cost and ease of execution, but convincing critiques highlight the difficulty of appealing to the same emotions and aspirations that extremist groups seem to capture.

Instead of simply advertising ideas through ‘credible voices’ and slick design, researchers have argued that initiatives must involve a significant element of individual engagement and pathways to concrete action, ideally leading to personal interactions in the real world.

With this in mind, activists behind a pop culture website in Indonesia began an 18-month project to connect with pro-IS Facebook users, initially under the cover of pseudonym, with a view to revealing their identity down the road. While the experiment was educational and worth further effort, they found that online ‘counter-engagement’ is easier said than done.

The website was founded by early pioneers of civil society efforts to counter extremism in Indonesia. They sought to create content that would counter extremist narratives and provoke constructive discussion around issues related to terrorism and its prevention, from an Indonesian youth perspective.

Those behind the initiative are involved in a separate project linking former prisoners convicted of terrorism offences and suitable mentors in two provinces with histories of militancy. In early 2019, a plan emerged to identify relevant candidates online, slowly build their trust through shared interests and friendly interaction, and eventually introduce them to a mentor offline.

Facebook is still hugely popular in Indonesia, and while the company’s mechanisms have become better at spotting and removing extremist accounts, the platform remains a productive environment for identifying potential ‘clients’. Ruangobrol practitioners look at a combination of post content, friend lists, and the way users engage with discussions under other posts. Referencing an IS magazine or video production is considered solid evidence.

Relevant users are then divided into four categories: Red (potentially dangerous); Orange (already quite extreme); Yellow (becoming more involved); and Green (mostly just curious). Between green and a darkish yellow is where they attempt connections. An initial opening might involve praising someone’s argument or playing ignorant and seeking to learn.

Female avatars tend to receive the most attention, so they learned to enter the fray with an attractive profile photo and soon change to something a little more jihadi. Some conversations then progressed to WhatsApp chats, but on occasion the ‘young women’ were invited to join a religious study circle. Unprepared to meet on such terms, they ghosted their way to safety.

The idea was for things to move more slowly. After gradually developing an online friendship they might point the client towards positive endeavours, such as an opportunity for an education scholarship abroad, which ruangobrol staff collate and post on the website. But after months of engaging several individuals, they never managed to set up offline meetings with mentors, nor even reveal who they really were – the timing never seemed right, and they were unsure how it would play out.

Editor-in-Chief Hakiim has experienced how difficult it is to build trust among people with extremist views in-person, but stressed that it’s even trickier online. “People don’t know who’s behind an account, so the levels of suspicion are very high,” he said. In a world of catfishing CVE activists, they are probably right to be cautious.

While Facebook has supported ruangobrol’s broader endeavours, the site’s undercover accounts have been removed by Facebook content moderators on two occasions – an accolade they’ve actually worn as a badge of honour when re-entering the target community with a fresh identity.

Unfortunately, the resource intensive experiment has come to the end of its funding allocation. Hakiim believes they were making headway, but due to the nature of the work, progress is piecemeal – especially when you’re learning as you go.

To be sure, the initiative was educational, and the website is using the experience to inform and adapt its primary activities, by increasing general user engagement and enticing debate that may draw people with extremist views into more open discussions.

If this were an academic project there would be glaring ethical concerns, but for a private company the lines are less clear. Efforts to counter violent extremism now have several years of trial-and-error programming in the back catalogue, but the field remains fundamentally experimental. Initiatives that push the boundaries should be encouraged.