Click here to read our latest report “30 Years of Trends in Terrorist and Extremist Games”

The Jakarta Bombing: Youth Digital Radicalisation and the Urgent Need for Adaptive PCVE Responses

The Jakarta Bombing: Youth Digital Radicalisation and the Urgent Need for Adaptive PCVE Responses
7th January 2026 Chevy Atha
In Insights

This Insight analyses the digital influences, memetic behaviours, and visual-symbolic cues that underpinned the explosion that occurred on 7 November 2025 at Jakarta’s State Senior High School 72. The attack was more than just a school-related act of violence; it highlights a growing reality of decentralised and global online ecosystems that can enable radicalisation irrespective of one’s local context. The suspect, a student, was inspired by foreign extremist symbols and visual templates, rather than local ideological networks. An investigation was launched following the incident, which caused 96 injuries at two explosive spots. The attack underscores the startling pathways of youth radicalisation in Indonesia – how a 17-year-old engaged with, processed, and acted upon the violent narratives embedded in his digital ecosystem. 

A Chronology That Reveals Something Deeper

The CCTV footage shows the assailant arriving at the school at 6:28 AM. He entered like any other student, engaged with teachers, and exhibited no anomalous behaviour. At midday, he wielded a toy weapon inscribed with extremist ideological texts. The initial explosion occurred at 12:02 PM at the school mosque, followed by another in the canteen area. Seven bombs were discovered; four detonated, while three did not activate.

Figure 1: The Perpetrator’s Toy Gun (Photo Credit: HUMAS POLDA METRO JAYA)

The visual evidence provides more context and insight into the attack. The weapons depicted in Figure 1 illustrate how the attacker sought to emulate worldwide shooter culture. The weapon’s surface bears the inscriptions “14 Words,” “Natural Selection,” “1189,” and “For Agartha,” beside the names of right-wing terrorists such as Brenton Tarrant, Alexandre Bissonnette, and Luca Traini. These writings do not stem from the Indonesian setting; rather, they are symbols historically employed in white supremacist movements, neo-Nazism, and mass murders in the West. The assailant’s inscription on the barrel and stock mirrors the approach employed by Tarrant during the 2019 Christchurch assault, transforming the firearm into a “canvas” for showcasing his ideological sympathies.

The use of terms such as “Natural Selection” references the Columbine assailant Eric Harris’s t-shirt, suggesting that the offender was not articulating a deeply understood ideology but rather imitating the aesthetics and visual rituals of prior perpetrators. The inscription “1189” pertains to the Crusades, whereas “For Agartha” originates from far-right esoteric mythology. The appearance of Legionnaire and SS Latvia symbols underscores the trend of incorporating Eastern European symbols. The above illustrates a process of copy-paste symbolism, wherein the criminal appropriates visual features observed in instances of violence overseas and replicates them, devoid of local context.

This overarching pattern indicates a type of radicalisation grounded not in theology, but in aesthetics. The attacker used a toy weapon as a performative object, a vehicle for expressing the identity he assumed via the internet. By inscribing those symbols, the offender is not arguing, but posing. He proclaimed himself a member of the lineage of global violence imitators. This is the clearest indication that the phenomenon at hand is memetic radicalisation: the emulation of style rather than the comprehension of doctrine.

The police verified that the assailant had no affiliations with any terrorist organisation; he operated alone. This case, like those the attacker admired, underscores that youth can radicalise without robust networks or recruiters – internet access alone can suffice.

Aesthetic-Based Radicalisation: Visuals Replacing Doctrine

The attack at State Senior High School 72 Jakarta exemplifies the concept of memetic radicalisation. Adolescents may no longer engage in violence from ideological means; they acquire knowledge through mimicry. 

According to Askanius and Keller (2021, p. 12), visual belonging refers to the construction of radical identities through imitation of a group’s aesthetics, irrespective of understanding its philosophy. This corresponds with the behaviour of the Jakarta attacker, who inscribed white nationalist emblems on his toy weapon, employed gestures that function as cyphers for the far-right community, and captured images in a global-shooter aesthetic.

Martinez Pandiani et al. (2025, p. 4) assert that visual entities – memes, poses, or symbols – disseminate more rapidly and efficiently than textual content and operate as cultural artefacts that may be replicated without comprehending their original context. The Jakarta attacker need not be an ideologue, but an emulator; the internet provided numerous examples to imitate.

Moderation and Visual Mimicry 

The use of aesthetics and visuals is key to memetic radicalisation. This is clearly demonstrated by the offender from SMA 72. Examination of his TikTok activity reveals that he disseminated content featuring distinct extremist themes, encompassing visuals inspired by white supremacist and neo-Nazi ideologies. The assertion that he had a TikTok account has been confirmed by the Global Project Against Hate and Extremism and the Soufan Centre. Archived footage from his suspended account displayed him posing with a black firearm while “Highway to Hell” played. The platform’s moderation protocols did not detect this extremist content until after the attack. Still, they acted promptly to delete his accounts, thereby safeguarding the investigation while simultaneously obliterating the digital evidence that could have furthered the circulation of these images and the evolution of his mimicry into acts of violence.

Solomon & Rupnow “OK” Hand gesture (Photo Credit: School Shooting Data Analysis and Reports)

The sign depicted in the above figure is intentional. The specific “OK” hand motion photograph captured between the assailant’s boots has previously been used by other violent attackers in distinct assaults within the United States. The sign has evolved into a memetic signal: a visual motif replicated by offenders to reference and emulate one another across nations. Although it appears innocuous in isolation, its significance becomes evident when considered alongside these prior instances.

According to Gomez et al. (2024, p.16 – 17), dangerous memetic symbols cannot be understood from a single image alone but must be understood in context and through recurring patterns. Consequently, single-item moderation is highly susceptible to misinterpretation. The rate at which symbols spread globally is continually changing, and human moderators are unable to keep pace with this evolution. The existing platform moderation systems, designed primarily to identify explicit text, are inadequate in understanding the nuanced and evolving language associated with radicalisation, particularly in the context of visual mimicry.

Digital Platforms Accelerate Imitation

Social media shorts have been proven to suggest unsuitable and emotionally distressing material to children and teenagers from the very first use session, which exposes them to more violence and suffering. This algorithmic exposure heightens the psychological susceptibility of children and adolescents, since the suggested harmful content might normalise harmful narratives and amplify the risk of internalising negative messages without adequate evaluative ability (Xue et al., 2025, pp. 1-2, 7–9). Burton (2023, p.3) calls it algorithmic extremism: not because algorithms support extremism, but because algorithms promote attention-grabbing content, and the aesthetics of violence often meet that criterion. When a user engages with three shooter-style videos, liking one and following another account, the algorithm promptly curates their feed to include analogous content. 

The Director General of Digital Space Supervision of the Ministry of Communication and Digital Republic of Indonesia, Alexander Sabar, reported that the perpetrator accessed multiple accounts and channels across various social media and messaging applications. The Ministry of Communication and Digital subsequently engaged with platforms to dismantle them; however, by that time, the algorithm had already increased the visibility of content in which he had expressed even slight interest.

A Transnational Violent Fandom

The symbols inscribed on the perpetrator’s weapon are not a phenomenon confined to the local context in Indonesia. It is the consequence of the transfer of transnational symbols. Here is the pattern that we observe:

  • Breivik served as an inspiration to Tarrant.
  • Tarrant served as an influence on the Buffalo gunman.
  • The attacker at 72nd High School was influenced by the Buffalo shooter and Tarrant.

This pattern indicates that modern violence is increasingly independent of organisations or manifestos; it relies heavily on a visual script that can be easily replicated by youth. In this context, the True Crime Community (TCC) does not qualify as an extremist group; however, certain segments may function as a nihilistic environment where violence is revered instead of denounced. Within these environments, perpetrators are idealised, identities are constructed around violence, and adolescents may adopt the symbols and behaviours of previous aggressors. The process of imitation can serve as a means for individuals to attain visibility or a sense of importance, thereby transforming TCC into a conduit for Nihilistic Violent Extremism. This occurs not through ideological beliefs, but rather through the intrinsic allure of replication.

How Artificial Intelligence Could Support PCVE Practitioners in Their Response 

  1. AI Has the Capability to Detect Symbols That Are Challenging for Human Vision to Perceive

Computer vision systems can be developed to identify:

  • Concise inscriptions on the weapon,
  • Particular hand gestures,
  • Numerical figures commonly employed by extremists,
  • Standard self-portrait in a shooter style,
  • Pre-attack photograph composition.

Abi-Nader et al. (2025, p.294) demonstrated that contemporary visual detection systems are capable of identifying high-risk weapons and visual patterns even in low-quality video footage. Symbols that were formerly disregarded, such as “1189” or “14 Words”, can be reassigned within the framework of user behaviour.

  1. AI Recognises Patterns Rather Than Isolated Images

Memetic radicalisation operates in a cyclical manner. One post is innocuous. Five posts within a fortnight, along with specific numerical symbols, begin to establish a pattern. Gomez et al. (2024, p.8) assert that pattern-based evaluation is superior in accuracy to single-unit-based evaluation. AI can issue alerts indicating that “Users are beginning to upload visual styles reminiscent of pre-attack patterns in certain international instances.” Not for retribution, but to provide advance notice.

  1. PCVE and XAI Can Detect The Spread of Extremism Through Symbol Reading Methods

The reason a pattern is deemed dangerous can be demonstrated using Explainable Machine Learning (XAI). Explainable models facilitate human comprehension of risk, as demonstrated by Hedhili and Bouallagui (2024, p. 253). Human judgments made in conjunction with AI are shown to be of higher quality when given counterfactual explanations (Ibrahim et al., 2023, p. 331). Small behavioural changes that were previously imperceptible can now be seen by PCVE practitioners with the help of XAI.

  1. AI Can Track the Movement of Symbols Across Borders

Ajala et al. (2022, p.2) showed that extremist groups on social media can be accurately found using network analysis that combines AI with human evaluation. 

PCVE can see how:

  • A symbol originating in European extremist communities later began circulating on US social media platforms,
  • Then it showed up on TikTok Indonesia,
  • And was finally used by a Jakartan teen.

Transnational trends like these are very hard to track without AI mapping.

The Need to Act

The offender’s journal entry in the SMA 72 case reveals that he had feelings of isolation but discovered a new universe, complete with identity and belonging, on the internet. 

Hogan et al. (2021, p. 331) demonstrated that youth can suffer from negligent risk systems. While AI has the potential to revolutionise the way we understand radicalisation, it must be implemented with utmost care to protect individuals’ privacy, ensure data anonymisation, and incorporate human verification to avoid the technology from enabling new injustices.

Conclusion

The attack at Jakarta High School 72 exposes a novel aspect of youth radicalisation. It was not constructed by a recruiter. It did not progress through doctrine. The attacker refrained from using verbal communication. He merely employed visuals and style. He donned a diminutive emblem that appeared trivial to the casual spectator yet held profound significance for the global community that originated it.

The forthcoming iteration of terrorism may not always communicate verbally; it will instead present itself visually. In such a world, PCVE must adopt innovative measures. Practitioners must develop the ability to interpret visual materials and better understand aesthetics. They must collaborate with AI not to supplant humans, but to issue early alerts when a teenager begins to deviate in a negative direction. Artificial intelligence will not resolve all issues. However, it is the sole technology capable of perceiving the visual patterns that currently constitute the principal language of digital radicalisation.

Failure to comprehend this language will result in an inability to interpret the emerging generation of violent threats, and would mean that High School 72 will not be the final instance.     

Chevy Atha Khairan is a researcher at the Think First Institute (TFI) and Marketing Research Indonesia (MRI). He is a recent bachelor’s graduate from the Islamic State University Syarif Hidayatullah Jakarta. His research focuses on environmental psychology, youth digital behaviour, and the psychological dynamics shaped by AI, social media ecosystems, and emerging risks of online extremism.

Are you a tech company interested in strengthening your capacity to counter terrorist and violent extremist activity online? Apply for GIFCT membership to join over 30 other tech platforms working together to prevent terrorists and violent extremists from exploiting online platforms by leveraging technology, expertise, and cross-sector partnerships.