Click here to read our latest report “Beyond Extremism: Platform Responses to Online Subcultures of Nihilistic Violence”

Beyond Extremism: Platform Responses to Online Subcultures of Nihilistic Violence

Beyond Extremism: Platform Responses to Online Subcultures of Nihilistic Violence
17th February 2026 Institute for Strategic Dialogue

Subcultures of nihilistic violence have emerged as a central threat, targeting and manipulating young people online. The Institute for Strategic Dialogue (ISD) defines nihilistic violence as violent acts lacking an ideological motivation and driven by a misanthropic worldview. These communities form a decentralised web of chats, forums and channels characterised by support for violence for violence’s sake, but with no specific political, ideological or religious goal.

This policy paper provides an overview of the specific online threat landscape of nihilistic violence subcultures, and outlines the implications for platform measures to protect users. The first section sets out the networks that comprise the ecosystem, the ways in which they use platform functions to conduct harmful activities, and a taxonomy of resulting harms. The second section of this report considers how existing platform terms of service relate to these different harms. The third section offers an overview of intervention opportunities for platforms and considers additional innovative approaches to ecosystem disruption.

Key Findings

  • While occupying parallel digital spaces and producing similar types of harm, online subcultures of nihilistic violence are distinct from ideologically motivated extremism. This unique threat requires bespoke platform interventions rather than expansions and adaptations of existing terrorism- and violent extremism-focused frameworks.
  • Nihilistic violence ecosystems are decentralised, cross-platform and highly agile, leveraging mainstream and fringe platforms for grooming, propaganda and operational coordination. Platform strategies should not look to respond to the threat as new forms of dangerous organisations, but rather to understand this phenomenon as a more dynamic threat from nihilistic violent subcultures, of which ‘groups’ like 764 and the True Crime Community are just the latest manifestation.
  • Nihilistic violent communities produce a much broader range of harms than ideologically motivated extremist networks, spanning sexual exploitation, cybercrime and various forms of real-world targeted violence, including self-harm, animal abuse, interpersonal violence and mass casualty attacks such as school shootings.
  • New platform policies are not necessarily required to mitigate the threat, given that many of these harms are already covered in platform community guidelines. However, these should be knitted together as part of a cohesive platform strategy, as enforcement against ecosystems of nihilistic violence is currently fragmented and reactive, enabling ban evasion and rapid regrouping.

Key Recommendations

  • Adopt an ideology-agnostic, behavioural approach to threat assessment: Shift from group-focused frameworks to models addressing behavioural indicators, pro-violence content, aesthetics and more diverse harm matrices.
  • Implement a spectrum of platform violence-prevention interventions, informed by a public health approach: Focus on upstream prevention. Early intervention should seek to build resilience through education and employ inoculation approaches.
    • Enhance platform-level safeguards: Consider opportunities for impactful platform-facilitated safety interventions – such as providing expert resources and developing community education campaigns around evolving nihilistic violence threats.
    • Empower community-level interventions: Equip moderators in fandom-driven spaces with bystander intervention tools and off-ramping resources.
    • Build bridges to support services: Provide a wider range of safeguarding support within relevant communities and ensure relevance to specific subcultures.
    • Innovate counter-communications: Use authentic, grassroots content that engages subcultural humour and aesthetics, while avoiding ideological deradicalisation messaging ill-suited to this threat.
    • Develop dynamic ecosystem disruption strategies: Coordinate cross-platform takedowns informed by intelligence-led mapping, leveraging GIFCT-style collaborative frameworks for wholesale network disruption.
  • Strengthen moderation and enforcement: Integrate ban-evasion markers and regrouping codes into moderation practices and consider IP/device fingerprinting to address the proliferation of burner account activity within these communities.
  • Invest in research and cross-sector collaboration: Establish an information-sharing hub to track evolving codes, platform usage and threat dynamics. As part of this, provide researchers with meaningful access to platform data to enable a joined-up, sectoral approach to this rapidly evolving threat.

The proposal for this Institute for Strategic Dialogue (ISD) paper came to Global Network on Extremism and Technology (GNET) from GIFCT’s 2025 Working Group, Addressing Youth Radicalisation and Mobilisation, which worked to identify current trends in youth radicalisation and mobilisation online, alongside lessons learned from prevention and positive intervention strategies, to address these dynamics.


Read full report