Click here to read our latest report “Beyond Extremism: Platform Responses to Online Subcultures of Nihilistic Violence”

Group Dynamics in Far-Right Telegram Communities: A (Worthy) Challenge for Radicalisation Research

Group Dynamics in Far-Right Telegram Communities: A (Worthy) Challenge for Radicalisation Research
18th February 2026 Ulrike Schwertberger

This Insight draws on data from the RadiGaMe project and is funded by the German Federal Ministry of Research, Technology, and Space.

Research on the rise of far-right movements across Western democracies has largely focused on what these actors say. Accordingly, researchers have investigated their grievances, fears, enemies, and how they legitimise violence. While this work has generated important insights, it often treats online platforms as static billboards for hate. Our recent study of 569 interaction fragments on Telegram suggests we should be asking a different question: How do these groups function as social systems?

Radicalisation is not merely a cognitive shift driven by exposure to content; it is a longitudinal social process. By focusing on how radical groups negotiate norms, enforce “us vs. them” worldviews, but also hold and validate shared identities, we can better understand why these communities remain resilient even after deplatforming. 

Encrypted messaging platforms—most notably Telegram—have become central places for this process. Their combination of perceived anonymity, limited moderation, and group-based affordances has made them particularly attractive to far-right and violent extremist actors seeking to sustain communities beyond the reach of traditional gatekeepers. This Insight shifts the analytical focus from individual messages to the social dynamics unfolding within these groups, exploring how they operate as social systems and what this means for the future of P/CVE.

From Ideological Arenas to Social Systems

Far-right Telegram groups are far more than static repositories of propaganda. They are interactive environments in which members do not just consume content, but “perform” belonging. Participants join to take part in ongoing collective processes that provide meaning, recognition, and orientation. In this sense, far-right Telegram groups function like every other social group: they develop shared norms, informal hierarchies, and recurring interaction patterns.

Group and organisational psychology has long demonstrated that group dynamics, such as processes of cooperation, conflict, leadership, and regulation, shape group stability. Rarely applying these insights to extremist groups, research tends to treat the “group” as a secondary backdrop or a mere container for radicalising content. Communication-focused research also knows that groups are functional contexts for radicalisation that provide norms, identity, or a sense of significance. Both perspectives, however, overlook a critical reality: extremist groups are dynamic systems whose internal functioning can change substantially over time. Their internal chemistry changes as membership shifts, external pressures mount (such as law enforcement action), and the group matures.

Understanding these “lifecycle dynamics” is essential. It explains how far-right communities sustain high levels of engagement over years, rather than months, and how they bridge the gap between individual curiosity and collective radicalisation.

The Anatomy of Interaction: Findings from the RadiGaMe Project

To better describe patterns and dynamics of group interactions, we analysed discussion fragments from 155 far-right Telegram groups. All groups are German-speaking, with strongly varying group sizes (between 34 and up to 10,000 group members). The sample provides a heterogeneous perspective on Telegram groups encompassing radical actors who reject liberal democratic values and extremist actors who reject democracy at its core. This includes actors who support violence against certain social groups as well as the state as a whole. Over a period of six years, we semi-randomly selected discussion fragments from the most active points in time of each group. 

Our systematic analysis of 569 discussion fragments reveals that far-right Telegram groups function less like loose message streams and more like structured social ecosystems. We identified three core pillars that sustain these communities:

  1. Collective Sense-Making: A central feature is “informational cooperation.” Members don’t just share links; they collectively interpret news events to reinforce a shared worldview. This acts as a primary cohesive force, sometimes turning the group into an “alternative newsroom” that validates their bias in real-time.
  2. Emotionalisation: Communication is fuelled by a volatile mix of affiliative support for the “ingroup” and intense contempt for the “outgroup.” The emotional climate, especially such binaries, reinforces moral boundaries and potentially increases the cost of leaving the group.
  3. Stabilising Narratives: Tying stories to identities, particularly those framing the group as “victimised” or “under threat,” serves to normalise grievances. This turns individual anger into a collective political project.

From Contestation to Consolidation

Perhaps our most significant finding is how these dynamics evolve over time. 

In the early phases, groups experience higher levels of contestation as members negotiate norms and “test” one another. However, as the group matures, overt conflict declines, replaced by intensified identity reinforcement and emotional investment. This suggests a process of social consolidation: the group becomes more stable, more insular, and potentially more resilient to external counter-messages. These findings underline that radicalisation on Telegram is not just about what people see (extremist content); it is about how they live within these everyday group processes that provide sustained meaning and cohesion.

Why Group Dynamics Matter for Radicalisation Research

Focusing on group dynamics challenges the “lone wolf” or individual-centric models of radicalisation that over-emphasise personal vulnerability in isolation. Instead, it highlights that radicalisation often unfolds within relational contexts that provide feedback, validation, and social rewards

Participation in a far-right Telegram group is thus not just about absorbing ideology; it is about entering a social marketplace that offers the three radicalisation-critical currencies: Validation, meaning, and recognition. Seen as a ‘social glue’, group dynamics explain why extremist narratives resonate deeply: they are not just ideas, they are the basis for belonging. This is why members remain engaged over long periods, and why leaving such communities can be difficult. Identity reinforcement, emotional synchronisation, and informational cooperation create a social environment in which extremist worldviews are normalised and reinforced through everyday interaction. This can contribute to the mainstreaming of increasingly extremist viewpoints into the middle of society. Understanding these processes is therefore crucial for both explanatory and preventative efforts.

The Researcher’s Dilemma: Technical and Ethical Barriers

While these findings underline the analytical value of a group-dynamic perspective, they also expose the substantial barriers researchers face when attempting to study such processes systematically. These technical, legal and methodological challenges do not merely complicate data collection; they raise fundamental questions about the limits of scientific inquiry into semi-encrypted communication spaces and about the responsibilities of researchers when investigating politically sensitive and potentially harmful environments.

  1. The Privacy Paradox: Ethics vs. Early Warning
    Access to Telegram data is shaped not only by technical limitations but also by legal and ethical boundaries. From a European—and particularly a German—legal perspective, researchers cannot simply intrude into private or semi-private communication spaces. Even when groups are technically “public,” their participants may not reasonably expect systematic academic observation or long-term data storage. Data protection law, constitutional guarantees of privacy, and principles of proportionality impose clear limits on what can be collected, stored, and analysed.
    This creates a fundamental tension for extremism research: How far should researchers go to understand potentially dangerous group processes early enough to inform prevention and policy—and where must clear red lines be drawn to protect individual rights and democratic norms? This tension is unlikely to be resolved through technical solutions alone but requires explicit normative and legal guidance. While the societal interest in understanding radicalisation processes is substantial, it does not automatically legitimise unrestricted access to digital communication spaces. As a result, much research relies on cautious ethical self-regulation, selective sampling, and conservative data handling practices, often at the expense of completeness or representativeness.
    Beyond legal constraints, Telegram itself offers very limited support for systematic discovery. The platform’s weak search functionality makes it nearly impossible to identify groups in a comprehensive or unbiased manner. Keyword-based searches privilege highly visible, explicitly ideological, or already well-known communities, while smaller, more insular, or strategically coded groups remain largely invisible. This introduces substantial sampling bias and makes it difficult to assess how representative observed group dynamics are for the broader far-right Telegram ecosystem.

 

  1. Platform Volatility: The “Ephemeral” and Fragmented Extremist Space
    Telegram is a highly volatile platform. Groups frequently disappear, are deleted by administrators, migrate to new spaces, or are removed following platform enforcement actions. In practice, this volatility severely constrains longitudinal research. In our own data collection, groups captured in 2024 were, in many cases, already deleted or banned only a few months later. Consequently, long-term observation is only feasible if data are collected continuously and at relatively short intervals.
    Such repeated data collection, however, requires substantial computational infrastructure. Storing large volumes of chat data over extended periods demands significant server capacity, secure storage solutions, and sustained funding—resources that are often beyond the reach of individual research projects. Even with regular collection, data loss remains unavoidable. Messages are frequently deleted by users or administrators, and once removed, they cannot be reconstructed retroactively. This results in large amounts of missing data and fragmented interaction histories, limiting the reliability of longitudinal reconstruction.
    Algorithmic curation further complicates matters. Although Telegram lacks the overt recommendation systems of mainstream platforms, visibility within and across groups is still shaped by forwarding practices, cross-posting, administrator decisions, and informal network hierarchies. These mechanisms influence which messages gain traction and which group interactions become observable, yet they remain largely inaccessible to researchers. As a result, it is difficult to disentangle whether observed patterns reflect genuine group processes or artefacts of platform-specific visibility structures.

Furthermore, extremist actors rarely use only one platform. Instead, they use different platforms for different purposes. Therefore, cross-platform research is becoming increasingly important to identify communication purposes, radicalisation dynamics, and group objectives. However, such research endeavours are complex and therefore rare.  

 

  1. The Measurement Gap: LLMs vs. Latent Context
    Measuring group dynamics at the content level is conceptually and empirically demanding. Group dynamics are inherently relational, processual, and contextual, yet most available data consist of isolated messages stripped of non-verbal cues, interactional histories, and situational context. Translating complex social-psychological constructs—such as cooperation, norm enforcement, or identity work—into content-based indicators inevitably involves simplification.
    The scale of available data further intensifies this problem. Telegram datasets often comprise millions of messages, making manual coding infeasible. While large language models offer promising tools for automation, their ability to validly capture subtle social-psychological constructs remains uncertain. Group dynamics are not merely semantic features; they depend on timing, reciprocity, power relations, and shared histories—dimensions that current automated approaches struggle to represent reliably.
    Moreover, researchers rarely have access to complete discussions. Data typically consists of interaction fragments rather than coherent conversational units. Identifying which messages respond to one another is often difficult due to Telegram’s limited reply structure, overlapping conversations, and extensive use of forwarded content. Reconstructing full interaction sequences would require highly complex models together with enormous human coding efforts. Consequently, most studies must work with partial representations of group processes, acknowledging that they capture tendencies rather than exhaustive accounts of collective interaction.

A Roadmap for Resilient Research and Prevention 

To meaningfully integrate group dynamics into the study of radicalisation—while respecting legal, ethical, and democratic boundaries—coordinated action across research, legislation, executive institutions, and prevention practice is required:

  1. Clear legal research frameworks that specify under which conditions the observation of extremist online groups is lawful, proportionate, and ethically justified, thereby reducing legal uncertainty and encouraging responsible research.
  2. Methodological investment in theory-driven, mixed-method approaches that combine social-psychological expertise with carefully validated computational tools for capturing group-level processes at scale.
  3. Greater platform accountability and transparency, including regulated research access and clearer documentation of platform features that shape visibility, interaction, and moderation practices.
  4. Institutionalised exchange between research and security actors, ensuring that insights into group dynamics inform early-warning and risk-assessment mechanisms without undermining civil liberties.
  5. System-aware prevention strategies should be the main focus of De-radicalisation efforts. If radicalisation is a collective process, “exit” must also be addressed collectively. We need interventions that disrupt emotional climates and group norms, moving beyond individual-focused counter-narratives toward strategies that dismantle the social ecosystems of extremist communities.

Taken together, these steps highlight that understanding—and countering—far-right radicalisation requires not only better data or tools, but a sustained commitment to studying and addressing the social dynamics that make extremist communities durable over time.

Ulrike Schwertberger is a research associate and PhD student at LMU Munich, Germany. Her research focuses on group dynamics in far-right communities, i.e., interpersonal interactions and negotiations of identity and conflict. Using longitudinal and multilevel data, she explores the relations between group dynamics and radicalisation processes. 

Simon Greipl is a research associate in Prof. Dr. Rieger’s teaching and research department (IfKW, LMU Munich). As part of the MOTRA project funded by the BMFTR, he focuses on identifying radicalisation dynamics in online environments with an emphasis on group phenomena. His particular research interest lies in investigating radicalisation phenomena in the context of gaming and its communities.

Diana Rieger is Professor for Communication Science at the Department for Media and Communication at LMU Munich. Her research focuses on extremist online communication, investigating hate speech, fear speech, conspiracy content, potential counter messages and their effects.

Are you a tech company interested in strengthening your capacity to counter terrorist and violent extremist activity online? Apply for GIFCT membership to join over 30 other tech platforms working together to prevent terrorists and violent extremists from exploiting online platforms by leveraging technology, expertise, and cross-sector partnerships.