Digital sex crimes such as South Korea’s molka (hidden camera) phenomenon and the Nth Room case are frequently described as criminal acts or cultural aberrations. However, these practices warrant deeper recognition as manifestations of gender-based violent extremism that exploit digital infrastructures to scale, monetise, and normalise misogynistic harm. These crimes are technologically mediated and transnational in scope, and ideologically coherent, rooted in dynamics of power, punishment, and control. While these practices are not typically framed as extremism, they exhibit many of its core characteristics: shared ideology, decentralised networks, and the strategic use of digital tools to inflict harm. This gendered extremism is neither fringe nor isolated. It is embedded in the everyday architectures and affordances of digital life.
This Insight analyses how digital infrastructures facilitate this form of everyday extremism, with a particular focus on South Korean cases, and explores how platforms, perpetrators, and design features co-produce online environments that reward and sustain misogynistic abuse.
Molka as Distributed Extremism
Molka refers to the covert filming of women’s bodies without consent. In South Korea, these crimes have become a defining feature of the country’s digital landscape of gender-based violence. According to media sources, more than 30,000 molka cases were reported between 2015 and 2018, with the real number suspected to be far higher. Recent analyses suggest the problem continues to persist despite legal reforms, underscoring that molka remains a deeply pressing issue.
The threat operates on two levels. In public, micro-cameras are hidden in places like subway bathrooms, hotel rooms, cafes, gyms, and libraries. Police investigations in Seoul have uncovered sophisticated caches of devices embedded in fire alarms, ceiling lights, hair dryers, and electrical sockets. In private settings, perpetrators include boyfriends, husbands, or acquaintances who plant car key-shaped or USB-style cameras in bedrooms or install spyware on personal devices. In both cases, women are rendered vulnerable in the very spaces where they are meant to feel secure.
Digital infrastructure enables the rapid dissemination of this footage through encrypted channels and decentralised platforms. Messaging apps like Telegram and Discord, along with niche pornography sites, serve as hosts for curated content, often accompanied by guides on how to evade detection and monetise recordings. Despite the closure of notorious platforms like Soranet, molka footage continues to circulate via cloud services and closed forums, revealing the adaptability and durability of these networks.
Far from isolated offences, molka mimics elements of a distributed ecosystem of abuse. Offenders frequently catalogue content by victim name, body part, or filming angle, suggesting intentionality, coordination, and a shared ideological investment in the degradation of women. In one international case, a doctor working in Melbourne was found to have covertly filmed hundreds of female colleagues, sorting footage into folders by name and anatomical focus. This logic of classification mirrors the tactics of extremist movements. It is systematic, repeatable, and purposefully dehumanising.
Molka has also become increasingly integrated into Nth Room-style operations. The original Nth Room case, exposed in 2020, involved the blackmail and sexual exploitation of women and minors through encrypted Telegram channels. Access to this content was tiered, with higher-paying users granted entrance to more violent material. A 2025 conviction revealed the persistence of this model. A man had created a Telegram channel titled Geu Beonbang (“That Room”), where molka-style recordings were exchanged for online gambling referrals and gift-card PINs. These cases underscore a continuity of logic, if not formal structure, across networks.
These operations exhibit characteristics commonly associated with extremist movements. They rely on technical sophistication, decentralised dissemination, ideological reinforcement, and economic incentive. Molka operates within this logic. It functions as a form of everyday, distributed gendered extremism rather than a collection of isolated voyeuristic acts.
Technology as Infrastructure of Abuse
The Nth Room case offers a powerful example of how digital infrastructure can facilitate, escalate, and monetise misogynistic abuse. The network used encrypted messaging, cryptocurrency payments, and categorised distribution channels to coerce victims and reward perpetrators. Even after the core network was dismantled, similar operations emerged elsewhere, adapting their technological and logistical strategies to evade detection.
These cases demonstrate that platforms operate not as passive hosts but as infrastructures that have the ability to facilitate extremism. Telegram’s encryption enables anonymity and persistence. Cloud storage systems archive large volumes of illicit material. Pornography platforms monetise voyeuristic content, often through categories that reward non-consensual imagery. These technical affordances, including minimal moderation, frictionless file sharing and anonymous uploads, are far from neutral. They shape user behaviour and enable harm.
The same applies to molka. Here, technology is integral. Hidden camera footage is often stored in indexed folders organised by age, body type, or relationship to the perpetrator, and distributed via encrypted platforms. Pornography sites categorise and circulate this material, sustaining its visibility and profitability. These systems resemble extremist propaganda networks, featuring searchable archives, contributor hierarchies and coded language designed to avoid detection. The feedback loop between technological features and ideological incentives intensifies both harm and participation.
Molkas are thus not isolated offences but part of a decentralised and durable abuse infrastructure. The logic underpinning them is the same. Women’s pain becomes content, their vulnerability becomes currency, and digital platforms provide the architecture to facilitate, conceal, and reward participation. As seen in the Nth Room and its successors, the more invasive the footage, the greater the user’s status in the network; the more private the platform, the lower the risk of exposure. This feedback loop between technological affordances and ideological desire mirrors other forms of online extremism, where visibility, virality, and cruelty generate social capital.
In these ways, digital technologies do not merely enable abuse. Instead, they constitute the infrastructure of everyday extremism. Their features are embedded with value-laden decisions that reward misogynistic behaviour, conceal perpetrators, and entrench harm. Recognising molka and Nth Room-like operations as digitally mediated gender-based extremism enables more accurate and actionable responses, including platform accountability, algorithmic interventions, and transnational legal cooperation to target not only the users but the architectures that sustain them.
Misogyny as Ideological Extremism
While the men behind molka and Nth Room-style crimes are not affiliated with formal extremist organisations, their actions are deeply ideological. The logic is consistent, whereby women’s bodies are positioned as objects of surveillance, control, and punishment. In molka, this ideology is implied through the act of covert filming: stripping women of privacy, consent, and subjecthood. In Nth Room, it is explicit: perpetrators used blackmail and doxxing threats to coerce victims into submission, referring to them as ‘slaves’ and gamifying abuse through tiered access and status competition.
This represents a phenomena that is both affective and collective. It binds participants through a shared worldview in which women’s autonomy is threatening and degradation is a form of dominance. These practices function as gendered extremism, as they radicalise participants, foster group cohesion, and encourage escalation. Content is not consumed in isolation but through communities via likes, rankings, comments, and payment-based access. These are rituals of radicalisation, beginning with initiation through viewing, followed by reinforcement through interaction, and culminating in advancement through cruelty.
Participation is also structured by hierarchy. New users may begin as passive consumers, progressing to commenters, contributors, and eventually high-status members who produce, moderate, or monetise content. Platform architectures, such as Telegram, facilitate and foster roles like admins, uploaders, and subscribers to operate within informal economies of abuse. Channels develop internal jargon, coded tags, and enforcement norms, resembling extremist cells that decentralise for resilience while replicating tactics across platforms.
The ideological structure mirrors other forms of extremism. It establishes in-groups and out-groups, rewards loyalty and brutality, and thrives on performance and visibility. Misogyny becomes the unifying principle here. These are closed communities that validate grievances, amplify entitlement, and celebrate transgression. The spectacle of suffering becomes both propaganda and currency, as in other extremist subcultures.
Crucially, this extends beyond misogyny. It reflects a coherent worldview in which women’s sexual agency, public presence, or perceived defiance is framed as a provocation deserving punishment. Recruitment occurs not through manifestos but through habitual practices of surveillance and sharing. It is banal and intimate, yet radicalising. Its power lies in repetition and diffusion.
These harms are also international in scope. The logic of molka – covert recording, categorisation, and voyeuristic redistribution – has travelled. Spycam crimes are not replicated internationally, and Nth Room-style networks have emerged in China, adopting similar architectures of coercion and anonymity. This transnational spread reveals misogynistic extremism as a modular, scalable, and global subculture embedded in everyday digital life, rather than an isolated phenomenon.
Molka and Nth Room-style activities draw strength from everyday behaviours like surveillance, sharing, and participation, rather than relying on manifestos or formal doctrines. It has become scalable. What begins as a hidden camera on a Seoul subway or a Telegram blackmail ring in Busan can quickly become part of a global economy of misogynistic harm. The infrastructures are interoperable, the ideology portable. In this sense, digital sex crimes go beyond being illegal or immoral acts. They function as everyday forms of gendered extremism, embedded in the infrastructures of digital life.
Intervening in the Digital Ecosystem
To counter this form of everyday extremism, digital platforms must recognise their own infrastructural complicity. While platforms such as Telegram and Discord have policies prohibiting the distribution of non-consensual intimate imagery and related harms, enforcement remains inconsistent and reactive. For instance, Discord’s Trust and Safety team has expanded efforts to detect grooming and exploitative content, yet continues to rely heavily on user reporting and limited metadata analysis. In the absence of proactive detection measures, subscription-based channels and encrypted folders that host molka-style content often persist undetected.
A more systemic response is needed. Algorithmic moderation should be trained to detect metadata clusters associated with voyeurism, hidden cameras, or blackmail. Upload frictions such as content scanning, auto-flagging of repeat offenders, and the banning of encrypted archive formats should be implemented where non-consensual imagery is likely. Addressing infrastructural risk means going beyond takedowns and embedding safety into platform architecture itself.
Furthermore, cloud storage services should develop red-flag systems for suspicious folder structures organised by gender, camera type, or location. AI-assisted moderation should be deployed to identify non-consensual patterns. Platforms must collaborate with civil society to develop hash databases of molka material (similar to child abuse image detection protocols) and share blocklists across platforms.
More broadly, tech companies should adopt safety-by-design frameworks that treat gender-based extremism as a systemic risk embedded in platform architecture. These frameworks must account for how digital infrastructures facilitate the circulation of misogynistic harm through features such as anonymity, frictionless sharing, and monetisation pathways. These features amplify gendered power imbalances and entrench patterns of abuse at scale.
Platforms must also shift from assuming neutrality to recognising that certain technical affordances (e.g. anonymity, unmoderated sharing, viral recommendation algorithms) create low-cost environments for extremist mobilisation.
Conclusion
Molka and Nth Room-style crimes are not isolated anomalies. They are the everyday manifestations of a larger ideological project that uses technology to radicalise misogyny, distribute harm, and evade accountability. These practices reflect core elements of violent extremism, including an ideology of dominance, collective mobilisation, and the strategic use of digital systems for coercion, reward, and spectacle. Recognising these crimes as forms of gender-based extremism rather than mere digital deviance allows us to develop stronger, more systemic interventions.
At the same time, South Korea’s experience illustrates both possibilities and the limitations of national intervention. Regulatory reforms have increased content monitoring and encouraged greater cooperation from platforms such as Telegram, but the response remains fragmented and reactive. What this case makes clear is that regulation alone is insufficient to dismantle the infrastructures of misogynistic harm. Sustained progress requires platform-level redesign, cross-border enforcement, and governance attuned to the ideological nature of these crimes.
These crimes are everywhere and all at once. The response must be just as global, just as integrated, and just as ideologically aware. The tools are available. What remains is the political will to use them.
–
Dr Se Youn Park is Director of Research at Women in International Security (WIIS) – Australia Inc. and holds a PhD in International Relations from the University of Queensland. Her work investigates how gendered institutions and digital infrastructures produce and perpetuate insecurity across different domains of security policy and practice, with regional expertise in Australia, South Korea, and the UK.
–
Are you a tech company interested in strengthening your capacity to counter terrorist and violent extremist activity online? Apply for GIFCT membership to join over 30 other tech platforms working together to prevent terrorists and violent extremists from exploiting online platforms by leveraging technology, expertise, and cross-sector partnerships.