A CNN report uncovered an online ‘Rape Academy,’ and the investigation revealed a global network of men drugging, raping, and filming their unconscious partners—often spouses—and sharing the content online. While Gisèle Pelicot’s case shocked France and the world, this case is not unique, and it revealed that spousal rape networks exist and continue to thrive with new cases being uncovered worldwide, including Australia, Canada, Germany, Italy, Ireland, New Zealand, West Africa, Poland, Singapore, the United Kingdom, and the United States of America. There is a cross-platform connection between how online spaces normalise and amplify misogyny, turning acts of gender-based violence into coordinated, violently misogynistic networks; reflecting a broader ecosystem where Technology-Facilitated Gender-Based Violence (TFGBV) serves as both a tool and a gateway to violent extremism. Communities, often part of the “manosphere” or incel forums, tend to foster extreme dehumanisation of women, reward rape-supportive content, while algorithms supply individuals with radicalising content rapidly.
This digital misogyny doesn’t exist in isolation: it mirrors and fuels real-world violence, as seen in the Pelicot case, where online coordination enabled mass sexual violence. Research shows that about “70 percent of young men have been exposed to extreme misogyny online, including normalising violent rhetoric against women”, and incel forums post about rape every 29 minutes, with most posts receiving support. The varying extent to which platforms prioritise engagement alongside evolving moderation practices can, in some cases, enable the persistence and spread of these ideologies, linking TFGBV directly to radicalisation and extremist violence across ideological lines. It is crucial to understand how spousal rape networks manoeuvre using cross-platform operations. This Insight analyses how digital infrastructure can enable safe havens for coordinated spousal rape networks, algorithmically normalising violent behaviour, and commodifying TFGBV into a transactional ecosystem of betrayal.
Platform Architectures as Safe Havens for Organised Sexual Abuse
Digital platforms are not neutral conduits, because their design choices—encryption, anonymity, decentralised hosting, and lax moderation—create environments where coordinated sexual violence can flourish with minimal risk of detection or accountability. This compounding cycle enables spousal rape networks to grow with self-sustaining digital anonymity, using multiple platforms for separate functions. Fringe platforms, for example, enable illicit activities and monetise TFGBV, while mainstream forums act as the first point of contact and amplify the rhetoric in publicly available videos. Both fringe and mainstream platforms’ architectures are used to varying degrees to evade culpability and detection while still benefiting from the reach all platforms provide.
For instance, Telegram’s encrypted messaging infrastructure has become a critical tool for creating safe havens for coordinated spousal rape networks and advancing misogyny in real time. The “Zzz” group (which has since been removed from the platform) was a private Telegram channel with nearly a thousand members, serving as a hub where men exchange tips on drugging partners, share videos of unconscious women, and livestream assaults. Telegram’s end-to-end encryption and resistance to government takedown requests make it a digital fortress for coordinated sexual abuse. Unlike platforms that cooperate with law enforcement, Telegram’s architecture prioritises anonymity over victim safety, effectively shielding perpetrators from accountability.
As noted in Misogyny and Violent Extremism: Can Big Tech Fix the Glitch?, digital platforms that enable “borderline—‘awful, but lawful’—content” benefit from high engagement while evading legal liability. Telegram’s refusal to implement proactive content moderation or meaningful subscriber verification allows abusive communities to operate with impunity. Acknowledging that this is a deliberate design choice rather than an unintended consequence provides a better understanding of why organised TFGBV and sexual violence do not simply disappear but instead flourish in these digital ecosystems.
Similarly, the website Motherless functions as a “moral-free file host” where “anything legal is hosted forever.” The site hosts over 20,000 videos tagged with phrases used to verify that a woman is unconscious during sexual acts. With an estimated 62 million visits in February 2026 alone, the platform has become a central repository for sexually explicit material and non-consensual content. Despite repeated warnings and public exposure, Motherless remains operational and protected by US Section 230 of the Communications Decency Act, which shields platforms from liability for third-party-generated content.
However, this legal protection does not absolve platforms of ethical responsibilities. Motherless’ hands-off approach to moderation—which claims to remove illegal content only after it is reported—creates a reactive rather than preventive approach to harms, by allowing abusive sexual content to remain online for extended periods, normalising and amplifying its reach. There is an increasing normalisation of violence against women, specifically by presenting sexually abusive acts as entertainment, with algorithms on mainstream platforms amplifying extreme and sexually explicit content, embedding such behaviour into digital culture.
Sexually explicit content glorifies real-world abuse and reflects a broader societal failure to take such violence seriously. Online communities centred around spousal rape networks exacerbate harms by fostering a sense of community among participants, reinforcing sexually violent impulses and turning exploitation into a shared and celebrated act. As one survivor noted in the CNN investigation, Valentina, after discovering her husband had drugged and raped her for years: “I can’t conceive of the fact that a woman could be treated like slaughterhouse meat. Because in the end, that’s what I was.” Platforms cannot claim neutrality while hosting content that destroys lives.
Algorithmic Normalisation of Sexually Violent Behaviour
Beyond infrastructure, algorithmic systems play a critical role in normalising and spreading abusive content. Platforms like Motherless use tagging systems to categorise and recommend videos, effectively creating a taxonomy of violation. Hashtags that explicitly describe non-consensual sex acts are used to verify and validate the unconscious state of victims. These hashtags function as algorithmic gateways, directing individuals to increasingly extreme content. Once an individual searches for or views such a video, the platform’s recommendation engine then funnels similar content, creating an echo chamber of abuse that culminates in both online and offline ramifications. This process mirrors the online radicalisation pathways seen in extremist ideologies. As certain platform algorithms may push individuals from mainstream content to fringe and then extremist content, such systems may also pivot individuals from consensual adult content to non-consensual, violent material. While the mechanism is the same, there exists an indifference in which platforms respond to gender-based violence—prioritising engagement and profit over the safety and dignity of women.
Moreover, the gamification of abuse is evident in how content is ranked and rewarded. Videos with high view counts, likes, and comments are promoted, incentivising individuals to produce more extreme material. One Motherless subscriber claimed to run a business selling “sleeping liquids” globally, advertising the product within the platform’s ecosystem. Another subscriber livestreamed the assault of his unconscious wife for cryptocurrency, with viewers directing the abuse in real time. While these behaviours may seem to be isolated incidents, they are systemic outcomes of platform design. When engagement is the primary metric of success, abuse becomes content, and violence becomes entertainment.
The Commodification of TFGBV in a Transactional Ecosystem
The most disturbing aspect of spousal rape networks is the complete disregard and erasure of consent, replaced by a transactional economy of betrayal. Videos of unconscious women are shared, traded, sold, and monetised, turning sexual violence into a digital commodity. On Motherless, subscribers exchange videos as currency, building reputations and social capital within the community. Some individuals advertise livestreams of drugged partners for $20 per viewer, with cryptocurrency as the preferred payment method. This transforms private betrayal into a public spectacle where the act of rape is no longer solely about domination, but about profit and entertainment.
While commodification of TFGBV is not new, it echoes the broader trends in the pornography industry, where AI-generated deepfakes and synthetic media are increasingly used to create and distribute non-consensual content. As observers suggest, “Pornographic deepfakes have become the new sites for gender-based violence against women and technology-facilitated abuse.” The difference here is that the victims are not celebrities nor public figures—they are private individuals, often spouses, whose trust is weaponised.
The economic model is clear: the more extreme the content, the higher the engagement, the greater the reward. This creates a perverse incentive structure where abusers often face no consequences but are celebrated and financially rewarded for the escalation of their crimes.
Shifting from Perpetrators to Ecosystems of Facilitation
Traditional responses to gender-based violence focus on individual perpetrators and legal prosecution. While necessary, this approach fails to address the structural enablers of abuse. Spousal rape networks are only possible because of a digital ecosystem that allows the organisation, coordination, and sharing of violent sexual abuse online.
In the Pelicot case, a chatroom called “Without Her Knowledge” was used to connect with other men, demonstrating how online communities facilitate real-world violence. The site, like Motherless, operated under a “moral-free” ethos, treating rape as a form of entertainment rather than a crime. It was only shut down after the trial, after years of unfettered and documented abuse. This pattern reveals a critical gap in platform governance: the failure to recognise TFGBV as a form of violent extremism.
A Call for Coordinated Action
To combat this crisis, it must be acknowledged that complicity has consequences. Platform architectures, algorithmic systems, economic models and legal structures require a major rewrite to redefine platform accountability. Here are a few recommendations to address this issue:
- Apply the growing body of precedent to ongoing rulings demonstrating that failure to warn and negligent design apply equally to sexual violence, rape and non-consensual content.
- Mandate AI detection: Platforms must be required to detect and remove non-consensual content, including known hashtagged images/videos, using proactive tools, rather than relying on reactive reporting.
- Integrate TFGBV into Preventing Violent Extremism (PVE) frameworks: P/CVE efforts must recognise misogyny as a driver of extremism. Recent verdicts have strengthened the case for treating spousal rape networks as security threats.
- Empower survivors: Survivors of spousal rape networks must be able to sue platforms for enabling their abuse, with legal support and trauma-informed processes.
- Governments can work with law enforcement: raising awareness through public service announcements and educating law enforcement on TFGBV and how to manoeuvre in this space.
Conclusion: From Complicity to Accountability
The global network of men drugging and raping their spouses may seem to be a fringe phenomenon, but it is a symptom of a broken digital ecosystem. Platforms are architects of abuse; their design choices enable and amplify sexual and non-consensual violence. To stop this, we must shift the shame from blaming individuals to holding systems accountable. Digital infrastructure must no longer be allowed to function as an accomplice. We must treat the commodification of TFGBV, the algorithmic normalisation of rape, and the platform-enabled betrayal of intimate partners as security threats.
As stated, “Digital technology can enable TFGBV and SGBV, but it is not the cause of these behaviours; it is rather an accomplice permitting these harms.” The question is no longer whether Big Tech can fix the glitch—but whether it will be forced to?
–
Gazbiah Sans is a Counter-Terrorism and Preventing Violent Extremism expert, specialising in sexual/gender-based violence, specifically in fragile, conflict and violent contexts. She has over 15 years of experience, notably with USAID in Cameroon on the Boko Haram-affected Lake Chad Basin Region and with the World Bank in Afghanistan. Gazbiah is the Director of PVE Works—a leading organisation advancing rights-based, innovative approaches to counter online and offline radicalisation and empower youth and women as agents of prevention. She served on the Global Internet Forum to Counter Terrorism’s working group on Addressing Youth Radicalisation and Mobilisation, contributes to the Christchurch Call Advisory Network, and advises the Global Community Engagement and Resilience Fund (GCERF).
–
Are you a tech company interested in strengthening your capacity to counter terrorist and violent extremist activity online? Apply for GIFCT membership to join over 30 other tech platforms working together to prevent terrorists and violent extremists from exploiting online platforms by leveraging technology, expertise, and cross-sector partnerships.