Content Warning: This Insight contains mentions of rape and violence against women.
Sexual and Gender-Based Violence (SGBV) is preventable, yet remains pervasive. Unfortunately, SGBV encountered in real life has migrated online as Technology-Facilitated Gender Based Violence (TFGBV). While expressed differently, both forms are rooted in extreme misogyny and reinforce each other, with digital technology exacerbating both online and offline violence. TFGBV mirrors and accelerates SGBV by violently weaponising digital technology and creating enabling conditions for violent extremism to thrive. Moreover, TFGBV has become a precursor for recruitment and is linked to (youth) radicalisation, leading to incidents of real-world abuse and violent extremism. This ebb and flow between online and offline violence perpetrated against women, girls and LGBTQI+ individuals, coupled with digital technology, demands urgent action from tech platforms. Approaching SGBV and TFGBV as separate acts, rather than an intertwined phenomenon, neglects the dual nature of the problem. Addressing TFGBV also means tackling SGBV and violent extremism simultaneously. This Insight examines these dynamics and offers recommendations to prevent TFGBV, SGBV, and violent extremism – enabling a safe(r) digital environment.
How Digital Technology Can Fuel TFGBV, SGBV and Violent Extremism
TFGBV is rampant on mainstream social media platforms, and global and regional estimates of TFGBV rates remain high. For example, a survey conducted across 45 countries by the Economist Intelligence Unit highlights 38% of women “report[ing] personal experiences with online violence.” Furthermore, 85% of women “reported witnessing online violence against other women (including from outside their networks).” The harms of TFGBV are underestimated and the prevalence of TFGBV is under-scrutinised, particularly in the context of misogynistic behaviours, violent extremism and terrorism. This is further compounded by a lack of standardised definition for TFGBV, a narrow understanding of harms associated and linked to SGBV and terrorism, inadequate policy responses, and limited knowledge and understanding about TFGBV.
Digital technology enables the spread of TFGBV, misogyny and violent extremism, but it is not the cause of these behaviours; it is rather an accomplice permitting these harms. For instance, constant exposure to TFGBV and misogynistic content provokes strong reactions, and algorithms interpret these reactions (likes, shares, comments) as popularity. This amplification of toxic content becomes normalised, as viewers are desensitised or drawn to radical ideologies that feed off of grievances and/or biases, essentially, ignoring TFGBV and making violent extremism possible.
This dynamic can appear intentional, as digital platforms prioritise profit over well-being by coding for engagement bias. When polarising or hateful content is rewarded with engagement, it can degrade online interactions and contribute to the algorithmic proliferation of violent extremist material, reinforcing echo chambers of harmful ideologies.
According to UN Women, the “manosphere is moving misogyny to the mainstream.” Recent reports indicate the face of terrorism is getting younger, with children as young as eight years old being overly exposed to extremist and incel culture. The current threat landscape indicates SGBV is not limited to a specific type of ideology; violent extremists and terrorists across the ideological spectrum perpetrate SGBV. This is further compounded by the ease of impunity, anonymity and pseudonymity on platforms, enabling misogynistic narratives to flourish with limited accountability.
This spill-over effect from online to offline violence (or from TFGBV to SGBV, and in some cases, violent extremism) broadens the violence spectrum and was seen in Gisèle Pelicot’s defining rape case. Her (then) husband, Dominique Pelicot, amplified his call using digital technology, drugged and shared unconscious and explicit images and videos of her (TFGBV), and groomed and recruited over 70 men within 50 kilometres to rape (SGBV) Gisèle over 200 times within a ten year timespan. This scale and proliferation of abuse was only made possible via digital technology, which allowed him to conceal his crimes, and the website/forum’s apathetic disregard for duty of care and social responsibility resulted in extreme harm. This case undeniably evidences the interplay between TFGBV and SGBV. It illustrates that content posted online causes both technology-facilitated and real-life violence. These digital risks exacerbate misogynistic radicalisation, create valid parameters conducive to terrorism, and reinforce pathways to real-world violence, as will be explained further in the next section.
Digital Tools as Catalysts for TFGBV, SGBV, and Radicalisation
Acknowledging that digital technology can enable TFGBV and SGBV is a first step in understanding misogynistic parameters that lead to violent extremism, and in some cases, terrorism. The link between TFGBV, SGBV, violent extremism and terrorism becomes even more blatant, as the United Nations Counter-Terrorism Committee Executive Directorate (CTED) indicates, “Da’esh members traded and purchased unmarried women and girls in online slave auctions, using an encrypted application which circulated photographs of the captives, as well as details of their age, marital status, current location and price.” Another report suggests misogynistic men were more likely to engage in mass violence, evidencing “83 % of the 18 mass murders in the US in 2018 were perpetrated by someone who engaged in gender-based violence prior to their attack.”
When coupled with digital technology, misogyny is a vector fuelling TFGBV and SGBV. TFGBV is trivialised and is an underestimated driver of radicalisation that accelerates violent extremism. Shifting the Shame: How a Preventing Violent Extremism (PVE) Approach Addresses the Invisibility of Gender-Based Violence, underscores that SGBV should be an early warning system, signalling a visible uptick in violence. This aligns with evidence linking TFGBV and SGBV to pathways of radicalisation, highlighting misogyny as a common thread across various forms of extremism, including incel-related, far-right, and religiously motivated terrorism. This is consistent with:
- Gender-based terrorism from self-declared incels;
- Misogyny and female subordination as central components of far-right ideologies;
- The Taliban’s online and offline misogynistic extremism and gender apartheid towards Afghan women and girls.
Against this backdrop, it is important to note that perpetrators of TFGBV and SGBV enjoy impunity, as their violence is shielded behind screens and often anonymity and pseudonymity. Perpetrators of TFGBV can exploit engagement bias, using platforms that rely heavily on borderline – “awful, but lawful” – content, and benefit from some limited platform responsiveness and capacity to prevent harm, contributing to a culture of dismissal around TFGBV. This is starkly demonstrated by the website Dominique Pelicot used as it involved more than 23,000 crimes including rape, murder, paedophilia, sexual assault and homophobic attacks. Since its founding in 2003, and despite numerous reported incidents of online and offline violence and associated judicial proceedings, European authorities only shut down the website in 2024 (21 years later) due to the mounting evidence brought to light by the case.
While the case stands out, neither Dominique Pelicot nor the site should be viewed as outliers. Instead, it should increase public demand for corrective and collective actions from digital platforms, and change perceptions around the nexus of TFGBV, SGBV and violent extremism to both online and offline prevention. A study from the Centre for Countering Digital Hate highlights worrisome data stating that “incel forum members post about rape once every 29 minutes and that 89 percent of these posts are support[ed]. Another 5 percent of posters were not morally against rape but instead found it ‘unimaginative’ and expressed their desire to ‘reduce a woman to nothing.’” According to the same study, about 70 percent of young men have been exposed to extreme misogyny online, including normalizing violent rhetoric against women, owing to an increase in recruitment for far-right groups and incel terrorism.
While some scholars opine that most incels may not commit acts of violence, this is an alarming trend, especially since TFGBV and SGBV can be entry points to violent extremism. Furthermore, engagement biases have accelerated the rate of online radicalisation. In 2002, radicalisation took sixteen months, while today it takes as little as a few weeks. Knowing that up to 80 percent of youth and children worldwide have daily exposure to violence online, this becomes even more troubling with a recent spike of intentional misogynistic activities on incel forums. There is a dire need to tackle TFGBV, SGBV and violent extremism in tandem.
Preventing TFGBV, SGBV, and Violent Extremism
TFGBV can act as a precursor to online radicalisation, which has accelerated at unprecedented rates and can lead to violent (gendered) extremism and terrorism. Therefore, addressing TFGBV, SGBV, and violent extremism requires a deliberate and strategic approach that pinpoints shared root causes of all, including as misogyny, harmful ideologies, social exclusion, and gender-related grievances. Meaningful action means honing a collective culture to promote resilience by embedding PVE mechanisms to prevent TFGBV, SGBV, and violent extremism. Building on previous PVE Works, such as advocating for a counter-narrative approach to SGBV, removing gender biases and gender mainstreaming in PVE policies by incentivising equality and equity, also necessitates integrating these solutions in digital spaces.
What does this look like in practice?
Holding perpetrators accountable would require digital platforms to activate tools for corrective and collective action. It also means developing practical solutions to address the overlooked prevalence of TFGBV, especially as online radicalisation targets younger users and algorithms have the ability to drive them toward dangerous ideologies and actions. Since misogyny, TFGBV and SGBV can be initial indicators of violent extremism, digital platforms have a duty to care for their complicity, which should involve:
- Gender mainstreaming to promote algorithmic fairness by coding software with a gender lens to remove gender biases that aim for course correction and ethical design. This would enable early detection of gendered hate speech, treating it as a serious threat to disrupt radicalisation;
- Confronting and de-amplifying harmful content by reducing or removing content progression and seeking subscribers’ approval to show any extreme material. This could then activate red flags more easily, as an early warning system for radicalisation and may be an indicator for violent extremism or terrorism. This has the additional benefit of giving survivors/victims of TFGBV a recourse, and it can be added to a reporting function as part of a TFGBV grievance mechanism; and
- Fostering a proactive counter-narrative approach by programming algorithms to suggest more empathic content that promotes equality and equity would ensure positive exposure to balanced content and interrupt radicalisation patterns. Additionally, this could take the form of subsidising subscriptions to trustworthy news outlets, (re)instating fact-checking and/or publishing high-quality and credibly sourced materials on their platforms as counter-narratives.
These proposed solutions do not just ask digital platforms to remove content, because the number of take-downs cannot measure success. It is about building sustainable digital resilience – enabling platforms to prevent TFGBV, SGBV, violent extremism and terrorism by supporting PVE initiatives, especially in high-risk contexts. This could include committing a portion of their revenue to improving subscriber safety and reducing harm.
Scholars agree that “technology companies have a responsibility to counter the use of their services for illegal purposes, and it is right that they should be subjected to the scrutiny and accountability requirements they often are constrained by.” For example, if any business were responsible for enabling SGBV, rape, paedophilia, murder, mass shootings, or terrorism, it would be shut down immediately, not only by governments, but by private sector and communities alike. Digital platforms cannot operate with impunity but must have some agreed-upon governance mechanisms that are legally binding and afford equity to all individuals. Therefore, the current standard operating procedure needs a major overhaul with ethical and moral guidelines advocating for behavioural change, inculcating a duty of care and social responsibility to subscribers from digital technology companies.
—
Gazbiah Sans is a Counter-Terrorism and Preventing Violent Extremism expert, specializing in sexual/gender-based violence, specifically in fragile, conflict and violent contexts. She has over 15 years of experience, notably with USAID in Cameroon on the Boko Haram affected Lake Chad Basin Region and with the World Bank in Afghanistan. She serves as a member of the Internal Review Panel for the Global Community Engagement and Resilience Fund and concurrently serves as a Co-Chair of the Christchurch Call Advisory Network. She is the Director of PVE Works.
—
Are you a tech company interested in strengthening your capacity to counter terrorist and violent extremist activity online? Apply for GIFCT membership to join over 30 other tech platforms working together to prevent terrorists and violent extremists from exploiting online platforms by leveraging technology, expertise, and cross-sector partnerships.