
Unsafe Systems: Fighting TFGBV in an Era of Heightened Gender Backlash
Media Contact
“Your power is relative, but it is real. And if you do not learn to use it, it will be used against you, and me, and our children. Change did not begin with you and it did not end with you, but what you do with your life is an absolutely vital part of that chain.” – Audre Lorde.
In 2018, ICRW released one of the first comprehensive frameworks for understanding (and naming) technology-facilitated gender-based violence (TFGBV). It offered clarity to a field only beginning to recognize the harms emerging at the intersection of gender inequality and digital technologies. The framework emphasized that online and offline violence exist on a continuum shaped by power, inequitable norms, and social permission structures, not as separate or siloed phenomena.
Since that time, our understanding of TFGBV has deepened in important ways. The United Nations Population Fund has launched programs like the Bodyright Campaign and publications on measuring TFGBV. Additionally, Save the Children’s 2025 framework applied a child-centered lens, an essential evolution because children engage with technology differently than adults. Their evolving capacities, reliance on digital tools for learning and identity formation, and the power differentials between adults and children create distinct vulnerabilities and patterns of harm. Together, these frameworks help map how TFGBV manifests across the life course.
But frameworks alone do not keep people safe. As we look back on the evolution of this field, and mark 16 Days of Activism, we are also reflecting on our own roles in shaping it. If the past several years have been about naming and understanding TFGBV, the current moment demands that we move from understanding to action. Making that shift requires that we answer a deeper question: who is responsible for creating the conditions for online safety?
Systems Change Has Advanced, But Unevenly, and in the Shadow of COVID-19
To understand how responsibility for safety online can be best envisioned, we can learn from how TFGBV has unfolded over the last five-plus years. In 2020, two years after ICRW’s first publication on TFGBV, the world experienced the COVID-19 pandemic. Children and adolescents became reliant on digital platforms for schooling, social connection, and identity formation. Adults shifted into remote work, online collaboration, and near-constant digital engagement. The pandemic dramatically intensified our collective dependence on technology, expanding exposure to online risks while protections failed to keep pace.
Our collective experience of the COVID pandemic, existing as it did alongside advancements in digital tech, reshaped how we live, and lent urgency and direction to research, advocacy, and policy efforts to understand and respond to online violence.
In the years since, the global response to TFGBV has matured. Feminist activists and youth organizers have raised visibility, and some governments and multilaterals have begun developing policies and strategies to address it.
But these shifts have unfolded alongside powerful countercurrents: backlash against, and shrinking funding for, gender equality and human rights; the consolidation of political and economic power in the hands of technology companies; and the rapid expansion of AI without adequate regulation and guardrails. This has created a profound tension in the ecosystem of prevention, mitigation, and response. Powerful actors, such as technology companies and the governments responsible for regulating them, hold disproportionate influence over the conditions of digital safety, while the communities most affected by TFGBV bear much of the burden of response.
This is not a neutral distribution of power. These powerful actors are determining whose safety matters, whose harm is ignored, and who gets to shape the rules of the game. As those with the greatest power evade accountability, responsibility is pushed onto those with the least, such as civil society, parents, teachers, activists, and victim-survivors themselves. This displacement of duty is a form of tragic responsibility, in which those harmed by systemic neglect are forced to shoulder obligations that properly belong to those with real power.
Narrative as Contested Terrain: The Landscape that Shapes Violence
This downward drift in responsibility is reinforced by the narratives that digital systems amplify, which normalize harm and frame safety as an individual burden rather than a structural one. Young people now encounter toxic digital narratives as a matter of routine, throughout their daily lives. So do women and gender-diverse people. The manosphere, “trad wife” aesthetics, pronatalist backlash, hostility toward women who choose not to marry or have children (and, ironically, those that do), transphobia, homophobia, racism, and algorithmic amplification of misogynistic content all deepen exposure to harm—online and off.
These narratives are not benign ambient noise. They are deliberate attempts to curtail women’s autonomy, punish those who step outside prescribed roles, and target LGBTQIA+ people whose identities challenge rigid, inequitable gender norms. They are intentional efforts to define the boundaries of the public square, and to explicitly exclude the voices they would rather keep silent.
As ICRW’s framework emphasizes, and as we have argued elsewhere, narrative is the contested terrain on which protection and harm are negotiated. Digital violence thrives where narratives have already prepared the ground—where misogyny and anti-rights rhetoric circulate increasingly unchecked, where harmful myths about girls’ and women’s “deservedness” of harm persist, and where trans, intersex, and queer people are dehumanized for sport.
Online safety is conditioned by the prevailing social narratives of our times, and those narratives, and the influence they enjoy, are disproportionately shaped by those who hold power. In this way, narrative harms are inseparable from the structural decisions of platform design and regulation, which determine what ideas are elevated, ignored, or allowed to metastasize.
A Call for Shared Responsibility
The power dynamics shaping digital life are not neutral. Technology companies and Western governments hold disproportionate control over platform design, regulation, investment, and access. They must take on far more responsibility for online safety, not as corporate philanthropy, but as enforceable, accountable policy and practice obligations.
Yet we cannot be naïve. Powerful actors rarely act simply because they are asked to, and they almost never act simply because it is the right thing to do.
Throughout history, feminist movements—especially Black, Indigenous, queer, and global-majority feminists—have shifted systems by mobilizing relative power, narrative power, and collective power (sometimes referred to as power with and power to) even when formal authority was absent. That lineage is not just inspiration but also instruction.
Many gender, child-protection, and rights professionals have been displaced by funding cuts and political headwinds, moving into new sectors or narrower roles. This displacement is real, but it does not eliminate agency.
As Lorde reminds us, power is always relative, but it is always present. And in moments like this, it must be used. We call on colleagues across sectors—tech, philanthropy, AI ethics, corporate responsibility, education, government, and beyond—to bring gender equity, child-centered insights, and TFGBV expertise into the spaces you now inhabit.
This moment also asks for honesty about our own positionality. The privilege that some of us enjoy, including whiteness, shapes whose voices are heard and whose ideas are adopted. Using that privilege to center Black, Indigenous, queer, disabled, and global-majority leaders is part of the work of building safer digital futures.
“Power with” and “power to” are not about taking on the work of institutions, but about building the collective leverage needed to make those institutions act on the responsibilities they resist.
About the Authors
Jess Ogden is the Deputy Executive Director, ICRW-Americas.
Connor Roth is a feminist researcher and advocate with expertise in technology-facilitated gender-based violence, youth, and displacement. She worked at ICRW as a Gender and Violence Research Specialist (2021-2023), at Bixal as a Monitoring and Evaluation Specialist, and is currently a research and evaluation consultant, including leading research for Save the Children on a child-centered framework for technology-facilitated gender based-violence against children.
Erin Leasure has a Masters of Public Health from the Milken Institute School of Public Health at George Washington University and a Bachelor of Art from the University of North Carolina at Chapel Hill. She previously worked at ICRW (2021-2023) and Save the Children on gender-related topics, including co-leading the research and development for a global child-centered TFGBV framework.
Laura Hinson is a researcher and strategist who uses evidence to inform philanthropic decision-making and strengthen gender equity initiatives globally. She served as a Senior Research Scientist at ICRW (2013-2022), where she led major studies on technology-facilitated gender-based violence and helped shape the field’s understanding of other complex issues like adolescent agency and reproductive empowerment. Most recently, she directed grantmaking and evidence strategy at Wellspring Philanthropic Fund with a focus on girls’ education.
