Digital Media and Child Sexual Abuse: Foe, Friend, or an Unfinished Fight?

Digital Media and Child Sexual Abuse: Foe, Friend, or an Unfinished Fight?

Mar 4, 2026

Our Social Science Director, Dr. Suruchi Sood, Outlines the Facts of CSAM

Digital media is now one of the most influential forces shaping childhood. Young people move
fluidly between online and offline spaces, socializing, learning, and creating in ways unthinkable
a decade ago. Yet the same platforms that foster connection and creativity also open new
avenues for harm: online grooming, sexual extortion, and the rapid spread of child sexual abuse
material (CSAM).

At CHILD USA, where our work centers on strengthening legal frameworks and evidence-based
prevention, the question is not simply whether digital media is a foe or a friend. The more urgent
question is: How can we harness digital infrastructure to protect children while addressing and
reducing the risks it creates?

The answer lies in research, and the data reveal a complex, actionable story.

What We Know: The Scope of Online Child Sexual Abuse

A 2025 global meta-analysis published in The Lancet Child & Adolescent Health provides one
of the clearest pictures yet of online child sexual exploitation and abuse (OCSEA). Analyzing
123 studies across 57 countries, researchers found that approximately 1 in 12 children
worldwide, 8.1%, have experienced some form of online sexual exploitation or abuse. Breaking
this down:

• Online sexual solicitation: 12.5%
• Non-consensual exposure or sharing of sexual images/videos: 12.6%
• Online sexual exploitation: 4.7%
• Sexual extortion: 3.5%

These figures come from pooled self-reports across dozens of national studies; not models or
projections. Crucially, the authors caution that these numbers still likely undercount the true
scope of abuse, because many national surveys remain focused on pre-digital forms of sexual
violence, image-based abuse and AI-generated content are rarely measured, and disclosure
barriers, particularly around extortion, remain high.

For policymakers and child-protection advocates, this means online abuse must be treated not as
a marginal concern but as a primary child protection priority worldwide.

Digital Media as a Foe: How Online Environments Increase Risk

Digital platforms have fundamentally changed the landscape of abuse. Offenders can now
connect with many young people quickly, anonymously, and without geographic or social
boundaries. Features like encrypted messaging, ephemeral content, recommendation algorithms,
and livestreaming create new vectors for exploitation and complicate detection.

The scale of this problem is starkly visible in data from the National Center for Missing &
Exploited Children (NCMEC). Through its CyberTipline, the primary U.S. reporting system for
suspected online child sexual exploitation, NCMEC received 36.2 million reports of suspected
child sexual exploitation in 2023 alone, a 12% increase from the prior year, encompassing more
than 100 million files. From this volume, NCMEC staff escalated 63,892 reports to law
enforcement as urgent or involving a child in imminent danger. It is essential to note that
CyberTipline reports represent suspected, not confirmed, abuse. Even so, the scale of reporting
reflects two undeniable realities: digital platforms dramatically increase the reach and volume of
harmful content, and mandatory reporting systems are surfacing material that would otherwise
remain hidden.

Emerging technologies are accelerating these risks. The U.S. Department of Homeland
Security’s 2024 Subcommittee Report warns of the growing misuse of generative AI to create
synthetic sexual images of children, the expanding availability of anonymizing tools, and the
difficulty of identifying offenders who operate across multiple platforms. In 2023, the
CyberTipline received 4,700 reports of CSAM linked to generative AI, a number that surged to
67,000 in 2024, a 1,325% increase in a single year.

These trends underscore why traditional prevention strategies, such as warning children not to
talk to strangers online, are no longer sufficient. Abuse frequently involves peers or
acquaintances, and technology makes escalation faster and concealment easier.

Digital Media as a Friend: Detection, Transparency, and Prevention

At the same time, digital media is an indispensable tool for identifying, interrupting, and
preventing child sexual abuse, provided platforms and governments design and regulate it
properly.

1. Centralized detection systems
The CyberTipline functions as a kind of public health surveillance system for online exploitation.
Because electronic service providers are legally required to report suspected CSAM, the system
enables faster victim identification, supports cross-platform investigations, reduces the
recirculation of known material, and generates data to guide policymaking. These capabilities
were unimaginable in a pre-digital era; they represent technology actively working in children’s
favor.

2. International transparency and accountability frameworks
The OECD’s 2023 Transparency Reporting on Online CSEA highlights the critical role
governments and platforms must play in preventing abuse. The report calls for clear, enforceable
standards on what platforms are required to report, how frequently, what data must be publicly
disclosed, and how responses should be coordinated across borders. Transparency reporting
doesn’t just measure harm; it compels platforms to take accountability for reducing it.

3. Prevention and awareness at scale
Digital media also enables public education to reach communities at a scale and speed
impossible through traditional channels. Global campaigns like UNICEF’s #ENDViolence
initiative and national public service announcements use social platforms to connect millions of
caregivers and young people with research-backed guidance. These efforts represent digital
media at its best: a tool for raising awareness, shifting norms, and supporting survivors.

The Unfinished Fight: What Still Needs to Happen

Despite meaningful progress in detection and reporting, critical gaps remain, and closing them is
the defining child protection challenge of our era.

Reporting is inconsistent and incomplete. In 2023, only 245 of more than 1,600 registered
electronic service providers actually submitted CyberTipline reports, and just five platforms
accounted for over 91% of all submissions. The majority of platforms did not report at all;
meaning vast quantities of suspected abuse go undetected and undocumented.

Legal frameworks lag behind technology. The 2024 REPORT Act expanded mandatory
reporting to include online enticement and child sex trafficking, and early results are striking:
reports of online enticement rose 192% in 2024. But legislative coverage remains uneven, and
enforcement is inconsistent across jurisdictions.

AI-generated abuse is outpacing policy. The 1,325% surge in generative AI-related
CyberTipline reports between 2023 and 2024 signals a threat that is evolving faster than current
regulatory frameworks can contain. Legislation like the TAKE IT DOWN Act represents a step
forward, but the policy response must scale alongside the technology.

Measurement gaps distort the true picture. The Lancet meta-analysis found that image-based
abuse, AI-generated content, and livestreamed exploitation are rarely captured in national
surveys. Without better data, prevention efforts will remain reactive rather than proactive.

Moving Beyond “Foe or Friend”: Digital Media Is Infrastructure
The research makes one thing clear: digital media is neither inherently harmful nor inherently
protective. It is infrastructure, shaped by the safeguards we implement, the regulations we
establish, and the norms we cultivate.
Technology becomes a foe when reporting systems are fragile, platforms evade accountability,
privacy features are abused, and children lack access to help or information.
Technology becomes a friend when safety-by-design principles are enforced, platforms
implement robust detection systems, governments adopt clear and coordinated legal frameworks,
and prevention campaigns reach every community.
The path forward is not to demonize technology or celebrate it uncritically, but to invest in
governance, accountability, and prevention that make digital spaces safer for every child. That
fight is underway. It is not finished.

References
Fry, D., Steele, B., Lolié, L., et al. (2025). Prevalence estimates and nature of online child sexual
exploitation and abuse: A systematic review and meta-analysis. The Lancet Child & Adolescent
Health. https://pubmed.ncbi.nlm.nih.gov/39855238/

National Center for Missing & Exploited Children (NCMEC). (2024). 2023 CyberTipline
Report. https://www.ncmec.org/content/dam/missingkids/pdfs/2023-CyberTipline-Report.pdf

National Center for Missing & Exploited Children (NCMEC). (2025). 2024 CyberTipline Data.
https://www.missingkids.org/cybertiplinedata

U.S. Department of Homeland Security. (2024). Combatting Online Child Sexual Exploitation
and Abuse: Subcommittee Final Report. https://www.dhs.gov/archive/publication/hsapc-cseafinal-report

Organization for Economic Co-operation and Development (OECD). (2023). Transparency
Reporting on Child Sexual Exploitation and Abuse Online.
https://www.oecd.org/content/dam/oecd/en/publications/reports/2023/09/transparency-reportingon-child-sexual-exploitation-and-abuse-online_98fc37bb/554ad91f-en.pdf