Skip to content

Deepfake nudes are a harmful reality for youth: New research from Thorn

March 3, 2025

4 Minute Read

Children and teens today live much of their lives online—where new risks are emerging at an unprecedented rate. One of the latest emerging threats? Deepfake nudes. 

Our latest research at Thorn, Deepfake Nudes & Young People: Navigating a New Frontier in Technology-Facilitated Nonconsensual Sexual Abuse and Exploitation, reveals that 31% of teens are already familiar with deepfake nudes, and 1 in 8 personally knows someone who has been targeted.

Over the past few years, deepfake technology has developed rapidly, making it possible to create hyper-realistic explicit images of anyone in seconds—with no technical expertise required.

While nonconsensual image abuse isn’t new, deepfake technology represents a dangerous evolution in this form of child sexual exploitation. Unlike earlier photo manipulation methods, AI-generated content is designed to be indistinguishable from real images, making them an especially powerful tool for abuse, harassment, blackmail, and reputational harm.

As deepfake technology grows more accessible, we have a critical window of opportunity to understand and combat this devastating form of digital exploitation—before it becomes normalized in young people’s lives.

The emerging prevalence of deepfake nudes

The study, which surveyed 1,200 young people (ages 13-20), found that deepfake nudes already represent real experiences that young people are having to navigate. 

What young people told us about deepfake nudes:

  • 1 in 17 teens reported they had deepfake nudes created of them by someone else (i.e., have been the victim of deepfake nudes).
  • 84% of teens believe deepfake nudes are harmful, citing emotional distress (30%), reputational damage (29%), and deception (26%) as top reasons.
  • Misconceptions persist. While most recognize the harm, 16% of teens still believe these images are “not real” and, therefore, not a serious issue.
  • The tools are alarmingly easy to access. Among the 2% of young people who admitted to creating deepfake nudes, most learned about the tools through app stores, search engines, and social media platforms.
  • Victims often stay silent. Nearly two-thirds (62%) of young people say they would tell a parent if it happened to them but in reality, only 34% of victims did.

Why this matters

Our VP of Research and Insights, Melissa Stroebel, put it best: “No child should wake up to find their face attached to an explicit image circulating online—but for too many young people, this is now a reality.”

This research confirms the important role tech companies play in designing and deploying technology conscious of the risks of misuse, while also underscoring the need to educate young people and their communities on how to address this kind of digital abuse and exploitation.

What you can do

Everyone can play a role in responding to emerging threats like deepfake nudes and other harms.

Parents can talk to their kids early and often:
Many parents and caregivers haven’t even heard of deepfake nudes—but young people have, and they need guidance on how to navigate this new threat.

  • Start the conversation about this early. Even if your child hasn’t encountered deepfake nudes yet, discussing them now can help them recognize the risks before they become a target.
  • Reinforce that deepfake nudes are not a joke. Some young people see these images as harmless or even humorous, but the reality is that they can have devastating consequences for victims.
  • Teach kids what to do if they’re targeted. Make sure they know where to report deepfake nudes, seek support, and understand that they are not alone in navigating online threats.

Platforms must prioritize safety:
The spread of deepfake nudes underscores the urgent need for platforms to take responsibility in designing safer digital spaces. Platforms should:

  • Adopt a Safety by Design approach to detect and prevent deepfake image creation and distribution before harm occurs.
  • Commit to transparency and accountability by sharing how they address emerging threats like deepfake nudes and implementing solutions that prioritize child safety.

Learn more and support Thorn:

Together, we can continue to defend children from sexual abuse and exploitation.


Get the latest delivered to your inbox