Skip to content

Thorn supports the Take It Down Act

March 3, 2025

5 Minute Read

One of the latest threats targeting teens and children is deepfake nudes. These AI-generated images depict real people in sexually suggestive or explicit situations or activities, and can be nearly indistinguishable from real photos.

Policymakers play a key role in protecting children, including from the dissemination of deepfake nudes. Among many other interventions, policymakers must draft and pass legislation designed to protect children from threats like deepfake nudes.

That’s why Thorn supports the Take It Down Act, a critical piece of legislation that closes a key legal gap by criminalizing the knowing distribution of intimate visual depictions of minors—whether authentic or AI-generated—when shared with intent to harm, harass, or exploit. 

The bill also strengthens protections against threats of disclosure used for intimidation or coercion, ensuring that those who target children online are held accountable.

Why Thorn Supports the Take It Down Act

We support the Senate’s recent passage of the Take It Down Act and encourage the House of Representatives to prioritize this critical piece of legislation as a step toward protecting kids from deepfake nudes.

Our latest research at Thorn found that 31% of teens are already familiar with deepfake nudes, and 1 in 8 personally knows someone who has been targeted. These manipulated images can be used for harassment, blackmail, and reputational harm, causing significant emotional distress for victims.

As deepfake technology grows more accessible, we have a critical window of opportunity to understand and combat this form of digital exploitation—before it becomes normalized in young people’s lives – and to act on their behalf to defend them from threats.

By closing a key legal gap, this bill criminalizes the knowing distribution of intimate visual depictions of minors—whether authentic or AI-generated—when shared with intent to harm, harass, or exploit. Importantly, it also extends penalties to threats of disclosure used for intimidation or coercion, providing stronger protections for child victims.

The Take It Down Act represents a crucial step toward our collective ability to keep pace with evolving threats and ensure that those who exploit children online are held accountable.

About the Take It Down Act: What You Need to Know

What are the key components of the Take It Down Act?

  1. The Take It Down Act would introduce criminal penalties for any person who knowingly publishes intimate visual depictions of an identifiable adult without consent and with intent to cause harm. This includes both authentic imagery and AI-generated imagery (digital forgeries).
  2. The Take It Down Act would introduce criminal penalties for any person who knowingly publishes intimate visual depictions of minors, with intent to humiliate, harass, or degrade the minor; or sexually arouse any person. This includes both authentic imagery and AI-generated imagery (digital forgeries). These criminal penalties do not apply to imagery that is considered child pornography, since that is already criminalized under 18 U.S. Code § 2256 and 18 U.S. Code § 1466A.
  3. The Take It Down Act would introduce criminal penalties for any person who intentionally threatens to distribute intimate visual depictions of minors or adults, as described above, for the purpose of intimidation, coercion, extortion, or to create mental distress.
  4. The Take It Down Act would require covered platforms to establish a “notice and removal process” to remove non-consensual intimate visual depictions, including AI-generated digital forgeries, and its copies within 48 hours of notice.

What could this bill mean for combating child sexual exploitation and abuse?

If the Take It Down Act passes the House of Representatives and becomes law, the bill would have several implications for combating online child sexual exploitation and abuse. 

First and foremost, the Take It Down Act’s introduction of criminal penalties for the knowing publication of intimate visual depictions of minors would fill an important legal gap around nude and exploitative images of a child. These are images that would be considered offensive, but do not meet the legal definition of child pornography—and, thus, are not criminalized in the same way that child sexual abuse material (CSAM) is. This presents a barrier to prosecution in some cases. By closing this legal gap, and criminalizing both authentic and AI-generated nude and exploitative images of a child, prosecutors will be able to better pursue offenders in child exploitation cases and ensure justice for all child victims. 

Secondly, the Take It Down Act’s addition of criminal penalties for the threat of disclosure of intimate visual depictions of minors, for the purpose of intimidation or extortion, is a critical step toward addressing the growing crisis of sextortion in this country. Our recent sextortion research indicates that 812 reports of sextortion are received by the National Center for Missing and Exploited Children (NCMEC) weekly, with more than two-thirds involving financial demands.  

Lastly, the Take It Down Act would require covered platforms to remove intimate visual depictions, both authentic and AI-generated, within 48 hours of being notified by the victim. This importantly provides another avenue of remedy for both child and adult victims. 

If the Take It Down Act passes quickly through the House of Representatives as it did in the Senate, this would indicate that preventing the online sexual exploitation and abuse of children is a serious priority of this Congress. We appreciate the many legislators who have supported this bill to date and are hopeful for a quick passage through the House of Representatives, to the President’s desk, and into law. 

Learn More

Together, we can continue to defend children from sexual abuse and exploitation.


Get the latest delivered to your inbox