TAKE IT DOWN Act: What You Need to Know
May 27, 2025
Since the Digital Millennium Copyright Act (DMCA) became law nearly thirty years ago, individuals have had the ability to compel websites, social media platforms, and media hosts to remove images they created themselves. Copyright vests in a work the moment it is fixed in a tangible form, such as taking a photo, without the need for fees or registration. The DMCA allows copyright owners to request the removal of their content from third-party hosts like Facebook, Dropbox, or TikTok.
However, the DMCA does not protect individuals depicted in content they didn’t create, which means it cannot be used to remove non-consensual intimate images or videos unless the subject also owns the copyright.
In response to this critical gap, Congress passed the TAKE IT DOWN Act, a bipartisan initiative led by Senators Amy Klobuchar (D-Minn.) and Ted Cruz (R-Texas), which was signed into law by President Trump in May 2025. The Act imposes criminal and civil penalties on online platforms that fail to remove intimate visual depictions within 48 hours of receiving a valid notification. It also penalizes threats to publish such images.
The law mandates that by May 19, 2026, all websites, platforms, and apps must establish clear and accessible procedures for users to report and request the removal of intimate visual depictions. These requirements extend beyond public-facing platforms to include password-protected sites and internal systems used by employees and contractors. Early implementation is strongly encouraged to ensure compliance.
During the bill's legislative review, Rep. August Pfluger (R-TX) expressed concern over the rise of deepfake pornographic content, which devastates lives in the absence of consistent state-level protections. Rep. Debbie Dingell (D-MI) highlighted that AI-driven exploitation is part of a larger pattern of abuse and coercive control.
The Act does not specify a statute of limitations. Businesses should treat all non-consensual intimate images as subject to takedown, regardless of when the content was uploaded. Exceptions include law enforcement, educational, scientific purposes, and legal proceedings. It remains unclear whether retaining images after a removal request violates the law.
Adding to the legal complexity, H.R.1, the budget bill passed by the House of Representatives, includes a moratorium that would prevent states, counties and cities from regulating artificial intelligence. If passed in its current form, this would limit victim’s ability to pursue remedies under potentially stronger state laws, forcing them to seek justice in the federal court.
Berger Singerman’s attorneys are well-versed in the requirements of the TAKE IT DOWN Act and related privacy and content regulations. Please reach out to Heidi Tandy, Geoff Lottenberg, or any other member of our Intellectual Property group for guidance on updating your policies, ensuring compliance, and protecting your business from emerging risks tied to non-consensual content and AI-generated imagery.

TAKE IT DOWN Act: What You Need to Know
May 27, 2025
Since the Digital Millennium Copyright Act (DMCA) became law nearly thirty years ago, individuals have had the ability to compel websites, social media platforms, and media hosts to remove images they created themselves. Copyright vests in a work the moment it is fixed in a tangible form, such as taking a photo, without the need for fees or registration. The DMCA allows copyright owners to request the removal of their content from third-party hosts like Facebook, Dropbox, or TikTok.
However, the DMCA does not protect individuals depicted in content they didn’t create, which means it cannot be used to remove non-consensual intimate images or videos unless the subject also owns the copyright.
In response to this critical gap, Congress passed the TAKE IT DOWN Act, a bipartisan initiative led by Senators Amy Klobuchar (D-Minn.) and Ted Cruz (R-Texas), which was signed into law by President Trump in May 2025. The Act imposes criminal and civil penalties on online platforms that fail to remove intimate visual depictions within 48 hours of receiving a valid notification. It also penalizes threats to publish such images.
The law mandates that by May 19, 2026, all websites, platforms, and apps must establish clear and accessible procedures for users to report and request the removal of intimate visual depictions. These requirements extend beyond public-facing platforms to include password-protected sites and internal systems used by employees and contractors. Early implementation is strongly encouraged to ensure compliance.
During the bill's legislative review, Rep. August Pfluger (R-TX) expressed concern over the rise of deepfake pornographic content, which devastates lives in the absence of consistent state-level protections. Rep. Debbie Dingell (D-MI) highlighted that AI-driven exploitation is part of a larger pattern of abuse and coercive control.
The Act does not specify a statute of limitations. Businesses should treat all non-consensual intimate images as subject to takedown, regardless of when the content was uploaded. Exceptions include law enforcement, educational, scientific purposes, and legal proceedings. It remains unclear whether retaining images after a removal request violates the law.
Adding to the legal complexity, H.R.1, the budget bill passed by the House of Representatives, includes a moratorium that would prevent states, counties and cities from regulating artificial intelligence. If passed in its current form, this would limit victim’s ability to pursue remedies under potentially stronger state laws, forcing them to seek justice in the federal court.
Berger Singerman’s attorneys are well-versed in the requirements of the TAKE IT DOWN Act and related privacy and content regulations. Please reach out to Heidi Tandy, Geoff Lottenberg, or any other member of our Intellectual Property group for guidance on updating your policies, ensuring compliance, and protecting your business from emerging risks tied to non-consensual content and AI-generated imagery.