The TAKE IT DOWN Act

🛡️ The TAKE IT DOWN Act: A Landmark Law Against Deepfake Abuse and Non-Consensual Intimate Content

By Daniel Hoffman, CISSP
June 2025


📌 Overview

On May 19, 2025, the United States enacted the TAKE IT DOWN Act, the first federal legislation to criminalize both the sharing and hosting of non-consensual intimate images—including AI-generated deepfakes. This bipartisan law aims to give victims a reliable takedown path and holds tech platforms accountable for prompt removal.

This blog post explores:

  • What the law actually says
  • How enforcement works (FTC, criminal charges)
  • What schools, workplaces, and platforms need to do to comply

⚖️ What the Law Covers: Definitions That Matter

Understanding the legal definitions is essential for enforcement and compliance:

1. “Intimate Visual Depiction”

Defined broadly to include:

  • Photos, videos, or digital content showing private parts or sexual activity.
  • Content that was created, altered, or distributed without consent.
  • Applies regardless of whether the person was actually photographed or generated digitally.

2. “Digital Forgery”

Refers to deepfake images and videos made with AI tools to portray someone in a sexual or nude context, even if the person never participated in such imagery.

3. “Covered Platform”

Any public website, app, or online service that:

  • Hosts user-generated visual content,
  • Is publicly accessible, and
  • Exceeds a minimum user threshold (e.g., 10,000+ users).

Exemptions may apply to encrypted platforms, closed research environments, and certain enterprise tools.


🚨 Enforcement: How the FTC Gets Involved

The Federal Trade Commission (FTC) is the primary enforcer. It will treat failure to remove flagged content as an “unfair or deceptive act or practice” under Section 5 of the FTC Act.

Here’s how enforcement will likely play out:

  1. Complaint Filed → Victim notifies the platform of violating content.
  2. 48-Hour Rule → Platform must take it down within 48 hours or face action.
  3. FTC Review → Noncompliance may lead to:
    • Civil penalties and fines,
    • Injunctions or compliance mandates,
    • Public shaming through FTC press releases or reports.

Platforms have until May 2026 to deploy compliant removal systems.


🏫 Compliance Guidance for Schools

While most K–12 and university systems are not “covered platforms,” they host digital platforms and serve vulnerable populations.

Suggested Actions for Schools:

  • Update Acceptable Use Policies (AUPs)
    Add clauses banning creation, use, or distribution of deepfakes and explicit imagery.
  • Create Takedown Protocols
    Define how to respond if images are found on school systems or reported by students.
  • Implement Monitoring Tools
    Scan school-managed devices and platforms like Canvas or Google Classroom.
  • Provide Counseling and Legal Resources
    Have designated response staff for student victims.
  • Run Digital Safety Programs
    Educate students about the legal and emotional impacts of deepfake abuse.

💼 What Employers Should Know

Workplaces can become vectors for harassment, retaliation, or reputation damage if they don’t address digital misconduct proactively.

Suggested Actions for Employers:

  • Update HR Policies & Code of Conduct
    Clearly define that non-consensual content—real or AI—is grounds for discipline or termination.
  • Train HR and Legal Departments
    Teach staff how to respond to internal reports, preserve evidence, and protect victims.
  • Review Internal Tools
    Ensure moderation or audit capabilities exist for platforms like Slack, Teams, or Confluence.
  • Encourage Whistleblower Reporting
    Create anonymous channels to report content or threats related to deepfakes.

🔧 What Platforms Must Do

If your platform falls under the “covered” definition, here’s your 3-part checklist:

  1. Build a Takedown Interface
    Victims need a simple, verified form to submit removal requests (email + image verification + consent claim).
  2. Automate Removal Within 48 Hours
    Use hash-matching (PhotoDNA-like tech) to identify and remove duplicates quickly.
  3. Maintain Transparency
    Include data in transparency reports: number of takedown requests, average response time, and appeal processes.

Platforms failing to comply after May 2026 risk hefty FTC fines and reputation damage.


🤔 Criticisms and Concerns

Digital rights organizations have raised legitimate worries:

  • Overreach: Vague definitions could chill legitimate speech or satire.
  • Abuse of takedowns: Without strong verification, malicious users might game the system.
  • Strain on small platforms: Indie apps and forums may lack the budget or tech to comply.

The law does not override Section 230, but it limits immunity for failure to remove unlawful content once notified.


🔚 Final Thoughts

The TAKE IT DOWN Act marks a historic shift in federal digital rights law. It gives victims of image-based abuse a clear path to justice—especially in an age where AI-generated content can destroy reputations in seconds.

For educators, employers, and developers, the challenge now is not just compliance—but leadership. Creating respectful digital environments is no longer optional.


📎 Resources

CyberLaw #Deepfakes #PrivacyRights #DigitalSafety #TAKEITDOWN #AIAwareness

OnlineHarassment #ContentModeration #TechPolicy #FTC #CybersecurityCompliance

StudentSafety #WorkplaceEthics #AIandLaw #DigitalRights #RevengePornLaw #PlatformAccountability

The TAKE IT DOWN Act
Scroll to top