Melania Trump

In a rare show of bipartisan unity, Congress has passed the Take It Down Act, a sweeping bill aimed at curbing the spread of non-consensual intimate images — including AI-generated “deepfakes” — across the internet. The legislation, introduced by Sen. Ted Cruz (R-Texas) and Sen. Amy Klobuchar (D-Minn.), now heads to President Donald Trump’s desk for his signature, with strong expectations that it will become law.

The Take It Down Act criminalizes the knowing publication or threat to publish intimate images without a subject’s consent. Significantly, the bill covers both real photos and those created by artificial intelligence, a growing concern amid the rise of deepfake technology. Once notified by a victim, websites and social media platforms will be required to remove the offending material within 48 hours and take steps to eliminate duplicates.

The legislation has earned public support from First Lady Melania Trump (in our picture), who personally lobbied on Capitol Hill, calling the spread of such content among teens, especially girls, “heartbreaking.” Tech giant Meta, the parent company of Facebook and Instagram, also backed the bill. “Having an intimate image — real or AI-generated — shared without consent can be devastating,” Meta spokesman Andy Stone said.

Advocates like the Information Technology and Innovation Foundation praised the bill as “an important step forward” in giving victims new tools for justice in the digital age.

However, not everyone is on board. Digital rights groups, including the Electronic Frontier Foundation (EFF), argue the bill’s language is too broad and could threaten legitimate speech. Critics warn it could force platforms to rely on blunt automated filters that may inadvertently censor lawful content — from LGBTQ expression to newsworthy images.

“The bill applies to a much broader category of content than the narrow definitions of non-consensual imagery it claims to target,” EFF stated, raising concerns about frivolous takedown requests and rushed content moderation decisions. Despite the controversy, supporters maintain that the legislation fills a crucial gap in federal protections against digital abuse, especially as AI-generated imagery presents new threats to privacy and safety.

With President Trump expected to sign the Take It Down Act imminently, the U.S. moves one step closer to federally regulating the intersection of AI, privacy, and intimate image abuse online.

Deepfakes, Privacy, and Censorship: What the ‘Take It Down Act’ Could Mean for the Internet

Why It Matters: The bill criminalizes the knowing distribution or threat to distribute intimate images without consent — whether those images are real or AI-fabricated. In addition, it forces platforms to remove reported content within 48 hours and proactively scrub duplicates. In the age of generative AI, the potential for abuse is staggering. Victims, especially women and minors, often face harassment and reputational damage from deepfakes posted to social media and adult sites. According to the bill’s supporters, current legal frameworks have struggled to keep pace.

Who’s Backing It: The act has earned high-profile support. First Lady Melania Trump publicly lobbied for its passage, calling attention to the damage such images cause among teenagers. Tech heavyweight Meta also endorsed the bill, with spokesperson Andy Stone calling non-consensual intimate content “devastating.”

The Debate: However, free speech advocates warn that the Take It Down Act could swing too far. Groups like the Electronic Frontier Foundation argue that the bill’s broad language might lead to the takedown of legitimate content — from protest images to consensual adult content wrongly flagged as abusive. They’re particularly concerned about platforms relying on AI moderation tools, which have a well-documented history of false positives, and the law’s tight 48-hour deadline for removals, leaving little room for due diligence.

What Happens Next: If President Trump signs the bill — and signs point to yes — it will become one of the most aggressive federal laws addressing online intimate image abuse in U.S. history. It also signals Washington’s growing willingness to regulate AI-generated content and the companies that host it. For now, the Take It Down Act represents both a significant step toward protecting digital privacy and a fresh chapter in the ongoing fight over free expression and content moderation in the AI era.


Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.