A new bipartisan bill introduced in Congress seeks to combat the alarming rise of AI-generated deepfake pornography by holding social media platforms responsible for policing and removing such content.
The proposed legislation, known as the Take It Down Act, comes in response to a staggering 464% increase in deepfake porn production in 2023 compared to the previous year.
Spearheaded by Senator Ted Cruz (R-Texas), the bill would criminalize the publishing or threatening to publish deepfake pornographic images.
It would require social media companies to develop a process for removing these images within 48 hours of receiving a valid request from a victim.
Platforms would be obligated to make reasonable efforts to remove any copies of the images, including those shared in private groups.
The Federal Trade Commission would be tasked with enforcing these new rules, adding another layer of consumer protection to their existing responsibilities.
Cruz emphasized the importance of creating a level playing field at the federal level, stating, “By putting the responsibility on websites to have procedures in place to remove these images, our bill will protect and empower all victims of this heinous crime.”
The impact of nonconsensual AI-generated images has been far-reaching, affecting celebrities like Taylor Swift, politicians such as Rep. Alexandria Ocasio-Cortez, and even high school students whose classmates have misused their photos to create fake nude or pornographic images using AI tools.
However, the legislative landscape surrounding this issue is complex, with two competing bills in the Senate.
Earlier this year, Senator Dick Durbin (D-Illinois) introduced a bipartisan bill that would allow victims of non-consensual deepfakes to sue individuals who created, possessed, or distributed the images.
In contrast, Cruz’s bill treats deepfake AI porn as extremely offensive online content, placing the onus on social media companies to moderate and remove such material.
The divergence in approaches has led to some friction in the Senate. When Durbin attempted to secure a floor vote for his bill last week, Senator Cynthia Lummis (R-Wyoming) blocked it, arguing that it was “overly broad in scope” and could potentially “stifle American technological innovation.” Lummis is, notably, one of the original co-sponsors of Cruz’s bill.
As the debate continues, Senate Majority Leader Chuck Schumer (D-New York) is pushing for broader AI legislation.
A recent AI task force roadmap highlighted the need to address the “nonconsensual distribution of intimate images and other harmful deepfakes” as a key issue.
The introduction of the Take It Down Act marks a significant step in addressing the growing threat of AI-generated deepfake pornography.
As lawmakers grapple with the complexities of regulating emerging technologies, the bill’s progress will be closely watched by victims, tech companies, and privacy advocates alike.
The coming weeks will likely see intense discussions as Congress works to find a balance between protecting individuals from exploitation and preserving technological innovation in the AI space.