San Francisco Sues AI Websites Over Non-Consensual Deepfake Nudes
The San Francisco City Attorney’s office has launched a lawsuit against 16 AI-powered websites accused of creating non-consensual nude deepfakes of women and girls. City Attorney David Chiu announced the legal action, targeting sites that have collectively garnered over 200 million visits in the first half of 2024 alone.
These websites employ artificial intelligence tools to digitally “undress” images of fully clothed individuals, simulating nudity without consent. One site even advertises its service as a means to bypass dating and obtain nude images directly.
The lawsuit alleges violations of state and federal laws prohibiting revenge pornography, deepfake pornography, and child pornography. Additionally, the sites are accused of breaching California’s unfair competition law. The City Attorney’s office argues that the harm inflicted on consumers far outweighs any potential benefits from these practices.
Seeking civil penalties, website shutdowns, and measures to prevent future deepfake pornography creation, this legal action comes amid growing concerns over non-consensual nudes fueled by advancements in generative AI technology. Recent months have seen an uptick in reports of “sextortion” cases, with victims ranging from celebrities like Taylor Swift to schoolchildren facing expulsion or arrest for circulating AI-generated nude photos.
City Attorney David Chiu expressed his dismay at the situation, stating, “The exploitation of women and girls through these websites is horrifying.” He emphasized the need for comprehensive societal solutions to address this complex issue.
As the case unfolds, it highlights the urgent need for legal and ethical frameworks to govern the rapidly evolving landscape of AI-generated content and its potential for misuse.