“This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation,” City Attorney David Chiu said in a statement. “Generative A.I. has enormous promise, but as with all new technologies, there are unintended consequences and criminals seeking to exploit the new technology.”
San Francisco has filed a lawsuit against 16 websites that allegedly offer or sell “nudify” services, which use artificial intelligence to create explicit images of women and girls without their consent.
According to The San Francisco Chronicle, the complaint states that all of the defendant websites let users create A.I.-generated images of real people by swapping their faces onto nude bodies. City Attorney David Chiu claims that these services, which have few in-built protections, violate California laws against nonconsensual deepfake pornography, revenge pornography, and—in some cases—child pornography.
“This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation,” Chiu said in a statement. “Generative A.I. has enormous promise, but as with all new technologies, there are unintended consequences and criminals seeking to exploit the new technology.”
The New York Times reports that one of the websites named in the lawsuit promotes its services by asking visitors whether they have “someone to undress.” Another advises users not to “[waste] time taking” a woman out on dates when they can “get her nudes” using artificial intelligence.
Chiu’s office noted that many large, A.I.-centered companies—like OpenAI, and Anthropic—filter and restrict content that is overtly violent or sexual. But many online services have less regulation and train their models using pornographic materials.
San Francisco Deputy City Attorney Karun Tilak said that it is no clear which programs are being used to create the allegedly unlawful images, but suggested that technologies like Stable Diffusion could be involved.
Tilak added that the City Attorney’s Office has identified the owners of some of the defendant websites, but that others “have hidden in the shadows.”
By suing, Tilak said, the city hopes “to obtain their identities.”
Chiu said in a statement that, once deepfake images begin circulated, it is virtually impossible to determine which website created them—making it difficult for women to take action against negligent companies. The lawsuit therefore seeks to close some of the websites, and to prevent their owners from creating similar deepfake services in the future.
“We have to very clear that this is not innovation—this is sexual abuse,” said Chiu. “This is a big, multifaceted problem that we, as a society, need to solve as soon as possible. We all need to do our part to crack down on bad actors using A.I. to exploit and abuse real people, including children.”
Sources
San Francisco Moves to Lead Fight Against Deepfake Nudes
S.F. sues websites over explicit, nonconsensual AI-generated nude images
Join the conversation!