📣 San Francisco Initiates Crackdown on Deepfake Nudes

posted  16 Aug 2024
Photo - San Francisco Initiates Crackdown on Deepfake Nudes
San Francisco's law enforcement has filed a lawsuit aimed at shutting down 16 websites that use artificial intelligence to turn photos of real women into nonconsensual images. Such services have recently gained popularity among both teenagers and adults.  

The lawsuit was spearheaded by San Francisco's Deputy City Attorney, Yvonne Mere, who, along with her team, prepared the legal action. External legal experts view this as a pioneering case in modern history, marking the first time the government has attempted to close websites engaged in digital “nudification.”
No one has tried to hold these companies accountable,
City Attorney David Chiu said.
If successful, the initiative will lead to the creation of a real-time updated list of such platforms to block them before they become popular. Over the last six months, these sixteen websites have attracted 200 million visitors worldwide, yet the owners have not responded to any inquiries.  

Additionally, the lawsuit aims to have an impact at the federal level, as the plaintiffs argue that these sites violate both local and federal laws. David Chiu remarked that while artificial intelligence undoubtedly offers societal benefits, some technologies inherently possess a darker side.  

Several states have already passed laws that criminalize the creation and dissemination of deepfake nudes of both minors and adults without their consent. However, these measures focus solely on penalizing individuals rather than preventing the creation of deepfakes.  

Sidebar ad banner