• Home
  • AI
  • AI ‘Undressing’ Websites Sued by San Francisco for Non-Consensual Nude Images 😮
Largest open-source AI model ever discussed by Meta's Zuckerberg. 🚀

AI ‘Undressing’ Websites Sued by San Francisco for Non-Consensual Nude Images 😮

San Francisco City Attorney Takes Legal Action Against Websites Generating Non-Consensual Nude Images Using AI

San Francisco City Attorney David Chiu has filed a lawsuit against 16 websites that use artificial intelligence (AI) to create non-consensual nude images of women and girls.

The lawsuit, filed in San Francisco Superior Court, targets sites that allow users to “undress” or “nudify” people in photos without their consent.

According to Chiu’s office, these websites have received over 200 million visits in just the first six months of 2024. The lawsuit claims the site owners include individuals and companies from Los Angeles, New Mexico, the United Kingdom, and Estonia.

The Dark Side of AI: Exploitation and Abuse

  • AI models trained on pornographic images and child sexual abuse material
  • AI used to generate realistic, pornographic images of individuals
  • Images used to extort, bully, threaten, and humiliate victims

The AI models used by these sites are reportedly trained on pornographic images and child sexual abuse material. Users can upload a picture of their target, and the AI generates a realistic, pornographic version. While some sites claim to limit their service to adults only, others allow images of children to be created as well.

“This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation,” Chiu stated.

He emphasized that while AI has “enormous promise,” criminals are exploiting the technology for abusive purposes.

Legal Action and Consequences

  • Lawsuit seeks financial penalties and shutdown of sites
  • Call for domain name registrars, web hosts, and payment processors to stop supporting these sites
  • Efforts to combat non-consensual intimate imagery and child sexual abuse material

The lawsuit alleges that these AI-generated images are “virtually indistinguishable” from real photographs. They have been used to “extort, bully, threaten, and humiliate women and girls,” many of whom have no way to control or remove the fake images once they’ve been created and shared online.

In one troubling incident highlighted by Chiu’s office, AI-generated nude images of 16 eighth-grade students were shared among students at a California middle school in February. The students targeted were typically 13 to 14 years old.

The legal action seeks to have these sites pay $2,500 for each violation and cease operations. It also calls for domain name registrars, web hosts, and payment processors to stop providing services to companies that create these AI-generated deepfakes.

Addressing the Growing Threat

  • Efforts to combat the spread of non-consensual intimate imagery
  • Legislative action and tech company pledges to prioritize child safety
  • Challenges of identifying and protecting real victims in the digital age

The rapid spread of what experts call non-consensual intimate imagery (NCII) has prompted efforts by governments and organizations worldwide to address the issue. The use of AI to generate child sexual abuse material (CSAM) is particularly concerning, as it complicates efforts to identify and protect real victims.

The Internet Watch Foundation, which tracks online child exploitation, has warned that known pedophile groups are already embracing this technology. There are fears that AI-generated CSAM could overwhelm the internet, making it harder to find and remove genuine abuse material.

In response to these growing concerns, some jurisdictions are taking legislative action. For example, a Louisiana state law specifically banning AI-created CSAM went into effect this month.

Major tech companies have pledged to prioritize child safety as they develop AI technologies. However, researchers at Stanford University have found that some AI-generated CSAM has already made its way into datasets used to train AI models, highlighting the complexity of the problem.

Read Disclaimer
This content is aimed at sharing knowledge, it's not a direct proposal to transact, nor a prompt to engage in offers. Lolacoin.org doesn't provide expert advice regarding finance, tax, or legal matters. Caveat emptor applies when you utilize any products, services, or materials described in this post. In every interpretation of the law, either directly or by virtue of any negligence, neither our team nor the poster bears responsibility for any detriment or loss resulting. Dive into the details on Critical Disclaimers and Risk Disclosures.

Share it

AI ‘Undressing’ Websites Sued by San Francisco for Non-Consensual Nude Images 😮