Former Vice President Departs Stability AI Due to Copyright Issues

Former Vice President Departs Stability AI Due to Copyright Issues


Stability AI VP Leaves Company Over Copyright Stance

The head of audio at Stability AI is leaving the company due to the AI developer’s stance on training its generative AI model with copyrighted works. Ed Newton-Rex, former vice president of audio at the company, expressed his disagreement with Stability AI’s opinion and resigned from his role.

Newton-Rex disagreed with Stability AI’s official stance on using copyrighted material in its model training. He pointed to a 22-page comment on generative AI that his former employer submitted to the U.S. Copyright Office, which called the emerging technology “an acceptable, transformative, and socially-beneficial use of existing content that is protected by fair use.”

He argued that today’s generative AI models can be used to create works that compete with the copyrighted works they are trained on, making it unfair use. Mostaque replied to Newton-Rex’s Twitter thread, providing a direct link to the submitted comment.

Generative AI and Copyright Issues

Generative AI refers to AI models that create text, images, music, and video using prompts and drawing from a massive corpus of training material. Copyright has become a central part of the discussion around this technology.

Newton-Rex said fair use laws were not designed with generative AI models in mind and that training models under the fair use doctrine was wrong. He supported generative AI that doesn’t exploit creators by training their models on work without artists’ permission.

Lawsuit Against Stability AI

Since July, Stability AI, Midjourney, and Deviant Art have been involved in a lawsuit against AI image generators based on claims of copyright infringement. A federal judge dismissed most of the claims against Midjourney and Deviant Art but said the lawsuit against Stability AI could move forward.

Newton-Rex reiterated concerns about companies training generative AI models on creators’ works without permission and exploiting creators in a society where creators rely on copyright.

A Call for Change

Newton-Rex concluded by hoping others would speak up against exploiting creators within generative AI companies so that they realize it can’t be a long-term solution.

Hot Take: The Impact of Generative AI on Copyright Laws

Read Disclaimer
This page is simply meant to provide information. It does not constitute a direct offer to purchase or sell, a solicitation of an offer to buy or sell, or a suggestion or endorsement of any goods, services, or businesses. Lolacoin.org does not offer accounting, tax, or legal advice. When using or relying on any of the products, services, or content described in this article, neither the firm nor the author is liable, directly or indirectly, for any harm or loss that may result. Read more at Important Disclaimers and at Risk Disclaimers.

This article highlights an important debate within the tech industry regarding how generative AI models are trained using copyrighted works. The departure of Stability AI’s head of audio over this issue underscores the ethical and legal challenges posed by this emerging technology. As generative AI continues to evolve, it’s crucial for companies to consider the rights of creators and ensure fair compensation for their work.

Author – Contributor at | Website

Demian Crypter emerges as a true luminary in the cosmos of crypto analysis, research, and editorial prowess. With the precision of a watchmaker, Demian navigates the intricate mechanics of digital currencies, resonating harmoniously with curious minds across the spectrum. His innate ability to decode the most complex enigmas within the crypto tapestry seamlessly intertwines with his editorial artistry, transforming complexity into an eloquent symphony of understanding.