Plato Data Intelligence.
Vertical Search & Ai.

Former VP Leaves Stability AI Over Copyright Concerns – Decrypt

Date:

The head of audio at Stability AI is leaving the company because of how the AI developer justifies training its generative AI model with copyrighted works.

“I’ve resigned from my role leading the Audio team at Stability AI, because I don’t agree with the company’s opinion that training generative AI models on copyrighted works is ‘fair use,’” wrote Ed Newton-Rex, former vice president of audio at the company.

Newton-Rex expressed gratitude to his former colleague and founder Emad Mostaque and the work they had done up until this point, but said that he was unable to change Stability AI’s official stance on using copyrighted material in its model training. He pointed to a 22-page comment on generative AI that his former employer submitted to the U.S. Copyright Office, which called the emerging technology “an acceptable, transformative, and socially-beneficial use of existing content that is protected by fair use.”

“I disagree because one of the factors affecting whether the act of copying is fair use, according to Congress, is ‘the effect of the use upon the potential market for or value of the copyrighted work,’” Newton-Rex said. “Today’s generative AI models can clearly be used to create works that compete with the copyrighted works they are trained on. So I don’t see how using copyrighted works to train generative AI models of this nature can be considered fair use.”

Generative AI refers to AI models that create text, images, music, and video using prompts, drawing from a massive corpus of training material—material that is, more often than not, harvested wholesale from the open internet. As a result, copyright has become a central part of the discussion around the technology.

Mostaque replied to Newton-Rex’s Twitter thread, providing a direct link to the submitted comment.

“[It] was great working with you [and] this is an important discussion,” Mostaque replied.

Newton-Rex said fair use laws were not designed with generative AI models in mind, and that training models under the fair use doctrine was wrong. He said that he can only support generative AI that doesn’t exploit creators by training their models on work without artists’ permission.

Since July, Stability AI, Midjourney, and Deviant Art have been involved in a lawsuit against AI image generators based on claims of copyright infringement. In October, a federal judge dismissed most of the claims of a group of artists, including illustrator Sarah Andersen, against Midjourney and Deviant Art, but said the lawsuit against Stability AI could move forward.

“Companies worth billions of dollars are, without permission, training generative AI models on creators’ works, which are then being used to create new content that in many cases can compete with the original works,” Newton-Rex reiterated. “I don’t see how this can be acceptable in a society that has set up the economics of the creative arts such that creators rely on copyright.”

Earlier this year, as the now-resolved WGA strike was heating up, actress and computer scientist Justine Bateman sounded the alarm on how generative AI could disrupt the entertainment industry—and was therefore a key factor in the historic WGA and SAG-AFTRA strikes.

“I’m sure I’m not the only person inside these generative AI companies who doesn’t think the claim of ‘fair use’ is fair to creators,” Newton-Rex concluded. “I hope others will speak up, either internally or in public, so that companies realize that exploiting creators can’t be the long-term solution in generative AI.”

Edited by Ryan Ozawa.

Stay on top of crypto news, get daily updates in your inbox.

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?