AI Regulation of Generative AI Models and the Impact on Creative Workers

At a virtual Federal Trade Commission (FTC) roundtable, representatives from various creative fields expressed their concerns about the use of generative AI models and tools. They emphasized the need for AI regulation to protect their artistic output, brands, voices, and likenesses from AI-generated deepfakes and unauthorized use. The roundtable, titled “Creative Economy and Generative AI,” aimed to explore the impact of generative artificial intelligence on the creative industry.

FTC chair Lina Khan highlighted the agency’s commitment to ensuring fair competition and understanding the potential effects of automated systems, including artificial intelligence, on individuals. The discussion encompassed not only well-known impacts on authors, actors, and visual artists but also showcased lesser-known perspectives.

Challenges in the Fashion Industry

Sara Ziff, founder and executive director of the Model Alliance, drew attention to the fashion industry’s unique challenges. She emphasized the need to consider the context of an industry with fewer worker protections. Ziff expressed concerns about the use of 3D body scans in connection with generative AI and the creation of AI-generated models, particularly those representing people of color. Fashion models have reported instances where they were not adequately informed about the usage of their scans and were not fairly compensated for the use of their images.

Musicians Struggling with Expansive AI Models

Jen Jacobsen, executive director at the Artist Rights Alliance (ARA), discussed the experiences of musicians in dealing with AI-driven tools. While musicians have long used AI for tasks such as auto-tuning vocals and generating beats, they are now facing challenges posed by expansive AI models that mimic their voices without obtaining proper consent or compensating the creators. Jacobsen highlighted that this leads to copyright infringement and unfair competition within the music marketplace, with AI-generated tracks directly competing against musicians’ original work. Additionally, she pointed out the increased use of AI to create deepfakes that deceive fans and harm artists’ public image.

Emphasizing the Human-Centered Approach

Duncan Crabtree-Ireland, national executive director and chief negotiator for SAG-AFTRA, the actors’ union, clarified the union’s stance on AI technologies. While not opposing their existence or use, Crabtree-Ireland stressed that AI-generated content always stems from human creative sources. He highlighted the importance of protecting actors’ brands, voices, and likenesses, ensuring that companies cannot appropriate them without permission. The proposal emphasizes a human-centered approach, where AI and its algorithms serve the human industry rather than override it.

The roundtable also included two plaintiffs involved in lawsuits against prominent AI companies. Author Douglas Preston, alongside several other authors, filed a class-action lawsuit against OpenAI, accusing the company of illegally using books to train its AI without consent or compensation. Preston asserted that books provide the valuable literary content necessary for high-quality language models and highlighted the emotional investment authors put into their work. Similarly, concept artist Karla Ortiz is part of a lawsuit against Stable Diffusion and Midjourney for infringing upon the rights of millions of artists. These organizations trained their AI tools using images scraped from the web without the consent of the original artists.

A Call for Consent, Credit, Compensation, and Control

The arguments put forth by these creative workers and labor leaders call for AI regulation that focuses on consent, credit, compensation, and control. They emphasize the need to protect the integrity and livelihoods of those working in the creative industry. These discussions are crucial in ensuring that AI technology respects the rights and contributions of artists and creators.

Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts