Home World Politics Crypto Business Sports
Home World Politics Crypto Business Sports
Big Content, AI, and the Battle Over Artistic Control and Compensation image from theguardian.com
Image from theguardian.com

Big Content, AI, and the Battle Over Artistic Control and Compensation

Posted 15th Nov 2025

L 40%
C 55%
R

Major labels such as Universal Music Group (UMG) and others have recently sued AI music startups for allegedly using their recordings to train text-to-music models without permission. In a notable development, UMG later struck a deal with the defendant Udio to build an AI music platform. However, advocacy groups like the Music Artists Coalition caution that such partnerships are often framed as collaboration but typically result in limited compensation or control for artists.

US courts are actively handling multiple AI-related copyright cases, with existing intellectual property frameworks struggling to apply traditional copyright law to the use of training data in AI technology. Cases such as Andersen v Stability AI illustrate the ongoing tension in this area.

Generative AI is already displacing creative labor; a 2024 Society of Authors survey found that over a third of illustrators have lost income due to AI, and projections for 2028 estimate a 21% revenue loss for audiovisual creators. In response, the Human Artistry Campaign has united industry professionals and artists to push for legislation that protects artists from the impacts of AI and big tech, underscoring the stance that AI cannot replace human artistry.

There are concerns that alliances between big-content companies and technology firms could lead to exclusive licensing deals that benefit corporations while sidelining the rights and livelihoods of artists. This dynamic has been likened to “the enemy of my enemy” scenarios. In practice, licensing deals such as the one between Runway and Lionsgate have often provided little to no compensation or opt-out options for artists, raising questions about the sufficiency and fairness of such arrangements.

Legislative efforts include the NO FAKES Act, which proposes a federal digital replication right aimed at regulating AI deepfakes. However, civil liberties groups have expressed concern about the potential vagueness of the law and risks to free speech, especially given that rights could extend to licensing or transferring digital replicas for up to 10 years (five years for minors). This ongoing debate highlights the complex balance between technological innovation, copyright enforcement, artists' rights, and civil liberties.

Sources
The Guardian Logo
https://www.theguardian.com/commentisfree/2025/nov/15/big-content-ai-entertainment-media-conglomerates-tech
* This article has been summarised using Artificial Intelligence and may contain inaccuracies. Please fact-check details with the sources provided.