Copyright Infringement

AI and Copyright Infringement in 2024 – What’s Next for 2025

Anyone following news stories about the rapid pace of AI development and creative industries’ ongoing battle over rampant copyright infringement will know that 2024 was a contentious year.

AI Copyright Infringement Lawsuits – The Year in Review

As the year started, authors sued OpenAI and Microsoft for copyright infringement, claiming they stole their copyrighted works to train their ChatGPT AI chatbot. By March, there was another AI-related lawsuit by a group of authors, for the same reason, against Nvidia.

In April, lawmakers proposed the Generative AI Copyright Disclosure Act to require tech companies to reveal copyrighted works used in their AI systems. No action has been taken to date. Also in April, more than 200 prominent artists, including Nicki Minaj and Billie Eilish, added their signatures to a letter demanding action on what they see as the music industry’s “predatory use of AI.” In May, OpenAI was sued again for using copyrighted works in generative AI system training.

In June, major labels sued Udio and Suno, AI music generator startups, for massive copyright infringement. The US Copyright Office released its report on AI deepfakes in August, while senators introduced the NO FAKES Act (Nurture Originals, Foster Art, and Keep Entertainment Safe). That bill, which addresses the dangers of AI digital replicas, is in committee.

In mid-December, the Writers Guild of America (WGA) called on Hollywood studios to aggressively pursue legal action against tech firms for using copyrighted material without permission. At the same time, the UK government proposed letting celebrities take legal action against AI companies for unauthorized use of their voice or likeness.  

The struggle to find a balance between AI innovation and compensating creators for the use of their copyrighted work (and granting celebrities control over their image and voice) continues into the new year.

UK Studies AI Development and Copyright Concerns

In December, the UK government launched a 10-week consultation to address AI development and the copyright concerns of creative industries. The initiative aims to establish clear guidelines for using copyrighted material in AI model training while ensuring fair compensation for rights holders.

The consultation focuses on three main areas:

  1. Improving transparency between AI developers and rights holders about how creative content is obtained and used in AI training.
  2. Developing frameworks for licensing and compensating creators for their material.
  3. Ensuring AI developers have access to high-quality training data while respecting intellectual property rights.

A key proposal is to introduce a copyright law exception for commercial AI training purposes while allowing copyright holders to reserve their rights and control their content’s usage. The government also proposes new transparency requirements for AI developers regarding their training datasets and acquisition methods.

The initiative also looks at personality rights protection in relation to digital replicas and deepfakes, examining whether current legal frameworks adequately address these issues.y rights protection in relation to digital replicas and deepfakes, examining whether current legal frameworks adequately address these issues.

Opposition to Proposed Copyright Law Changes

Jo Twist, CEO of BPI (British Phonographic Industry), strongly opposes the copyright exception, arguing it would discourage tech companies from pursuing proper licensing agreements and effectively subsidize overseas tech corporations at the expense of UK creators. The UK’s creative industries are valued at £125 billion (US $154 billion) annually. BPI also opposes the “opt-out” rule, which lets AI systems freely use copyrighted content for training unless copyright holders explicitly reserve their rights.

A broad coalition of creative industry organizations, including the Motion Picture Association, Getty Images, the Society of Authors, and the Independent Society of Musicians, joins BPI in opposing the proposed updates. The coalition believes that the current copyright laws should be more rigorously enforced. More than 38,000 people, including Kate Bush and Paul McCartney, signed a petition against AI copyright theft.

The tension between unfettered AI advancement and protecting intellectual property rights continues.

The Hollywood Reporter – Georg Szalai – December 17, 2024

City AM – Maria Ward-Brennan – December 31, 2024

The Times – Isabella Fish – December 31, 2024

AI Deepfakes US Legislative Update

The US Copyright Office’s July 2024 Report on Copyright and Artificial Intelligence, focusing on deepfakes (AI-generated digital replicas), found that current laws do not address the challenges posed by unauthorized digital replicas.

Research shows that 98% of deepfake videos online are explicit images, predominantly targeting women. Additional threats include financial scams, fraudulent celebrity endorsements, and the potential to undermine news integrity and political discourse through undetectable disinformation.

The report finds significant gaps in existing legal protections. Copyright law doesn’t protect individual identity, the Lanham Act requires commercial use, and the Federal Trade Commission Act is limited to commercial contexts. State laws vary widely and often have narrow applicability, with right of publicity laws being the most relevant but inconsistent across jurisdictions.

In response to these findings, the Copyright Office called for urgent new federal legislation. Bipartisan Senators introduced the NO FAKES Act, which would require individuals to authorize the use of their visual likeness or voice. The bill received broad support from artists, AI companies, and entertainment industries. The House of Representatives proposed its own NO AI FRAUD bill, a similar piece of legislation.

Federal legislation awaits the new Congress. Meanwhile, at least twelve states have enacted laws specifically addressing sexually explicit deepfakes. Local authorities are also pursuing legal action, such as San Francisco’s District Attorney filing suit against websites enabling non-consensual nude image creation.

States Expanding Publicity Rights

States are also actively expanding publicity rights. While they originated to prevent unauthorized commercial endorsements, they’re now being expanded to cover almost any content that “evokes” a person’s identity. Tennessee passed the ELVIS Act, extending existing celebrity rights to include voices and broadening liability for unauthorized distribution. California responded with AB 1836, creating strict rules about using deceased personalities’ likenesses, though with some exceptions that may be difficult to interpret without legal expertise.

Reuters – Paven Malhotra, Michelle Ybarra and Matan Shacham – December 18, 2024

Electronic Frontier Foundation – Corynne McSherry – December 27, 2024

What 2025 Holds for Tech Companies and AI Copyright Lawsuits

Several crucial copyright lawsuits against AI companies like OpenAI, Anthropic, and Meta are expected to reach critical stages in 2025. The central legal question revolves around whether AI companies’ use of copyrighted material for training their models constitutes “fair use” under copyright law.

Tech companies argue their AI systems make fair use of copyrighted content by learning from it to create new, transformative works. Copyright holders contend this unauthorized use generates competing content that threatens their livelihoods. Major tech companies and investors warn that requiring payment for training data could severely impact AI industry growth in the US.

While some content owners, including the Financial Times, Reddit, and News Corp, have opted to license their material to tech companies, others, like the New York Times and major record labels, continue pursuing legal action. If courts side with AI companies on the fair use question, they could avoid copyright liability entirely.

Two cases may provide early indicators of how judges will approach fair use arguments:

  1. Thomson Reuters vs. Ross Intelligence, regarding AI-powered legal research
  2. Music publishers vs. Anthropic, concerning song lyrics used to train Claude

A notable development occurred in November when a New York judge dismissed a case from AlterNet and Raw Story against OpenAI, ruling they failed to demonstrate harm from alleged copyright violations. This decision suggests other cases might conclude without addressing fair use if plaintiffs cannot prove harm from AI training use.

The outcomes of these cases could vary across jurisdictions, and multiple appeals are likely. The resolution of these lawsuits will significantly influence how AI companies operate and interact with content creators in the future.

Reuters – Blake Brittain – December 27, 2024

IFPI Warns of Stream-Ripping and Generative AI

IFPI (International Federation of the Phonographic Industry) identifies two major threats to the music industry: stream-ripping services and generative AI, with the former potentially fueling the latter’s development. They suggest these threats could impact the industry’s long-term sustainability.

According to IFPI’s senior legal advisor Catherine Lloyd, while AI can enhance creative processes, generative AI poses significant challenges. The primary concerns include the unauthorized use of music for AI model training and the creation of competing content, particularly through voice cloning technology. This led IFPI to add ‘AI Vocal Cloning’ as a new category in its annual piracy markets report.

Stream-ripping, which involves downloading music from platforms like YouTube, remains the industry’s most immediate piracy concern. While YouTube has become a significant revenue source for the music industry through licensing agreements, it is also the primary source for stream-ripping activities. Legal actions have focused on blocking stream-ripping websites through ISPs and targeting hosting providers. IFPI sees controlling access to training data through licensing as crucial for maintaining the industry’s position in an evolving market where AI-generated content could create unprecedented competition.

TorrentFreak – Andy Maxwell – December 31, 2024

Scroll to Top