Partner
Here’s What Happened:
Anthropic PBC wanted to create an AI assistant named “Claude” to compete with other chatbots like ChatGPT. Anthropic wanted Claude to be a step above other chatbots by basing its training on well curated facts, well organized analyses and captivating fictional narratives. Like other chatbots, Anthropic had to find works for training large language models (LLMs) that would power Claude. Anthropic used copyrighted works it purchased and works that were not purchased or licensed (pirated works). Anthropic then selected books from this central library to train LLMs.
Among the books that Anthropic used were the works belonging to Andrea Bartz, Charles Graeber and Kirk Wallace Johnson. They brought a class action suit against Anthropic for copyright infringement in the District Court for the Northern District of California. The suit included works that were purchased by Anthropic and pirated works.
The court granted Anthropic’s motion for summary judgment in part. Using the four factor test for fair use, the court determined that as to the works that Anthropic purchased:
- Purpose and Character of the Use. The use of the books to train Claude was transformative because of the way that Anthropic cleaned and tokenized copies of the works in advance of training Claude which weighed in favor of fair use;
- The Type of Original Work. The books were both fiction and non-fiction but each contained sufficient expressive elements to disfavor fair use;
- The Amount of the Original Work Used. While Anthropic used the whole works, they did not make the works accessible to the public which weighs in favor of fair use; and
- The Effect of the Use on the Market for the Original. Anthropic did not and will not displace the demand for the plaintiff’s works.
The court also determined that Anthropic’s use of pirated works would not be fair use. The court will hold trial on the damages incurred by the owners of the pirated works.
WHY YOU SHOULD KNOW THIS: Training LLMs strikes directly into the heart of copyright. LLMs need source material. This decision may only apply to the type of LLM that Anthropic used. Therefore, those who seek to train LLMs should use caution.
Cited Authority: Bartz v. Anthropic PBC,, No. C 24-05417 WHA, 2025 WL 1741691, at *1 (N.D. Cal. June 23, 2025).