---
title: "Millions of books died so Claude could live"
slug: "millions-of-books-died-so-claude-could-live"
date: 2026-02-03
category: tech-pub
tags: [openai, anthropic]
language: en
sources_count: 1
featured: false
publisher: AInauten News
url: https://news.ainauten.com/en/story/millions-of-books-died-so-claude-could-live
---

# Millions of books died so Claude could live

**Published**: 2026-02-03 | **Category**: tech-pub | **Sources**: 1

---

## TL;DR

Anthropic trained Claude on millions of copyrighted books – without permission from publishers or authors.

---

## Summary

Anthropic trained Claude on millions of copyrighted books – without permission from publishers or authors. Training data came from pirated e-book collections and shadow libraries, including Books3 and LibGen. Anthropic invokes fair use, while publishers and authors sue and demand licensing agreements. The Vergecast explores the ethical and legal gray zones of AI training on unauthorized content.

---

## Why it matters

Anthropic trained Claude on millions of copyrighted books – without permission from publishers or authors.

---

## Key Points

- Anthropic trained Claude on millions of copyrighted books – without permission from publishers or authors.
- Training data came from pirated e-book collections and shadow libraries, including Books3 and LibGen.
- Anthropic invokes fair use, while publishers and authors sue and demand licensing agreements.
- The Vergecast explores the ethical and legal gray zones of AI training on unauthorized content.

---

## Nauti's Take

Anthropic loves to position itself as the good guys in the AI race – with an ethics board and Constitutional AI. But when push comes to shove, they too reach into the piracy jar. Interpreting fair use as a blank check for billion-dollar companies is brazen. Publishers and authors are right: if you profit from others' work, you should pay. That Claude, the poster child for "safe AI," was trained on stolen goods is the irony of the year.

---


## FAQ

**Q:** What is Millions of books died so Claude could live about?

**A:** Anthropic trained Claude on millions of copyrighted books – without permission from publishers or authors.

**Q:** Why does it matter?

**A:** Anthropic trained Claude on millions of copyrighted books – without permission from publishers or authors.

**Q:** What are the key takeaways?

**A:** Anthropic trained Claude on millions of copyrighted books – without permission from publishers or authors.. Training data came from pirated e-book collections and shadow libraries, including Books3 and LibGen.. Anthropic invokes fair use, while publishers and authors sue and demand licensing agreements.

---

## Related Topics

- [openai](https://news.ainauten.com/en/tag/openai)
- [anthropic](https://news.ainauten.com/en/tag/anthropic)

---

## Sources

- [Millions of books died so Claude could live](https://www.theverge.com/podcast/872998/anthropic-claude-books-netflix-theaters-vergecast) - The Verge AI

---

## About This Article

This article is a synthesis of 1 sources, curated and summarized by AInauten News. We aggregate AI news from trusted sources and provide bilingual (German/English) coverage.

**Publisher**: [AInauten](https://www.ainauten.com) | **Site**: [news.ainauten.com](https://news.ainauten.com)

---

*Last Updated: 2026-03-20*
