(no title)
jaccola | 18 days ago
"The distilled LLM isn't stealing the content from the 'parent' LLM, it is learning from the content just as a human would, surely that can't be illegal!"...
jaccola | 18 days ago
"The distilled LLM isn't stealing the content from the 'parent' LLM, it is learning from the content just as a human would, surely that can't be illegal!"...
mikehearn|18 days ago
zozbot234|18 days ago
budududuroiu|18 days ago
> The court’s decision in Thaler v. Perlmutter,1 on March 18, 2025, supports the position adopted by the United States Copyright Office and is the latest chapter in the long-running saga of an attempt by a computer scientist to challenge that fundamental principle.
I, like many others, believe the only way AI won't immediately get enshittified is by fighting tooth and nail for LLM output to never be copyrightable
https://www.skadden.com/insights/publications/2025/03/appell...
amenhotep|18 days ago
I think it's a pretty weak distinction and by separating the concerns, having a company that collects a corpus and then "illegally" sells it for training, you can pretty much exactly reproduce the acquire-books-and-train-on-them scenario, but in the simplest case, the EULA does actually make it slightly different.
Like, if a publisher pays an author to write a book, with the contract specifically saying they're not allowed to train on that text, and then they train on it anyway, that's clearly worse than someone just buying a book and training on it, right?
BeetleB|18 days ago
Nice phrasing, using "pirate".
Violating the TOS of an LLM is the equivalent of pirating a book.
creamyhorror|18 days ago
Ultimately it's up to legislation to formalize rules, ideally based on principles of fairness. Is it fair in non-legalistic sense for all old books to be trainable-on, but not LLM outputs?
TZubiri|18 days ago
American Model trains on public data without a "do not use this without permission" clause.
Chinese models train on models that have a "you will not reverse engineer" clause.
WSSP|18 days ago
this is going through various courts right now, but likely not