(no title)
mdahardy | 2 months ago
That said, LLMs are still trained on significantly more data pretty much no matter how you look at it. E.g. a blind child might hear 10-15 million words by age 6 vs. trillions for LLMs.
mdahardy | 2 months ago
That said, LLMs are still trained on significantly more data pretty much no matter how you look at it. E.g. a blind child might hear 10-15 million words by age 6 vs. trillions for LLMs.
JohnFen|2 months ago
A camera hooked up to the baby's head is absolutely not getting all the input data the baby gets. It's not even getting most of it.
ForceBru|2 months ago
I don't know how to count the amount of words a human encounters in their life, but it does seem plausible that LLMs deal with orders of magnitude more words. What I'm saying is that words aren't the whole picture.
Humans get continuous streams of video, audio, smell, location and other sensory data. Plus, you get data about your impact on the world and the world's impact on you: what happens when you move this thing? What happens when you touch some fire? LLMs don't have this yet, they only have abstract symbols (words, tokens).
So when I look at it from this "sensory" perspective, LLMs don't seem to be getting any data at all here.
omneity|2 months ago
The acquired knowledge is a lot less uniform than you’re proposing and in fact is full of gaps a human would never make. And more critically, it is not able to peer into all of its vast knowledge at once, so with every prompt what you get is closer to an “instance of a human” than “all of humanity” as you might think of LLMs.
(I train and dissect LLMs for a living and for fun)
minraws|2 months ago
They mentioned the training data is much higher for an LLM, LLM's recall not being uniform was never in question.
No one expects compression to be without loss when you scale below knowledge entropy that exists in your training set.
I am not saying LLMs do simple compression but just pointing a mathematical certainity.
(And I think you don't need to be an expert in creating LLMs to understand them, albeit I think a lot of people here have experience with it aswell so I find the additional emphasis on it moot).