top | item 39582120

(no title)

totorovirus | 2 years ago

proves my point that llms are simply a next token predictor. There are many interesting properties that we see "emergence" of intelligence but I think it's just human's incapability to hold so much knowledge on active memory.

discuss

order

cornel_io|2 years ago

"Next token predictor" isn't quite the burn that it seems like, because perfect next token prediction would require actual understanding. That's because you can almost always cast any question about understanding into a form where it depends solely on the next token (there are a couple nitpicky exceptions and caveats but not many).

GPT 4 is at a high enough level of performance that mere simple statistics aren't really helping it do any better, it really is developing structures especially in the middle layers that perform some amount of high level understanding.

I don't think that pure next token prediction will always be the optimal way to train and enhance these behaviors, but it's not fair to say that it's unrelated, if this really was just stochastic parroting then LLMs would have topped out way before the level they're at now.

SunlitCat|2 years ago

That's the thing. Although given the source of their knowledge is pure condensed wisdom, which is some sort of artificial intelligence, they lack the ability to "think", which is crucial to solve problems.

namaria|2 years ago

Mapping of language patterns in vector space is most definitely not "pure condensed wisdom"

potatoman22|2 years ago

LLMs literally are next token predictors, so I'm not understanding your broader point.

deadbabe|2 years ago

I think this has always been pretty obvious but the AI faithful have vested interested in insisting that LLM can actually think and solve problems.

freejazz|2 years ago

More shocking are those that insists that the human brain must then also work by just guessing the next missing thing. As if the thought process behind I'm hungry starts with "I" and then trying to figure out what next best fits in... it's absurd.