top | item 47163782

(no title)

js8 | 3 days ago

Dennett also came to my mind, reading the title, but in a different sense. When people came up with theory of evolution, it was hard to conceive for many people, how do we get from "subtly selecting from random changes" to "build a complex mechanism such as human". I think Dennett offers a nice analogy with a skyscraper, how it can be built if cranes are only so tall?

In a similar way, LLMs build small abstractions, first on words, how to subtly rearrange them without changing meaning, then they start to understand logic patterns such as "If A follows B, and we're given A, then B", and eventually they learn to reason in various ways.

It's the scale of the whole process that defies human understanding.

(Also modern LLMs are not just next word predictors anymore, there is reinforcement learning component as well.)

discuss

order

No comments yet.