top | item 45165437

(no title)

matt3D | 5 months ago

Watching my children learn how to talk, I have come to the conclusion that the current LLM concept is one part of a two part problem.

Kids learn to speak before they learn to think about what they're saying. A 2/3 year old can start regurgitating sentences and forming new ones which sound an awful lot like real speech, but it seems like it's often just the child trying to fit in, they don't really understand what they're saying.

I used to joke my kids talking was sometimes just like typing a word on my phone and then just hitting the next predictive word that shows up. Since then it's evolved in a way that seems similar to LLMs.

The actually process of thought seems slightly divorced from the ability to pattern match words, but the patter matching serves as a way to communicate it. I think we need a thinking machine to spit out vectors that the LLM can convert into language. So I don't think they are a dead end, I think they are just missing the other half of the puzzle.

discuss

order

kuekacang|5 months ago

Another part are malleable memory. Something I imagine we as humans are accumulating context daily and doing reinforcement training while we sleep.