top | item 47162951

(no title)

mzhaase | 5 days ago

It always occurred to me that LLMs may be like the language center of the brain. And there should be a "whole damn rest of the brain" behind it to steer it.

LLMs miss very important concepts, like the concept of a fact. There is no "true", just consensus text on the internet given a certain context. Like that study recently where LLMs gave wrong info if there was the biography of a poor person in the context.

discuss

order

steve1977|5 days ago

I think much along the same lines. LLMs are probably even just a part of the language center.

And of course they also miss things like embodiment, mirror neurons etc.

If an LLM makes a mistake, it will tell you it is sorry. But does it really feel sorry?

red75prime|5 days ago

> But does it really feel sorry?

And what does it mean to feel sorry? Beyond fallible and imprecise human introspective notion of "sorry", that is. A definition that can span species and computing substrates. A deanthropomorphized definition of "sorry", so to speak.

joquarky|4 days ago

Ever practiced meditation of the form where you just witness your thoughts? It seems just like LLM generated words, both factual and confabulated nonsense.

dnautics|5 days ago

thats unlikely. but they are awfully lot like turing machines (k/v cache ~ turing tape) so their architecture is strongly predisposed to be able to find any algorithm, possibly including reasoning