top | item 37099563

(no title)

gradys | 2 years ago

What does understanding and reasoning mean?

discuss

order

NateEag|2 years ago

We don't know. The nature of consciousness is an unsolved problem.

We do know that LLMs fall into a certain category of mistake that most educated humans look at and go "HA! What was it thinking??"

It's not that humans don't also make those types of errors - it's that we recognize them quickly when they're pointed out to us and usually describe the error as a "stupid mistake," "brain fart," or similar name intended to show explicitly "gosh, I totally failed to actually think before I did that."

The LLMs show no sign of such self-awareness or, well, "intelligence," loose and squishy as those words are.

Maybe GPT-5 will fix that, but so far it doesn't look that way.