top | item 40938837

(no title)

SI_Rob | 1 year ago

Do humans?

Are these terms well defined or just subjective "I know it when I feel it" echoes of an unresolved debate over residual beliefs in a dualistic mind/body dichotomy?

What if it turns out that a confluent sequence of sensory inputs amounts to a unique neurophysical vector that initiates a particular activation cascade in another cluster of nerves, some outputs of which do not have images in the conscious domain (are not phenomenological) despite strongly informing it resulting in what we call 'creativity', all together defining a path back through our sensory encoding/decoding apparatus which we recognize as 'thought.'

I am not convinced that we are looking at this question through the right end of the telescope here.

discuss

order

orbital-decay|1 year ago

That's always the main issue with any piece that is using ill-defined terms like intelligence, consciousness, self-consciousness, thinking, understanding, etc. Nobody ever came close to defining them in a practical manner in decades/centuries, but then LLMs came and suddenly lots of people are somehow absolutely sure that they don't do any of this while humans/animals do.

beardyw|1 year ago

> What if it turns out that a confluent sequence of ,,,

That is useful way of looking at it. But the problem remains how do you train on sight, sound, touch in any sort of useful way. It takes 20 years to make an adult using its non digital hardware (which I would consider superior for the task). If it could happen quicker do you not think that would.

MattPalmer1086|1 year ago

Thinking and understanding are related to logic and linking information together in useful patterns.

It would be perfectly possible and uncontroversial for a machine to be able to do that without any consciousness.

The point being made is that LLMs don't even do that.