top | item 34732351

(no title)

canniballectern | 3 years ago

Of course, today's LLMs only appear to have theory of mind at first glance and fall apart under closer scrutiny. But if they can continue to become more and more accurate replicas of the real thing, I don't think it matters at all.

There's no way to know for sure that anyone other than yourself experiences consciousness. All you can do is judge for yourself that what they're describing matches closely enough with your own experiences that they're probably experiencing the same thing you are.

discuss

order

sgt101|3 years ago

I think it does matter because it legitimizes a view of humans (and animals) that undervalues them. The causality of meaning arising from patterns of language rather than patterns of language arising from meaning follows the same inversion as society being more valuable than the humans in it. Bad things have happened when that belief becomes dominant.

pdonis|3 years ago

> There's no way to know for sure that anyone other than yourself experiences consciousness. All you can do is judge for yourself that what they're describing matches closely enough with your own experiences that they're probably experiencing the same thing you are.

That judgment is not just based on the words other people use. It is based on knowing that other people's brains and minds have the same sort of semantic relationships to the rest of the world that yours do. And those relationships can be tested by checking to see if, for example, the other person uses the same words to refer to particular objects in the real world that you do, or if they react to particular real-world events in the same way that you do.

You can't even test any of this with an LLM because the LLM simply does not have the same kind of semantic relationships with the rest of the world that you do. It has no such relationships at all.

s0ulphire|3 years ago

I'll dig up a source in a bit, but there is a critical period of development in which a child must be exposed to language, or they will fail to develop the very core skills that you're suggesting are innate abilities in a person regardless of their upbringing. This is exactly how you learned everything you know; your parents talked to you. Language grants you the ability to define concepts in the first place, without which you have no ability to recognise them as you have no language with which to think about them in the first place. So what specifically differentiates the way your brain learned to classify objects and words from the way a NN does? And what stops a NN from being able to develop concepts based on the relationship of those new definitions in the same way you do? IMO arguably it's just a matter of processing power and configuration of the network.

mietek|3 years ago

I don’t think you know whether I have a mind or not.

yamrzou|3 years ago

> There's no way to know for sure that anyone other than yourself experiences consciousness.

- Do you see how the fish are coming to the surface and swimming around as they please? That's what fish really enjoy.

- You're not a fish, replied Hui Tzu, so how can you say you know what fish really enjoy?

- You are not me, said Zhuangzi, so how can you know I don't know what fish enjoy.

robertlagrant|3 years ago

> But if they can continue to become more and more accurate replicas of the real thing, I don't think it matters at all.

So, I suppose I'd ask: what does "matter" mean here? If you knew that everyone you loved had been destroyed and been replaced by exact replicas, would that matter?

canniballectern|3 years ago

If the replicas were truly exact, I guess not ¯\_(ツ)_/¯

bigbluedots|3 years ago

That already happens frequently enough at a cellular level.

sgt101|3 years ago

Star trek universe!