top | item 38282518

(no title)

qd011 | 2 years ago

I genuinely think that this whole argument is a waste of time.

What matters is whether the outputs are useful and the outputs don't change based on whether you call it "thought", "AGI" or "probabilistic word selection".

discuss

order

verdverm|2 years ago

At some point, they will get some basic rights, considering animal abuse is a crime, but also that we will anthropomorphize LLMs more.

Current tech is not there yet, but we should not wait to discuss either

shrimp_emoji|2 years ago

They might, but they might not.

Trees have been given rights in some places. Some people believe dogs and cats have rights.

Humans have been not given rights in some places. Some people believe some humans don't have rights.

It's not about "sentience" or "consciousness" -- in reality, these concepts are religious ones, like the soul, and don't map to anything objectively meaningful.

A better way to think about rights, I think, is that they stem from the barrel of a gun. Can a thing reciprocate a social contract with me usefully? Can it help me if I'm nice to it? Can it hurt me if I'm mean to it? Alternatively, will entities that can help me do so if they see me help this thing or harm me if they see me harm this thing? That's all that matters; I'll be game-theoretically forced to grant it personhood. I'm programmed by my brain to show empathy to you, and, when that fails, you or others will harm me if I hurt you. Or, if I become a powerful dictator, I might execute you for being inconvenient to me. For all I know, you're a p-zombie that I could otherwise hurt with a clean conscience. None of that really matters; it's all about what I'm forced to do, from within or without.

qd011|2 years ago

> At some point, they will get some basic rights

I don't know how to express how deeply stupid this sounds while still being polite and constructive.

brucethemoose2|2 years ago

Severance (the TV show) is a pretty entertaining exploration of this issue.

Still... Maybe its not a good analogy. LLMs are inifinitely replicable and editable. The "concious experience," if you will, is discontinuous even if you assume the architecture will advance massively. We definitely dont need to be talking about rights yet.