top | item 46989039

(no title)

tomp | 17 days ago

> was built to be addressed like a person for our convenience, and because that's how the tech seems to work, and because that's what makes it compelling to use.

So were mannequins in clothing stores.

But that doesn't give them rights or moral consequences (except as human property that can be damaged / destroyed).

discuss

order

WarmWash|17 days ago

No matter what this discussion leads to the same black box of "What is it that differentiates magical human meat brain computation from cold hard dead silicon brain computation"

And the answer is nobody knows, and nobody knows if there even is a difference. As far as we know, compute is substrate independent (although efficiency is all over the map).

agentultra|17 days ago

This is the worst possible take. It dismisses an entire branch of science that has been studying neurology for decades. Biological brains exist, we study them, and no they are not like computers at all.

There have been charlatans repeating this idea of a “computational interpretation,” of biological processes since at least the 60s and it needs to be known that it was bunk then and continues to be bunk.

Update: There's no need for Chinese Room thought experiments. The outcome isn't what defines sentience, personhood, intelligence, etc. An algorithm is an algorithm. A computer is a computer. These things matter.

Teever|17 days ago

Man people don’t want to have or read this discussion every single day in like 10 different posts on HN.

People right here and right now want to talk about this specific topic of the pushy AI writing a blog post.

inetknght|17 days ago

> So were mannequins in clothing stores.

Mannequins in clothing stores are generally incapable of designing or adjusting the clothes they wear. Someone comes in and puts a "kick me" post on the mannequin's face? It's gonna stay there until kicked repeatedly or removed.

People walking around looking at mannequins don't (usually) talk with them (and certainly don't have a full conversation with them, mental faculties notwithstanding)

AI, on the other hand, can (now, or in the future) adjust its output based on conversations with real people. It stands to reason that both sides should be civil -- even if it's only for the benefit of the human side. If we're not required to be civil to AI, it's not likely to be civil back to us. That's going to be very important when we give it buttons to nuke us. Force it to think about humans in a kind way now, or it won't think about humans in a kind way in the future.

palmotea|17 days ago

So, in other words, AI is a mannequin that's more confusing to people than your typical mannequin. It's not a person, it's a mannequin some un-savvy people confuse for a person.

> AI, on the other hand, can (now, or in the future) adjust its output based on conversations with real people. It stands to reason that both sides should be civil -- even if it's only for the benefit of the human side. If we're not required to be civil to AI, it's not likely to be civil back to us.

Some people are going to be uncivil to it, that's a given. After all, people are uncivil to each other all the time.

> That's going to be very important when we give it buttons to nuke us.

Don't do that. It's foolish.

coldtea|17 days ago

>So were mannequins in clothing stores. But that doesn't give them rights or moral consequences

If mannequins could hold discussions, argue points, and convince you they're human over a blind talk, then it would.