top | item 17032880

(no title)

gyom | 7 years ago

I have a friend who was participating often in psychology experiments at Stanford, and he became familiar with the whole procedure of letting the subjects believe that they were interacting with another person via a computer, when in fact they were interacting with a program (makes everything more standard and easier to analyse).

One day he participated in this "split or share" kind of experiment, and he was ruthless. Nobody's emotions would be damaged by acting nasty and never sharing with the computer program.

Turns out, it probably was actually a real person who was behind on the other side. He saw some old woman crying, coming out of some adjacent room after the experiment was over.

So, yeah, different social conventions definitely apply.

discuss

order

fjsolwmv|7 years ago

If I know anything about college psych experiments, the woman was an actor and the study was about your reaction to seeing her cry.

Freak_NL|7 years ago

Actually setting up a software environment that connects two participants in the same simulation, and have it run without any bugs is usually way beyond the capabilities of the psychology students running the experiment. Not to mention the bother of coordinating the different teamed up participants (“Did you click start? The other participant is waiting for you to click sta— oh no it timed out. Hang on I'll restart the pairing sequence…”).

If you participate in one of those studies on campus, you are facing a computer running a local bit of standalone software, or a webservice running a questionnaire.

hosh|7 years ago

There's the reverse where, what if AIs do develop feelings like that? Why shouldn't they be treated with the courtesy we treat other humans?

Just thought of these:

- Would this kind of personal assisstant be useful for people on the autism spectrum, to help navigate the kind of implied and unspoken narratives that people with autism seem to have difficulty with?

- Would there be a call to train Duplex to speak in a way that is more comfortable for people on the spectrum?

- Or any of the neurodivergent tribes for that matter?

freehunter|7 years ago

I've said this before, but in reality strong AI will be another species. Every species of any moderate intelligence is expected to be treated with some courtesy and respect, but their social norms are far different from that of a human and we tailor our interactions with them in different ways. If I'm chewing gum and a human sees me, I might offer them a piece too. I'd never do that to a dog, no matter how much the dog wanted me to. If I saw a human chewing on the grass, I might stop them and ask them some questions to see if they're okay or need medical attention. If I saw a rabbit doing it, I might take a picture because it's cute, but I'd leave it alone (unless it was in my vegetable garden).

There is no reason to expect that strong AI will share the exact same feelings we do unless someone explicitly programs it to (which would be a mistake). Any truly emergent emotional behavior on the part of an AI would be very likely to differ substantially from ours. Making a chatbot "sad" is not the same as making a sapient (or even sentient) being sad. If AI ever achieves sentience, we're going to have to learn what makes it uniquely happy and sad.

ryanianian|7 years ago

> Why shouldn't they be treated with the courtesy we treat other humans?

Because courtesy requires some amount of empathy and mental effort on top of what's required just to communicate your point. That effort is wasted on a machine.