(no title)
gyom | 7 years ago
One day he participated in this "split or share" kind of experiment, and he was ruthless. Nobody's emotions would be damaged by acting nasty and never sharing with the computer program.
Turns out, it probably was actually a real person who was behind on the other side. He saw some old woman crying, coming out of some adjacent room after the experiment was over.
So, yeah, different social conventions definitely apply.
fjsolwmv|7 years ago
Freak_NL|7 years ago
If you participate in one of those studies on campus, you are facing a computer running a local bit of standalone software, or a webservice running a questionnaire.
nitwit005|7 years ago
salgernon|7 years ago
https://en.wikipedia.org/wiki/Susan_Bennett
hosh|7 years ago
Just thought of these:
- Would this kind of personal assisstant be useful for people on the autism spectrum, to help navigate the kind of implied and unspoken narratives that people with autism seem to have difficulty with?
- Would there be a call to train Duplex to speak in a way that is more comfortable for people on the spectrum?
- Or any of the neurodivergent tribes for that matter?
freehunter|7 years ago
There is no reason to expect that strong AI will share the exact same feelings we do unless someone explicitly programs it to (which would be a mistake). Any truly emergent emotional behavior on the part of an AI would be very likely to differ substantially from ours. Making a chatbot "sad" is not the same as making a sapient (or even sentient) being sad. If AI ever achieves sentience, we're going to have to learn what makes it uniquely happy and sad.
ryanianian|7 years ago
Because courtesy requires some amount of empathy and mental effort on top of what's required just to communicate your point. That effort is wasted on a machine.