top | item 14697789

Chinese room argument

11 points| jermaink | 8 years ago |en.wikipedia.org

9 comments

order

cousin_it|8 years ago

For everyone who considers the Chinese Room argument obviously wrong (like me), here's two versions that are much stronger and might still make you uneasy:

In Greg Egan's "Jewelhead" stories, every person gets a computer implanted in their brain at birth, which gradually learns to imitate the input-output behavior of the biological brain. At some point they switch to the jewel full time and throw away the biological brain, becoming immortal. That's seen as a fact of life and people don't question it much.

In one of Wei Dai's nightmare scenarios, we ask an AI to upload humans in an efficient way. Unfortunately, since humans can't introspect into the idea of "I'm conscious" very deeply, the resulting resource-optimized uploads just have a handful of hardcoded responses to questions about consciousness, and aren't in fact conscious. Nobody notices.

Of course, both cases are problematic only if you can "optimize" a human brain into something else, which would mimic the same input-output behavior without being conscious. The trouble is that we can't rule out that possibility today. Humans certainly have a lot of neural circuitry that's a side effect of something else. Some of it might get optimized out, the way a human in a sealed room can be optimized to nothing at all. To rule out a "Disneyland without children" scenario, wishful thinking isn't enough, we need to properly figure out consciousness and qualia.

dTal|8 years ago

In Wei Dai's "nightmare scenario", what of value is lost?

If "consciousness", whatever that is, is completely undetectable by external means and has no effect on human behavior, I think we can safely ignore it.

dTal|8 years ago

The most famous example of an entire category of fallacious arguments about consciousness:

  1) construct a hypothetical brain on an implausible substrate
  2) note the implausibility of the substrate
  3) therefore consciousness is magic QED
See also: "China brain"

bwasti|8 years ago

From the article, in "Replies":

The fact that man does not understand Chinese is irrelevant, because it is only the system as a whole that matters. Searle notes that (in this simple version of the reply) the "system" is nothing more than a collection of ordinary physical objects; it grants the power of understanding and consciousness to "the conjunction of that person and bits of paper" without making any effort to explain how this pile of objects has become a conscious, thinking being. Searle argues that no reasonable person should be satisfied with the reply, unless they are "under the grip of an ideology".

dTal|8 years ago

Searle's note applies equally to an "ordinary" brain, which is just a collection of ordinary physical objects (atoms).