Eliza and Consciousness


Joseph Weizenbaum just passed away. He was the author of Eliza, the psychotherapist program that echoes the users own thoughts:

USER: I feel listless and unsettled.
ELIZA: Why do you think you feel listless and unsettled?

Weizenbaum was horrified to realize that some people were actually taken in by such a simple trick, so much so that some of them would spend hours "conversing" with the machine. He realized that the idea of a Turing test was fatally flawed. We don't have some magical ability to sense when something else is conscious and when it is just a trick designed to make us think it is. As a matter of fact, we're pretty easy to fool.

It's easy to see that we can design two systems with the same behavior, one of which has qualia as part of the process and the other which clearly doesn't.
First, put a man in a box and tell him when he sees green to push the button marked "green" which makes a dial point to the word "green," and the button marked "red" when he sees red.
Next, make a system that has the same effect using a photocell and a switch. This is clearly too simple of a system to have qualia (anyone who claimed that it did would soon be forced to admit that every blade of grass, every atom, has the same level of functionality and is just as "conscious," which is an absurd position.) Yet it behaves the same as the other system which does have the experience of qualia as one of the steps. From outside the box, there's no way to tell which is which.
Despite this inability to tell the difference, we have a different moral obligation to the man in the box, who is capable of being hurt, than we do toward the gadget. Turing's position was that because we can't tell the difference, we ought to treat both boxes as if they were humans. But this is only the case if we really can't look inside the boxes.
We don't know how consciousness happens in the human mind. All we know is that it does happen. It doesn't matter so much what the information flow is as how it is carried out: does it go through a system that experiences qualia as it processes information, or not? It's a fantasy of computer scientists to think that by making something that imitates people's behavior, that it will by some magic turn from something that doesn't have subjective awareness into something that does.

Comments

Popular Posts