Searle's Chinese Room, anyone?
Searle requests that his reader imagine that, many years from now, people have constructed a computer that behaves as if it understands Chinese. It takes Chinese characters as input and, using a computer program, produces other Chinese characters, which it presents as output. Suppose, says Searle, that this computer performs its task so convincingly that it comfortably passes the Turing test: it convinces a human Chinese speaker that the program is itself a human Chinese speaker. All of the questions that the human asks it receive appropriate responses, such that the Chinese speaker is convinced that he or she is talking to another Chinese-speaking human being. Most proponents of artificial intelligence would draw the conclusion that the computer understands Chinese, just as the Chinese-speaking human does.
Searle then asks the reader to suppose that he is in a room in which he receives Chinese characters, consults a book containing an English version of the aforementioned computer program and processes the Chinese characters according to its instructions. He does not understand a word of Chinese; he simply manipulates what, to him, are meaningless symbols, using the book and whatever other equipment, like paper, pencils, erasers and filing cabinets, is available to him. After manipulating the symbols, he responds to a given Chinese question in the same language. As the computer passed the Turing test this way, it is fair, says Searle, to deduce that he has done so, too, simply by running the program manually. "Nobody just looking at my answers can tell that I don't speak a word of Chinese," he writes.
This lack of understanding, according to Searle, proves that computers do not understand Chinese either, because they are in the same position as he nothing but mindless manipulators of symbols: they do not have conscious mental states like an "understanding" of what they are saying, so they cannot fairly and properly be said to have minds.
http://en.wikipedia.org/wiki/Searle's_Chinese_room
One of my favorite philosophical arguments. Even neuro-psychological buffs like Steven Pinker can only bumble their ways around it. The central point is that consciousness entails something more than immediacy; something more than response. The old school existentialists had this in mind when they referred to consciousness as, or entailing, negation -- specifically the negation between input and output, which practically is called reflection, which entails understanding.