MeatballWiki | RecentChanges | Random Page | Indices | Categories

Quoting Alan Turing, 1950, reprinted in "Understanding Intelligence" by Rolf Pfeifer and Christian Scheier, ISBN 0262161818 (alternate, search).

"It (the imitation game) is played by three people, a man (A), a woman (B), and an interrogator (C) who may be of either sex. The interrogator stays in a room apart from the other two. The object of the game for the interrogator is to determine which of the other two is the man and which is the woman. He knows them by labels X and Y, and at the end of the game he says either "X is A and Y is B" or "X is B and Y is A." The interrogator is allowed to put questions to A and B this:

C: Will X please tell me the length of his or her hair?

Now suppose X is actually A, then A must answer. It is A's object in the game to try and cause C to make the wrong identification. His answer might therefore be:

"My hair is shingled, and the longest strands are about nine inches long."

Furthermore, the object of the game for the third player (B) is to help the interrogator.

We now ask the question, "What will happen when a machine takes the part of A in this game? Will the interrogator decide wrongly as often when the game is played like this as he does when the game is played between a man and a woman? These questions replace our original, "Can machines think?"

This approach to understanding machine intelligence is critized by John R. Searle in the famous Chinese Room thought experiment, published in "Minds, Brains, and Programs", 1980 [1], ISBN 0674576330 (alternate, search).

Suppose that I'm locked in a room and given a large batch of Chinese writing. Suppose furthermore (as is indeed the case) that I know no Chinese, either written or spoken, and that I'm not even confident that I could recognize Chinese writing as Chinese writing distinct from, say, Japanese writing or meaningless squiggles. To me, Chinese writing is just so many meaningless squiggles.

Now suppose further that after this first batch of Chinese writing I am given a second batch of Chinese script together with a set of rules for correlating the second batch with the first batch. The rules are in English, and I understand these rules as well as any other native speaker of English. They enable me to correlate one set of formal symbols with another set of formal symbols, and all that 'formal' means here is that I can identify the symbols entirely by their shapes. Now suppose also that I am given a third batch of Chinese symbols together with some instructions, again in English, that enable me to correlate elements of this third batch with the first two batches, and these rules instruct me how to give back certain Chinese symbols with certain sorts of shapes in response to certain sorts of shapes given me in the third batch. Unknown to me, the people who are giving me all of these symbols call the first batch "a script," they call the second batch a "story" and they call the third batch "questions." Furthermore, they call the symbols I give them back in response to the third batch "answers to the questions." and the set of rules in English that they gave me, they call "the program."

[...] it seems to me quite obvious in the example that I do not understand a word of the Chinese stories. I have inputs and outputs that are indistinguishable from those of the native Chinese speaker, and I can have any formal program you like, but I still understand nothing. [...]

The usual reply is that although Searle may not understand Chinese, the room as a whole does.

See the [Chinese Room] entry of [The Internet Encyclopedia of Philosophy] for more discussion of this problem, including Searle's responses to common objections.

Several years ago I thought a lot about the Chinese Room argument and wrote several objections. Now I think that Searle's real objection is that the room does not have a human experience of "understanding written Chinese". I don't think that abstract/symbolic systems will have experiences in exactly the same way that biological systems do. Perhaps the machines will someday wonder how the human brain (an unstable fuzzy electro-chemical system) could possibly experience the quantum-fractally-dynamic-vortex-mesh of Reality-3 (or something even stranger than that). --CliffordAdams


MeatballWiki | RecentChanges | Random Page | Indices | Categories
Edit text of this page | View other revisions