totally ture

Warning: grumping about something really abstruse incoming.

Now, I totally agree that the Turing test, as it’s conventionally depicted in  media, is humanocentric and not a great way to assess self-awareness. What does the ability to impersonate a human actually tell us?

The trouble is, the way the Turing test is conventionally depicted in media is nothing like how it was originally formulated.

The way it’s typically shown, a human talks to an unknown third party and tries to guess whether that party is a human or a computer.

As it was originally formulated, however, the test is more like a game.

In his paper, Turing outlines what he calls “the Imitation Game”. The game involves two parties, A and B, and an interrogator with whom both parties can communicate via teletype (or, in modern terms, by text-based IM).

A and B cannot communicate directly with each other, but each can ask the interrogator questions about the other, or request that the interrogator relay questions on their behalf.

(The interrogator is, of course, under no obligation to relay such questions accurately, nor to honestly report the other party’s responses.)

The interrogator’s role is to determine some specific fact about the identities of A and B. A’s goal is to assist the interrogator in coming to the correct conclusion, while B’s goal is to trick the interrogator into guessing wrongly.

The interrogator does not know which party is the helpful one and which party is the deceptive one.

In the basic form of the Imitation Game as described by Turing, one of the parties is a man and the other is a woman, and the interrogator is tasked with correctly determining their respective genders. In the modified version known as the Turing test, one party is human and the other is a computer. It’s typically further stipulated that the computer must take the deceptive role.

(The arguments for why the computer must be the deceptive one are long and complicated and not worth getting into here - just go with it for now.)

Basically, in its proper form, the Turing test isn’t merely testing the computer’s ability to impersonate a human. It’s also testing for empathy, theory of mind, and the capacity for social manipulation.

Merely successfully impersonating a human would allow the computer to win 50% of the time at most, as the interrogator would then be reduced to blind guessing; to consistently win the game, the computer must additionally determine what the interrogator thinks a computer would act like, then guide the interrogator to interpret the human party’s responses as fitting that profile.

I don’t know about you, but I’d be pretty impressed at a computer that could pull all that off.