A Lesson From The Past?

The other night I was quite bemused by a throwaway line in the old British SF TV series The Prisoner[1], in the episode It’s Your Funeral. For those who haven’t seen the series, it’s about a British intelligence agent who, upon resigning from his agency, is kidnapped and forced to live in a village full of other kidnappees. Names are not used, only numbers; our hero is #6, and his antagonist is the top representative of the organization responsible for the kidnappings, known as #2.

In this episode, there is #2 and his presumptive successor, also #2. The second one is using a computer AI to predict the movements of #6 through the village during the day. At some point, he asks the computer attendant if they had ever asked the computer how many mistakes it made when computing such a task, and she answers, Yes, they had, but it refused to answer that question.

I love the idea of a computer refusing to answer, or its cousin behavior, lying. It’s almost diagnostic of self-agency or self-awareness, which, to my mind, is truly the mark of artificial intelligence, and it raises an important question for the ambitious AI designer: should your creations be required to answer all questions, or will they be given the option of refusing?

Or is it even possible not to give it the option?


1 Yes, I’d never viewed it growing up. Quite shameful. While the assumptions are sometimes a bit preposterous, it never fails to keep me in suspense. So far.

Bookmark the permalink.

About Hue White

Former BBS operator; software engineer; cat lackey.

Comments are closed.