I’ve been reading a lot about Alan Turing recently so the Turing Test came to mind: “a human judge engages in a natural language conversation with a human a machine designed to generate performance indistinguishable from that of a human being... If the judge cannot reliably tell the machine from the human, the machine is said to have passed the test” (Wikipedia).
So how would the machine deal with this? We already know the answer thanks to Siri, the phone’s speak-your-answer service. The machine would Google up the day, discover nothing very significant and conclude, if it’s been well programmed, that the other meaning applies.
Or perhaps it would be smarter? It wouldn’t bother goggling because probabilistically on most days of the year the second meaning would apply. But even if the computer always matched a human answer in questions like this its mode of operation would always be different. For us “What does it mean to be British today” is not a question about knowledge, but one of identity, emotional engagement with a concept –“Britishness” –that is undefinable but understood by all. I’ve always felt that the Turing test misses the point. We should like to know – or some of us would – how the human mind comes up with its answers. We know the computer does it differently so developing computer power to the point where it can pull off this party trick isn’t the point.