Saturday, May 24, 2008

Commonsense morality?

I was playing 20 questions with an internet program that learns from
previous games (www.20q.net) and tried to see if it could guess that I was
thinking of morality. It didn't, but what was pretty interesting was how
my answers were inconsistent with previous users', and the list of similar objects:


"Does it have a spine? You said No, 20Q was taught by other players that the answer is Yes.
Can you control it? You said No, 20Q was taught by other players that the answer is Yes.
Do you know any songs about it? You said No, 20Q was taught by other players that the answer is Yes.
Does it deal with imagination? You said No, 20Q was taught by other players that the answer is Yes.
Does it come in many varieties? You said No, 20Q was taught by other players that the answer is Yes.
Is it a part of something larger? You said No, 20Q was taught by other players that the answer is Yes.
Contradictions Detected
The opinions of the 20Q A.I. are its own, and are based on the input of players. 20Q's answers reflect common knowledge. If you feel that 20Q is in error, the only way to correct it is to play again!


Similar Objects
an hour (time), sympathy, a yawn, patience, logic, honesty, truth, anorexia, procrastination, time, a sneeze, boredom."


1 comment:

Anonymous said...

I tried "oatmeal" and it guessed right.