The problem of incorporating human commonsense knowledge into a computer program and using that knowledge for intelligent tasks has been recognized as a central problem in artificial intelligence (AI) since the inception of the field. Solving this problem has turned out to be extremely difficult, so difficult that, essentially, AI applications only succeed if they can avoid the problem. Nonetheless, it remains a holy grail for the field.
Three general approaches have been proposed for collecting commonsense knowledge: hand coding by experts, web mining, and crowd sourcing. The project reported in this paper combines web mining with crowd sourcing, specifically gamification. Candidate facts are automatically extracted from texts and presented to human subjects in the form of a game. Players are presented with one candidate fact at a time and asked whether it is meaningful. To ensure participation, the players get points for answering; to ensure they reply honestly, the game includes some facts already answered and assesses penalties if the players get these wrong. The labeling of the facts by the human players is then fed back into the web mining system as training data. This approach has a measurable positive impact on the quality of the web mining extraction. As of September 2011, the game on Facebook had attracted 1145 players, of whom 67 percent came back a second day and 36 percent came back for a third day or more.