AI and Common Sense

Common-Sense Reasoning or Common-Sense Experiences?


There has been a lot of discussion about AI and “common sense reasoning.” The discussions typically try to make the point that “common sense reasoning” is the key difference between AI and AGI. But I believe that these discussions are misleading on one important way--- they imply that common sense reasoning requires specialized algorithms or processing strategies (such as various types of inferencing logic.) And these discussions propel the AI community to seek such processing strategies.

A more pragmatic view is that the brain (and electronic AGI systems) probably has a limited number of processing strategies, and common sense reasoning emerges from a handful of standard processing techniques, such as inferencing mechanisms, operating on a vast store of knowledge. The store of knowledge provides the real foundation for common sensed reasoning, not the algorithms. For example, common sense reasoning for a dolphin, living in the oceans, is vastly different from common sense for people. The effects of gravity are totally different. Air and sea currents are different; visibility is different; the daily life threats are different; and then there is echolation which we do not possess. It seems to me that dolphin brains probably use the very same processing strategies that we use, but apply those strategies to different stored experiences. So when talk about common sense reasoning, we should really think about common sense experiences.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了