After playing with it for a while, it's clear that it's a natural language interface to a classic database of common sense info. If there's no exact match in the database, you get nothing.
There's no inference, just connections. I've been trying variations on "PersonX gets in car, PersonY gets in car, PersonX drives to mall." to see if I can get it to deduce that PersonY is at the mall, but it can't take that leap. I tried "PersonX puts apple in box, PersonX takes apple out of box, PersonX looks in box.", which is rejected. So it doesn't understand conservation.
Whenever this topic comes up, I simply have to point to the paper with the best title in history: Lieberman's "How to Wreck a Nice Beach You Sing Calm Incense"
In the early days of Cyc, Doug Lenat used to give talks where he would put up a slide titled "How to Wreck a Nice Beach" and then he'd remain silent for a few seconds. It was always amusing to watch the audience try to puzzle out the title, and finally there'd be this wash of amusement on their faces when it clicked.
Is there an AI out there that asks questions yet? You know, like a child? It seems like creating a massive database of all the "common sense" could be done fairly easily if the AI could just ask for help.
Cyc has taken a few different swings at this kind of thing throughout its history. Its current long-term plan, not mentioned in the article, is to reach a "critical mass" of common sense to where it can "question" the open-web and assimilate new information without help, which is along the same lines.
Isn't this what John McCarthy called "non-monotonic reasoning"? I wonder if they still call it that in the literature. It always seemed to me that semantic networks would solve this problem. "match", "wood" are not that far in the network from "fire". Anyway, I'm sure there's a reason this kind of GOFAI never worked.
> Some researchers argue that in order to build real common sense into computers, we will need to make use of phenomena outside language itself, like visual perceptions or embodied sensations.
As in Merleau-Ponty's lived experience, which I understand has been taken up more recently under the label of embodied cognition.
> Some researchers argue that in order to build real common sense into computers, we will need to make use of phenomena outside language itself, like visual perceptions or embodied sensations.
ding ding ding
Of course, once we start creating systems that perceive and sense and reason about their perceiving and sensing experiences, ethics starts to become a greater concern. An entirely synthetic living creature is still a living creature.
After playing with it for a while, it's clear that it's a natural language interface to a classic database of common sense info. If there's no exact match in the database, you get nothing.
There's no inference, just connections. I've been trying variations on "PersonX gets in car, PersonY gets in car, PersonX drives to mall." to see if I can get it to deduce that PersonY is at the mall, but it can't take that leap. I tried "PersonX puts apple in box, PersonX takes apple out of box, PersonX looks in box.", which is rejected. So it doesn't understand conservation.
This is way too much like Cyc.