An AI with 30 Years’ Worth of Knowledge Finally Goes to Work Source: Will Knight
Having spent the past 31 years memorizing an astonishing collection of general knowledge, the artificial-intelligence engine created by Doug Lenat is finally ready to go to work.
Lenat’s creation is Cyc, a knowledge base of semantic information designed to give computers some understanding of how things work in the real world.
Cyc has been given many thousands of facts, including lots of information that you wouldn’t find in an encyclopedia because it seems self-evident. It knows, for example, that that Sir Isaac Newton is a famous historical figure who is no longer alive. But more important, Cyc also understands that if you let go of an apple it will fall to the ground; that an apple is not bigger than a person; and that a person cannot throw an apple into space.
And now, after years of work, Lenat’s system is being commercialized by a company called Lucid.
“Part of the reason is the doneness of Cyc,” explains Lenat, who left his post as a professor at Stanford to start the project in late 1984. “Not that there’s nothing else to do,” he says. But he notes that most of what is left to be added is relevant to a specific area of expertise, such as finance or oncology.
Among other projects, the company is developing a personal assistant equipped with Cyc’s general knowledge. This could perhaps lead to something similar to Siri but less predisposed to foolish misunderstandings.
Michael Stewart, a longtime collaborator of Lenat’s and the CEO of Lucid, says the new company is in talks with various others interested in using the Cyc knowledge base. Lucid has been working with the Cleveland Clinic, for example, to help automate the process of finding patients for clinical studies. This involved adding new information to the Cyc knowledge base and a new front-end interface that allows doctors to input natural-language queries such as “Find patients with bacteria after a pericardial window.” Lucid should not only find the right candidate patients but provide a clear chain of logical reasoning for why it selected them.
Stewart says the company is also working with banks and financial firms to develop similar solutions for uncovering investment insights and insider trading. The system was able to spot one potential case of insider dealing when it learned, from an organizational chart, that two people had sat next to each other several years earlier. It understood that if two people sit next to each other, that means they know each other.
In each case, the AI faces a small learning curve. “We interview subject-matter experts, and also peruse documentation of the company, or medical-history documents,” Stewart says. “We ingest that knowledge into Cyc much like you would with a human.”
The fact that Cyc is now being commercialized might raise a few eyebrows. The project has spent so long in gestation that it has often seemed as though it might never reach the market (see “The Cost of Common Sense”).
Besides, hard-coding rules and logic into an AI is quite an old-fashioned approach. In recent years, machine learning and especially neural networks have come to dominate the field, thanks to sudden leaps in performance made possible by better algorithms, more powerful hardware, and huge amounts of training data (see “10 Breakthrough Technologies 2013: Deep Learning”). Google’s impressive Go-playing program AlphaGo, for example, mastered the impossibly complex and abstract game using various machine-learning tricks (see “Google’s AI Is Battering One of the World’s Top Go Players in Style”).
But deep learning is not good at imbuing machines with anything like common sense, which many see as an important shortcoming. Lenat certainly believes that advances in machine learning and deep learning will be flawed without some hand-coded knowledge. “It’s fine to say we’ll have programs that excel at checkers and chess and Go,” he says. “But that’s very different from saying those programs will be able to have prolonged conversations that cause you to make decisions involving human life.”
Gary Marcus, a professor of psychology and neural science at New York University and the cofounder of an AI company called Geometric Intelligence, says Lucid is interesting because it aims to address some of the shortcomings of popular approaches. “Cyc has a reputation for being unwieldy, and for the last decade hardly anything has been said about it publicly,” Marcus says. “At the same time, it represents an approach that is very different from all the deep-learning stuff that has been in the news.”
Marcus agrees that recent advances, which have enabled computers to process images and audio with human-like skills, are somewhat limited. “Deep learning is mainly about perception,” he says, “but there is a lot of inference involved in everyday human reasoning, and Cyc represents a serious effort to grapple with the subtlety of that inference. I don’t know what will emerge, but I am eager to see.”
| }
|