Artificial Intelligence Ethics a New Focus at Cambridge Univ Source: Amir Mizroch
A new center to study the implications of artificial intelligence and try to influence its ethical development has been established at the U.K.’s Cambridge University, the latest sign that concerns are rising about AI’s impact on everything from loss of jobs to humanity’s very existence.
The Leverhulme Trust, a non-profit foundation that awards grants for academic research in the U.K., on Thursday announced a grant of £10 million ($15 million) over ten years to the university to establish the Leverhulme Centre for the Future of Intelligence.
The new facility will be directed by Professor Huw Price, the university’s Bertrand Russell Professor of Philosophy. Others on the team include political scientists, lawyers, psychologists and technologists, said Prof. Gordon Marshall, the director of the Leverhulme Trust.
The Trust sprang out of a company that now is part of Unilever ULVR.LN +1.07%. However, the European consumer goods conglomerate had no official role in the Trust’s allocation of funding, Marshall said. Several former Unilever employees serve on the Trust’s board of directors, he said.
The new center will work in conjunction with the university’s Centre for the Study of Existential Risk, which researches emerging risks to humanity’s future, including climate change, biological warfare, and artificial intelligence. Price is also the academic director at the CSER.
U.S. tech giants Google, Microsoft MSFT +1.83%, Amazon, Twitter TWTR -2.08% and Facebook FB +1.17%, as well as Baidu from China, have recently boosted their artificial intelligence research departments, staffing them with top-tier AI researchers from academia. Recently, big names like Stephen Hawking, Bill Gates and Elon Musk have warned about the possible dangers to humanity of intelligent, self-aware machines.
In an interview, Price said part of the job would be to create an AI community with a common purpose of responsible innovation, and to update the current thinking about the opportunities and challenges posed by AI.
“Using memes from science fiction movies made decades ago �Cthe case of 2001: A Space Odyssey ― that was 50 years ago. Stanley Kubrick was a brilliant film director but we can do better than that now,” he said. The classic film features a sentient computer program called HAL-9000 on board a space ship. The computer kills the ship’s crew.
A still from “2001: A Space Odyssey” with Keir Dullea reflected in the lens of HAL’s “eye.”
        MGM / POLARIS / STANLEY KUBRICK
The new center will also collaborate with the Oxford Martin School at the University of Oxford, Imperial College London, and the University of California, Berkeley. A major focus of the collaboration would be around what Price called “the value alignment program,” where software programmers would team up with ethicists and philosophers on trying to write code that would govern the behavior of artificial intelligence programs.
“As a species, we need a successful transition to an era in which we share the planet with high-level, non-biological intelligence,” Price said. “We don’t know how far away that is, but we can be pretty confident that it’s in our future. Our challenge is to make sure that goes well.”
The new center in Cambridge joins others around the world set up recently to study the consequences of intelligent machines. In July, the Future of Life Institute, based in Cambridge, Mass., awarded $7 million from Elon Musk to “keep AI robust and beneficial.”
Cambridge University is a hotbed for AI research.
Last year, Google acquired London-based DeepMind Technologies, an AI company started by Cambridge graduate Demis Hassabis. This November, Mr. Hassabis said he had held discussions with Stephen Hawking, who Hassabis said was very worried about the ethical issues surrounding Google’s progress on AI. Mr. Hawking’s office did not respond to requests for comment.
In October, Apple acquired Cambridge-based VocalIQ, an AI startup working on ways to improve computers’ ability to understand human speech and to “speak” more naturally.
Correction: Prof. Gordon Marshall is the director of the Leverhulme Trust. An earlier version of this post mispelled his second name as Marshal.
| }
|