TechNews Pictorial PriceGrabber Video Tue Nov 26 22:46:25 2024

0


Teaching a Computer Not to Forget
Source: drienne LaFrance


agsandrew/Shutterstock.com

Imagine if every time you learned something new, you completely forgot how to do a thing you'd already learned.

Finally figured out that taxi-hailing whistle? Now you can't tie your shoes anymore. Learn how to moonwalk; forget how to play the violin. Humans do forget skills, of course, but it usually happens gradually.

Computers forget what they know more dramatically. Learning cannibalizes knowledge. As soon as a new skill is learned, old skills are crowded out. It's a problem computer scientists call "catastrophic forgetting." And it happens because computer brains often rewire themselves―forging new and different connections across neural pathways―every time they learn. This makes it hard for a computer to retain old lessons, but also to learn tasks that require a sequence of steps.

"Researchers will need to solve this problem of catastrophic forgetting for us to get anywhere in terms of producing artificially intelligent computers and robots," said Jeff Clune, an assistant professor of computer science at the University of Wyoming. "Until we do, machines will be mostly one-trick ponies."

Catastrophic forgetting also stands in the way of one of the long-standing goals for artificial intelligence: to create computers that can compartmentalize different skills in order to solve diverse problems.

So what would it take for a computer brain to retain what it knows, even as it learns new things? That was the question Clune had when he and his colleagues set out to make an artificial brain act more like a human one. Their central idea: See if you can get a computer to organize―and preserve―what it knows within distinct modules of the brain, rather than overwriting what it knows every time it learns something new.

"Biological brains exhibit a high degree of modularity, meaning they contain clusters of neurons with high degrees of connectivity within clusters, but low degrees of connectivity between clusters," the team explained in a video about their research, which was published last week in the journal PLOS ONE.

In humans and animals, brain modularity evolved as the optimal way to organize neural connections. That's because natural selection arranges the brain to minimize the costs associated with building, maintaining, and housing broader connections.

"It is an interesting question as to how evolution solved this problem," Clune told me. "How did it figure out how to allow animals, including us, to learn a new skill without overwriting the knowledge of a previously learned skill?"


}

© 2021 PopYard - Technology for Today!| about us | privacy policy |