TechNews Pictorial PriceGrabber Video Sat Dec 21 10:46:11 2024

0


Bringing deep learning to life
Source: Kim Martineau


Gaby Ecanow loves listening to music, but never considered writing her own until taking 6.S191 (Introduction to Deep Learning). By her second class, the second-year MIT student had composed an original Irish folk song with the help of a recurrent neural network, and was considering how to adapt the model to create her own Louis the Child-inspired dance beats.

“It was cool,” she says. “It didn’t sound at all like a machine had made it.”

This year, 6.S191 kicked off as usual, with students spilling into the aisles of Stata Center’s Kirsch Auditorium during Independent Activities Period (IAP). But the opening lecture featured a twist: a recorded welcome from former President Barack Obama. The video was quickly revealed to be an AI-generated fabrication, one of many twists that Alexander Amini ’17 and Ava Soleimany ’16 introduce throughout their for-credit course to make the equations and code come alive.

As hundreds of their peers look on, Amini and Soleimany take turns at the podium. If they appear at ease, it’s because they know the material cold; they designed the curriculum themselves, and have taught it for the past three years. The course covers the technical foundations of deep learning and its societal implications through lectures and software labs focused on real-world applications. On the final day, students compete for prizes by pitching their own ideas for research projects. In the weeks leading up to class, Amini and Soleimany spend hours updating the labs, refreshing their lectures, and honing their presentations.

A branch of machine learning, deep learning harnesses massive data and algorithms modeled loosely on how the brain processes information to make predictions. The class has been credited with helping to spread machine-learning tools into research labs across MIT. That’s by design, says Amini, a graduate student in MIT’s Department of Electrical Engineering and Computer Science (EECS), and Soleimany, a graduate student at MIT and Harvard University.

Both are using machine learning in their own research — Amini in engineering robots, and Soleimany in developing diagnostic tools for cancer — and they wanted to make sure the curriculum would prepare students to do the same. In addition to the lab on developing a music-generating AI, they offer labs on building a face-recognition model with convolutional neural networks and a bot that uses reinforcement learning to play the vintage Atari video game, Pong. After students master the basics, those taking the class for credit go on to create applications of their own.

This year, 23 teams presented projects. Among the prize winners was Carmen Martin, a graduate student in the Harvard-MIT Program in Health Sciences and Technology (HST), who proposed using a type of neural net called a graph convolutional network to predict the spread of coronavirus. She combined several data streams: airline ticketing data to measure population fluxes, real-time confirmation of new infections, and a ranking of how well countries are equipped to prevent and respond to a pandemic.

“The goal is to train the model to predict cases to guide national governments and the World Health Organization in their recommendations to limit new cases and save lives,” she says.

A second prize winner, EECS graduate student Samuel Sledzieski, proposed building a model to predict protein interactions using only their amino acid sequences. Predicting protein behavior is key to designing drug targets, among other clinical applications, and Sledzieski wondered if deep learning could speed up the search for viable protein pairs.

“There’s still work to be done, but I’m excited by how far I was able to get in three days,” he says. “Having easy-to-follow examples in TensorFlow and Keras helped me understand how to actually build and train these models myself.” He plans to continue the work in his current lab rotation with Bonnie Berger, the Simons Professor of Mathematics in EECS and the Computer Science and Artificial Intelligence Laboratory (CSAIL).

Each year, students also hear about emerging deep-learning applications from companies sponsoring the course. David Cox, co-director of the MIT-IBM Watson AI Lab, covered neuro-symbolic AI, a hybrid approach that combines symbolic programs with deep learning’s expert pattern-matching ability. Alex Wiltschko, a senior researcher at Google Brain, spoke about using a network analysis tool to predict the scent of small molecules. Chuan Li, chief scientific officer at Lambda Labs, discussed neural rendering, a tool for reconstructing and generating graphics scenes. Animesh Garg, a senior researcher at NVIDIA, covered strategies for developing robots that perceive and act more human-like.

With 350 students taking the live course each year, and more than a million people who have watched the lectures online, Amini and Soleimany have become prominent ambassadors for deep learning. Yet, it was tennis that first brought them together.

Amini competed nationally as a high school student in Ireland and built an award-winning AI model to help amateur and pro tennis players improve their strokes; Soleimany was a two-time captain of the MIT women’s tennis team. They met on the court as undergraduates and discovered they shared a passion for machine learning.

After finishing their undergraduate degrees, they decided to challenge themselves and fill what they saw as an increasing need at MIT for a foundational course in deep learning. 6.S191 was launched in 2017 by two grad students, Nick Locascio and Harini Suresh, and Amini and Soleimany had a vision for transforming the course into something more. They created a series of software labs, introduced new cutting-edge topics like robust and ethical AI, and added content to appeal to a broad range of students, from computer scientists to aerospace engineers and MBAs.

“Alexander and I are constantly brainstorming, and those discussions are key to how 6.S191 and some of our own collaborative research projects have developed,” says Soleimany.

They cover one of those research collaborations in class. During the computer vision lab, students learn about algorithmic bias and how to test for and address racial and gender bias in face-recognition tools. The lab is based on an algorithm that Amini and Soleimany developed with their respective advisors, Daniela Rus, director of CSAIL, and Sangeeta Bhatia, the John J. and Dorothy Wilson Professor of HST and EECS. This year they also covered hot topics in robotics, including recent work of Amini’s on driverless cars.

But they don’t plan to stop there. “We’re committed to making 6.S191 the best that it can be, each year we teach it,” says Amini “and that means moving the course forward as deep learning continues to evolve.”


}

© 2021 PopYard - Technology for Today!| about us | privacy policy |