TechNews Pictorial PriceGrabber Video Mon Nov 25 22:51:14 2024

0


Artificial Intelligence Has a ‘Sea of Dudes’ Problem
Source: Jack Clark



Fei-Fei Li.
Photographer: Jeff Chiu/AP Photo

Earlier this month, Bill Gates took the stage at the Recode conference to talk about philanthropy with his wife, Melinda. They discussed mobile payments, contraception, and billionaires giving away their fortunes. Then the conversation turned to artificial intelligence, and Gates grinned and swiveled in his giant red leather chair. "Certainly, it's the most exciting thing going on," he said. "It's the Holy Grail. It's the big dream that anybody who's ever been in computer science has been thinking about."

Melinda patiently waited for her husband to finish extolling the virtues of machines that can solve problems scientists haven't programmed them to know. Then it was her turn. "The thing I want to say to everybody in the room is: We ought to care about women being in computer science," she said. "You want women participating in all of these things because you want a diverse environment creating AI and tech tools and everything we're going to use." She noted that just 17 percent of computer science graduates today are women, down from a peak of 37 percent.

The figures are actually worse in AI.    At one of 2015's biggest artificial intelligence conferences—NIPS, held in Montreal—just 13.7 percent of attendees were women, according to data the conference organizers shared with Bloomberg.

That's not so surprising, given how few women there are in the field, said Fei-Fei Li, who runs the computer vision lab at Stanford University. Among the Stanford AI lab's 15 researchers, Li is the only woman. She's also one of only five women professors of computer science at the university. "If you were a computer and read all the AI articles and extracted out the names that are quoted, I guarantee you that women rarely show up," she said. "For every woman who has been quoted about AI technology, there are a hundred more times men were quoted."

Much has been made of the tech industry's lack of women engineers and executives. But there's a unique problem with homogeneity in AI. To teach computers about the world, researchers have to gather massive data sets of almost everything. To learn to identify flowers, you need to feed a computer tens of thousands of photos of flowers so that when it sees a photograph of a daffodil in poor light, it can draw on its experience and work out what it's seeing.

If these data sets aren't sufficiently broad, then companies can create AIs with biases. Speech recognition software with a data set that only contains people speaking in proper, stilted British English will have a hard time understanding the slang and diction of someone from an inner city in America. If everyone teaching computers to act like humans are men, then the machines will have a view of the world that's narrow by default and, through the curation of data sets, possibly biased.

"I call it a sea of dudes," said Margaret Mitchell, a researcher at Microsoft. Mitchell works on computer vision and language problems, and is a founding member—and only female researcher—of Microsoft's "cognition" group. She estimates she's worked with around 10 or so women over the past five years, and hundreds of men. "I do absolutely believe that gender has an effect on the types of questions that we ask," she said. "You're putting yourself in a position of myopia."

There have already been embarrassing incidents based on incomplete or flawed data sets. Google developed an application that mistakenly tagged black people as gorillas and Microsoft invented a chatbot that ultimately reflected the inclinations of the worst the internet had to offer.

"From a machine learning perspective, if you don't think about gender inclusiveness, then oftentimes the inferences that get made are biased towards the majority group—in this case, affluent white males," said Margaret Burnett, a professor at Oregon State University's School of Electrical Engineering and Computer Science. Burnett developed GenderMag, which helps software developers build systems that account for the differences in gender of their users. Microsoft has been experimenting with the software, she said. She's also investigated how machine learning systems suffer if the designers don't properly account for gender. "If un-diverse stuff goes in, then closed-minded, inside-the-box, not-very-good results come out," she said.

Lili Cheng.
Photographer: J. Countess/Getty Images

Tay, Microsoft's chatbot released earlier this year, had a lot of un-diverse stuff going in. Within 24 hours of being exposed to the public, Tay took on a racist, sexist, homophobic personality. It did that because internet users realized that Tay would learn from its interactions, so they tweeted insulting, racist, nasty things at it. Tay incorporated that language into its mental model and start spewing out more of the same.

Companies like Microsoft are grappling with how to assemble better, more diverse data sets. "How do we make sure that the data sets we're training with think about gender?" asked Lili Cheng, who led the Microsoft team that developed Tay. "The industry as a whole, ourselves included, need to do a better job of classifying gender and other diversity signals in training data sets."

There's already evidence that gender disparity has crept into AI job listings. Textio is a startup that helps companies change job posting language to increase the number and diversity of people that apply. It performed an analysis of 1,700 AI employment ads and compared those to over 70,000 listings spread across six other typical IT roles. The analysis found that AI job ads tend to be written in a highly masculine way, relative to other jobs.

An ad from Amazon.com for a software development engineer merited a high masculine score because it uses language commonly associated with men, like "coding ninja," "relentlessly" and "fearlessly," said Textio CEO Kieran Snyder. Those words tend to lead to fewer women applying, and the ad also lacked any kind of equal opportunity statement, she said. "It's amazing how many companies are, on the one hand, disappointed with the representation of women in these roles and, on the other hand, happily pushing out hiring content like this,"said Snyder, who is a former Amazon employee. Amazon declined to comment.

"Everybody has a bit of their own bias," said Katherine Heller, an executive director of Women in Machine Learning, a decade-old group dedicated to improving the gender diversity in AI. The organization hosts talks and presentations by female researchers, and also has a public directory of several hundred women working in machine learning, giving people a way to reach out to women in the community. "Some of the cultural issues that play into women not being involved in the field could also lead to important questions not being asked in terms of someone's research agenda."

Close all those tabs. Open this email.
Get Bloomberg's daily newsletter.

Some women in AI are focused on the next generation. Stanford's Li started a group for 10th grade girls called SAILORS that pairs intensive study with company field trips and mentoring. Not wanting to miss an opportunity to conduct research, Li did a study of the program and found that women who attended it had a statistically significant increase in technical knowledge, confidence, and interest in pursuing careers in AI. Chelsea Finn, a Ph.D. student at the University of California at Berkeley, said the most helpful thing for her is to see visible, female role-models.

That's something women working in AI are acutely aware of. "By increasing my own visibility, hopefully I get more high schoolers and undergraduates interested in this," Mitchell said.


}

© 2021 PopYard - Technology for Today!| about us | privacy policy |