Teaching algorithms not to discriminate Source: Claire Cain Miller
Algorithms have become one of the most powerful arbiters in our lives. They make decisions about the news we read, the jobs we get, the people we meet, the schools we attend and the ads we see. Yet there is growing evidence that algorithms and other types of software can discriminate.
The people who write them incorporate their biases, and algorithms often learn from human behavior, so they reflect the biases we hold. For instance, research has shown that ad-targeting algorithms have shown ads for high-paying jobs to men but not women, and ads for high-interest loans to people in low-income neighborhoods.
Cynthia Dwork, a computer scientist at Microsoft Research who has studied privacy and algorithm design, discussed these issues in an email interview.
Some people have argued that algorithms eliminate discrimination because they make decisions based on data, free of human bias. Others say algorithms reflect and perpetuate human biases. What do you think?
Algorithms do not automatically eliminate bias. Suppose a university, with admission and rejection records dating back for decades and faced with growing numbers of applicants, decides to use a machine learning algorithm that, using the historical records, identifies candidates who are more likely to be admitted. Historical biases in the training data will be learned by the algorithm, and past discrimination will lead to future discrimination.
Are there examples of that happening?
A famous example of a system that has wrestled with bias is the resident matching program that matches graduating medical students with residency programs at hospitals. The matching could be slanted to maximize the happiness of the residency programs, or to maximize the happiness of the medical students. Prior to 1997, the match was mostly about the happiness of the programs.
This changed in 1997 in response to "a crisis of confidence concerning whether the matching algorithm was unreasonably favorable to employers at the expense of applicants, and whether applicants could 'game the system,' " according to a paper by Alvin Roth and Elliott Peranson published in the American Economic Review.
Another recent example of the problem came from Carnegie Mellon University, where researchers found that Google's advertising system showed an ad for a career coaching service for "$200k+" executive jobs to men much more often than to women.
The paper is very thought-provoking. I am currently collaborating with the authors and others to consider the differing legal implications of several ways in which an advertising system could give rise to these behaviors.
The law protects certain groups from discrimination. Is it possible to teach an algorithm to do the same?
This is a relatively new problem area in computer science, and there are grounds for optimism ― for example, resources from the Fairness, Accountability and Transparency in Machine Learning workshop, which considers the role that machines play in consequential decisions in areas like employment, health care and policing. This is an exciting and valuable area for research.
You have written that ideally a regulatory body or civil rights organization would impose rules governing these issues. The tech world is notoriously resistant to regulation, but do you believe it might be necessary to ensure fairness in algorithms?
Yes, just as regulation currently plays a role in certain contexts, such as advertising jobs and extending credit.
Should computer science education include lessons on how to be aware of these issues and the various approaches to addressing them?
Absolutely! First, students should learn that design choices in algorithms embody value judgments and therefore bias the way systems operate. They should also learn that these things are subtle: For example, designing an algorithm for targeted advertising that is gender-neutral is more complicated than simply ensuring that gender is ignored. Techniques for addressing these kinds of issues should be quickly incorporated into curriculums as they are developed.
| }
|