TechNews Pictorial PriceGrabber Video Wed Nov 27 03:52:56 2024

0


Racist algorithms: how Big Data makes bias seem objective
Source: Cory Doctorow


The Ford Foundation's Michael Brennan discusses the many studies showing how algorithms can magnify bias -- like the prevalence of police background check ads shown against searches for black names.

What's worse is the way that machine learning magnifies these problems. If an employer only hires young applicants, a machine learning algorithm will learn to screen out all older applicants without anyone having to tell it to do so.

Worst of all is that the use of algorithms to accomplish this discrimination provides a veneer of objective respectability to racism, sexism and other forms of discrimination.

I recently attended a meeting about some preliminary research on "predictive policing," which uses these machine learning algorithms to allocate police resources to likely crime hotspots. The researchers at the Human Rights Data Analysis Group discussed how these systems are often deployed in response to complaints about racial bias in policing, but the data used to train the algorithm comes from the outcomes of the biased police activity. If the police are stop-and-frisking brown people, then all the weapons and drugs they find will come from brown people. Feed that to an algorithm and ask it where the police should concentrate their energies, and it will dispatch those cops to the same neighborhoods where they've always focused their energy, but this time with a computer-generated racist facewash that lets them argue that they're free from bias.

        There is no easy fix. Instead, a broad coalition of civil society organizations must push for change in a number of directions at the same time. Sweeney and Bedoya outline a number of strategies, including:

        * Investing in the technical capacity of public interest lawyers, and developing a greater cohort of public interest technologists. With more engineers participating in policy debates and more policymakers who understand algorithms and big data, both government and civil society organizations will be stronger.

        * Pressing for “algorithmic transparency.” By ensuring that the algorithms underpinning critical systems like public education and criminal justice are open and transparent, we can better understand their biases and fight for change.

        * Exploring effective regulation of personal data. Current laws and regulations are out dated and provide relatively little guidance on how our data is utilized in the technologies we rely on every day. We can do better.


}

© 2021 PopYard - Technology for Today!| about us | privacy policy |