TechNews Pictorial PriceGrabber Video Tue Nov 26 06:54:50 2024

0


Tech-savvy investigators are ready to put algorithms under the microscope
Source: Matthew Braga


Given the growing role algorithms play in so many parts of our lives — such as those used by Facebook, one of its data centres pictured here — we know incredibly little about how these systems work

Decades before you could buy a plane ticket on your phone, there were computerized reservation systems (CRS). These were rudimentary information systems used by travel agents to book customers' flights. And they had one devious flaw.

By the early 1980s, 80 per cent of travel agencies used the systems operated by American and United airlines. And it didn't take long before the two airlines realized they could use that monopoly to their advantage — namely, by writing code designed to prioritize their own flights on CRS screens over those of their competitors.

Naturally, U.S. aviation regulators weren't pleased, and the companies were ordered to cut it out. But the case — described in a 2014 paper from researcher Christian Sandvig — lives on today as one of the earliest examples of algorithmic bias.

It's a reminder that algorithms aren't always as neutral or well-intentioned as their creators might think — or want us to believe — a reality that's more evident today than it's ever been.




Facebook founder and CEO Mark Zuckerberg said last month he doesn't want anyone to use his company's platform — which serves content according to complex, unseen algorithms — 'to undermine democracy.' (Justin Sullivan/Getty Images)

In U.S. courts, reports generated by proprietary algorithms are already being factored into sentencing decisions — and some have cast doubts on the accuracy of the results. Sexist training sets have taught image recognition software to associate photos of kitchens with women more than men.

And perhaps most famously, Facebook has been the target of repeated accusations that its platform, which serves content according to complex algorithms, helped amplify the spread of fake news and disinformation, potentially influencing the outcome of the 2016 U.S. presidential election.

Yet, given the important role algorithms play in so many parts of our lives, we know incredibly little about how these systems work. It's why a growing number of academics have established a nascent field for algorithmic audits. Much like companies already have outsiders review their finances and the security of their computer systems, they might soon do the same with their decision-making code.

Algorithmic auditors

For now, it's mostly researchers operating on their own, devising ways to poke and prod at popular software and services from the outside — varying the inputs in an effort to find evidence of discrimination, bias or other flaws in what comes out.

Some of the field's experts envision a future where crack teams of researchers are called in by companies — or perhaps on the order of a regulator or judge — to more thoroughly evaluate how a particular algorithm behaves.

There are signs this day is fast approaching.

Last year, the White House called on companies to evaluate their algorithms for bias and fairness through audits and external tests. In Europe, algorithmic decisions believed to have been made in error or unfairly may soon be subject to a "right to explanation" — though how exactly this will work in practice is not yet clear.

A Harvard project called VerifAI is in the early stages of defining "the technical and legal foundations necessary to establish a due process framework for auditing and improving decisions made by artificial intelligence systems as they evolve over time."



Cathy O'Neil Weapons of Math Destruction mathematician algorithms auditing

Mathematician Cathy O'Neil, author of the book "Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy," founded an algorithm consulting company last year to help data-savvy companies manage risk and use algorithms fairly. (Cathy O'Neil)

Harvard is one of a handful of schools — including Oxford and Northwestern — with researchers studying algorithmic audits, plus a new conference devoted to the subject will kick off in New York next year.

Outside academia, consulting giant Deloitte now has a team that advises clients on how they can manage "algorithmic risks." And mathematician Cathy O'Neil launched an independent algorithm consultancy of her own last year, pledging "to set rigorous standards for the new field of algorithmic auditing."
Scrutinizing secret code

All of this is happening amidst rising political backlash against some of the most powerful tech companies in the world, whose opaque algorithms increasingly shape what we read and how we communicate online with little external scrutiny.

One of the challenges, says Solon Barocas, who researches accountability in automated decision-making at Cornell University, will be determining what, exactly, to scrutinize and how. Tech companies aren't regulated the same way as other industries, and the mechanisms that are already used to evaluate discrimination and bias in areas such as hiring or credit may not easily apply to the decisions that, say, a personalization or recommendation engine makes.

And in the absence of oversight, there's also the challenge of convincing companies there's value in letting in algorithmic auditors. O'Neil, the mathematician and a well-known figure in the field, says her consulting firm has no signed clients — "yet."

        ANALYSIS| When algorithms go bad: Online failures show humans are still needed

Barocas thinks companies "actually fear putting themselves in greater risk by doing these kinds of tests." He suggests some companies may actually prefer to keep themselves — and their users — in the dark by not auditing their systems, rather than discover a bias they don't know how to fix.

But whether companies choose to embrace external audits or not, greater scrutiny may be inevitable. Secret and unknowable code governs more parts of our lives with each passing day. When Facebook has the power to potentially influence an election, it's not surprising that a growing number of outside observers want to better understand how these systems work, and why they make the decisions they do.


}

© 2021 PopYard - Technology for Today!| about us | privacy policy |