TechNews Pictorial PriceGrabber Video Wed Nov 27 03:49:17 2024

0


Is your AI racist? This lawmaker wants to know
Source: Lalita Clozel


“People tend to develop things from their own way of thinking and their own world," said Rep. Emanuel Cleaver, D-Mo. "We would be wise ... to make certain that they don't build into their systems exclusionary practices.”
Bloomberg News

The question of whether an algorithm making underwriting decisions can be racially biased has long gnawed at lenders fearful of inadvertently running afoul of consumer protection laws. Now, the question has sparked interest from Congress.

“Fintech has been accepted pretty much as a positive impact even by" the Consumer Financial Protection Bureau, Rep. Emanuel Cleaver II, D-Mo., said in an interview. But “if we're not careful, we will develop a system that almost institutionally ignores and maybe even hurts small and minority businesses.”

Cleaver, a member of the House Financial Services Committee, launched an investigation last week into the issue, specifically into the way artificial intelligence is used in small-business loans such as merchant cash advances. He asked five online lenders, including Lending Club and Prosper, to answer questions about their loan products, disclosure methods and “potentially discriminatory practices.”

Like other Democratic lawmakers, Cleaver expressed concern over the potentially discriminatory impact of online lenders’ decision-making process, while also praising their potential to extend loans to underserved groups.

“People tend to develop things from their own way of thinking and their own world," said Rep. Emanuel Cleaver, D-Mo. "We would be wise ... to make certain that they don't build into their systems exclusionary practices.”Bloomberg News“We're not saying that fintech or alternative financing systems are innately evil,” Cleaver said. “They can have a positive impact.”

But the lawmaker also brought to the fore a cutting-edge question that online lenders are increasingly grappling with, as the algorithms behind underwriting decisions become more sophisticated.

“People tend to develop things from their own way of thinking and their own world," Cleaver said. "We would be wise ... to make certain that they don't build into their systems exclusionary practices.”

A complicated regulatory challenge facing advanced underwriting models is known as the “black box” dilemma. Because artificial intelligence programs can make decisions independently of human beings, it can be hard to track down whether their formulas comply with fair-lending laws.

“Traditional underwriting is using logistical regression, which is one kind of math or decision trees," said Douglas Merrill, the CEO of Zest Finance. "You can look at them and know what they do. When you leave that kind of clear math behind and do machine-learning, the algorithms are more powerful. But machine-learning algorithms are by their own nature black boxes."

His company has created a product designed to reverse-engineer AI underwriting decisions.

The opaque nature of algorithms has caused some financial institutions, particularly traditional banks, to be wary of using artificial intelligence in their underwriting decisions. The concern is that if they were to face a lawsuit or enforcement action, they would be unable to defend themselves.

“You can't have people claim that they're unfairly discriminated against in the lending process and have the lender say, 'Oh, we don't know how our scoring model works,' " said Richard Fischer, a senior partner at Morrison & Foerster. "That's just not going to fly."

There is another “black box” that lenders have to contend with: which standards of discrimination they have to comply with.

“Society has made different choices on what are protected classes and attributes for different financial services,” said Aaron Klein, a fellow at the Brookings Institution in Washington. “Their application to modern-day financial services are murky and unclear.”

For example, a lender cannot use gender in determining an interest rate for most loans. But it appears that's what happens in auto lending. “Data show that teenage boys are higher risk drivers than teenage girls,” Klein said.

Beyond questions around what data can be used in an algorithm’s underwriting formula, companies also have to be wary of disparate impact claims. They could run into trouble if the machine makes individual decisions that at the aggregate level provide certain groups an advantage.

“The variables that are used in the artificial intelligence process, those variables could be highly correlated to race,” said Lisa Rice, the executive vice president of the National Fair Housing Alliance. “When you put three or four variables together, that could increase the likelihood of that data serving as a proxy for race.”

There needs to be a debate over these issues, Klein said. But it could become tricky if regulators need to determine which data points can be considered discriminatory and which ones are fair game.

“I don't want somebody going through and going, ‘Apple yes, magazine subscription no, beauty product no, but do they wear shirts or long pants? Yes,’ ” he said.

Cleaver has given the five companies until Aug. 10 to respond to his inquiry.

This article originally appeared in American Banker.


}

© 2021 PopYard - Technology for Today!| about us | privacy policy |