Hola Bird

Main Menu

  • Latino Finance
  • Latino Loans
  • Hispanic Mortgages
  • Latino Economies
  • Capital

Hola Bird

Header Banner

Hola Bird

  • Latino Finance
  • Latino Loans
  • Hispanic Mortgages
  • Latino Economies
  • Capital
Hispanic Mortgages
Home›Hispanic Mortgages›Combatting AI Biases in the Financial Sector

Combatting AI Biases in the Financial Sector

By Eric P. Wolf
April 14, 2022
0
0

The use of explainable AI models is key to avoiding biases for businesses in most sectors of the economy, but especially in finance.

In the United States, credit unions and banks that deny consumers credit cards, car loans, or mortgages without a reasonable explanation may be subject to fines due under the Fair Credit Reporting Act. However, the AI ​​bias is still pervasive in the financial industry.

It’s a problem some government agencies are trying to address, but there’s no easy solution, said Gartner analyst Moutusi Sau.

“Without the existence of common standards across the financial services industry, it becomes difficult to measure what is being treated as bias,” Sau said. “The solution to the bias problem comes down to modeling and should start at the pre-modeling level, taking it to modeling and then post-modeling measurements of deviations.”

Explainability of pre-modeling can eliminate bias in the dataset. At the same time, explainability models allow users to interpret complex patterns. Post-modeling explainability provides explanations for pre-developed models, Sau wrote in a 2021 research paper.

Due to the lack of consensus on creating fair models between government agencies, the financial industry and IT professionals, companies are approaching the problem in different ways.

Zest AI

“Financial services are particularly problematic due to a history of biased practices,” said Jay Budzik, CTO at Zest AI, during an equity roundtable at the ScaleUp:AI conference on April 7.

Zest AI is a financial services provider that develops machine learning software for credit underwriting.

“We believe that credit is broken – that the math that was invented in the 1950s and that really popularized FICO [the credit reporting score] was great at the time, but it also reflected a certain set of social values ​​and norms,” Budzik said in an interview.

The Burbank, Calif.-based vendor provides software and services to banks that allow them to leverage the predictive power of a machine learning model to create a less racially biased and inaccurate rating model.

Its platform uses game theory, an applied mathematical method that analyzes situations where players make interdependent decisions. Zest AI uses this method to analyze how machine learning models make fair loan decisions.

Without the existence of common standards in the financial services industry, it becomes difficult to measure what is considered bias.

Moutusi SauAnalyst, Gartner

“For fair lending and racial discrimination, it’s also really important because you want to make sure your model isn’t penalizing people…based on something inappropriate,” Budzik said in the interview.

In addition to using game theory, the vendor trains models to focus not only on accuracy, but also on fairness – a method it calls “adversarial debiasing”.

This allows Zest AI to inject the notion of fairness into its model training process so that each cycle of data examined by the model is evaluated not only on accuracy, but also on fairness for the protected groups, including including blacks and Hispanics, immigrants and others. . The model then receives feedback from a second model, or “helper”, who tells it whether or not it is fair.

“This method … uses the full power of machine learning and the fact that it can explore billions of alternatives to find the one that achieves a fair outcome, but still provides this high level of accuracy,” said said Budzik.

But adversarial bias is not foolproof, he noted.

“Sometimes we’re not able to find a more accurate and equally accurate model,” he said. This leads to a trade-off approach in which a large amount of precision or even a small amount of precision is traded for fairness.

Another approach to avoiding AI bias in finance

Credit Karma, an Intuit brand, tries to eliminate bias by not using personally identifiable information (PII) data, said Supriya Gupta, chief executive of the personal finance company’s recommendations.

Credit Karma partners with financial institutions that adhere to fair lending practices, Gupta said. Instead of using personal identifiers such as gender and race, the company uses other attributes to provide financial recommendations to the more than 120 million consumers it works with.

Attributes include a person’s credit score, personal transactions, assets, liabilities, loans, income, and how the person pays bills.

Credit Karma runs deep learning models with these attributes to create 35 billion model predictions per day, according to Gupta. These predictions drive the AI ​​engine to predict whether members will be approved for any of the offers they see on Credit Karma. The recommendations also provide insight into ways members could improve their personal finances.

“That’s really the power of AI,” Gupta said.

Related posts:

  1. Housing market on fire: 5 ways this boom has nothing to do with the latest
  2. The CFPB sees its mortgage complaints increase
  3. American Watchdog Says Black, Hispanic Owners More Likely To Be On Withdrawal Program
  4. The Bay Area is losing Latino owners. Where are they going?

Recent Posts

  • Crypto can be a driver for racial equity
  • Special Feature: “Eyewitness News Guide to Inflation”
  • Kentucky attorney general files case for 2023 gubernatorial race
  • Annual Latino Conference moves to Allentown – Times News Online
  • New U.S. Funding Wins Six Stevie® American Business Awards®

Archives

  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021

Categories

  • Capital
  • Hispanic Mortgages
  • Latino Economies
  • Latino Finance
  • Latino Loans
  • Terms and Conditions
  • Privacy Policy