Lawmakers warn against AI-powered financial tools
As fintech companies move from being a startup to being a staple of the financial industry, the already high use of algorithm-based artificial intelligence (AI) is increasing.
And it’s not just finance. In a January 2021 report, auditing and professional services firm PwC found that 86% of senior executives said AI would become mainstream technology in their business within the year. A quarter said AI was already widely used in their businesses.
The promise of artificial intelligence and machine learning is great: using complex algorithms, machines can be taught to mimic human intelligence, performing tasks ranging from handling and resolving complaints from clients to the assessment of the approval of a mortgage loan application in real time.
However, a letter sent by House Financial Services Committee Chair Maxine Waters (D-CA) and Representative Bill Foster (D-IL), Chair of the Artificial Intelligence Task Force, to the heads of five US agencies with financial supervisory responsibilities – including the Federal Reserve, the Consumer Financial Protection Bureau (CFPB), and the FDIC – may differ.
Highlighting “the possible risks and benefits of emerging technologies in the financial services and housing industry,” representatives Waters and Foster urged them to take action to ensure that AI and machine learning tools are “Used ethically and help and help moderate income communities of color that have been underserved for too long.” “
Specifically, they want government agencies to ensure that algorithmic biases are excluded from AI-based financial tools. “Financial institutions must fully understand that they are bound by these civil rights protections when creating and manipulating datasets,” Waters and Foster wrote. “Regulators should subject financial institutions using AI to full audits of their algorithmic decision-making processes.”
They further clarify that this means having the expertise and staff budgets to monitor protected classes, including color, religion, national origin, gender, marital status, age and use. public assistance programs – “even when these attributes are not explicitly taken into account by AI.” . “
Set bad examples
In question, it is the part “to imitate human intelligence”. Humans – AI developers among them – are prejudiced. Specifically, the fear is that programmers will rely on huge data sets that could be tainted with historical biases to teach AIs the area in which they will provide information and recommendations.
A 2019 study of two million home loan applications found that lenders – who typically use AI to recommend whether or not to give a loan – were 40% more likely to turn down Latino applicants than whites with a similar financial situation. Black applicants were 80% more likely to be turned down.
The whys are complex, but the AI of mortgage applications is based on entering historical data. Among other things, this includes a long history of lenders in minority neighborhoods in which they would not make loans, and credit scores influenced by the generational wealth of white applicants – such as homeownership.
Beyond the financial sector, in 2015 Amazon overhauled its recruiting algorithm after realizing it was biased against women. Why? Because it was based on resumes submitted over the past decade, mostly by men. In 2019, an algorithm widely used by hospitals was found to be less likely to direct African American patients for treatment than equally ill white patients, because it was based on historical records reflecting their better self-confidence and access. to medical care.
Noting that “FinTechs and others using these technologies should play their part in building a fairer and more equitable financial system in the 21st century,” the letter from Waters and Forster urged agencies to “put principles first. transparency, enforceability, confidentiality, impartiality and fairness. . This will ensure that the regulation and rule-making of AI can meaningfully address the governance, risk management and appropriate controls of AI. “