JPMorgan Chase, Nyca invest in AI-testing firm FairPlay

Adobe Stock: ArLawKa

FairPlay, a provider of fairness testing software for banks and fintech lenders, will receive $10 million in funding on Friday from JPMorgan Chase, Infinity Ventures and Nyca Partners.

"Fundamentally, what we're doing is testing the models for biases," Kareem Saleh, founder and CEO, told American Banker. "You can be biased in many different directions — against particular groups, against people from particular geographies, against people with particular incomes. Apart from a legal obligation, I think that there's a feeling in the zeitgeist that these AI systems may not treat us humans particularly well."

FairPlay has raised about $23.5 million since it was founded in 2020. A big change for the company in 2024, according to Saleh, is it graduated from working mostly with fintechs to working with some of the largest banks. (The banks have declined to be named.)

In February, Pathward, a national bank based in Sioux Falls, South Dakota, signed a three-year contract with FairPlay. Charles Ingram, chief technology and product officer at Pathward, said FairPlay's technology will help the bank refine its analytical capabilities related to fair lending and create efficiency through new tools and automation. 

"We're optimistic the technology will contribute to making the financial network accessible to more consumers and businesses," Ingram said.

Shuman Chakrabarty, head of the impact finance and advisory group at JPMorgan Chase, became interested in the way FairPlay helps lenders evaluate their models and broaden access to credit, and that it works with institutions of different sizes, "which is critical if we're going to strengthen the ecosystem." 

"As we're thinking about the rapid adoption of AI across financial services, whether that's fintech lenders, whether that's smaller lending institutions, we are seeing many of them seeking some sort of help in evaluating their models," Chakrabarty said. 

An example is lenders looking at borrowers who have incomplete credit files. "Solutions like FairPlay ensure that borrowers like that are getting a thorough look in a way that could hopefully foster increased access," Chakrabarty said. "We were looking for ways to support the adoption of trustworthy AI, because ultimately, what we want to do is strengthen the financial services ecosystem during a period of rapid evolution, and see FairPlay as a potentially important way to do that."

JPMorgan Chase has internal teams dedicated to model evaluation, something not all banks can afford to do. 

New York-based fintech investor Nyca has invested in many AI startups.

"We have been viewing AI as important in many, many ways in financial services and in our view the most powerful near-term application of AI in financial services is making expensive people more efficient; think of some of the very well-paid staff inside a bank, like people doing model risk management, building credit models, managing corporate banking relationships, or managing money for clients," said Hans Morris, managing partner of Nyca Partners. "How can you make the loan officers and relationship managers more efficient?"

Lending is a complicated and sensitive subject — not only because it's regulated and subject to laws like the Fair Credit Reporting Act, but because there is a history of decades of well-documented discrimination, Morris pointed out.

"There's all kinds of biases that have existed for a long time," Morris said. "Banks are constantly asking themselves, 'how sure are we that our model is accurate?'" Not only is there regulatory risk to perpetuating bias, for instance, by failing a Community Reinvestment Act review and therefore having a merger application denied, but banks have a vested interest in making loans to people who are going to repay, and bias gets in the way of that, he pointed out.

Companies like FairPlay help banks consider borrowers who have a low or no FICO score, perhaps because they grew up in a community where there were no bank branches.

Many banks have been recently dropping inclusion efforts due to signals coming from Washington, D.C. And one of the biggest critics of AI-based lending, the Consumer Financial Protection Bureau, is being dismantled

Yet fair lending laws remain extant and state bank regulators and attorneys general care about whom banks lend to and whether they're fair. And, according to Saleh, some banks have made a practice of asking, does our AI have blind spots? 

"That is a question that is relevant to a bank for risk management purposes, completely apart from what you think about DEI and redlining in the financial services industry," Saleh said. "AI has this well-known tendency to overfit to populations that are well-represented in the data. 

"And if you want to keep your AI from running your business off a cliff, or causing harm in the communities you serve, you probably ought to know what populations you're approving and what populations you're missing, and why, and are you leaving money on the table? And is the model degrading when it's in production? Has something broken, either in the model or in the world?"

For reprint and licensing requests for this article, click here.
Artificial intelligence Fintech Technology
MORE FROM AMERICAN BANKER