As experts across the fintech industry closely monitor the ripple effects of the White House's executive order on artificial intelligence, leaders remain confident in their firms' abilities to adapt to changes in an already highly regulated environment.
The scope of
This level of scrutiny extends to the buyers of the models, who are required to report their purchase as well as "the existence and location of these clusters and the amount of total computing power available in each cluster." Users will also need to adhere to
Lawmakers remain divided on how to
But in addition to the
Yolanda McGill, vice president of policy and government affairs for Zest AI in Burbank, California, said conversations with legislators are often driven by fears that AI models could go off the rails, or that they could begin replacing human employees.
"In our [industry], there are concerns toward having a really good understanding about what an algorithm is actually doing. … I was concerned that [those fears] would mean we were not going to be able to have the conversations about practical use cases that are happening right now and are impacting people's lives every day for the good or for the ill," McGill said.
Regulators with the
The CFPB's focus on
"The financial services industry has been grappling with these questions for a long time," McGill said. "Algorithms are not new to financial services, AI is not new to financial services and companies have been innovating within the guardrails of mandatory consumer disclosures, mandatory versions of explainability, prohibitions against discrimination and other requirements for some time now."
Beyond the requirements for reporting training and cybersecurity standards to the government, some fintech experts are unsure if federal agencies are capable of discerning productive models from those that could threaten national security.
The order "immediately posed the question of how do you identify these models without some sort of
"From the outside, it's going to be hard for the U.S. government to identify it and I think eventually there is going to be some self-regulation within companies that are working on very powerful foundation models," Kalmar said. "I think it gives a direction for the regulatory bodies to focus on it, but specific areas need to be well defined."
Lendbuzz's software-as-a-service platform deploys machine learning algorithms to analyze consumer financial data such as bank transaction history and establish a credit score for qualifying borrowers. The firm then handles the underwriting of the loan at the point of sale, which is backed by funds provided through Lendbuzz's bank partners.
Kalmar is working to keep Lendbuzz ahead of the executive order's guidance by strengthening cybersecurity and improving transparency in the building of the models for stakeholders and regulators alike.
"The price a company or financial institution could pay for breaches and failure there is very significant, and I think AI will present new risks in that area that we have not been exposed to in the past," he said.
The popularity of products powered by generative AI has continued to rise in recent months, but friction points have caused many organizations to
Throughout a significant portion of the order's recommendations for regulators to increase their oversight is an emphasis on data gathering during the development phases. Agencies in Europe and the
Ed Maslaveckas, co-founder and CEO of the London-based open-banking and data intelligence firm Bud Financial, said agencies in the U.K. are better positioned to create guidelines for the use of AI in financial services by examining real-life applications.
"I'm glad that people are taking this really seriously, and that we've seen all the activity happening in the U.S., U.K. and Europe as the main concern was we would let the world run rampant for years, and then when something bad happened, we'd draft regulation. … I think we're moving in a positive direction, but outcomes" produced by the models in question are the No. 1 thing, Maslaveckas said.
"I like the way the U.K. is thinking about oversight being driven by use cases, and because it's use case driven, it's regulated by the subject matter experts," who have the technological understanding necessary to lead campaigns for change, Maslaveckas said.
Fintech leaders are closely monitoring the next steps from changemakers when formulating plans for future innovation.
Scienaptic AI in New York, which offers loan decision software that uses machine learning to analyze large amounts of data, vets models for fairness and accuracy long before they are put into production. This process, which has been in place for some time, helps the fintech stay ahead of regulatory shifts, the company said.
"There is a significant need to enhance financial inclusion in the U.S.," said Vinay Bhaskar, chief operating officer and head of AI and compliance initiatives at Scienaptic AI, who added that his company tries to push that agenda through its software. "The CFPB and Federal Trade Commission continues to do a solid job in highlighting the broad objectives and principles around lending and this Executive Order underscores the value of the those objectives as it proposes a risk-based approach to manage and implement AI."
Updated 11/09/23: This story was updated to provide added context into Scienaptic AI's operations.