Financial institutions that use complex algorithms in their lending decisions are legally required to provide a specific explanation to consumers who are denied credit, the Consumer Financial Protection Bureau said Thursday.
Creditors cannot simply skirt the requirements of the Equal Credit Opportunity Act by saying that artificial intelligence or machine learning technologies were used to evaluate credit applications, CFPB Director Rohit Chopra said in a written statement.
“Companies are not absolved of their legal responsibilities when they let a black-box model make lending decisions,” Chopra said in a press release that accompanied a policy statement by the agency.
“The law gives every applicant the right to a specific explanation if their application for credit was denied, and that right is not diminished simply because a company uses a complex algorithm that it doesn’t understand.”
Under the Equal Credit Opportunity Act, a landmark anti-discrimination statute that became law in 1974, a creditor is required to provide an applicant with a reason for denying, revoking or changing the terms of an existing extension of credit. The explanation is known as an adverse action notice.
The CFPB did not provide an analysis of how lenders that use complex algorithms can find ways to meet the adverse action requirements in ECOA.
Last year, 198 financial institutions were cited for violations of the law and its implementing statute, Regulation B, according to the Federal Financial Institutions Examination Council. The CFPB cited the violations in a recent
The bureau said Thursday that there is no exception to ECOA’s requirements for creditors who are using technology “that has not been adequately designed, tested, or understood.”
Earlier this month, the CFPB issued an
Chopra has been