Twitter exploded this month with concerns that the Apple Card algorithm might be discriminating against women.
A renowned programmer
Almost instantly, the New York State Department of Financial Services
Apple Card’s partner, Goldman Sachs,
A section in the Dodd-Frank Act of 2010 clearly requires that financial institutions, including small-business lenders, collect protected-class data to provide a baseline to test for discriminatory lending practices. In response, the Consumer Financial Protection Bureau
Though broadly written, the still-unspecified rule actually requires small-business credit providers to collect data about applicants’ protected class status (gender, race, ethnicity, etc.); and the subsequent loan approval information so that lenders and regulators can better test for disparate impact. Once the requirements are clearly established, financial institutions will be responsible for the fair-lending outcomes of their small-business credit models.
Regulators have implemented similar regimes for the mortgage industry, but not as much for consumer lending. The alleged disparities with Apple Card approvals could have been easily recognized in internal testing if they were able to apply an applicant’s demographic data into their models for robust bias testing.
All financial institutions and nonbank lenders do test for disparate impact today, as required by law. But the process relies on backward-looking data and is imprecise, difficult and expensive.
In the future, data collected and tested under the Dodd-Frank Act requirements will quickly return signs of discriminatory patterns — results having nothing to do with the creditworthiness of each borrower — that can be expeditiously removed from the model.
An algorithm returning a flawed risk assessment is an expensive problem to have regardless of whether it was the result of historically biased data, statistical error or an unknown factor. Biased approvals may be expensive to fix, but it’s even more expensive not to fix.
Though this specific requirement in Dodd-Frank impacts small-business credit providers, the lessons learned will help the industry move holistically toward democratizing access to credit for all.
To be clear, the Dodd-Frank requirement does not allow underwriters to use protected-class data in the underwriting model. It simply provides an account-level baseline to radically improve today’s imprecise methodologies to test for discrimination.
With small businesses, only quantifiable business performance — not demographic traits — should dictate creditworthiness. Understanding protected-class demographic traits will give model validators greater visibility into model performance and equality.
In many ways, commercial lenders are lucky to soon collect borrower demographic information.
Small-business credit is a prime sector to begin refining how lenders think about credit equality, with a focus on being transparent about the data used. Commercial data is considerably more straightforward and less fraught than financial data about consumers.
By collecting previously restricted protected-class data, small-business lenders will be able to empirically test more robust metrics for equality than current methodologies allow.
For this specific data collection requirement to function as Dodd-Frank intended — and genuinely protect small businesses — every institution, from the largest to the smallest, must be held to the same standard for data collection, compliance and transparency.
High-emotion, social media-fueled narratives like Apple’s alleged discrimination won’t help create real accountability or make progress toward credit parity.
Machine learning and automation is evolving unbelievably fast. But there are so many questions yet to be answered about democratized access to financial services. The CFPB should move quickly to establish the necessary rules to ensure standardized, robust testing of small-business lending models.