Small businesses should be allowed to permission their data with companies that share the goal of financial inclusion, while the industry should move to measure more outcomes instead of data inputs.
The rightful prohibition of ZIP codes in underwriting is one example of how financial regulators ensure fairness by protecting against discriminatory lending. But the increasing reliance on artificial intelligence and machine learning, or “automated insights” as I
Before machine-assisted underwriting, credit determinations were made by loan officers hunched over paper applications in a bank branch. A human being applying their personal judgment and all its baked-in bias, evaluated the loan applicant’s capacity to run a business, calculate expenses and repay a debt in full, often from personal funds.
This afforded an advantage to the people most like the loan officer, and to business models and customers they understood well. Lacking the ability to see into the loan officer’s mind, regulators tracked statistical outcomes to determine fair practices.
Data-driven lending is a marked improvement over that sole loan officer. Cash flow data can remove the personal characteristics, affinity and attitudes of the neighborhood banker. Instead, it renders a choice based on the past and current performance of the business itself, as illustrated by dozens of data holistic touchpoints. These models are still highly supervised by humans, but the algorithms can make use of nontraditional or alternative data.
Banning the use of ZIP codes was meant to prevent discrimination by race or ethnicity. And while sometimes appropriate, an approach that always defaults to outright ban attributes can limit both the customer’s ability to access credit and stilt the development of new models trying to best serve America’s small businesses.
The availability of data and pace of change in the industry is making it incredibly hard for regulators to judge every modern-day, reliable and appropriate input under their charge to ensure fair and ethical outcomes. Customers should have the right to disclose data inputs to companies which make fair and appropriate use of it. And regulators should look to measure the actual outcomes, instead of the inputs.
As customers now generate seas of valuable data, nuanced and objective decisions can be reached by including more holistic information, not less. Regulators face difficulties in anticipating what data may generate
Meanwhile, companies engaged in a continual refinement of their models are constantly experimenting with predictive inputs and parsing the resulting outcomes, which remain bound by fair-lending laws. Regulators are often only prompted to examine such data after a series of customer complaints or a suspicious volume of anecdotal evidence emerges around a company’s practices.
However, waiting for a whistleblower or complaint is hardly reliable or easily standardized. Rather than wait for problems to emerge, data scientists and stewards of machine learning are testing thousands of variables from the new deluge of inputs and testing outcomes, often in real time.
Such fintech trials, often called sandboxes, can be compared to prescription drug trials. Regulators
Similarly, fintech solutions search for innovative positive externalities. Yes, risk of financial harm and unfair lending practices are concerns. But that’s why untested ideas require an opportunity to prove or disprove positive outcomes in the early stages.
Innovation suffers when the opportunity to analyze modern data points is removed through premature restrictions. Identifying inefficiencies and missteps are part of any sustainable modeling practice, as evidenced by the
This is why the U.S. House Committee on Financial Services should take a collaborative approach with fintechs to ensure fair and inclusive access to small-business credit when considering the use of alternative data in underwriting and credit scoring, during a hearing
Online lending platforms are very careful with the data used to determine the most critical elements of lending: a borrower’s ability to repay consistently and separately, a willingness to repay rather than default.
Firms test across multiple trials to find variation or consistency and to ensure replicability. Data scientists diligently review real-time results for indicators of undue bias, while regulators wait for yearslong patterns of outcomes to maybe emerge.
Most data-driven fintechs already lend with a culture of testing, modeling, and principles-based examination of inputs, and of responsibly regulated outcomes. Regulations that arbitrarily restrict data inputs only holds back progress, particularly for the
If the outcomes meet lawful standards and show that such appropriate, fair use of that data increases small businesses' access to robust banking services, rather than excluding or penalizing them, we ought to consider it as part of a broader discussion on financial inclusion.