BankThink

The cost of good data is high; the cost of bad data is ruinous

data quality 6.jpg
It has never been more important for banks, particularly those over $10 billion of assets, to establish ironclad data quality, governance and control standards. Failure to do so will have cascading negative consequences, writes Eugene Ludwig.

Last month, I discussed the evolving technology landscape and its impact on banking, outlining my intention to explore various tech-related topics in future op-eds. I begin this series with data and data governance, both because it is a critical area for successful tech-stack implementation — it must be done right for other projects to function well or at all — and because it is an area of weakness for many banking organizations.

For banks, data quality, governance, control and utilization are ever more important. A bank is, at its heart, a balance sheet — a set of assets and liabilities. Unlike retailers, wholesalers or professional services firms, banks possess few physical assets. To be without sound data is to be without a sound bank. If you eliminate a bank's ledger book, it cannot track assets and liabilities — and is essentially out of business.

Soon, data quality and governance will be critical to being a highly competitive, top-tier financial institution and to receiving favorable regulatory ratings. For larger institutions, the future of quality data management programs is now. Almost every technology tool, especially those relying on artificial intelligence, is driven by good data. But for too many banks, their data programs are seriously flawed.

For example, in 2013, the Bank for International Settlements published Principles for Effective Risk Data Aggregation and Risk Reporting for the 31 global systemically important banks, publishing periodic progress reports since. The latest report, from November 2023, reveals only two of the 31 banks are fully compliant. This reflects, at least in part, the reality that as business and technology change, there will always be work required to bring new products and services into compliance. Leading banks may be working with fintech partners on innovative new business models, exploring generative AI or positioning for crypto — all of which involve exploiting new data that needs to be accurate, protected and integrated into their data governance frameworks. Moreover, the technology itself is ever more capable, so the table stakes keep rising.

Of course, the strategic value of data governance extends beyond regulatory compliance, but they are intertwined. The recent Synapse bankruptcy and Silicon Valley Bank insolvency highlight the need for robust data management to mitigate risks arising from complex business models, rapid data growth and data velocity. Capital markets events like the Archegos bankruptcy have increased the focus on managing counterparty risk more effectively across investment banking and trading businesses.

Data quality, architecture, governance and interoperability pose significant challenges for banks. Banks collect vast amounts of data from customers, vendors, markets and internal sources, yet much of this data remains siloed and is captured in ways that limit its usefulness. Manual data entry exacerbates these issues, requiring extensive "cleaning" for the data to be useful. While data capture, storage and manipulation practices vary across institutions, the overall problem of dirty and/or idiosyncratic data is pervasive. This limitation hinders data utility within individual business segments and across the bank as a whole.

This is a hard subject to tackle and get right. Banks face significant challenges in achieving pristine data quality and governance. Even smaller banks with limited business lines face challenges with manual data entry, inconsistent data capture and manipulation across systems and vendors. As banks grow and diversify, ensuring data consistency across business lines becomes increasingly complex. Are Gene Ludwig and Eugene Ludwig the same person? What about ABC Advisory, LLC, owned and controlled by Gene Ludwig?

Acquiring another institution — which may have solved problems differently or not at all — further complicates data management. While siloed operations can mask these issues, achieving enterprisewide efficiency and cost savings requires consolidated data.

The personal data collection company, which offers its database for background check and private investigation services, acknowledged the major breach.

August 15
Social Security Cards in a Row Pile for Retirement

Historically, some banks could function with subpar data programs because essential information was supplemented by bank personnel with the knowledge of their specific siloes and the people or environments important to them. Today, with larger banks, intensified fintech competition and the growing reliance on data-driven technology, this is less and less the case.

For many banks facing margin compression, competition and the high cost of data management, prioritizing data governance and control may seem like a luxury. However, this can turn out to be penny-wise and pound-foolish. First, banks must collect and manage sensitive customer data, making thoughtful data management crucial to cybersecurity. Additionally, effective data management is essential to defending against compliance violation charges and identifying emerging trends that can threaten the bank's health, from customer trends to issues relating to balance sheet composition.

While the front-end benefits of high-quality data are often recognized, cost pressures frequently lead to suboptimal workarounds and solutions. This approach can be counterproductive due to the competitive environment, the complexity of the bank's business model and/or the size and complexity of the bank itself. Moreover, regulators increasingly expect banks, particularly larger institutions, to have robust data management programs.

The importance of high-quality data architecture is easy to articulate, but actually building that architecture is difficult.

Executive commitment, backed by the board and budget, is essential to attract top talent and foster business-line collaboration. Existing definitions will need to be adapted to achieve a bankwide standard. Compromising for the good of the whole is, historically, hard to achieve. But establishing a baseline, setting goals and continuous measurement are crucial for progress.

A skilled chief data officer, capable of both vision and execution, is critical. Second, even if the road to a top-notch data program is long, developing a road map and starting the journey is essential. Delays exacerbate challenges as new programs, products and data complexities emerge.

Ultimately, a robust data management program is essential for the long-term success of banks, particularly those exceeding $10 billion in assets.

For reprint and licensing requests for this article, click here.
Data quality Data governance Regulation and compliance
MORE FROM AMERICAN BANKER