Last month, I discussed the
For banks, data quality, governance, control and utilization are ever more important. A bank is, at its heart, a balance sheet — a set of assets and liabilities. Unlike retailers, wholesalers or professional services firms, banks possess few physical assets. To be without sound data is to be without a sound bank. If you eliminate a bank's ledger book, it cannot track assets and liabilities — and is essentially out of business.
Soon, data quality and governance will be critical to being a highly competitive, top-tier financial institution and to receiving favorable regulatory ratings. For larger institutions, the future of quality data management programs is now. Almost every technology tool, especially those relying on artificial intelligence, is driven by good data. But for too many banks, their data programs are seriously flawed.
For example, in 2013, the Bank for International Settlements published
Of course, the strategic value of data governance extends beyond regulatory compliance, but they are intertwined. The recent Synapse bankruptcy and Silicon Valley Bank insolvency highlight the need for robust data management to mitigate risks arising from complex business models, rapid data growth and data velocity. Capital markets events like the Archegos bankruptcy have increased the focus on managing counterparty risk more effectively across investment banking and trading businesses.
Data quality, architecture, governance and interoperability pose significant challenges for banks. Banks collect vast amounts of data from customers, vendors, markets and internal sources, yet much of this data remains siloed and is captured in ways that limit its usefulness. Manual data entry exacerbates these issues, requiring extensive "cleaning" for the data to be useful. While data capture, storage and manipulation practices vary across institutions, the overall problem of dirty and/or idiosyncratic data is pervasive. This limitation hinders data utility within individual business segments and across the bank as a whole.
This is a hard subject to tackle and get right. Banks face significant challenges in achieving pristine data quality and governance. Even smaller banks with limited business lines face challenges with manual data entry, inconsistent data capture and manipulation across systems and vendors. As banks grow and diversify, ensuring data consistency across business lines becomes increasingly complex. Are Gene Ludwig and Eugene Ludwig the same person? What about ABC Advisory, LLC, owned and controlled by Gene Ludwig?
Acquiring another institution — which may have solved problems differently or not at all — further complicates data management. While siloed operations can mask these issues, achieving enterprisewide efficiency and cost savings requires consolidated data.
The personal data collection company, which offers its database for background check and private investigation services, acknowledged the major breach.
Historically, some banks could function with subpar data programs because essential information was supplemented by bank personnel with the knowledge of their specific siloes and the people or environments important to them. Today, with larger banks, intensified fintech competition and the growing reliance on data-driven technology, this is less and less the case.
For many banks facing margin compression, competition and the high cost of data management, prioritizing data governance and control may seem like a luxury. However, this can turn out to be penny-wise and pound-foolish. First, banks must collect and manage sensitive customer data, making thoughtful data management crucial to cybersecurity. Additionally, effective data management is essential to defending against compliance violation charges and identifying emerging trends that can threaten the bank's health, from customer trends to issues relating to balance sheet composition.
While the front-end benefits of high-quality data are often recognized, cost pressures frequently lead to suboptimal workarounds and solutions. This approach can be counterproductive due to the competitive environment, the complexity of the bank's business model and/or the size and complexity of the bank itself. Moreover, regulators increasingly expect banks, particularly larger institutions, to have robust data management programs.
The importance of high-quality data architecture is easy to articulate, but actually building that architecture is difficult.
Executive commitment, backed by the board and budget, is essential to attract top talent and foster business-line collaboration. Existing definitions will need to be adapted to achieve a bankwide standard. Compromising for the good of the whole is, historically, hard to achieve. But establishing a baseline, setting goals and continuous measurement are crucial for progress.
A skilled chief data officer, capable of both vision and execution, is critical. Second, even if the road to a top-notch data program is long, developing a road map and starting the journey is essential. Delays exacerbate challenges as new programs, products and data complexities emerge.
Ultimately, a robust data management program is essential for the long-term success of banks, particularly those exceeding $10 billion in assets.