With the continued digitization of financial services through new product innovation and data recordkeeping, regulators have made some attempts to use technology to keep pace. But they have a long way to go.
As President-elect Trump forms a new government, he should consider establishing an executive-level office to inform his administration on ushering in a new era of financial regulation — one that is much savvier about new technology and big data.
The Office of Financial and Regulatory Technology would be the center of research and innovation to build links between evolving technologies and new regulations. Transparency in near-real time would be its end objective so that regulators charged with overseeing financial institutions can in fact see what’s going on behind the scenes.
Regulatory oversight would take on new meaning as computers in financial institutions would be sending financial transactions to regulators’ computers in real time. Analysis of such transaction data would utilize algorithms and get smarter each day as big-data patterns emerge.
To some extent, the regulatory agencies have tried to gain this type of capability, but to date they have failed miserably. Immediate benefits from this new agency would come from the interplay of regulators and technologists aiming to set more attainable and cost-effective goals for new regulations. Both regulators and financial institutions would gain from standard data sets for transmitting financial transaction data. Such data sets include
Today, almost all regulation is implemented through computer code in the digital factory that underpins most of finance. Risk management and data management, the two pillars of a digitized financial system, must be understood through the technologists’ lens or we will forever be creating more of the same — meaningless data collected by yet another layer of intermediary data institutions for yet another regulatory purpose.
We see this occurring with attempts to implement well-intentioned financial reforms resulting from the immediate aftermath of the financial crisis that appear destined to fail. A Computer Assisted Transaction audit trail for stocks and options that is supposed to prevent flash crashes is about to be approved by the Securities and Exchange Commission, possibly costing billions. Its outcome, at best, is data analysis a week late after an adverse event had occurred. It will be missing critical data from futures markets even though there is universal agreement that these markets are interrelated, react to each other in real time and collectively caused the first flash crash, in 2010.
The global rethink of regulating derivatives is mired in data for billions of transactions already sent to regulators that is neither accessible nor readable by computerized means. Nearly 100 new data intermediaries produce, process, store and aggregate this data, as well as deliver the dysfunctional data to global regulators. It is no wonder that regulators are failing to perform their local oversight and to analyze global systemic risk.
The onerous Volcker Rule, meant to prevent banks from using capital of federally insured depository institutions to make big bets for their own trading businesses, requires legacy trading systems to be adapted to perform technology miracles they were never intended to perform.
Finally, there is the issue of "too big to fail."
As the regulators try to deconstruct the most complex financial institutions through the "living wills" process — in order to plan out how a bank could be resolved in failure — an agency focused on the technological aspects of regulation would do a better job of understanding banks’ technology blueprints.
Too little is still known about how these giants were assembled and interoperate leading many to ponder how they could be broken up. A living will requires the drafter to have a full inventory of assets and liabilities and organizational components. In addition it must contain an inventory of internal systems and interconnections, as well as external entanglements with all outside facilities operators and infrastructure organizations.
Without such a technology blueprint for breaking up these financial behemoths regulators may inadvertently pull the wrong brick or tug the wrong pipe and topple the whole edifice. Understanding the internal workings of these firms is not helped as the global project for creating data identifiers for products and participants is languishing.
But more to the point, there is no regulator able to process and analyze the immense data deluge of ever-increasing volumes of financial transactions that institutions are sending every day. This data should be analyzed in a timely manner for risk, compliance, fraud and other regulatory oversight mandates. Lots of valuable insights are being missed as there is no one regulator looking for patterns of misbehavior enabled by big-data analysis.
Early objectives for the Office of Financial and Regulatory Technology under the Trump administration would be the setting of uniform standards for financial transactions and uniform protocols for financial networks; embedding links between internal financial systems and external regulatory systems; removing huge and duplicative infrastructure costs of multiple regulatory agencies’ data and technology budgets; and, most important, setting the vision and timetable for use of distributed ledger technologies for these agencies and the financial institutions they oversee.
Like a master builder, the new president-elect has the opportunity to work alongside financial institutions to chart a vision and lay the foundation for our new financial system in the digital age.
Allan D. Grody, the president of Financial InterGroup Holdings Ltd., is a 50-year veteran of the financial industry. His work and writings focus on the intersection of risk, data and technology.