Forget big data. Just collect "smart" data.
That's the principle guiding David Saul, senior vice president and chief scientist of State Street Corp.
The challenge is to identify and pull in data that allows a financial firm that allows it to calculate risks that it faces in securities positions held and exposures to different counterparties. This is not something solved with "technology alone,'' Saul said at the 2013 Americas Operations Forum of the Society for Worldwide Interbank Financial Telecommunication, or Swift.
The effort has to be combined with industry standards on formatting of the key data and clear understanding of regulatory requirements.
But it also requires "robust data governance,'' Saul said. This includes clear identification inside an organization of:
• Who owns the data
• Who's responsible for the data
• How will it get aggregated
• How will it get delivered
"We shouldn't be thinking of regulation as a burden. We should be thinking of it as good business,'' said Lee Fulmer, managing director for cash management at J.P. Morgan. "The question is how do we give answers to regulators in a useful form that is also useful to us."
The need to create useful data rather than just lots of data comes as large global institutions face expenditures ranging from $150 million to $350 million each to comply with new post-credit crisis regulatory requirements in the United States, Europe and elsewhere.
That is "significantly larger" than the level of expenditures required previously for complying with Sarbanes-Oxley Act, Markets in Financial Instruments Directive and Basel II requirements, from before the crisis, said Javier Perez-Tasso, head of marketing at SWIFT.
The challenge, he said, will be for firms to effectively pull together the correct "smart" data from across many systems across many divisions or subsidiaries in many geographies, Perez-Tasso said.
The lack of integration has to be overcome, he said. And one method of overcoming the challenge will be Web semantics, Saul said.
Web semantics promoted by World Wide Web Consortium (W3C) are designed to create common data formats throughout the Web.
The aim is to allow data to be shared and reused across applications, different parts of a company and different organizations, as needed.
For global financial institutions, a key initiative for creating the smart, reusable data is the creation of the Financial Industry Business Ontology (FIBO), Saul said.
This initiative is being headed by two industry organizations, the Object Management Group and the Enterprise Data Management Council.
The immediate aim is to capture data that will allow firms to report data to regulators, in the efforts to comply with Dodd-Frank Wall Street Reform Act laws in the United States and similar post-2008 rules worldwide.
Making the collection of data easier and more cost effective are technical developments such as cloud computing, standardized hardware, machine learning and the ability to hold large amounts of key data in the usable memory of a machine or network of machines, according to Ulrich Homann, Chief Architect, Worldwide Enterprise Services, at the Microsoft Corporation.
Cloud computing makes a wide variety of technologies and services accessible and less expensive, he said. And, the commoditization of data and other computer servers means operating networks of 5,000, 10,000, or even 50,000 machines is "doable technically," he said. The result: Cost savings and, at same time, real-time risk management.
More use of encryption, in automated fashion on e-mails and in storing data, also will help enforce risk-management policies, he said.
Swift operates a global network that allows financial institutions worldwide to send and receive information about trades and transactions.