Threats have moved from stealing card numbers and online banking information to more lucrative attacks on networks, increasing risk to not only to ATM machines, but also to electronic payment networks.
Case in point, the attack at Bangladesh’s central bank that controlled access to the SWIFT messaging system was one of the biggest digital heists on record. And more recently, a hacking spree affected European ATMs remotely accessing NCR machines in more than a dozen countries, forcing them to spit out cash.
Risk is nothing new to the banking industry. The risk of loss from failed internal processes, controls, people, systems or external events has attracted a lot of attention over the years. This is mainly because operational risk incorporates many of the major issues facing the sector today, including systems, cybersecurity, infrastructure, organizational issues, regulatory compliance and data breaches. We’ve recently seen billions of dollars in fines being issued to banks all over the world for poor compliance and oversight of practices.
In a recent interview,
Global legislatures are putting further pressure on banks to digitize. Most recently, the U.K.’s
However, despite tightening regulations, there are still major technological challenges that legislators fail to address. Technology risk is obviously a big part of operational risk; yet, this is less understood by regulatory entities. And, when you get down to system risks, the most poorly understood is certainly the software layer. Today, software risk is one of the major sources of profit loss and security exposure for banks. Software is also the backbone to any payment system. If you have bad software, you’re going to incur a lot of risk.
The latest
Shareholders and regulators have been so focused on operational considerations that they have largely failed to understand and ascertain whether the IT infrastructure and software quality of banking systems are actually capable of supporting the pace of financial innovation. There needs to be a much higher degree of technical excellence and software engineering to underpin our financial system.
A global, standardized approach, such as those provided by the Consortium of IT Software Quality (
Both legacy and new banks will need to ensure checks and balances, including system-level software analysis. For legacy financial institutions focused on ‘keeping the lights on,’ modernization efforts can be quite costly and time-intensive. Newly established banks, which can come up and running very quickly using modern technologies and API-driven development over the cloud, will still need to prove that applications are robust, secure and scalable. Of course, they don’t carry the cost and complexity of legacy systems. However, as they are delivering more discrete services, the same regulatory and operational hurdles will apply.
As part of their business operating model, new banks are adopting an approach where they buy specific services rather than build everything themselves – hence, in parallel, the rise of offerings by ‘fintech’ and ‘regtech’ providers. Big banks, who are effectively software and technology companies, need to run a robust infrastructure. Largely owned in house, the complexity is now accounting for significant operating budget and restricting change. The new ‘challenger’ approach is one where organizations don’t own their own technology. This presents new obstacles for
Successful institutions have to delicately balance regulatory requirements with their technical capacities, while pursuing the maximum set of capabilities. By analyzing their software using the most advanced standards at the system-level, banks will better measure and manage operational risks for their complex technology infrastructures. Those who can master this process will be the ultimate winners. This is the real challenge for today’s banks.