|DIALOG with IBM's Shanker Ramamurthy: Moving toward smart analytics|
Better tools—and better use of them—will help banks address efficiency, risk, and trust challenges
October 13, 2010
ABABJ met recently with Shanker Ramamurthy, general manager of Banking and Financial Markets for IBM at the tech giant’s research center in Hawthorne, N.Y. The writer was given a tour of the facility, then sat down with the executive to discuss key issues and IBM initiatives in the banking sector.
Widely quoted as a thought leader, Shanker was recently ranked by Euromoney magazine as one of the 50 most influential financial services consultants worldwide. He has over 20 years of consulting experience and has held leadership positions within IBM and PricewaterhouseCoopers. A qualified accountant with an undergraduate degree in mathematics, Shanker also holds an MBA in Finance from the Indian Institute of Management and a Masters in Information Science from the University of New South Wales in Australia. With these credentials he is well positioned to speak about that odd and complicated intersection of IT and finance. The following is an edited recounting of the conversation.
What, in IBM’s view, have been key trends for banks and financial market firms this year?
There are three things. First, both banks and Wall Street firms need to simplify business processes. Next, they need to address risk and regulation more efficiently, and finally, perhaps most importantly, they need to rebuild customer trust. These are challenges of not just the moment but will be areas that financial institutions will be dealing with for some time to come, whether the bank is a community bank or global behemoth, a start-up or mature player. And they also apply, albeit a bit differently, whether a bank operates in mature markets [G7 countries], emerging markets, or in less developed countries. In mature markets, there is pressure coming from all the loss of wealth, which has put a squeeze on profit margins. The financial ecosystem is experiencing a reversion to mean in terms of size and profitability, resulting from the financial crisis and the current economy.
Conditions are improving but they still can’t be considered entirely good.
That’s right. When times are good, you can afford complexity—duplication in terms of channels, geography, lines of business; variance in business process, inaccurate data, and so on. When times are not good—like now—you need to be leaner and more efficient. And so, we help our customers address the key areas I referenced. For instance, many banks are moving toward a shared services model of computing, collapsing applications to be used on fewer large-scale computers in order to reduce their IT spend.
Two years after the financial crisis broke books are still being published on variations of the theme, “why things went wrong.” In that spirit, then, we ask, why were such key signals in the capital markets missed given all the analytics in use?
It’s a good question. Of course, it’s not just a technology issue or the effectiveness of a particular application. Addressing risk accurately is much broader—embracing cultural and even systemic factors. The simple answer is, data and analytical models are incomplete and only as good as the model makers. Another simple answer is, the data has to be evaluated properly and acted on in a timely manner.
However, let me offer another way of responding because I’m interested in theories of market performance: Much of the sense that analytics then used somehow missed the mark goes back to the active dialogue that’s still occurring about the efficient market hypothesis versus the fat tail, unexpected-events view of the world. You can find this debated often enough in the blogosphere.
So, proper model building gets back, in some sense, to figuring out how markets work?
Sure, because your view on markets effects your [model’s] assumptions. University research in market theory these days tends to feature economists who are either mostly Fresh-water in their belief system, or mostly Salt-water. Fresh-water economists—so named in the 1970s and originally coming out of the University of Chicago near the Great Lakes—believe that, with enough data on how individuals make decisions under conditions of uncertainty, macro conditions can be identified and that many imbalances will correct themselves without undue government intervention.
The alternate view of the so-called Salt-water economists contends that the real world [and economic activity as a subset] is more discontinuous than the picture of it suggested by those who believe in the efficient market hypothesis and a normal growth curve world.
Where does the behavioral school fit in? That states that individuals within market economies aren’t at all rational about purchasing decisions, which would suggest that models are, if not useless, then, well, diminished in their predictive ability.
Behavioral ideas are gaining a lot of currency in economics and much work is being done to integrate this view into a grand overarching view of the world.
You see, the Fresh-water school is generally right. It’s just that every now and then, events happen that you just could not have easily, or perhaps ever, have predicted.
IBM Research Fellow Emeritus, Benoit Mandelbrot, who invented fractal mathematics and used IBM computing power as part of his research, predicted back in the early 1960s that random [financial] events happen that are not in accordance with the efficiency market hypothesis. Incidentally, Mandelbrot was the mentor of Nassim Nicholas Taleb, who wrote the popular book Black Swan, which looks at the impact of the highly improbable.
So, a lot of theoretical work remains. On a practical level, what needs to happen?
Well, banks need to address the simplification, risk, and trust issues I first mentioned. As global operations become increasingly interconnected, complexity will only increase and models will also need to incorporate the new learnings on that complexity, and better manage the technology-specific type of complexity that comes with interconnection.
Basically systems need to be able to handle the types of non-linear fat-tail events that will most likely happen with increasing frequency in the future. In order to manage this complexity effectively, not only must banks have sophisticated models, but management must also truly understand the limitations and assumptions that are built into these models so that judgment can be more effectively interwoven into decisions relating to risk management.
Analytic enhancements, then, are important for financial institutions this year?
Yes. To derive value from these systems, there are three classes of problems that banks need to address. First, more than ever, institutions must get an integrated view of information, first within domains of credit risk, market risk, and operational risk, then in some way that takes all of it into account and plays on inter-relationships between various types of risk. They will need to do this, in part, because the regulators will require it. But they will also need to do it to become smarter in their risk management.
Data integration alone is something banks have been struggling with for years.
Absolutely, the more organizational silos a bank has, the more products, the greater the geographic range... well, effectively coordinating all this information requires tremendous discipline in business process. Each of the discrete operations must be integrated and monitored in some way.
Bankers know this, of course. But it’s a matter of deploying systems and methods that can pick up patterns that can be acted on, without being too rigid, time consuming, and expensive. At the same time, our consulting business addresses the organizational complexity that banks have and helps them simplify and standardize.
You mentioned three IT related problem areas or challenges and we’ve touched on the first.
Te second broad area is simply this: the simulations you need to do, the type of complex math you need to do to work out these contingencies requires substantial computing power. This need for more capacity, IBM believes, will drive the push towards the cloud computing model, which is another big theme for us this year and something our banking customers are asking us about. Bankers like the capacity and they like the cost control that this approach can potentially offer.
Cloud, of course, is a hot technology topic. But you definitely see interest from bankers in terms of having more computing power to enhance analytics?
Absolutely. The cloud approach is also an innovative use of infrastructure, because if done correctly it can be set up to be consumed as needed without many of the IT managerial issues.
And the third IT challenge?
The third issue has to do with enabling banks to make real-time decisions while coping with the ever-accelerating explosion in the volume of information required to be analyzed. Stream computing—another big theme at IBM—lets business users make split second decisions about information coming in from all sides: internet channels, branch, call center and so on. Users need to figure out how to monitor all that data.
Another big issue is, how do you model it, or make decisions on it, when this has to be done within minutes—or even faster? The computing paradigm needed to do that is quite different from time series and even asynchronous traditional methods of computing.
At the start of our talk, you mentioned simplification. Can you give me some examples?
We have been working with several banks, which are members of the Operational Risk Exchange (ORX) founded in 2002. These banks share data, which has been submitted anonymously and enables data mining and enhanced operational risk decisioning. [Today, as posted on the website of the ORX Association, the database contains 170,081 loss events.] Basically, you can get a more accurate risk picture and do so with a shared-cost model. IBM Research helped to create the risk models used by the group.
Is that something that will be repeated elsewhere?
Yes. We believe that shared utilities for risk and fraud will become far more common—even in the U.S. where there are so many banks—due to regulatory demand, or because it will prove to be the only way to address these significant challenges cost effectively.
You also spoke earlier about rebuilding trust in financial institutions. Your thoughts?
In developed countries, banks need to further refine their efforts at being “customer centric”—having a well-organized view of each customer and acting on what’s known—rather than using a product-based model. Making this migration requires a lot of database work and channel integration to create an environment that helps bankers “connect all the dots” and make intelligent offers.
In developing countries we’ve been helping bankers address the problem of “financial inclusion.” In places where people don’t have experience with financial institutions, it’s developing rapport with something unknown. In emerging markets, we’re doing amazingly innovative work on the back of mobile devices—which people already have familiarity with in other contexts—to provide basic banking services. It’s more about putting the right technology in the right place, and putting the right type of delivery model together, whether it’s branches or ATM or so on.
Right now, in terms of mobile financial services, the most interesting example is in Kenya. IBM has been an actively involved in helping telcos, banks, and other industry participants to bring banking to the unbanked in Kenya and also in several other emerging markets.
[This article was posted on October 13, 2010, on the website of ABA Banking Journal, www.ababj.com, and is copyright 2010 by the American Bankers Association.]
| TechTopics Plus