In this era of hyper-regulation, economic ennui, and desperate revenue requirements, the potential of Big Data is something like the holy grail: extremely desirable yet tantalizingly out of reach.
It's not that bank and other business leaders doubt the advantages that advanced analysis of data derived from customer interactions and other sources could bring to the bottom line. It's just that such analysis can be very hard to achieve, at least in an economical way.
IBM's Sushil Pramanick, writing in that company's IBMbigdatahub.com blog, says: "While analytics aren't exactly new to the world of banking, plenty of banks are gearing up for their next big analytics push, propelled by a load of data and new, sophisticated tools and technologies. Why has business analytics jumped to the top of the priority list for banks? Pick a reason. Regulatory reform, managing risk, changing business models, expansion into new markets, a renewed focus on customer profitability-any one of these is reason enough for many banks to reconsider what today's analytics capabilities can offer."
He outlines succinctly what Big Data's promises and pitfalls are:
• Promises: "By applying data mining and predictive analytics to extract actionable intelligent insights and quantifiable predictions, banks can gain insights that encompass all types of customer behavior, including channel transactions, account opening and closing, default, fraud, and customer departure."
• Pitfalls: "Implementing business analytics across the bank can be a daunting and potentially expensive prospect, due to: Complex, heterogeneous technology architectures; operationally optimized but siloed processes and systems; data fragmented across multiple databases; constrained investment budgets with competing agendas; lack of skilled resources; [and] perception that the data available is of insufficient quality to support analysis."
Other analysts say much the same thing. Pitney Bowes Software recently surveyed hundreds of senior executives across industries and concludes that "the journey from Big Data to business value has just begun."
When asked their biggest challenge to extracting value from Big Data, the response most selected, at 35%, was that there is too much data and too few resources. In response to another question, 38% said that the lack of analytics capabilities and skills is the biggest inhibitor to gaining business value from Big Data.
Nevertheless, the intense pressures financial institutions face now and in the foreseeable future will mandate that such challenges will have to be overcome. As FICO says in its own recent study, the biggest challenge of the next two years will involve advancing from decisions based on big customer segments to decisions targeted to microsegments and ultimately to segments of one. Meeting this challenge will require new customer analysis techniques and automated decision-making.
"Customer experience matters-it is how enterprises will differentiate themselves and compete in the ‘instant economy,' says Stuart Wells, chief technology officer at FICO. "It is no longer adequate to blindly blast marketing messages to millions of consumers, or to take months to incorporate insights. Customers expect instant, relevant interactions from the companies they do business with, or else they will take their business elsewhere. This is driving a wave of investment in not only analytics but the decision management platforms needed to foster an intimate customer relationship."
This gap between the increasingly intense demand for analytic services and the stubborn difficulty of attaining them has not been overlooked. Analysts note that forces are underway, both internally within companies and externally among third-party providers, to find ways to make Big Data work.
Rainstor, which provides database solutions for Big Data, says the focus for IT this year will be to provide scalable, high-performance analytics at the lowest cost, as business users demand continuous access to both current and historical data."
"The increasing need to leverage Big Data for competitive advantage, along with the rise in innovative product offerings, will completely change the way customers store, manage, and analyze their most critical asset-data. Enabling enterprise customers to manage petabyte scale data environments in a much more efficient and cost effective way is what is now required. Over the next 12 months, Big Data will take the spotlight because it is on every company's short list."
In like vein, Gartner outlines three key predictions in this area:
• By 2015, 65% of packaged analytic applications with advanced analytics will come embedded with Hadoop.
[Hadoop, named after the toy elephant belonging to one of its inventor's sons, is an open-source software framework that supports data-intensive distributed applications and is currently the leading-edge facilitator of Big Data analysis in use.]
"Technology providers will benefit by offering a more competitive product that delivers task-specific analytics directly to the intended role, and avoids a competitive situation with internally developed resources," says Bill Gassman, research director at Gartner.
• By 2016, 70% of leading business intelligence vendors will have incorporated natural-language and spoken-word capabilities. In other words, instead of relying on the old-school point-and-click, or even the drag-and-drop, interfaces with computers, users will be able to get information simply by asking for it out loud-similar to the personal assistant available on today's iPhones.
• By 2015, more than 30% of analytics projects will deliver insights based on structured and unstructured data.
"Organizations are exploring and combining insights from their vast internal repositories of content-such as text and emails and increasingly video and audio-in addition to externally generated content such as the exploding volume of social media, video feeds, and others, into existing and new analytic processes and use cases," says Rita Sallam, research vice president at Gartner.
To be sure, technology providers already are making their mark. As an example, simply because it is most recent, Science Applications International Corp. introduced Critical Insight Solutions, what it calls "an end-to-end suite of Big Data optimization tools and expertise for large organizations looking to bring order and clarity from massive amounts of disparate data in real-time."
Targeting financial services companies, among others, it says the product "allows users to quickly test analytic hypotheses independent of data volume, speed, and structure. This fast-discovery approach provides unprecedented agility in responding to dynamic environments."
No doubt many in the banking industry will consider the appropriateness of such solutions to their own organizations. In the meantime, Pitney Bowes Software offers a few best practices as such considerations are made:
• Demonstrate the business value of every data project or exercise to senior executives.
• Focus spending on staffing with advanced analytics and reporting skills.
• Think about whether centralizing data, data management, and/or data analytics will help deliver business value.
• Create and nurture the discipline of repeating like data analytics exercises over time to measure changes in business results and client or consumer behavior over time.
Sources used for this article include:
About the AuthorJohn Ginovsky is contributing editor of ABA Banking Journal and editor of the publication's TechTopics e-newsletter. For more than two decades he has written about the commercial banking industry. In particular, he's specialized in the technological side of banking and how it relates to the actual business of banking. He previously was senior editor for Community Banker magazine (which merged with ABA Banking Journal) and was a staff writer for ABA's Bankers News. You can email him at firstname.lastname@example.org