|Getting real with going virtual (February 2009)|
Virtualization—a more efficient way to set up and maintain a raft of IT resources (including storage and server assets)— brings a cost-cutting potential that will guarantee it a place on the must-do list of 2009.
By Lauren Bielski, senior editor
Simulating physical IT assets gives banks efficiency
Unless they are laying out expenses for do-or-die topline growth, bank technologists this year will be hunting for the ways and means of efficiency—and paying for additional IT paraphernalia only to save.
With these mandates, banks will “go virtual” with the various infrastructure components that buttress their operations.
While other efficiency projects may get the nod (say, adopting a software management application in order to monitor any “hiccups” in processing), the long emerging virtual tech segment is this year’s hot item under the spending-to-save category.
As one indicator of this, the number of virtualized PCs will grow from less than 5 million in 2007 to 660 million by 2011, says Philip Dawson, a Gartner vice-president and analyst.
Generally “virtualization” refers to methods that decouple—that is, separate functional logic from hardware to better utilize hardware capacity, among other positive by-products.
Alan Murphy technical marketing manager with Seattle-based shared product platform provider, F5 Networks, wrote in a white paper: “today’s use of the term [virtualization] generally refers to any type of process [simplification] where the process is somehow removed from its physical operating environment, and there are many different ways to do this.”
First, as a quick aside, the technique isn’t exactly new, harkening back to the ’70s, albeit with different hardware and techniques, one bank IT veteran notes.
To get your head around the concept, think of a well-virtualized server environment as one where a warren of many different types of software can co-exist on a single hardware platform without crashing, burning, or ticking off business users with application interference.
It’s all about efficient IT provisioning (setup of the IT environment) that is secure and easy to handle, via being centralized—and yet keeps the front office in production.
Ken Kucera, senior vice-president and CEO of $20 billion assets First National Bank of Omaha, is an early adopter of virtualization, which has helped him cut costs and upgrade bank applications, including bill payment.
First National’s data center has undergone an overhaul: “We cleaned it up by using virtualization to consolidate our servers,” he says. “We’ve also done network and storage virtualization to have the capacity we need in a simpler environment,” he adds. “Going forward, we’ll use this approach whatever we can,” he asserts.
When Kucera took over as CIO back in October 2003 he noticed that First National’s habit of loading a single application on a single physical server had left the bank with some 600 “boxes,” as servers are often called, from a slew of manufacturers.
“There was too much complexity,” says Kucera. “I realized we had to provision the data center differently.” These days, instead of taking weeks to order, receive, and set up a physical server, the bank can “put up virtual images” of an application quickly. “We’ve gone from 600 to 80 servers using Linux Virtual Servers running on IBM System z boxes,”
Kucera says. “I’ve been able to reduce staff and operational cost.”
Be careful about security
“While there are many different types of projects you can do with virtualization, banks we spoke with seemed to be most interested in reducing hardware costs in server and storage areas and better managing their infrastructure,” says Susan Cournoyer, managing vice-president, Gartner. “That was definitely the case for the next few months.”
She says that at a time when bank executives are preoccupied with a swirl of business concerns, in some cases working through sudden mergers, or a need to rush doubletime though an expense reduction exercise, virtualization, if done properly, could help them prune their data centers without negatively impacting performance.
“We do have concerns about security connected to these types of projects,” Cournoyer explains. “Although it is complicated to explain, virtualizing without careful security planning can result in exposures. Security must be ‘baked in’ at conception, not added on later.”
Still, as early as last April, before lean operations became the mandate, Gartner predicted that the technique would be a big hit into 2012, crowning new vendors while forcing traditional providers to rethink their offerings.
The recession, of course, may put some drag on that forward motion. But based on what she’s heard from clients, Cournoyer says the technology appears to be counter-cyclical.
Virtualization has so much spend-to-save value because it lets IT staff provision software more flexibly. It’s good for re-engineering.
“This is actually a method that can support better integration of application environments in various bank divisions,” says Cournoyer. “Although, strictly speaking, integration work would likely be a grand project goal and not something anybody is willing to spend on right now, still, the technique and technologies have that sort of potential.”
What bank IT wants
So then, there are the immediate needs of the recession and other possibilities. Over time, the industry will be taken into a new era. One way will be with the machine type of virtualization, which decouples the hardware and operating system (OS). Then, there is the application type, which, in effect, separates the OS from the application layer.
In a research note, Gartner wrote that while application virtualization is gaining considerable interest because of it’s ability to partition hardware, it is machine virtualization that will be the bigger deal over time, allowing multiple operating system/application combinations to be run on a single device.
Another plus for banks is virtualization’s ability to rein in scattered legacy environments. When blended with service-oriented architecture (SOA) techniques, it supports new application features in the front, middle, and back office without a radical “rip and replace” of legacy systems.
David Zimmerman, global solutions executive, banking, IBM says that while every region, worldwide, has subtle variations in how quickly the method is being adopted and what types of projects are of most interest, all want better IT asset utilization than the norm of 12-20%.
He says the perspective from most U.S. tier one and tier two banks is a need to cut costs and reduce server sprawl, and yet move forward on key projects such as supporting improved disaster recovery.
Many large banks, he reminds readers, still struggle with aging core processors and other legacy issues. “If you simplify the environment, you simplify, to some extent, all kinds of major upgrade projects.” Zimmerman says that the typical CEO would look at a virtualization project not as “cool technology” but as a methodology that can support a beefed up production environment quickly. Likewise, the typical HR head would see the approach and think of reducing headcount. Virtualization is all about what it can accomplish in a way that previous technologies—which were hard to provision and “in your face”—just weren’t.
“If you have 600 servers, at a typical capacity utilization during non-peak, but you need excess capacity for peak performance, well then, you’re in a position where a virtualization project just might make sense,” says Zimmerman.
IBM, he says, which took its own medicine, went from running 3,900 servers to 33 System z servers. BJ
Ways to subdivide work: Grid, virtual, clouds
With grid, a complex computational job such as processing derivatives is subdivided and processed on multiple servers. Whereas the “virtual machine,” as techies call it, gets evoked on demand. Think of it as a kind of simulation in that it isn’t finite (like the software sitting in your desktop unit) but can still do computational work. Like grid, work is subdivided and passed around from strained, overused devices to those that are under-utilized. Unlike grid, there are fewer limitations as to what type of job can be handled in this subdivided way and how that subdivision is executed.
Virtualization is also getting talked about, increasingly, in terms of the “cloud,” that vast interconnected set of resources linked, in a very general sense, by the internet.
In explaining these nuances, David Zimmerman, global solutions executive for banking, at IBM, says: “While a cloud does support grid, a cloud can also support non-grid environments such as three-tier web architecture running traditional or Web 2.0 apps.”
IBM is joined by numerous providers in the emerging space. Analyst reports at Gartner and Forrester name Microsoft, VMware, Intel, and Advanced Micro Devices (AMD) as leaders in helping to support this more efficient way of computing.
The electronic version of this article available at: http://lb.ec2.nxtbook.com/nxtbooks/sb/ababj0209/index.php?startid=36
| TechTopics Plus