The Blue Cloud Is IBM’s Commercial Cloud Computing
November 19, 2007 Timothy Prickett Morgan
Like other server makers, IBM has conflicting desires to both embrace and to ignore the what seems inevitable advance from proprietary computing–under both senses of the word proprietary–to open, on-demand, utility-style computing. Because companies still like to own their computing infrastructure but nonetheless want to experiment with new architectures, IBM this week is rolling out a new product called the Blue Cloud, which lets companies build their own utilities–often called “clouds” in deference to the Internet’s old nickname, the Big Gray Cloud–using similar technology deployed by Google, Yahoo, Amazon, and similar Internet powerhouses.
The conflict that IBM and other platform providers faced when they deal with cloud computing is simple enough. There was no commercial Internet in the early 1990s, and companies wanted to host their own applications–and they usually still do. This environment made companies like IBM lots of revenues and profits–particularly because of the inefficient use of computing resources. Even if companies didn’t use capacity efficiently, they sure did pay as if they were. And now, more than a decade after the commercialization of the Internet and the explosion in server capacity, companies are getting sick of doing IT the same old way. Server and storage virtualization are just two means for transforming silos of servers, operating systems, and applications into a more flexible computing infrastructure–something that looks more like a utility. In modern parlance, a computing and storage utility that is available on demand, typically running so-called Web 2.0 applications but also sometimes number-crunching jobs and other workloads, is now being referred to as a compute cloud, and the style of computation done on such massively parallel machines is called cloud computing. Rather than own capacity, you log in, pay for capacity, run your jobs, get your answers, and log off.
This sounds a lot easier than it actually is, and part of the reason is that standard business applications have not been coded to run on such massive clusters of loosely coupled machines. Turning such complexes into something that, at least from a programmer’s point of view, resembles a shared memory server is the Holy Grail of computing. Moreover, companies of all sizes like to hug their servers, and they never want a job to have to wait for processing time. They might be willing to use someone else’s utility to buy some spare excess capacity or to run non-critical but important workloads, but in general, companies want to buy their IT and squeeze the life out of it for as long as possible.
The Blue Cloud offering from IBM aims to make it easier for companies to set up their own clouds. The offering, which is actually coming out of the company’s Software Group, will be initially available in the first quarter of 2008. The first machines supporting the Blue Cloud software stack will be BladeCenter boxes with IBM’s X64 and Power blade servers running Linux. The X64 blades will have the Xen hypervisor installed and the Power blades will use IBM’s own Virtualization Engine, and Tivoli Provisioning Manager and Tivoli Monitoring will be used to manage the virtual machines in the cloud network. Sitting atop this will be a mix of IBM’s DB2 database, its WebSphere middleware, and the Apache Hadoop open source implementation of the MapReduce parallel computing environment created by Google. Presumably, IBM will also deploy AIX on Power-based blade servers. IBM also said that it will create a highly virtualized compute cloud based on its System z mainframes–presumably running Linux–and is also going to delivery clouds based on its various rack servers. The company’s i5/OS operating system for Power servers and Windows were not mentioned, but there is no reason that these platforms cannot hook into computing clouds or be modified to run the code behind them.
Even though MapReduce is one of the key secret sauces in the Google empire, Google knows it needs to open up a bit if it is to help computer science departments train newbies acquainted with massively parallel computing architectures and IT vendors to build the kind of hardware and systems software it needs to run and grow its business.
According to Dennis Quan, chief technology officer for the high performance computing unit in IBM’s Software Group, most universities teach students how to create and deploy applications on one or a few servers. “Students are never taught how to process large amounts of information across thousands of machines,” says Quan. “Applications being built today not only need to scale to very large numbers of users, but also need to scale to deliver very large amounts of compute resources.” Quan is one of the 200 researchers IBM has working on cloud computing.
This is why IBM and Google announced in October that they were creating computing clouds for universities to use as part of their instruction. The Blue Cloud products will be the commercialization of the clouds IBM has running at its Almaden Research Center, Google has in its own data center, and that the University of Washington has setup with the help of these two vendors. Over time, these centers will have approximately 1,000 server nodes for students to play with as they do their studies. This is not Google-class scale, of course, but it is closer than what is available today for most computer science departments.
The Blue Cloud is actually a derivative of a virtual server loaner program, called the Technology Adoption Program, that IBM started a number of years ago to allow independent software vendors to test their applications on virtualized slices of IBM’s own servers running specific software stacks from a remote location. Blue Cloud will be pitched with the obligatory services oriented architecture and Web 2.0 angle, and will initially be interesting to companies that want to create infrastructure that can be adapted quickly to handle different–and yet massive all the same–workloads. Financial services companies are among the early Blue Cloud beta testers IBM already has playing with the technology, as are government research centers and academia and one large auto manufacturer.
It remains to be seen if IBM will sell cloud computing as a service. IBM has sold HPC computing capacity to various customers through its Supercomputing on Demand service, which helped it work out kinds in utility-style computing. But so far, the uptake of utility computing for IBM’s customers, as well as those for similar offerings from Sun Microsystems, Hewlett-Packard, and Amazon has been less than many expected.
Red Hat to Use Automation, Virtualization to Eat the Server Space
Google, IBM Partner on Utility Computing Cloud
Ballmer Talks Up ‘Cloud Computing’
Dell Offers Large-Scale Data Center Design Service
rPath Linux Packages Up Amazon’s Grid Computing
Sun’s Grid Utility Expands Beyond the United States
ISVs Preload Applications on the Sun Grid
Sun Gives Developers Free Access to Grid Utility, Other Goodies