IBM Announces Utility Computing Service
IBM today unveiled Blue Cloud, a utility computing initiative designed to help customers run grid platforms in their existing enterprise data centers.
November 15, 2007
IBM today unveiled Blue Cloud, a utility computing initiative designed to help customers run efficient grid-based platforms in enterprise data centers. The effort differs from utility computing services from Amazon and other grid hosting providers, as IBM is focusing on helping build cloud computing platforms in customer data centers, rather than leasing out its own data center infrastructure on a pay-as-you-go model.
This approach allows enterprise customers to retain control over their resources while growing the market for IBM hardware and software to run and manage the platform. Blue Cloud was announced today in Shanghai, and the initial offering will feature IBM's BladeCenter chassis and Tivoli management software.
Built Atop Hadoop
The initiative is built around the Apache Project's Hadoop, an open source platform for utility computing that integrates elements of Google's infrastructure management software. IBM is comparing the introduction of Blue Cloud to its decision to throw its support behind Linux in 2000, which is widely viewed as a key tipping point in the adoption of Linux in the corporate market.
"Blue Cloud will help our customers quickly establish a cloud computing environment to test and prototype Web 2.0 applications within their enterprise environment," said Rod Adkins, Senior Vice President, Development and Manufacturing for IBM Systems & Technology Group. "Over time, this approach could help IT managers dramatically reduce the complexities and costs of managing scale-out infrastructures whose demands fluctuate."
IBM's press release notes that Blue Cloud allows users to "link together computers to deliver Web 2.0 capabilities," and coverage in eWeek also notes cloud computing's applications in social media. But IBM also told the New York Times that it is working with several corporations and government agencies. "Large financial services companies are going to be among the first to be interested," IBM senior VP William Zeitler told The Times.
Hadoop as Scalability Driver
The aspect of Web 2.0 that is most central to Blue Cloud is scalability. And Hadoop is all about scalability. Hadoop is a framework for running applications on large clusters of commodity hardware. It was originally built to power to build a search index for the Nutch project, and semantic search startup Powerset has been building its index using Hadoop on Amazon's EC2 utility computing platform. Hadoop uses Google's published computing infrastructure, specifically MapReduce and the Google File System (GFS).
Another major player that is closely involved in Hadoop is Yahoo, which now employs Hadoop lead developer Doug Cutting. On Tuesday Yahoo announced that it had assembled a 4,000 CPU Hadoop grid for use by researchers at Carnegie Mellon. Yesterday Yahoo launched a new blog focused on Hadoop and distributed computing.
Assembling A Hadoop-Ready Workforce
IBM's unveiling of BlueCloud dovetails with last month's announcement that IBM and Google were teaming up to build large data centers to power a grid computing initiative for research universities. The program will allow computer science students at research universities develop for "cloud computing" applications hosted by large data centers. But it also will ensure a ready workforce of developers with expertise running Hadoop on IBM hardware, which will be of interest to enterprise customers weighing utility computing platforms for their own data centers.
About the Author
You May Also Like