McKinsey: Data Centers Cheaper Than Cloud
A typical corporate data center is more economical than Amazon's EC2 when measured in cost per CPU equivalent, according to a new report from McKinsey.
April 15, 2009
clouds
Is cloud computing more expensive that running your own data center? It can be for larger users, according to McKinsey & Company, which says that a typical corporate data center is more economical than Amazon's EC2 when measured in cost per CPU equivalent. McKinsey detailed its findings in a report released today at the Uptime Institute's Symposium 2009 in New York, titled "Clearing the Air on Cloud Computing."McKiney's conclusion runs counter to much of the messaging from the cloud computing industry, which asserts that running applications in a third-party data center with usage-based pricing offers better economics than building and operating a data center.
The cost comparison is one highlight of a broadly skeptical report, which says that cloud computing has tremendous promise, but the growing hype for the cloud could lead to a "trough of disillusionment" for large enterprise users.
"Cloud computing has shown great promise for start-ups and pet projects for large corporations," McKinsey writes. "However, it is not ready to help with the big challenges of big companies. Cloud computing can divert attention IT departments'attention from technologies that actually deliver sizeable benefits."
Current cloud computing offerings are not cost-effective compared to large enterprise data centers, McKinsey said, citing an average cost of $45 per CPU per month for a typical enterprise data center. The equivalent cost of virtual cores in large jobs on EC2 ranges from about $70 to $140 a month on Linux and $100 to $180 a month for Windows. McKinsey said the only scenario in which EC2 was competitive was when users pre-pay for Linux servers.
Amazon's platform makes much more sense for smaller companies, said McKinsey, which estimates that about half the use cases for smaller jobs on EC2 are cheaper than the $45 a month CPU equivalent for an in-house data center.
McKinsey cited one area where enterprises can save money in the cloud: migrating applications from Windows servers in their data center to Linux-powered cloud platforms. The consulting firm estimated that companies could save 10 to 15 percent in staff costs by shifting its data center operations to the cloud, but those gains would be more than offset by the differential in hardware pricing.
"Rather than create unrealizable expectations for 'internal clouds,' CIOs should focus now on the immediate benefits of virtualizing server storage, network operations and other critical building blocks," McKinsey concludes. These virtualization projects can help enteprise providers realize utilization improvements that can approach the rates seen for large public clouds, according to the report.
McKinsey defines a "typical" enterprise data center as having 10 percent capacity utilization. That number is consistent with some estimates of data center inefficiency, but enterprises that have used virtualization to consolidate servers are likely to have higher utilization rates. The number matters because it assumes a large available gain for investments in virtualization as an alternative to cloud deployment.
It's also worth noting that any smaller cloud providers are making similar comparisons with pricing on EC2, seeking to position their services as more affordable alternatives.
Finally, it's worth noting that McKinsey couldn't resist offering its own definition of cloud computing, which it estimates is the 23rd such attempt. Here's McKinsey's definition: Clouds are hardware based services offering compute, network and storage capacity where:
Hardware management is highly abstracted from the buyer
Buyers incur infrastructure costs as variable OPEX
Infrastructure cost is highly elastic (up or down)
About the Author
You May Also Like