The Myth of the Data Center's High-Density Future
Data Center World: Schneider exec says high-density PODs are a much smarter way to go than high-density data centers
October 20, 2014
ORLANDO, Fla. - Regardless of predictions by vendors that data center densities would go up, it has hasn’t been the case so far. This misconception has led to a lot of data center design missteps, Kevin Brown, vice president of data center strategy for Schneider Electric, said.
The projections overstated reality, which has led to numerous data centers that were overbuilt and ended up being sold by the owners because they wanted to wash their hands off of them. The third parties that take the facilities over lease the unused capacity to multiple tenants. So the idea that building an entire data center for higher densities is a good way to future-proof the facility is a misconception.
But determining data center design needs is hard. In a Monday morning presentation at Data Center World in Orlando, Brown suggested that designing density at the POD level is a more efficient way to go than planning for a certain density across the entire facility.
Density policy at the POD level should still have peak and average specified. "If you have to oversize something, make it a POD, not rack power distribution or the whole data center," he said. "Have contingency plans in case you miss."
Designing for lower density is easier to manage overall, according to Brown, and high densities don’t necessarily save money.
“The data center of the future will look a lot like the one today,” he said. “Average density of 3-5kW per square foot and peak of 11.4kW [can be operated] with high degree of confidence. The days of data centers as rack-by-rack deployments are going away. We need to start thinking about the POD. Stop thinking of kW per rack, but by POD.”
While Brown proposes this strategy in right-sizing data centers, it’s important to note Schneider’s growing investment in the prefab data center. It acquired AST earlier this year and has been rolling out modules and modular reference designs.
Modules can adjust as the business evolves, and prefab modules allow a business to move faster. It is an argument against building unnecessarily big facilities, but Schneider also sells components for big mechanical systems installed in large data centers. The trend, however, is towards more incremental build-outs.
The high density misconception
There are two basic reasons for misguided density forecasts:
Mixed loads in data centers brings the average down
Technology issues and improvements
“Density projections are really talking about servers,” he said. “What’s happening is the focus is no longer about driving performance, but increasing performance per watt. While experts predict densities are going up, servers are going the other direction. All data is showing this.”
Cost per watt as density increases. Schneider Electric's Kevin Brown said the return diminishes and design complexity isn't worth potential savings when it comes to high density (source: Schneider Electric Data Center World Presentation)
Brown believes that mobile chip technology will continue to make its way up into the data center, increasing performance per watt and offsetting any supposed offset costs through higher density data center space. This is one major trend fighting against high-density predictions.
At low densities, rack count drives savings; at higher density, rack count savings is countered with higher-cost racks and rack power distribution. The assertion is that design complexity isn't worth the savings.
“As density goes up, it means bigger more expensive racks, greater capacity, so power distribution is more expensive. Where I save money is fewer racks, less space. That’s my trade off. Higher costs are supposedly offset.
When determining density, the cost curve drops really quickly as you reach 5kW per square foot, then continues at an increasingly slower pace to 15Kw. “Beyond 15kW, there’s no savings to be had. This is part of the reason densities are stabilizing.”
Design as insurance premium
Brown believes that when designing for uncertainty, it should be looked as an insurance premium.
There are three strategies to provision for density uncertainty:
Oversize the whole data center (a bad idea)
Create oversized PODs
Create oversized racks.
Design strategy should embrace POD deployment, said Brown. Oversized PODs are better than oversized racks.
Oversizing the three should be viewed as insurance premiums with varying degrees of desirability. Oversize rack is comparable to paying a < 1 percent insurance premium, oversizing the POD is a 5 percent premium and oversizing the data center is a comparable to a 25 percent premium. The POD is the insurance premium that goldilocks chose. Oversize a few PODs in design and the capacity guessing game is easier to figure out.
About the Author
You May Also Like