Insight and analysis on the data center space from industry thought leaders.
Data Center Cooling: Are We Innovating Yet?
True innovation goes beyond the obvious improvements or natural evolution of a product or technology. Due to its large share of total data center energy use, cooling system innovation is a requirement of staying in the game.
November 18, 2010
Mark Germagian, President and Founder of Opengate Data Systems, has been active in technology research and new product development since 1980, developing innovative power and cooling solutions for the telecom industry and information technology environments.
Mark Germagian
MARK GERMAGIANOpengate Data Systems
Over the past 20 years, have we seen real innovation in data center cooling or just incremental design improvement? Even 4,000 years ago, buildings were cooled with ‘wind catchers.’ These tall brick towers created a draw of building air out of the building due to the prevailing winds. The evacuation of air from the building was replaced with air that entered from a ground opening to an underwater aqueduct. This process of moving air across the water cooled the air before it entered the building. This system also worked in reverse. This is the first known building cooling system that didn’t require any man-made energy.
By What Measure Do We Consider Something Innovative?
In any industry, when we set out to innovate, we typically improve a product by making it faster/better, more reliable or producing it at lower cost. Sometimes we achieve success by meeting just two of these goals. When achieving all three, success is often guaranteed.
A data center cooling system needs to process air; remove the heat from the air (make it cooler) and maintain appropriate humidity levels. However, a cooling system that just produces the coldest air with the greatest reliability at the lowest cost doesn't necessarily make it innovative. With data center cooling systems exceeding 30% of total data center electricity use, operational costs play a huge role. Innovation to make a cooling system better must be one of the key goals of managing the facility.
Innovation or Just Obvious Makeovers?
The U.S. Patent Office grants patents to any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. In order for an invention to be patentable, it must also be non-obvious, where 'non-obvious' is sufficiently different from what has been used or described before by a person having ordinary skill in the area of technology related to the invention.
In 1902, only one year after Willis Carrier graduated from Cornell University with a Masters in Engineering, the first air conditioning for temperature and humidity was in operation. The 'Apparatus for Treating Air' (U.S. Patent #808897 granted in 1906) was the first of several patents awarded to Carrier, the 'father of air conditioning', but the term 'air conditioning' actually originated with textile engineer Stuart H. Cramer. Cramer used the phrase 'air conditioning' in a 1906 patent claim filed for a device that added water vapor to the air.
Einstein received a patent in 1930 for an absorption refrigerator that didn't require moving parts or electricity to operate. Interestingly, Electrolux bought the Einstein patent rights and his invention never made it to commercial production.
Is water cooling innovative? IBM mainframes were water-cooled years ago. Because water is 832 times denser than air, it can remove more heat than air can. Using water to cool is a basic application of physics, and using it to cool computers similar to what has been done in the past is just a makeover of a pre-existing system. The U.S. Patent Office would consider it a 'method innovation' when adding intelligence, automation, and control to water or air systems for cooling IT equipment. Methods can bring effective results to make solutions faster or better, more reliable, and at a lower cost.
ASHRAE can help
The American Society of Heating Refrigeration and Air Conditioning Engineers (ASHRAE) set new environmental standards in 2008, and is still working with IT equipment suppliers to further expand these standards for continuous energy efficiency improvement. ASHRAE moved the recommended upper temperature limit to 80.6 ºF (27 ºC) for one reason--improved energy efficiency. What ASHRAE clearly understands is that low supply air temperatures require low water temperatures, which hurts cooling equipment efficiency as well as how many hours the equipment can run without refrigeration, the largest energy-use sub-system in the cooling circuit. Adding more cooling units to run in inefficient modes is taking a 'meat locker' approach to data center cooling. This will be effective for keeping below the ASHRAE upper limits and may provide some redundancy when a cooling unit fails, but it is costly and irresponsible.
Setting Performance Standards: Your Innovation Goals
You eventually need to address an entire system, such as data center cooling, by looking at each of the sub-systems that make up the entire cooling circuit. However, you could write your 'faster/better' goals, for example, as a set of performance standards for the entire system, often resulting in innovation activity for each sub-system.
1. Supply only the required volume of air
IT equipment fans draw air to cool internal heat-generating components. Today, the average IT equipment consumes about 120 cubic feet per minute (CFM) of air for each kilowatt (kW) the equipment dissipates. For a rough estimate of where you stand today, multiply your UPS load by 120 and compare this total IT equipment CFM to the air supplied ratings on your perimeter cooling unit or air handler manufacturer specs. Don’t be surprised if you get the industry average of 2.5 times the air volume supplied over what is required.
2. Supply air at the upper ASHRAE limit
Achieve this temperature range while also maintaining accepted humidity levels. This is a very important consideration when looking at bringing in outdoor air for free cooling. Often, circulating water or glycol to outdoor fan coils prevails because of this need to maintain humidity. The real challenge: With control over your supply temperature using the full range of the new ASHRAE standard, are you able to keep the rack intake temperatures within just a few degrees of the supply air temperature? With hot air recirculation in the data center and cold air bypassing the IT equipment and returning to the cooling units, this can be the one critical issue for achieving faster/better, more reliable, and lower cost innovation.
3. Select the most cooling-efficient servers, storage and network gear to meet your other requirements
Energy-efficient IT hardware that consumes less air in the data center for the same power load will result in a higher exhaust temperature. This higher heat improves the efficiency of the entire cooling circuit in two ways. First, you’re consuming less cool air, so less has to be generated. Second, the higher exhaust temperature makes the perimeter or central air handler coils more efficient. Deploying efficient IT hardware without a way to manage the higher exhaust heat will result in hot spots, and the need to overcool to knock them down.
Also consider that using energy-efficient IT equipment will increase, not lower, your Power Usage Effectiveness (PUE). For the same data center infrastructure, you end up decreasing your IT power usage and increasing your percentage of data center infrastructure. So why are we using PUE to measure efficiency? It’s the best measure we have now, and the PUE metric should evolve soon as we start looking more closely at the IT operation.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.
About the Author
You May Also Like