What IT Managers Need to Know about Data Center Cooling

Making sure a data center runs efficiently is up to both IT and facilities

Yevgeniy Sverdlik, Former Editor-in-Chief

February 17, 2016

4 Min Read
What IT Managers Need to Know about Data Center Cooling
(Photo by Sean Gallup/Getty Images)

For any data center cooling system to work to its full potential, IT managers who put servers on the data center floor have to be in contact with facilities managers who run the cooling system and have some degree of understanding of data center cooling.

“That’s the only way cooling works,” Adrian Jones, director of technical development at CNet Training Services, said. Every kilowatt-hour consumed by a server produces an equivalent amount of heat, which has to be removed by the cooling system, and the complete separation between IT and facilities functions in typical enterprise data centers is simply irrational, since they are all essentially managing a single system. “As processing power increases, so does the heat.”

Jones, who spent two decades designing telecoms infrastructure for the British Army and who then went on to design and manage construction of many data centers for major clients in the UK, will give a crash course in data center cooling for both IT and facilities managers at the Data Center World Global conference in Las Vegas next month. The primary Reuters data center in London and a data center for English emergency services – police and fire brigade – are two of the projects he’s been involved in that he’s at liberty to disclose.

If IT managers simply communicate parameters of the equipment they have in the data center or are planning to install, facilities managers should be able to determine the optimal spot for that equipment on the IT floor. Facilities managers need to know the thermal profile and power requirements of IT equipment in order to utilize data center cooling capacity efficiently.

Jones’s presentation will include a quick overview of the basic concepts in data center cooling and guidance on matching operational parameters of IT equipment to areas of the data center with the most appropriate temperature and humidity ranges. He will also go over newer cooling efficiency concepts, such as containment and free cooling, as well as the essential need to continuously measure the system’s performance.

The presentation will not be basic, “but it’s not in-depth where we’ll go into cooling equations,” he said. “It would cover a good cross-section of IT professionals, as well as technicians and managers.”

Another portion of the presentation will cover the basic steps of creating a preventative maintenance program for the cooling system. It starts with measuring and monitoring, which includes gathering sensor data, doing static pressure checks, using thermal imaging, and applying appropriate metrics to understand how the system is operating.

The next steps are functional testing, or how to make sure the equipment is tested properly, and controlling and improving airflow management – things like making sure perforated raised-floor tiles and air vents are aligned correctly.

Jones will go through basic visual checks that can be done to better understand capacity of the cooling system and give an overview of affinity laws – the physics of pumps and fans. A fan, for example, doesn’t necessarily move more air if it spins faster. If it’s spinning too fast, some air “slips” off the blades, which means the fan is wasting energy. Jones will explain how to determine the optimal fan speed.

Data center managers make a lot of mistakes that result in inefficient data center cooling. One of the biggest problems is poor understanding of airflow management, which is something both IT and facilities staff play a role in.

The company can spend a lot of money delivering cold air to the data center floor, but if technicians install cabling in a way that obstructs air flow, neglect to cover empty rack spaces with blanking panels, or simply don’t know how to determine where in the rack is the best spot for a particular piece of equipment, a lot of conditioned air doesn’t get to the equipment at all or gets mixed with hot exhaust air.

Another common mistake is overcooling. A lot of modern IT equipment works well in higher temperatures than most data centers provide. The unfortunate reality is that most data centers have a mix of old and new IT gear, which means data center managers need to have a finer understanding of their cooling system and airflow on their data center floor to take advantage of higher operating temperatures while making sure older equipment stays sufficiently cooled.

Greater understanding of how data center cooling systems work by everyone who works in the data center can make a big difference in how efficiently the facility runs, optimizing its energy use and use of the company’s resources.

Want to learn more? Join Cnet's Adrian Jones and 1,300 of your peers at Data Center World Global 2016, March 14-18, in Las Vegas, NV, for a real-world, “get it done” approach to converging efficiency, resiliency and agility for data center leadership in the digital enterprise. More details on the Data Center World website.

 

About the Author

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like