Insight and analysis on the data center space from industry thought leaders.

Data Centers Must Move from Reducing Energy to Controlling Water

While we can’t change the physics of how many BTUs versus how much water is used, we can control them.

Industry Perspectives

August 10, 2018

4 Min Read
DataCenterKnowledge logo in a gray background | DataCenterKnowledge

Marcus_20Moliteus.jpg

Marcus Moliteus, LEED AP, is Director of Sales Engineering at Aligned Energy.

After attending a data center industry event earlier this year and speaking with leading engineers from around the country, it struck me that there was very little conversation about reducing water usage in the data center. I broached the subject with several attendees, and while everyone agrees that data centers have a drinking problem - so to speak - there are not a lot of solutions to address it because the conventional approach usually involves a trade-off with the use of more energy. Using more energy to solve a water problem is not sustainable. 

Our industry will always grapple with the age-old power-hungry problem that is the data center. By its very nature, the industry requires a massive amount of resources to deliver services. As a result, industry groups such as American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE), have focused on rising energy concerns with guidelines focused on solving for power through usage and temperature standards. The industry is also constructing greener buildings and innovating cleaner designs. All together data center “Power Usage Effectiveness” (PUE) has significantly decreased over the past 10 years.

While it is a positive development that overall energy for data centers is being reduced around the globe, a key component that has — for the most part — been washed over is water usage. One example of this is the continued use of open-cell towers. They take advantage of evaporative cooling to cool the air with water before it goes into the data center. And while this solution reduces energy, the water usage is very high.

Raising the issue of water reduction is the first step in creating ways our industry can do something about it. As we experience the continued deluge of the “Internet of Things”—projected to exceed 20 billion devices by 2020, we will only be able to ride this wave if we keep energy low and start reducing water usage.

The Heat is on for Free Cooling

The first question becomes how can cooling systems reject heat more efficiently?

Let’s say heat is coming off the server at 100 degrees Fahrenheit. The idea is to efficiently capture heat and bring it to the atmosphere as close to that temperature as possible — but it is all dependent on the absorption system. If water is brought back at a temperature in the high 90s, the system can do an enormous amount of economization of the atmosphere without having to use a lot of water to cool it down any further. The air temperature outside the data center will likely be at 95 degrees F and above for only a portion of the year, but for the majority of the year, it will be below that. That creates “free cooling” without having to turn on water, except when absolutely needed

Sacrificing Water Versus Power

Obviously traditional data center cooling systems use water.  And while we can’t change the physics of how many BTUs versus how much water is used, we can control them. By doing this, we can also achieve greater efficiencies. If the external air temperature gets too hot and a data center wants to keep its PUE down, water is turned on. It’s that simple.

Apply this scenario throughout an entire year. Outdoor temperatures can run up to 100 degrees F plus. It is optimal to maintain an annualized PUE of 1.15 (PUE of 1.0 is nearly perfect). However, if the outdoor temperature rises to 120 degrees F in Phoenix, Arizona, the PUE will spike. Imagine if, at that point, a data center operator could choose whether to use water for a limited number of hours and continue to keep the PUE low while having a small WUE (water usage effectiveness).

With traditional cooling solutions, high water usage is necessary due to the systems requiring water for a majority of the year to operate at the same efficiency on hotter days. Some data centers create efficiencies at higher temperatures by allowing room temperatures to rise into the high 80s and 90s. However, that creates a whole realm of issues, including higher risk of failure for servers, corrosion or mold spores and bacterial growth on the servers. The ability to optimize the usage of water versus power when needed helps to solve these risks.

Editor's Note: Part II of this article will explore how to further optimize the use of water.

Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Informa.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating.

 

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like