Assessing AI's Impact on Data Center Heating and Cooling Needs

Higher heat loads are likely to become commonplace in data centers that host AI workloads. Here, we assess the short- and long-term impacts – and potential mitigation strategies.

Christopher Tozzi, Technology Analyst

November 4, 2024

4 Min Read
Flames ignite the computer processors on the motherboard and smoke
Alamy

If predictions hold true, the AI boom will dramatically increase the electricity that data centers consume. Moody's predicts that data center energy use linked to AI will grow by a rate of 43% annually. By extension, AI will also likely lead to hotter conditions inside data centers, since electricity consumption by IT equipment goes hand-in-hand with heat production.

This means that, in addition to addressing the need for more power, data center operators should be thinking about how they'll handle the higher heat loads that are likely to become commonplace in data centers that host AI workloads.

Will AI Raise the Temperature in Data Centers?

To date, no one has collected hard, comprehensive data about the extent to which AI workloads are actually contributing to higher heat output by data centers. But it's reasonable to assume that they are because AI accelerator hardware – such as field programmable gate arrays (FGPAs), graphical processing units (GPUs) and neural processing units (NPUs) – typically consumes a lot of electricity and generates a lot of heat relative to more conventional devices.

What's more, AI hardware designs can make it more challenging to dissipate heat. For example, research on NPUs found that they "impose serious thermal bottlenecks to on-chip systems due to their excessive power densities." In less technical terms, this means that it's harder to cool down AI chips because they generate high volumes of heat in small, concentrated areas.

How Data Centers Can Cool AI Hardware

This means that data center operators are likely to have to invest in new cooling strategies to keep AI hardware running at safe temperatures.

The good news on this front is that the technology for cooling AI devices efficiently already exists. Liquid cooling and immersion cooling excel at dissipating heat quickly, with minimal energy use.

The bad news, though, is that these types of cooling systems are typically much more expensive than conventional data center cooling solutions. As a result, data center operators may find themselves having to invest in expensive new cooling infrastructure.

This trend could also create some market pressures wherein certain data centers become more capable than others of hosting AI workloads, due to differences in heating systems. Data centers that offer liquid and immersion cooling will enjoy a leg up in the race to corner the AI market.

The need to cool AI workloads efficiently could also add new incentive for data centers to invest in strategies like the reuse of heat, since they'll have more heat to spare thanks to AI chips.

AI, Heat and Data Centers: A Long-Term Outlook

On balance, it's worth noting that AI hardware might not always be so power-hungry or heat-intense. It's likely that AI chip manufacturers will find ways to make AI hardware more efficient from a heat perspective over time, just as CPU designs have changed over the decades to reduce heat output without compromising on processing power. Just because today's NPUs, FPGAs, GPUs and other AI devices generate lots of heat doesn't mean that will always be the case.

It's also possible that businesses won't end up deploying as many AI devices in data centers as many current projections suggest. Companies might opt to use GPU-as-a-Service solutions, for example, instead of setting up their own hardware – and although offering GPU-as-a-Service requires service providers to deploy GPUs in a data center somewhere, the ability to share the GPUs among multiple customers and workloads is likely to lead to more efficient overall GPU usage than would be the case if each business set up its own GPUs.

But in the short run, the heat wave that AI is poised to bring to the data center industry seems unavoidable. Now is the time for data center operators to decide whether they want to make the investments in cooling technologies capable of handling increased heat demands.

About the Author

Christopher Tozzi

Technology Analyst, Fixate.IO

Christopher Tozzi is a technology analyst with subject matter expertise in cloud computing, application development, open source software, virtualization, containers and more. He also lectures at a major university in the Albany, New York, area. His book, “For Fun and Profit: A History of the Free and Open Source Software Revolution,” was published by MIT Press.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like