Omdia Analysts Discuss Powering – and Cooling – the AI Revolution
At Data Center World’s Omdia Analyst Summit, industry analysts described the challenges AI workloads are creating for the data center industry – and where we go from here.
April 16, 2024
Data Center World 2024 kicked off on Monday with research firm Omdia hosting an Analyst Summit. The overarching theme was whether the AI data center would be a gradual evolution or a complete revolution in design, operations and best practices.
Vlad Galabov, Head of Omdia’s data center practice, laid out key trends influencing the market. AI software is expected to tally $100 billion in annual revenue by the end of 2024. Most of that goes to predictive AI. Despite all the hype about generative AI, it is currently only worth a few billion a year. By the end of 2024, that will rise to more than $10 billion and then to $60 billion by 2028.
Predictive AI is very much an established market with five million servers expected to be installed by the end of 2024. GenAI, on the other hand, is still in the laboratory research phase, according to Galabov. Yet it will have already amassed one million deployed servers by the end of this year.
“ChatGPT proved there is clear demand and catalyzed investment in GenAI,” he said.
Fueled by AI demand, the installed power capacity of data centers is predicted to double between 2014 and 2030. By that time, it will reach nearly 170 GW – of which nearly half will be for AI.
The AI Ripple Effect
What does all that mean for the data center? Galabov expects three things:
Consolidation of the IT footprint: the number of cores in a processor will arrive at 288 by the end of 2024. That’s a 10X rise compared to 2017. Software optimization within processors is better, too, and applications are being optimized to operate in custom processors. Galabov expects this all to lead to a 5 to 1 rate of consolidation of servers within the data center.
IT utilization gains: Serverless computing will minimize overprovisioning in the cloud so fewer servers and processors will sit idle. Meanwhile infrastructure as a service (IaaS) is being paired with professional services to drastically lower the IT footprint in legacy environments.
Improved PUE: Power usage effectiveness (PUE) will be further driven down by top management decree in tandem with the use of AI-based tools.
Cooling the AI Jets
Shen Wang, principal analyst at Omdia, followed Galabov at the Analyst Summit with a discussion of the best technologies for cooling the AI data center.
“AI is here and we can’t power it,” he said. “So, let’s figure out how to cool it.”
What he means is that power constraints stand solidly in the way of AI growth in many metropolitan areas. While there are workarounds that can be employed, the ultimate solution is to use advanced cooling technologies that utilize far less power than traditional cooling methods.
Analyst Shen Wang speaks at Data Center World's Omdia Analyst Summit on April 15, 2024.
Wang noted a 100x increase in the die size of CPUs since the 70s. Since 2000, processors are 7.6x larger and require 4.6X more power. Now factor in GPUs and it becomes clear that liquid cooling is inevitable, Wang said. He believes air cooling will continue to be used for CPU-based racks, but GPU racks need liquid cooling if they are to adequately serve AI workloads.
“We can also bring precision air cooling to where the heat is to increase efficiency and lower PUE,” said Wang.
As rack sizes grow above 50kW, however, direct-to-chip (DtC) technologies will dominate, Omdia predicts. For higher densities, more innovation is needed. That might lie in a refinement of immersion technologies or a combination of DtC, air and immersion.
“We need to cool far more precisely and efficiently in the areas of greatest need,” said Wang.
Those planning new data centers are at an advantage, he added. They can optimize their designs and underlying infrastructure to accommodate liquid cooling. Legacy data centers will have to cope with the limitations of their current architecture. Some will be able to introduce a lot of liquid cooling. Others will be severely constrained.
“Right now, liquid cooling is only one seventh of the total cooling market, but by 2027 it will be worth one third,” said Wang.
Infrastructure Readiness Remains a Hurdle
In a panel following this presentation, Maurizio Frizziero of Schneider Electric added that flexibility is key when it comes to cooling existing data centers.
“There is no one size fits all in cooling,” he said.
Jason Matteson of Iceotope believes infrastructure readiness is a major hurdle for anyone wishing to deploy liquid cooling. There is a need for pumps, pipes and valves for the water, control systems, leak detection systems and more. In addition, the current building may not be able to implement immersion cooling due to space, weight constraints or insurance/risk considerations. But overall, he said fewer people are in fear about the presence of liquid within the data center.
“The hydrophobia is gone as water is needed,” said Matteson.
Richard Bonner of Accelsius, though, added that the fintech sector continues to be hesitant about the introduction of water. But he is bullish about the future.
“If you spend $2.5 million on a rack, you can’t have it throttle so you need liquid cooling to maximize the return on investment,” said Bonner. “There is no compelling reason to use air for high performance computing (HPC) and AI.”
About the Author
You May Also Like