Insight and analysis on the data center space from industry thought leaders.
Data Center Growth: Power Supply's Vital Role in Fueling HPC and AI
Today's data centers face the challenge of meeting escalating power demands to drive innovation and maintain efficiency.
February 13, 2024
High-Performance Computing (HPC) and generative continue to capture the imagination of the IT world. The boundaries of innovation are limitless. However, it’s important to remember the foundation of all technology systems: the power supply.
Without sufficient power, the fastest CPUs in the world can’t achieve their potential compute capacity. The same goes for AI machine learning models. Without a power grid feeding data centers that can deliver enough megawatts of electricity to server clusters, the models can’t generate the fast results that AI promises to end-users. This is a major challenge facing data centers today.
The Power to Drive Cities and Countries
To show just how big of a leap HPC and AI represent in terms of power demand, companies looking to move their infrastructure to a colocation data center for basic requirements (networking and data) might have required only 10-15kW per cabinet just a year or two ago. Today, however, AI applications and the HPC clusters that power them can require 50-100kW per cabinet, or more.
These power densities add up quickly. In the past, a colocation installation consuming 3-5MW would have been considered large. Today, a single HPC computing system can take up to 13 megawatts from power sources while Exascale systems draw 25 megawatts or more. Supercomputers (harnessing the power of multiple interconnected processing cores) use up to 30MW, consuming as much power in a year as a small city.
In the AI realm, one expert estimates that if Google used AI for its approximate nine billion daily searches, it would need 29.2 terawatt hours of power each year – the equivalent of the annual electricity consumption of Ireland. By 2027, AI servers could use between 85 to 134 terawatt hours annually. That's comparable to what Argentina, the Netherlands, or Sweden uses in a year – about 0.5% of the world's current electricity use.These are astronomical numbers that would have been inconceivable just a couple of years ago.
How Can Data Centers Answer the Power Challenge?
The demand for HPC resources and generative AI is sure to increase. So how will data centers answer the call? The answer lies in addressing both sides of the supply and demand equation.
Data center operators are working more closely than ever with utility partners to coordinate their expansion plans with power supply and grid expansions. This information is included in data center location plans so that new facilities have available and resilient power supply.
On-site power is another option, including substations that can scale as demand increases. It’s also important to explore more ways to generate on-site power, including natural gas plants, microgrids, geothermal, solar, wind, and, possibly, nuclear-powered small modular reactors (SMRs).
Power generation isn’t enough. It must also be delivered to where it’s needed. Data center providers can collaborate with local and national entities to facilitate and expedite permitting for construction of transmission lines.
Data center operators continue to drive more efficient powering and cooling of their facilities. HPC and AI require not only a lot of electricity but also generate a lot of extra heat. That’s why leading colocation data centers are turning to liquid cooling options that efficiently deploy piping and fittings that encase liquids and deliver cooling closer to CPUs, GPUs, and memory chips.
A combination of the factors noted above will also limit the industry’s environmental impact, which is a moral and business imperative.
Key Data Center Attributes for Delivering the Necessary Power
Not every colocation provider is positioned to deliver the power that HPC and AI deployments demand. This is why enterprises deploying HPC and AI workloads should seek out colocation partners with the following attributes:
Financial Stability: Delivering the new power requirements is capital intensive, with little margin for error along the way. Colocation firms must identify capital partners that – literally – buy into the vision for the need for efficient power.
Operational Discipline: Enterprises need colocation partners who can anticipate their needs and hit every deadline for delivering power. This requires a proactive stocking program for critical infrastructure components and partnerships with utilities to alleviate demand on the grid, such as developing on-site power generation.
Proven Track Record: Technical teams that have experience deploying and operating HPC clusters with dense power requirements and advanced liquid cooling approaches. The pedigree of leadership in key positions will reveal where the colocation provider has been and where it is going with power innovations.
New Opportunities
The explosion of HPC and AI represents an amazing time and massive new opportunities for enterprises. We are at the embryonic stage of understanding the opportunities of AI, but undoubtedly, the innovations that emerge will require more power. This puts the industry at an inflection point, requiring deep exploration of the best options for power generation and transmission, cost-efficiency, and scalability, while limiting environmental impact. Now is the time.
Joe Minarik is COO of DataBank, a provider of enterprise-class edge colocation, interconnection, and managed services.
About the Author
You May Also Like