DOE Report Exposes Critical Impact of AI on Data Center Power Consumption

US energy officials have presented a detailed roadmap for meeting AI’s soaring energy demands while maintaining grid reliability and data center sustainability.

Sean Michael Kerner, Contributor

September 10, 2024

5 Min Read
DOE Report Exposes Critical Impact of AI on Data Center Power Consumption
How do we solve the AI data center power problem?Image: Alamy

The US Department of Energy (DOE) has partnered with data center industry experts to address the escalating energy needs of artificial intelligence and digital infrastructure.

A new report, published by the DOE in collaboration with a wide range of industry stakeholders, offers a detailed roadmap for meeting these growing demands while maintaining grid reliability and environmental sustainability.

The report (PDF), titled ‘Recommendations on Powering Artificial Intelligence and Data Center Infrastructure’, highlights the soaring power demands from data centers, particularly those driven by AI applications.

With hyperscale facilities now requesting 300-1,000 MW of power and lead times of just one to three years, local grids are struggling to keep pace, the report states.

The expansion of large language models (LLMs) was identified as a significant factor in this surge, raising concerns about future energy consumption as AI becomes more deeply integrated into society.

DOE-Headquarters.jpg

Data Center Expert Advisory Board

After consulting with dozens of industry stakeholders including Amazon, Google, Meta, OpenAI, Digital Realty, and QTS, the DOE presented numerous critical findings and recommendations, including:

  1. The creation of a data center AI testbed within the DOE to facilitate partnerships between national labs, academia, and industry in developing energy-efficient AI algorithms.

  2. Collaboration between energy utilities, data center developers and operators, and other key stakeholders to discuss how to address current electricity supply bottlenecks.

  3. A “rapid assessment” of the cost, performance, and supply chain issues facing power generation, storage, and grid technologies to support regional data center expansion.

Related:ARPA-E’s Peter de Bock Talks Data Center Cooling Obstacles, Innovations

A key focus of the DOE recommendations is the potential for data centers to transition from being “passive” power consumers to active participants in grid management.

The report also encourages the development and deployment of emerging clean energy technologies such as advanced nuclear, enhanced geothermal, and long-duration energy storage.

“The scale of the potential growth of both the electricity and the information technology sectors due to AI is extraordinary and represents the leading edge of projected electricity demand growth,” the report reads.

Commenting on the report, Dan Thompson, principal research analyst for data centers at S&P Global Market Intelligence, told Data Center Knowledge: “It’s interesting that we’ve reached the point where the DOE thinks it needs to get involved to help alleviate the problems data center owners/operators and utilities are experiencing in connecting all these new data center projects.

Related:Data Center Generator Market Projected to Surpass $2.1B by 2032

“This is a reminder that the local issues we keep hearing about for grid operators exist across the country, rather than just in one or two hotspots.”

Data Center Knowledge has approached the DOE with additional questions. This article will be updated if we hear back.

What’s Missing in the DOE Report?

While there is no shortage of recommendations to digest, analysts said the report is lacking in a few key areas.

While the paper emphasizes the need to encourage public-private collaboration to find ways to make LLMs more efficient, Thompson noted that there were no discussions about trying to make the data centers themselves more efficient.

“In other countries around the globe, the very first thing we’ve seen regulators target is the power usage effectiveness (PUE) of data centers, in an attempt to curb excess power consumption right off the bat,” Thompson said. “Interestingly though, PUE isn’t mentioned in the DOE report."

Of note, in a recent interview with Data Center Knowledge, Peter de Bock, program director of the US Department of Energy’s Advanced Research Projects Agency – Energy (ARPA-E) outlined his views on why PUE is not always an ideal metric.

Related:Data Center Disaster Recovery: Essential Measures for Business Continuity

"PUE has helped the industry focus on sustainability, and it’s been great for that,” de Bock said. “PUE also has its challenges.”

Thompson also observed that the report is based entirely on powering AI and not just powering data centers more broadly. 

“Perhaps this is semantics, but there are some very important topics and initiatives suggested for the DOE to take action on, but what happens if interest in AI as a technology falters?” Thompson said.

“As we’ve seen from our research, there is a lot of investment in data centers happening globally that is being tied to AI – no question – but there also continues to be large-scale data center developments happening that are definitely attributed to the cloud, which may include AI, but is not necessarily limited to AI.”

Data-Center-Power-Problem.jpg

Tackling the Data Center Energy Conundrum

Though this latest DOE report is focused squarely on addressing AI data center power issues, the agency has multiple efforts underway to enable sustainable scaling of the data center industry.

Among the numerous initiatives is COOLERCHIPS – a multi-year effort to improve data center cooling through silicon that doesn’t run as hot as current technologies.

“We have projects ongoing on single and two-phase immersion cooling and direct-to-chip cooling, as well as others and we intend to have a proof of concept by the first half of 2026,” de Bock said during this year’s Data Center World conference. “By 2030, the country with the most efficient, powerful, and lower-TCO data centers will be at a major advantage.”

Read more of the latest data center sustainability news

More recently, in July 2024, the DOE announced the Frontiers in Artificial Intelligence for Science, Security, and Technology (FASST) initiative.

Under the FAAST roadmap, DOE and its 17 national laboratories have been tasked with building the “world’s most powerful integrated scientific AI systems for science, energy, and national security.”

The initiative will focus on addressing AI energy challenges through the development of new clean energy sources and “highly energy efficient” supercomputers.

“Artificial intelligence is an innovative technology that can help unleash breakthroughs in energy technologies and enhance our national security,” said US Secretary of Energy Jennifer Granholm.

“FASST builds on DOE’s role as the nation's steward of advanced supercomputing and research infrastructure across our 17 national labs to provide a national capability in AI and enable technological breakthroughs for decades to come.” 

About the Author

Sean Michael Kerner

Contributor

Sean Michael Kerner is an IT consultant, technology enthusiast and tinkerer. He consults to industry and media organizations on technology issues.

https://www.linkedin.com/in/seanmkerner/

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like