The Biggest AI Data Center Stories That Shaped 2024

Explore the biggest trends in AI data centers for 2024 – from surging capex to edge innovations. Here are the top 10 stories driving the industry forward.

Christopher Tozzi, Technology Analyst

December 23, 2024

8 Min Read
Image: Alamy

There has been no shortage of predictions over the past several years about how AI will impact the data center industry and how data centers will help shape the evolution of AI. We’ve been told, for example, that AI will lead to a surge in power consumption by data centers and spur major new data center construction efforts.

Over the course of the past year, it has started to become clearer which of these predictions and projects will play out. We now have evidence, for example, about how AI is contributing to data center company profitability, as well as some of the challenges (like chip shortages) the industry faces in deploying AI workloads on a large scale.

At the same time, however, as-yet-unproven projections and predictions about AI’s impact on the data center industry continued to abound over 2024. There remain many opinions about what’s coming next – like new “edge AI” architectures – and little in the way of proof demonstrating that those opinions are accurate.

For full details on these and other data center trends involving AI, here’s a breakdown of the top Data Center Knowledge stories in this vein from the past year.

1. Moody’s Report Reveals Surge in Data Center Demand Driven by AI Boom

A July 2024 report from Moody’s about the expansion of data center capacity in response to the AI boom was notable not because it predicted the seemingly obvious – that companies will build more data centers to house AI workloads over the coming years – but because it quantified the projected impact of that expansion on data center operations. Most notably, Moody’s predicted that AI will contribute to a 23 % overall increase in data center energy consumption between 2023 and 2028, and that energy used tied to AI workloads specifically will grow by an annual rate of 43% over the same period.

Related:2024’s Biggest Data Center Construction Stories: A Year in Review

These numbers are predictions, and they could turn out to be wrong. But if you want to know in quantitative terms exactly how AI is poised to impact data center operations, this source is as good as any.

2. AI Accelerated Servers Fuel Growth in Data Center Spending

Another report, released in September by the Dell’Oro Group, offered quantitative insight into what AI means for data center spending. It found that data center capital expenditures surged by 46% in just the second quarter of 2024 – a trend that, if it holds, suggests that AI will fuel an enormous increase in data center investment and spending over the near future. The growth reflects not just AI hardware purchases but also the power and cooling systems necessary to support AI devices in data centers.

Related:AI Data Centers Pose Regulatory Challenge, Jeopardizing Climate Goals – Study

3. AI Revolution Will Add Fuel to Data Center Boom, BlackRock Says

BlackRock also chimed in in 2024 with projections for AI’s impact on data center growth. Its numbers were less precise, but it predicted that AI data centers will expand in capacity by between 60% and 80% per year over the next several years.

On balance, it’s worth noting that the company didn’t define exactly what an “AI data center” is or the extent to which the expansion of data centers in this category will contribute to overall data center capacity. Still, as the opinion of a company whose business is to predict and capitalize on major economic trends, BlackRock’s projections about AI’s role in data center expansion are significant.

4. Equinix Data Center Expansion Continues as Hyperscale, AI Demand Persists

In another data point that at least indirectly correlates with AI’s role in data center growth, Equinix, which operates data centers across the world, attributed its 8% increase in revenue this year in large part to AI. Not coincidentally, the company is also in the midst of rapidly expanding its data center footprint.

Equinix didn’t provide details about how much of its revenue growth was due to AI workloads specifically, and its CEO cautioned that it would take time for the industry to feel the full weight of AI. Still, if you’re willing to go out on a limb and assume that correlation implies causation to some extent, it’s a reasonable conclusion that the AI boom – which correlates with financial success this year for Equinix – is at least starting to pay dividends for data center operators.

Related:How LLMs on the Edge Could Help Solve the AI Data Center Problem

5. Bitcoin Miners Pivot to Data Center Operations Amid AI Boom

Another sign of AI’s impact on data center business strategies was a pivot by data center operators from facilities that cater to cryptocurrency mining toward ones focused on AI. That’s the move Iris Energy described to Data Center Knowledge this year.

This shift makes good sense given that interest in cryptocurrency has generally waned in recent years and that the same types of infrastructure and devices – like GPUs – that excel at crypto-mining also work well as AI hardware. But the trend is notable all the same because it suggests that, at least to some extent, expanded data center capacity to support AI will come in the form of crypto-mining facilities that are repurposed for AI, rather than brand-new data centers. In this sense, the repurposing of cryptocurrency data centers for AI workloads could reduce the amount of new data center investments fueled by the AI boom.

6. Nvidia CEO Jensen Huang and Mark Zuckerberg Tout Their Vision for AI

Other notable – albeit not exactly objective – opinions about the role of AI in data centers and beyond appeared this year from the CEOs of Nvidia and Meta. Speaking at SIGGRAPH this summer, the executives mentioned, among other topics, how their companies are using AI internally – including to help manage data center operations for Nvidia, according to comments by Nvidia CEO Jensen Huang.

The discussion offered few technical details, so it’s challenging to draw takeaways about what the use of AI inside data centers by companies like Nvidia and Meta actually entails, or what it may portend for AI’s impact on the way data centers operate. But it’s still interesting that these companies – both of which sell AI products, of course, and therefore have incentive to advance the narrative about AI’s increasingly central role in modern businesses – have to say about the internal use of AI.

7. HBM Chip Shortage: A New Bottleneck in the Data Center Supply Chain

It’s one thing to expand the capacity of data centers for hosting AI workloads. It’s another to deploy the actual server infrastructure that supports those workloads – and because of the shortage of high-bandwidth memory (HBM) chips reported this year, there is a risk that the expansion of AI-friendly data center space will outpace the growth of AI-friendly servers. That’s because HBM chips are used to manufacture GPUs, which are frequently used for AI training and inference.

This is an example of one of the challenges the data center industry will need to overcome to sustain continued growth in response to the AI boom.

8. How Heat Waves and AI Challenges Are Piling Pressure on Data Centers

Keeping AI infrastructure cool is another fundamental challenge that may hinder continued data center expansion. That’s especially true due to the increased frequency and intensity of heat waves. AI chips produce a lot of heat under any circumstances, but dissipating the heat becomes even harder when the ambient temperature surrounding a data center surges due to heat waves.

This is one reason why innovative data center cooling technologies, which can dissipate heat in energy-efficient ways, are likely to become a key element in continued data center expansion in the age of AI.

9. Edge AI: Why the Future of AI Compute is at the Edge

So-called edge AI could contribute to strategies for improving the data center sustainability in the age of AI. Edge AI means having AI workloads process data at the network edge instead of in centralized data centers. Doing so could reduce energy consumption and improve performance by reducing the amount of data transmission required to deploy AI.

On balance, it’s worth noting that AI processes like training tend to require large amounts of energy no matter where they take place – whether at the edge or in a conventional data center – so edge AI is unlikely to reduce energy consumption very dramatically. Still, there could be some tangible sustainability benefits due to advantages like reduced heat concentration (and, by extension, reduced energy consumption by cooling systems), since edge AI infrastructure doesn’t place large numbers of AI chips in close proximity to each other.

10. How LLMs on the Edge Could Help Solve the AI Data Center Problem

To provide a deeper dive into what edge AI might look like in practice, Data Center Knowledge covered one key type of edge AI use cases: Large Language Models (LLMs) deployed at the edge. By operating LLMs on edge devices like smartphones, businesses can reduce the energy and compute demands that AI places on their data centers. Currently, taking advantage of edge devices for this purpose is challenging because most edge hardware is not optimized for LLMs, but that could change as chip manufacturers design more AI-friendly processors for use in devices like smartphones.

About the Author

Christopher Tozzi

Technology Analyst, Fixate.IO

Christopher Tozzi is a technology analyst with subject matter expertise in cloud computing, application development, open source software, virtualization, containers and more. He also lectures at a major university in the Albany, New York, area. His book, “For Fun and Profit: A History of the Free and Open Source Software Revolution,” was published by MIT Press.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like