Insight and analysis on the data center space from industry thought leaders.
AI Will Change the Nature of Data Center Builds in 2025
AI-driven transformation is reshaping data center builds for 2025 – from dense fiber systems to innovative cooling solutions. Here’s what to expect.
What a difference a year makes. Last year, we noted that the exponential growth in demand for AI compute in data centers would force more efficient processes, faster builds, and more creative problem-solving to address the persistent shortage of top IT talent.
This has certainly proven to be true – in fact, truer than anyone really expected.
According to a May 2024 outlook published by Goldman Sachs, AI implementations are now expected to force up to a 160% spike in data center power demand, demonstrating the increased urgency in managing this growth as the race for resources heats up.
The IEA estimated that globally, data centers consumed 460 TWh of electricity in 2022 (PDF), consuming about 2% of all generated power – and that number is expected to double by 2026. The reasons are clear; AI implementations require much greater compute power than other forms of processing, as power-hungry GPUs labor to meet growing demand.
In 2024, the need for more efficient strategies became clear. In 2025, we will see those strategies put into practice. Already there are some big moves and bold plans on the table, changes in data center builds that will power cloud compute to the next level.
The AI Drivers – Big Compute Goes Small
The spread of AI’s applications into every facet of personal and professional life has been breathtaking. I could only compare it to the earliest days of the World Wide Web, our first introduction to the global internet in the late 1990s. At first a curiosity, alternately hyped and dismissed, the internet became integral to modern life in record time.
It’s said that the telephone only became a common household fixture 50 years after its invention. The internet took about 20 years. Now, AI looks poised to do the same in a fraction of that time, as it quickly finds new applications in the enterprise space, and the vast majority of this will be supported by data centers.
The number of inventive enterprise uses for AI is going parabolic – we have barely scratched the surface of AI’s impact on commerce, science, and society itself. Ironically, the biggest innovation in decades is making its influence felt in ever-increasingly small ways through the enterprise space.
Data Center Construction is Booming
The biggest names in tech are building like never before, bending their 10-year CapEx averages ever higher as the gold rush-like race to AI compute gains steam.
It’s not only the technology of AI that’s evolving, but also the delivery model. AI-as-a-Service is paving a smooth road for enterprise adoption of AI capabilities, particularly generative AI that can fill multiple roles from customer service to long-term financial planning.
Indeed, data centers themselves are increasingly making use of GenAI to address the persistent lack of skilled IT employees by using AI to monitor, manage, and support lean IT teams so they can be more productive. With an intuitive way to ask questions and receive recommendations, a less-advanced IT team can punch above its weight and relieve some of the labor stresses data centers face.
With these buildouts, reliable access to sufficient power remains a challenge. Data centers draw a growing percentage of generated power worldwide and the trend will continue for the foreseeable future, accounting for as much as 44% of increased electrical demand through 2028, according to Bain & Company, shared in a recent report from Utility Dive. The scarcity of excess energy supply in most areas is driving new data center builds to new and sometimes unexpected locations to secure proximity to affordable power generation sources or leasing dedicated grid power to ensure supply.
Read more of the latest AI data center news
And we’ve all seen the stories of data centers’ recent embrace of dedicated nuclear power generation to support their growth. We expect to see even more of this in 2025 and beyond.
The choice of nuclear is logical: the source is stable, scalable, and relatively sustainable compared to fossil fuel-driven sources. At the same time, data centers are doing what they can to reduce energy consumption – both as a matter of economics and environmental responsibility – by deploying water cooling systems in place of less efficient forced air cooling.
As the scale of GPU-powered AI compute rises, these efficiencies will become more apparent, as will the benefits of increased network uptime, as excessive heat is a prime culprit in outages and premature component failure.
Shrinking the Profile of Infrastructure
Related to both power and cooling needs, the data center’s fiber infrastructure continues to become denser in AI compute facilities. GPUs in AI arrays must be fully networked – every GPU must be able to talk to every other GPU – which increases complexity by an order of magnitude and complicates cooling. To overcome the bulk of the required fiber infrastructure, data centers will use highly dense fiber systems to make those countless connections, packing more fibers and connectors into the existing footprint to power their AI networks.
By forcing more compute resources into fewer racks, data centers can reduce energy use and simplify cooling needs as well. Plus, as hyperscale data centers migrate from 2x400G (aggregate 800G) to native 800G, this advanced fiber infrastructure will provide some much-needed pathway capacity to accommodate the demand yet to come.
Multi-Tenant Data Centers – Standardization and Flexibility
I’ve spent a lot of time looking at the largest hyperscale data centers and their licensed AI as a service model as they relate to enterprise. But there’s another important side of the business to consider in 2025, and that’s how multi-tenant data centers (MTDCs) will forge a way forward for their enterprise customers. Whatever their vertical, enterprise needs are changing fast and MTDCs must remain flexible to accommodate their needs.
Here, too, a standardized approach to denser fiber infrastructure is key because it reduces IT staff demands and simplifies configuration changes. Several top manufacturers of fiber infrastructure are in the process of launching or improving simpler, more plug-and-play technologies to help all data centers, but particularly MTDCs, to flatten the required skill curve required to be as agile and responsive as possible, maintaining SLAs even with leaner IT teams.
2025 will be 2024 – Only More So
The fundamental changes coming to data centers in this dawn of the AI age will be truly remarkable. From location to scale, hyperscale and MTDCs alike will need to scale up their fiber capabilities while scaling down their fiber’s physical profile, adopt new cooling technologies, and take a fresh look at how they buy and use electrical power. Unfortunately, there is no end in sight to the ongoing shortage of top-skilled IT expertise, but AI itself is already demonstrating ways that it can help operators fill those gaps with GenAI-powered monitoring and management.
As AI continues to make inroads in the enterprise space, data centers will be called upon to supply the massive compute required to turn promises into practical business benefits. Like AI, data centers will innovate and adapt to meet changing needs and deliver the optimal solutions that this fast-growing industry needs.
About the Authors
You May Also Like