Insight and analysis on the data center space from industry thought leaders.

Building the Backbone of AI: Why Infrastructure Matters in the Race for AdoptionBuilding the Backbone of AI: Why Infrastructure Matters in the Race for Adoption

AI adoption is surging, but do businesses have the infrastructure to keep up? Interconnection and smart cloud strategies are critical, says Ivo Ivanov, CEO of DE-CIX.

6 Min Read
Image: Alamy

AI is ready for businesses, but are businesses ready for AI? That’s one of the biggest questions that analysts, experts, and executives are asking themselves as we soar into 2025.

The pace at which new AI technologies have evolved over the past year is unprecedented, with virtually every business now firmly onboard the AI train. Large Language Models (LLMs) now have reasoning capabilities, decision-making algorithms can better handle uncertainty, robots can learn through imitation and reinforcement, and supercomputer initiatives have improved the speed and scalability of training AI systems – and that’s just scratching the surface.

According to a 2024 MIT Technology Review survey, a staggering 95% of businesses are already utilizing AI in some way, and more than half are aiming for full-scale integration in the next two years. The momentum behind AI is nothing short of remarkable, but as with any emerging technology, there are peaks and troughs before a state of blissful equilibrium is reached. According to Gartner’s Hype Cycle, many of the technologies and AI use cases outlined above are now in the “inflated peak of expectations” phase of their development. Expectations are high, and businesses are pursuing AI doggedly to get the jump on their competitors. But before we get to the “slope of enlightenment” and the “plateau of productivity” – the ultimate goal – we must first apparently trudge through the “trough of disillusionment.”

Related:AI Will Change the Nature of Data Center Builds in 2025

This disillusionment won’t impact every business, of course. Those that are well-prepared and have the right infrastructure and phased-growth mentality will be able to weather any temporary setbacks, but the majority of businesses – with their “inflated expectations” – run the risk of trying to run before they can walk with AI.

One of the primary challenges facing businesses when it comes to AI is having the foundational infrastructure to make it work. Depending on the use case, AI can be an incredibly demanding technology. Some algorithmic AI workloads use real-time inference, which will grossly underperform without a direct, high bandwidth, low-latency connection. A prime example of “running before you can walk” with AI would be looking at deploying such a use case without ensuring that your network infrastructure is up to the task. So how can companies ensure they have a winning AI infrastructure in order to “win” at AI?

Elevating Cloud Access

An organization’s path to the cloud is really the central pillar of any successful AI strategy. The sheer scale at which organizations are harvesting and using data means that storing every piece of information on-premises is simply no longer viable. Instead, cloud-based data lakes and warehouses are now commonly used to store data, and having streamlined access to this data is essential.

Related:Data Lakes Evolve: Divisive Architecture Fuels New Era of AI Analytics

But this shift isn’t just about scale or storage – it’s about capability. AI models, particularly those requiring intensive training, often reside in the cloud, where hyperscalers can offer the power density and GPU capabilities that on-premises data centers typically cannot support. Choosing the right cloud provider in this context is of course vital, but the real game-changer lies not in the who of connectivity, but the how.

Relying on the public internet for cloud access creates bottlenecks and risks, with unpredictable routes, variable latency, and compromised security. This is where interconnection platforms equipped with cloud and AI exchange capabilities come into play. Data center and carrier-neutral versions of these platforms – becoming increasingly popular – provide businesses with resilient, direct, and secure pathways to multiple leading cloud operators, supported by Service Level Agreements (SLAs) that ensure high performance and reliability.

Related:Next-Gen Networking: Exploring the Utility of Smart Routers in Data Centers

What’s more, they reduce costs associated with data movement by enabling SLA-backed private connectivity solutions that minimize cloud egress fees. For organizations with multi-location operations or redundancy requirements, leveraging diverse cloud regions ensures data proximity to users, minimizing latency and boosting resilience. A multi-cloud approach goes even further, allowing businesses to avoid vendor lock-in and optimize their setups with the best available services. With advanced cloud routing solutions handling the complexity, businesses can ensure seamless interoperability and low-latency data exchange, enabling AI applications to thrive on a robust, efficient, and secure infrastructure.

Exploring the Data Center Landscape

The data center landscape is booming. According to our research, the US now has more than 11,000 MW of data center capacity, and the number of data center operators has increased by 250% in the past decade. An organization’s choice of data center is vital to AI transformation – a well-balanced approach that combines hyperscalers, colocation data centers, and even some on-premises facilities is a good way to diversify for future AI needs.

While hyperscalers are indispensable for training large AI models due to their power density and GPU capabilities, colocation data centers, while typically smaller, will continue to play an important role when it comes to inference – the process during which AI algorithms execute real-time responses. 

Therein lies the challenge. Despite the data center boom in US hubs such as Dallas, Phoenix, and New York, our research shows that vacancy rates have dwindled to as low as 1-4%, driven by surging AI and LLM adoption. In order to pursue their AI goals, businesses must therefore expand their horizons, exploring data center opportunities in secondary markets, even those outside of urban centers. That is now possible thanks to interconnection and the use of distributed internet exchanges or IXs.

The Power of Interconnection

Internet exchanges (IXs) are physical platforms where multiple networks, including internet service providers (ISPs), data centers, and content delivery networks (CDNs), interconnect to exchange traffic directly between one another via peering.

By facilitating data exchange through the shortest and fastest network pathways, IXs can improve network performance, reduce costs, and ensure low-latency connectivity for all participants. Crucially, they allow businesses to leverage data centers outside of their traditional geographical boundaries, so a business that cannot find capacity due to low vacancy rates in New York, for example, can seamlessly tap into capacity in an adjacent region.

Read more of the latest networking news

The number of IXs in the US has surged in recent years, with the number of large-scale IXs (connecting more than 50 networks, and many of them in the hundreds) having increased by 350% since 2014. In particular, the majority of these newly established IXs are operated according to the distributed data center and carrier-neutral IX model. These neutral models, free from vendor lock-in and with a higher density of interconnected networks, empower businesses to build diverse, geographically distributed infrastructures tailored to AI’s demands for quick, reliable access to data. Because these IXs aren’t tied to a single carrier or data center, they provide businesses with built-in resilience and redundancy – if one data pathway is congested, another can be used. Our research reveals that, today, more than 80% of IXs in the US have a neutral or hybrid-neutral operating model, overtaking the traditional data center/carrier-operated IX model.

High-speed, low-latency connectivity is in high demand, but waiting for new data center capacity simply isn’t viable for businesses that want to get ahead in the AI race. That’s why interconnection is key; it removes the geographical chains that have traditionally held businesses back and provides a solid connectivity base that will ensure a smooth and steady path through the AI revolution.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like