Insight and analysis on the data center space from industry thought leaders.

How Edge Data Centers Will Save the Internet

Building data centers at the edge of the last-mile network will allow the wired and wireless worlds to operate in tandem and will enable new classes of applications.

Industry Perspectives

April 30, 2018

5 Min Read
DataCenterKnowledge logo in a gray background | DataCenterKnowledge

 vaporIO_0.jpg

Cole Crawford is CEO and Founder of Vapor IO.

The wireless world and the wireline world are on a crash course. We built each of these nationwide infrastructures with different goals and restraints, using vastly different principles and geographic footprints. Without fundamental changes to converge these two communications systems, the internet itself will eventually break.

By placing data centers at the very edge of the wireless network, at the base of cell towers and other locations on the infrastructure side of the wireless last mile, we can alleviate a great deal of the pressure on the networks from applications and data that would otherwise have to traverse the wireline and wireless infrastructures.

A Tale of Two Networks

Below, on the left, is effectively the modern internet backbone in the United States. On the right is the publicly available spectrum map of a prominent US wireless carrier.

internet_0.png

Today these two infrastructures provide the illusion of working well together—after all, I can open my cell phone almost anywhere in the US to read my emails, browse the web or watch a movie.

But in reality the wireline and wireless infrastructures are physically and logically separate systems. They talk to each other, of course, but only through a Rube-Goldberg machine of latency-inducing network hops, maze-like routings, protocol conversions, and data lakes.

Until very recently, the gap between the wireline and wireless infrastructures went largely unnoticed. Most of today’s applications are just fine with human-scale latencies—delay tolerances measured in seconds, not milliseconds—and also don’t require large amounts of data. If something goes wrong with packets traversing these networks, something bad rarely happens.

In yesterday’s world, the consequences of a delay or data loss is inconvenience. In today’s world, it could crash a car. New applications bring many benefits but also raise the stakes.

At the End of the Day, It’s All About Transport

Having worked for the man who laid copious amounts of the fiber along I-70 (MAE East / MAE West), I have a keen sense of the limitations of the internet backbone. But everybody else should, too. Applying simple math should make it immediately clear there is simply not enough fiber in the ground to handle the growth of the wireline plus wireless internet. We're soon going to run out of the type of connectivity we need to move the 80+ zettabytes of data we will have created and stored by 2025.

Further, while the speed of light is fast—it's not THAT fast. The way current generation routing works, the only way we can achieve a ubiquitous, low-latency, sub-10 millisecond real-time experience is by using local compute and leveraging dense local metro fiber and peering in ways that merge the wireless and wireline worlds. Moreover, many data streams and workloads that operate at the edge will never need to travel on the internet backbone; they can be processed and stored locally in an edge data center, one hop away from the wireless RAN (Radio Access Network).

Building the Local Internet Backbone

As our connected devices become more mobile and more dispersed, their data must traverse both the wireline and wireless networks and they must rely on these two infrastructures working in concert—at very high levels of reliability and with very low, even machine scale, latencies.

By placing data centers at the very edge of the wireless network, at the base of cell towers and other locations on the infrastructure side of the wireless last mile, we can alleviate a great deal of the pressure on the network from applications and data that would otherwise have to traverse the wireline and wireless infrastructures.

With data centers at the edge, we can do “local breakout” in software, which lets us peel off the wireless data stream after it has been received by the antenna and processed by the baseband unit at the tower or aggregation hub, but before it passes through the wireless system’s legacy backhaul. This way, the data stream can be routed inside the edge data center or placed directly onto the internet.

It might not be readily apparent, but many of the legacy wireless networks force data packets to trombone out to some central office (or, even worse, to a single point in the middle of the country) before an IP address is even assigned. Without local breakout, data cannot be IP-routed, even at the tower, until it has traversed that round trip. Local breakout makes it possible to process data at the edge or, alternatively, backhaul it with a very fast direct fiber link to a regional or centralized data center at much higher speeds and reliability than if it had to return through the outdated legacy network.

Even when data must be delivered beyond the edge, such as to a central data center for cold storage or national distribution, the edge facility can pre-process it at the edge. Workloads running on servers at the edge can evaluate the incoming data and deliver real-time decisions if necessary, while also running algorithms to reduce the quantity of data sent to a centralized data center by orders of magnitude, saving precious backhaul capacity.

The Internet Will Evolve

Building data centers at the edge of the last-mile network will allow the wired and wireless worlds to operate in tandem and will enable new classes of applications. These edge data centers will unlock the ability to deliver real-time wireless services even if they require sub 10 millisecond latencies or transport large amounts data. These new classes of edge applications will push a lot of their processing and data storage to the edge, enabling everything from tetherless virtual reality to self-driving vehicles.

Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Informa.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating.

 

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like