Next-Gen Data Center Interconnect Tech Enables Explosion of Online Video and Cloud Services
Arista and Infinera demonstrate end-to-end 100 GbE connectivity over 150 km in third-party-validated test
It takes a lot of sophisticated technology to pull off something like Yahoo and CBS’s first-ever global webcast of an NFL game in October that reportedly matched the quality of satellite TV broadcasts, but one of the fundamental pieces of infrastructure high-quality internet video couldn’t exist without is high-bandwidth, low-latency data center interconnect technology. These are boxes that push massive amounts of data over long-distance optical networks at ultra-high speeds from one data center to another.
The explosion of internet video is changing nearly everything about how the internet is built, from its geographic layout down to the specific interconnect technologies inside data centers and top-of-rack network switches that move packets between those optical interconnect boxes – also referred to as DCI – and servers that store and process the data. One big change happening on the DCI front is the transition to 100 Gigabit Ethernet, the standard defined five years ago for pushing unprecedented amounts of data over networks.
While online video is the biggest driver for 100 GbE data center interconnection, other applications, such as cloud services and enterprise disaster recovery or business continuity, are also contributing to the shift to higher-bandwidth DCI.
“Video is probably the number-one driver, because that’ driving significant demand on the public internet,” Ihab Tarazi, CTO of Equinix, the world’s largest data center colocation and interconnection service provider, said. Equinix data centers around the world are where much of the interconnection between network carriers, digital content companies, cloud service providers, enterprises, and internet service providers happens.
DCI vendor Infinera and data center networking switch supplier Arista Networks, both of whose products Equinix deploys widely, recently published test results of a joint DCI solution that showcases the kind of capabilities modern technology for shuttling data over long distances has.
The test, overseen and validated by independent data center networking technology testing organization called The Lippis Report, confirmed 100 GbE throughput at latency under 20 microseconds with zero loss for “any mix of traffic” end to end. That’s data traveling from a server in one data center through an Arista switch to an Infinera DCI box, over up to 150 kilometers of fiber into another Infinera DCI box at a remote data center, and through another Arista switch to another server in less than 20 microseconds, uncompromised. The solution was tested at 10 Gigabits per second and at 100 Gbps, according to the test report.
The optical DCI transport platform was Infinera’s Cloud Xpress, tested with Arista’s 7280 switches.
Here’s a summary of the results, validated by The Lippis Report:
End-to-end 100 GbE line-rate throughput with zero loss for any mix of traffic
End-to-end latency under 20 microseconds between servers in different data centers
Up to 500 gigabit per second dense wavelength division multiplexing bandwidth with a single two-rack-unit Infinera Cloud Xpress
Ability to extend over 150 kilometers without any external amplification
Less than one watt per Gb/s for the Cloud Xpress
Arista, a major supplier of data center switches to web-scale data center operators, including Facebook, Morgan Stanley, and Netflix, released its 7280 switches into general availability earlier this year, according to Carl Engineer, who runs business development for the vendor. This is the first time Arista has partnered with an optical-interconnect vendor on a joint solution.
Arista 7280SE-68 data center switch (Image: Arista)
The purpose of the test was to ensure Arista’s new switch, one of the first 100 GbE top-of-rack data center switches on the market, worked with Infinera’s 100-Gig DCI, Engineer said.
“Cloud titans,” such as Netflix, eBay, Microsoft, and Google, among others, are keen to switch to 100-Gig Wide Area Networks because of skyrocketing network traffic. The combination of Infinera Cloud Xpress and Arista’s 100-Gig switches is a good way to do it with minimal amount of equipment, he explained.
Image courtesy of Arista and Infinera
There are also some target enterprise customers that have decided to move more of their infrastructure to the cloud and who can benefit by using 100-Gig DCI technology to interconnect their on-premise data centers with infrastructure of their cloud service providers.
Uptake for Infinera’s Cloud Xpress has been the strongest in the web-scale data center space, Jay Gill, principal product marketing manager at Infinera, said. “They’re the ones that really drove our Cloud Xpress product requirements,” he said.
Multi-tenant data center providers like Equinix are another category of customers creating demand for 100 Gig interconnect solutions.
A customer in an Equinix data center in Santa Clara, California, for example, can pay for a 100-Gig port on an Arista switch and, via Infinera, connect to an Arista switch at an Equinix data center in San Jose. The infrastructure essentially creates one virtual data center across the entire metro, Equinix’s Tarazi said.
Network cabling at an Equinix data center (Photo: Equinix)
While Equinix is familiar with performance of the two products, the latest test results are “really good news,” from both performance and distance perspectives, he said.
In addition to performance, a big benefit of Arista switches is their programmability, Tarazi added. A company like Equinix, which has a sophisticated interconnection platform it has developed in-house, can use its own Software Defined Networking technology. “That’s very significant for automation and agility,” he said.
Infinera and Arista aren’t the only vendors on the market that can provide this level of performance and programmability. Equinix also uses DCI solutions by Ciena and switches by Juniper, according to Tarazi. The data center provider uses Arista switches to connect customers to the public internet, for example, but the combination of Juniper and Ciena for private cloud connectivity, he said.
“This is a new class of technology, and those guys are some of the leaders, but [they are] not the only ones,” Tarazi said.
About the Author
You May Also Like