Welcome to Your Compact, Data-Driven, Generator-Free Data Center Future
These three trends will shape data center technology south of the motherboard in 2020 and for years to come.
Given the data center industry’s cagey nature – the secrecy around critical infrastructure, the NDAs, and so on – we can’t make specific predictions without substantial risk of looking like total fools. But from conversations with vendors and analysts we can at a minimum get some idea of the directions data center technologies are moving in.
Here we’ll focus on three trends in data center tech south of the motherboard (silicon, networks, virtualization, containers, and so on being north) that we think we’ll be significant in 2020 and beyond. First, the world of new possibilities machine learning and operational data collection open up for intelligent data center management tools. Second, the renewed focus on density in power and cooling tech driven by machine learning and the need to shrink computing infrastructure footprint for edge deployments. Third, the spike in enthusiasm around technologies that may one day make diesel generators a thing of the past.
Next Step in Data-Driven Data Center Management
For years, big vendors have been talking about adding a predictive-analytics dimension to data center management tools, namely DCIM software. Meanwhile, smaller players, such as Nlyte and Vigilent, brought tools with predictive capabilities to market.
Two of the said big vendors, Schneider Electric and Vertiv, said in December that they are now collecting enough operational data from customer devices to start rolling out viable predictive features.
“We have a very large data lake, with billions of rows of data, which we think is incredibly important,” Steve Lalla, executive VP in charge of Vertiv’s services division, told DCK. “We can start changing the way we deliver services and solutions. We can start being more predictive. We can start looking at SLAs.”
The vendor continuously collects data from customer systems – “when they allow us to” – through its monitoring software (on-prem and, increasingly, SaaS). Overtime, it’s gotten better at getting the data normalized and organized to make it useful for analytics, Lalla said.
Schneider’s efforts to build out predictive data center management capabilities and deliver them in the form of Software-as-a-Service, or SaaS, started almost three years ago, Kevin Brown, senior VP of innovation and CTO of the company’s Secure Power division, told us.
Now, “we’ve got enough data in the cloud where we’re starting to roll out predictive analytics,” he said in December. “Much more sophisticated battery-aware models, machine-learning algorithms – those are all no longer theory. Those are coming out this quarter.”
Schneider is now collecting data from between 250,000 and 300,000 devices deployed in customer data centers, Brown said. The company hired a dedicated team of data scientists, and when it got to around 200,000 devices, the team started feeling confident about some of their algorithms’ accuracy, he said.
Confident enough to do things like predicting when a UPS battery might fail, for example. Schneider wants to do more, but it will need to collect even more data to do that. The more capable an algorithm, the more data it needs, Brown explained. “The bar keeps moving depending on how sophisticated you want to be with your algorithms.”
Andy Lawrence, executive director of research at Uptime Institute, said in a recent webinar that the emergence of machine learning drove a resurgence in data center management software. Once promising, the DCIM software market didn’t see the skyrocketing growth many had expected. But wide adoption has taken place, albeit slowly.
DCIM can now be considered a mainstream technology, Rhonda Ascierto, VP of research at Uptime, said. All data centers have some sort of DCIM in place, regardless of whether they call it DCIM or something else, she said. And the bottom line is, there’s been enough data center management software deployed to enable collection of data that can now be used to build ML-powered predictive analytics and automation features.
Both data availability and rapid progress in ML are giving data center management software a boost. But there’s also a third driver: edge computing. As companies lay out their plans for deploying many small compute nodes close to where data is generated, they quickly run up against the problem of operating such a distributed infrastructure in an economical way. Tools like DCIM, especially provided as cloud services (SaaS), are a natural fit here, enabling remote monitoring and management features from a centralized console.
Edge has become central to Schneider’s infrastructure management SaaS strategy. “The idea that going to the largest data center with a cloud-based management system – you know, we’ve kind of solved that by keeping the data onsite in a lot of cases,” Steven Carlini, VP of innovation and data center at Schneider, told us. “It really has more value when you’re deploying at scale. The real value is going to be at the edge.”
Denser, Smaller, and Soon, Ubiquitous
Edge computing has put ever more pressure on engineers that design data center technologies to make things smaller and denser.
Schneider, for example, recently announced its smallest micro-data center yet: a 6U enclosure that can house servers, networking gear, and UPS and can be wall-mounted. Brown said he expects such small form factors to drive a lot of revenue for Schneider in 2020. “Less than a full-rack deployment is where the traction is,” he told us.
Vertiv in 2019 revamped its power portfolio and launched a family of UPS units that pack more power capacity per unit of space than ever before. Of all the company’s products, “that to me is the slam dunk for this coming year,” Quirk said. The rackmount GXT5 UPS family, designed very much with edge computing in mind, ranges from 500VA to 10kVA (some models support 208V and some both 208V and 120V configurations).
Edge computing was also a big consideration behind Schneider’s partnership with the immersion-cooling technology firm Iceotope (Schneider’s venture capital arm is an investor in the firm) and the electronics distributor and IT integrator Avnet, announced in October.
Instead of dunking servers in a tub of liquid coolant or fitting pipes onto motherboards to deliver chilled water directly to chips, Iceotope’s approach is to flood a sealed server chassis with coolant. That means the solution can be deployed in standard data center racks, and standard servers can be retrofitted with liquid cooling.
The number-one problem immersion cooling solves is high power density. Growth in machine learning has driven growth in deployments of server GPUs, which are used to train deep-learning models. Those power-hungry chips can take rack power densities well beyond what a standard data center design is able to cool. Many users can still get away with air-based cooling, and liquid-cooled rear-door heat exchangers that cool air right at the rack have been the most popular approach to solving this problem.
But proponents of immersion cooling technologies emphasize their efficiency advantages. These solutions don’t require any fans. Server fans are usually removed altogether. “You can probably get at least a 15-percent energy reduction in a lot of environments by going to liquid cooling,” Brown said.
Additionally, “on the edge, it solves a lot of problems,” he said. Eliminating fans means eliminating other related parts, which means fewer components that can fail. Having high density in a small footprint makes it easier to deploy edge computing in places where there isn’t a lot of physical space available. It also solves the problem of dust that can damage IT equipment in places like manufacturing plants.
While vendors are excited about the edge, Uptime has yet to see a lot of demand for “net new” edge computing capacity, Ascierto said. To date, most of the demand for 100kW or below “micro data centers” has been driven by upgrades to server closets or remote locations where compute capacity already exists.
The market analyst said she doesn’t expect to see a spike in demand in 2020 either. The anticipated big wave of demand will likely come beyond 2020, once more IoT applications and 5G wireless infrastructure gets deployed, she said.
Promise of Better Backup
Another big shift in thinking about data center design that’s only beginning now and may not materialize in a big way until sometime after 2020 is the replacement of diesel generators with batteries or other technologies.
As Uptime’s Lawrence pointed out, generators are “a problem.” They are expensive to deploy and maintain, they pollute the atmosphere, and make a lot of noise. So far, however, they’ve been an indispensable part of any data center that’s meant to keep running around the clock.
Data center operators have been exploring two alternatives to diesel generators: gas fuel cells and batteries, with lithium ion batteries being an especially promising technology.
Thanks to Bloom Energy, there are now multiple major fuel cell deployments at data center sites, but most of them use the cells to supplement grid energy. At least one, an eBay data center in Utah, uses Bloom fuel cells as its sole energy source, relying on the utility grid instead of generators for backup.
Uptime, Lawrence said, is aware of multiple “very interesting pilots” that started in 2019 to test out alternatives to diesel generators. Additionally, at least one major colocation provider has done some “significant research into this,” he said.
Thanks to the electric-vehicle industry’s strides in increasing energy density and reducing costs of lithium-ion batteries, the technology is quickly taking hold in the data center industry. For now, it’s being used to replace lead-acid batteries in UPS systems, but the runtimes it can provide are continuously expanding, and Schneider’s Brown said it’s fully possible that lithium-ion batteries will eventually be good enough to replace generators.
“I don’t think you’ll see it in 2020, but we track this pretty closely,” he said.
The key metric Schneider watches is how much runtime you can get out of a lithium-ion battery system that costs the same as a diesel generator. If two and a half years ago that runtime was 90 minutes, it’s now close to three hours, Brown said.
Neither of these trends started in 2019, and neither of them, to the best of our knowledge, will reach any sort of definitive inflection point in 2020. Instead, these are some of the big developments that gained momentum in 2019, are expected to accelerate even further in 2020, and will shape of data center technologies south of the motherboard for years to come.
About the Author
You May Also Like