Four Startups – Four Liquid Cooling Designs for Data Centers
These companies are innovating on the decades-old method for cooling computers.
July 30, 2018
Close to half of a data center's total energy use can go to keeping equipment properly cooled. The most common data center cooling method has been to simply refrigerate air through rows of cabinets laden with heat producing equipment.
Easiest isn't always best, of course, and cooling with refrigerated air has many drawbacks, starting with the already mentioned energy costs.
These days, vendors have been coming up with a variety of alternate cooling solutions based on a technology that hasn't been the norm since back in the pre-microprocessor days, when mainframes ruled the roost. By using liquid coolants instead of forced air, cooling-associated energy costs can in many cases be reduced.
There are other advantages as well. One of the biggest ones is power density. Because of the efficiency of liquid cooling, data center operators can take better advantage of valuable floor space and increase the amount of processing power per square foot far beyond what's possible with forced air. Liquid-cooled processors also run cooler, which increases their performance -- that's the main reason a large number of supercomputers are cooled with liquid.
Big hardware vendors like Hewlett Packard Enterprise, Dell EMC, and Lenovo have been adding liquid cooling to their portfolios, often offering easy-to-deploy hybrid approaches that bring coolant into server cabinets to cool air, which is blown by fans through the equipment. While these approaches can often cut cooling costs, many startups are offering more direct solutions with efficiencies that can make cooling-related energy costs almost negligible, while greatly increasing server density and CPU performance. Here are some of the more interesting examples:
Chilldyne
This five-year-old startup based in Carlsbad, California, takes a direct-to-chip approach to liquid cooling, sending liquid through a modified heat sink. The design uses a low flow rate and negative pressure, so that fluid retreats from the electronics rather than spilling in case of a break in the line.
Asperitas’s AIC24 “modular data center” system
While immersion-based cooling has been around for some time (gamers love it), it's definitely not for everybody. For starters, this isn't a solution that can be added to racks of servers that are already up and operating. Asperitas's AIC24 "modular data center" system requires the IT to be mounted in specialized modules, which can house any type of server board up to 12×13 inches.
The plus side is not only inexpensive cooling but high density. Each modular data center can support up to 22 kW of IT power while taking up less than 10 square feet of floor space, which might help the system find traction in small edge data centers at the bottom of cell towers.
The company says it's been working with various technology partners, including Super Micro Computer, to develop mass-produced hardware to lower the cost of implementation, and in April announced a partnership with Boston Limited, a UK-based server manufacturer.
Iceotope
When UK-based Iceotope got started ten years ago it offered an immersion-based system, Petagen, that was not unlike Asperitas's immersion technology. In recent years, however, the company has shifted gears, and now offers Ku:l Sistem, a cooling method the company calls "non-submersive immersion" that requires no mechanical chilling and is easy for data center operators to implement.
"We actually don't fully immerse," Iceotope's CEO, David Craig, told Data Center Knowledge. "We use very little coolant, and we invented a whole new method of delivery that allows us to collect 100 percent of the heat with relatively small amounts of coolant."
In a nutshell, Ku:l Sistem circulates Galden, a dielectric coolant from the Belgian chemical company Solvay, direct-to-chip by means of a specially designed heat sink. Although this method is not unlike systems being marketed by Chilldyne, Ebullient, and others, there is a difference. In this case, the heat sink is designed to purposefully allow minute amounts of the non-corrosive and non-flammable coolant to spill onto the CPU and key electronic components, like RAM, to supply additional cooling. The heat collected by the circulating Galden is then removed by use of a plate heat exchanger in conjunction with the facility's water supply.
"The electronics are completely safe," Craig said. "We just deliver small amounts to the places that matter, so we work brilliantly well in hyperconverged systems and on the edge of network."
These examples illustrate several things data center operators need to consider when deciding if a liquid cooling solution fits their needs. First, solutions such as those offered by Chilldyne and Ebullient will require additional investments in infrastructure to move coolant through the premises. And immersion-based systems come with several potential obstacles, including limitations on the equipment that can be deployed and increased expenses for readying equipment for the technology.
About the Author
You May Also Like