Is Liquid Cooling a Cure for the Moore’s Law Breakdown?

A Microsoft project touted its recent re-investment in immersive cooling technologies as not just a way to overcome the derailment of Moore’s Law by physics, but the only way. Hypothesis or hyperbole?

Scott Fulton III, Contributor

June 18, 2021

3 Min Read
Is Liquid Cooling a Cure for the Moore’s Law Breakdown?

An “Innovation Story” published by Microsoft last April re-introduced readers to the company’s work with immersion cooling for data center servers, particularly as it relates to Azure.  But then it brought up an unusual connection — one whose existence may be a question in itself.  If two-phase immersion cooling were present in production data center environments, wrote Microsoft’s John Roach, the scalability limitations of processor designs at the atomic level could be overcome.

Put another way:  If heat is the obstacle that derailed Moore’s Law, why not use immersion cooling to transfer heat off the chips?  The observation that transistor densities could double every 18 months while keeping processors affordable and desirable in the market, could get back on track — albeit submerged.

That assertion prompted Greg Cline, Omdia’s principal analyst for SD-WAN and data center, along with Vladimir Galabov, head of Omdia’s Cloud and Data Center Research Practice, to draw a stunning conclusion.  In their April 21 report entitled, “Immersion cooling is heating up,” they wrote that Microsoft expanded its investments in liquid cooling technologies because “it is the only way to offset the slowdown of Moore’s Law.”

Could this case have possibly been overstated, maybe just a bit?  Is Omdia really saying immersion cooling is the one way data centers will overcome Moore’s Law, and eventually, they’ll all just have to sink or swim with it?

Data Center Knowledge (whose parent company, Informa, is also parent of Omdia) put the question to Dr. Moises Levy, Omdia’s new principal analyst for data center power and cooling.

“We are currently reaching a physical limit in transistor miniaturization,” responded Dr. Levy.  “At 7 nm, each transistor is the size of 10 hydrogen atoms laid side by side.  It is more expensive and technically difficult to keep up Moore’s Law.

“When we are unable to continue shrinking the size of each transistor to pack more on a processor,” he continued, “we would need to start making our processors bigger and bigger.  This means they would require more and more power.  Liquid cooling systems enable thermal management for higher density of electronics, since air cooling systems are no longer effective.  Liquid cooling solutions are currently being adopted in many data centers, and contribute to improving the power to cooling ratio, eliminating the need for complex airflow management, and helping to achieve sustainability goals linked with carbon emissions and water consumption.”

Yeah, sure, but. . . are immersion tanks truly an inevitability?  Maybe not, Dr. Levy responded, if you’re willing to accept some even wilder possibilities.

“Another way to keep up Moore’s law is through new technological advances in nanotechnology and quantum computing,” he wrote.  “Companies such as Intel, IBM, Microsoft, and Google are already working on quantum computing, where we talk about quantum bits (qubits) and subatomic particles.”

In an article last January for Consulting / Specifying Engineer, Dr. Levy proposed a new means of visualizing performance metrics in a data center by measuring four key beneficial attributes — productivity, efficiency, sustainability, and operations — weighed in each instance against risk.  The result is what he calls a data center site risk metric, which he proposed in a 2017 paper for the IEEE as a means of comparing the overall efficiency of multiple data centers against one another.  It would be interesting to see how Dr. Levy would evaluate a fully immersed data center in the context of site risk.

About the Author

Scott Fulton III

Contributor

Scott M. Fulton, III is a 39-year veteran technology journalist, author, analyst, and content strategist, the latter of which means he thought almost too carefully about the order in which those roles should appear. Decisions like these, he’ll tell you, should be data-driven. His work has appeared in The New Stack since 2014, and in various receptacles and bins since the 1980s.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like