Building a More Sustainable Data Center: Challenges and Opportunities in the AI Era

Data centers are under pressure to meet the demands of AI while improving sustainability. Discover the challenges and opportunities shaping this transformation.

Stephen Lawton

September 17, 2024

15 Min Read
Building a More Sustainable Data Center: Challenges and Opportunities in the AI Era
The surge in power requirements has resulted in the need for imminent data center design adaptations.Image: Alamy

Table of Contents

Data centers use massive amounts of power and water to run and cool racks of servers, 24 hours a day, seven days a week. According to the International Energy Agency, data centers account for 1-1.5% of the world’s power consumption, and this is only expected to increase over the coming years.

The escalating demands for artificial intelligence (AI) are driving global data centers towards smaller, denser servers with hotter components. This surge in power requirements has resulted in the need for imminent design adaptations in data center infrastructure.

AI Spending Soars

Spending in the global AI infrastructure market – including data centers, as well as networks and other hardware that supports the use of AI applications – is expected to reach $423 billion by 2029, growing at a compound annual rate of 44% over the next six years. Meanwhile, sustainability pressures continue to grow.

According to the Uptime Institute Global Data Center Survey 2022, “Most operators expect carbon emissions reporting requirements soon – yet many are unprepared.”  

Governments worldwide are increasing their efforts to expand sustainability programs to reduce the strain on power grids, as well as identify ways to better manage water resources. Over the past decade, the results have been promising.

Power Usage Effectiveness

Power usage effectiveness (PUE), one of the primary metrics used for the past 15 years to measure energy efficiency, is a key metric that’s seen recent improvement. PUE has stayed relatively flat since 2018 when the average for the largest data centers measured was 1.58. However, in 2022, it was 1.55, meaning that, on aggregate, a data center expends 55% as much energy on cooling, power distribution, and ancillary facility functions as on IT. The lower the rate, the lower the expenditure on power, water, and related sustainability expenses.

The Data Center Thermal Management & Sustainability End-User Survey 2022, conducted by the research firm Omdia, says hyperscale data centers have the best PUE metric. 84% of respondents said they have a PUE lower than 1.5, and 44% specified lower than 1.25. Most respondents indicated their PUE will remain about the same by 2025, the report noted. However, 17% of respondents indicated a PUE higher than 1.5, demonstrating that there is room to improve energy efficiency.

Other report highlights included sustainability being in the top three selection criteria for survey respondents. 45% of respondents said they use renewable energy resources for data centers, representing less than 25% of their total power budget.

ESG Programs

Sustainability is not just a compliance exercise on reducing the negative impact on the environment, it also can bring financial benefits to an organization. According to Gartner’s Unlock the Business Benefits of Sustainable IT Infrastructure report, “[Infrastructure and operations’] contribution to sustainability strategies tends to focus on environmental impact, but sustainability also can have a significant positive impact on non-environmental factors, such as brand, innovation, resilience and attracting talent.”

As a result, boards should embrace the financial opportunities of companies’ Environmental, Sustainability, and Governance (ESG) compliance rather than consider it just another unavoidable compliance expense without a discernable return on investment (ROI).

Gartner predicts 75% of organizations will have implemented a data center infrastructure sustainability program by 2027. The report's authors noted: “The environmental benefits of sustainable data centers and cloud services are clear, but the business benefits they deliver are often overlooked.” Sustainable IT is just one of the ways sustainability programs create value for the enterprise, the report notes.

To improve data center resilience, Gartner recommends that organizations expand use of renewable energy using a long-term power purchase agreement to contain costs, generate their own power where feasible, and reuse and redeploy equipment as much as possible to maximize the value of the resource.

Data Center Sustainability Resources

Resources abound for data center operators looking to reduce power and water consumption, including the DoE’s EnergyStar program and its addendum, IEA’s Data Centers and Data Transmission Networks, InformationWeek’s article How Can Data Centers Reduce Water Usage and Improve Efficiency?, Data Center Knowledge’s 5 Data Center Cooling Mistakes to Avoid, and a myriad of research reports, vendor-sponsored white papers and technical documents online.

These resources can aid in the design of new data centers to ensure that the proper balance of heat, water, and other water-replacement resources are included before construction begins.

AMD-ZT-Systems.jpg

Right-Sizing the Data Center

Over the past several years, the needs of data centers have changed significantly, said Jonathan Meade, chief operating officer at Meade Engineering, a company that designs the electrical infrastructure for data centers. “Capacity need overall, electrically speaking, is dramatically higher. The demand for what customers who are either building the data centers or utilizing the components within the data centers requires more power, which inherently is more resources,” he said.

The reason for the increased power demand is primarily due to denser servers that draw more energy. Meade said he used to build server cabinets that required 5 kW of power, which “was all the heat those servers could withstand.” Today, those same cabinets with more dense hardware need 20 to 30 kW of power.

Interestingly, denser servers with high-powered graphic processor units (GPUs) for AI applications have reduced cooling requirements compared to central processing units (CPUs) for traditional business applications. Today, Meade noted, server racks that used to require 5 kW of power now can operate at 20-34 kW using the same amount of cooling due to advancements in server technology. “That’s really where that innovation has been pushing,” he said.

However, while there are some advances where more powerful processors need less cooling per processor, the vast increase in the GPU density of data centers overall means there are higher power and cooling demands.

The cost to build an “average-powered base building” for a data center, not including the computing resources, ranges from $125 to $200 per square foot, according to JLL, a professional services firm that specializes in real estate and investment management. As AI applications drive up the cost server hardware and require more power and cooling, the definition of “average” relating to the cost of constructing a data center will rise as well.

A key challenge that data center operators building new facilities face today is trying to determine just how much energy they need to build for, without under or overplanning. Should too little power be brought into the facility, it might not be able to supply the appropriate amount of power to all the servers, let alone power the cooling operations. But overbuilding the capacity can waste power and drive up costs.

Clean Energy

Uwe Erlenwein, head of data center construction at IONOS in Baden-Württemberg, Germany, agrees with Gartner that installing a power-generating facility or deploying renewable energy-generating infrastructure can greatly reduce long-term power costs. This is particularly the case in Europe, where the focus is on clean and renewable energy. Erlenwein said that because data centers require such massive amounts of power, it is better to build the data centers close to clean power sources, such as solar, wind, or water.

Power Innovations

Transmitting power over a long distance certainly works, but power is lost in transmission, so having it closer to the source is more power- and cost-efficient. IONOS, for example, is building its own wind farm to provide power to some new data centers.

Fifteen years ago, Erlenwein said, all the energy came from traditional, non-renewable sources, including coal, natural gas, and other fossil fuels. Today, the combination of clean sources is providing increased levels of power to European data centers.

Advances also are seen in the construction of data centers, he noted. Photovoltaic (PV) systems, which use solar panels to convert sunlight into electricity, were considered a fire risk in past years but now are popular for data centers. Erlenwein said his company recently completed a 1.6 MW solar plant adjacent to its data center in France to provide power when demand surges on the grid.

Geothermal-Power-Plant.jpg

Nuclear-Powered Data Centers

One power source popular for large data centers has a sometimes less-than-positive image among the public: nuclear. However, the IEA considers nuclear power a safe and clean energy source. Small reactors are used worldwide, often in submarines and large naval vessels.

The World Nuclear Association estimates that more than 160 nuclear-powered vessels are currently in operation, their energy generated by more than 200 small nuclear reactors. Today, roughly 18.2% of US data centers use nuclear energy, according to the US Energy Information Administration. 

“I think you’ll see self-generation, whether it’s hydro, or even nuclear, because of the growing strain on the grid,” said Sean Farley, vice president of data center strategy for the Americas at JLL. Farley, who has previously managed data center operations for Rockwell Automation and Microsoft, said he expects to see an increase in the use of nuclear power for data centers over the next five years, although he acknowledges that this is “kind of aspirational thinking. That's just what some of these folks have to do to make sure they have this level of reliability.”

Alternative Power Sources

If the public grid is not perceived as reliable, he said, then self-generation, with the public grid as a back-up, might become the option of choice for very large-scale power requirements. The public grid is the primary source of power today, with diesel generators using vegetable oil or natural gas as the secondary source. It is all about reducing costs, he said. “Find the most efficient way to run your power plant and use as little power as possible for a given critical load.”

Erlenwein noted that hydrogen is used in Europe as a power source, and non-explosive propane as a cooling source, but these technologies are less common in the US due to concerns about what might happen if a problem occurs. The move to propane is due to its low global warming potential and the reduced use of chemical refrigerants, he said. The European trend is to use natural refrigerants already found in the atmosphere.

Read more of the latest data center sustainability news

An alternate power source required for data centers is batteries. Current technology is lithium-ion, but as with other rare earth materials, lithium-ion is a finite resource. John Young, founder and president of CyberDef, a cybersecurity consultancy that specializes in compliance, policies, processes, and procedures, cautions that new battery technologies are needed. Electric vehicles and other new electric technologies that require large, heavy lithium-ion batteries demand a lot of the world’s resources, he said. As such, a new battery technology is needed to supply tomorrow’s high-use applications.

Young, like Farley, also has a long history of data center management experience in technology and defense. He spent 22 years at IBM, including running its data center. Before that, Young ran the data center for US government supplier McDonnell Douglas.

Young noted that retrofitting older data centers is an emerging, albeit expensive trend. Adding the ability to use renewable energy – especially solar generated in the Southwest US – can improve the available power significantly. “If we have the right collectors that are able to store it in proper batteries, and we can make efficient use of it, it could change everything,” he added. He recommends building solar arrays in Arizona where it is arid, the land is flat, and there’s a low risk of natural disasters.

He also agrees that newer, non-water-cooling technologies will aid significantly. Using evaporative cooling and pushing air to cool data centers alone wastes a lot of water that otherwise could be used for drinking, he said.

If power generation is the ‘heads’ on the proverbial sustainability coin, then cooling would be ‘tails.’ Water usage for cooling by data centers is massive, as is air flow for cooling server components.

Air Cooling

A recent Omdia survey reported that air handling units (AHUs) and chillers are the preferred thermal management systems by survey respondents, with direct expansion selected as the primary heat rejection method, followed by evaporative.

Furthermore, the survey indicated that liquid cooling is at “an early stage of adoption in data centers.” 93% of those using liquid cooling are using it for less than 15% of data center IT loads as measured in kilo-volt-amperes, or kVA. (kVA is a measure of apparent power if a device was operating at 100% efficiency, as opposed to kilowatts [kW], which is the measure of power that is converted into actual or working power.) 

The interesting twist is that the IBM System/360 Model 91 Processor mainframe, introduced in 1964, was water-cooled. Digital Equipment Corp.’s VAX 11/780, introduced in 1977, also was water-cooled. Both systems were popular in data centers for more than two decades.

Potential for Liquid Cooling

Today’s liquid-cooling techniques generally don’t involve actual water, but rather other non-conductive liquids such as mineral oil.

“There’s a point, somewhere between 50 and 100 kilowatts per rack, where you can’t move air around fast enough to cool the cabinet,” said Farley. “A new way to cool data center servers is needed.”

“Liquid cooling could become very quickly the cooling methodology for AI super-dense environments,” he continued. “That removes the traditional air moving and handling systems that we've been using for a long time in the space and when those go away – your chilled water loops and cooling towers and evaporative cooling – all goes away. And poof, you could run very, very low water consumption data centers, because you're essentially dunking your servers in mineral oil.”

Liquid cooling is essential because no other method can remove the heat from the much denser racks of processors, he noted. That said, it could be an “unanticipated consequence of AI, which in some ways is generating all this new resource consumption. It may put an end to the use of water, traditional water-based cooling systems in data centers.”

Challenging Conventional Wisdom

Farley said the lack of challenging conventional thinking about how to cool data centers could well be holding back advancements in the industry. However, companies like Microsoft offer compelling case studies as to why straying off the beaten path can produce unforeseen benefits.

Traditionally, data centers were kept at a temperature that was determined by the American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE) on the optimal operating temperature of server components. “You kept the data hall really cold and you wanted to keep your equipment happy,” Farley said. In approximately 2007 or 2008, Microsoft and other companies developed research that showed that operating a data center at those temperatures exceeded the purchase price of the hardware over the asset’s lifetime.

After meeting with Intel and other chip vendors whose products were used in the data center, Microsoft determined that the ASHRAE operating temperature model was colder than required. By increasing the ambient temperature of the data halls by up to 12 degrees Fahrenheit, the data center was able to reduce its cooling cost significantly.  

Another bit of outside-the-box thinking – literally – was asking Hewlett-Packard and Dell to take a shipping container, install 2,000 servers in it, and ship it directly to the Microsoft data center. Doing so eliminated a great deal of cardboard waste. “Deployment was much faster and much more sustainable,” he noted. “It gave us a box with hard walls, which is easy to move around and drive efficiency up. We hit a PUE of 1.1.”

James Grice, an attorney and partner at Akerman LLP, said that while data centers look for leading-edge technologies for planning purposes, they tend to stay with well-established technologies for production servers. They tend to be reticent to trust leading-edge technologies for servers that are required, by contract, for very high uptimes. The possibility of a new technology failing, leading to a massive lawsuit, is simply not worth the risk. “It’s not an industry where we're open to innovative ideas,” he said. “We’re not open to trying things out on the fly.”

Grice added that many of the coal-fired power plants that are being decommissioned are not being replaced immediately. He recommends that data centers affected by decommissioned plants might instead replace the lost power with renewable power.

However, he emphasized that the premium cost of power that customers of data centers must pay is not always the top priority. “If you need to serve latency-sensitive applications in New York City, being in a New York City data center is where you need to be.”

The Future of Sustainable Data Center Operations

Despite being able to pack far more compute power into smaller servers, the demands of AI and the greater heat generated by GPU-based servers rather than CPU-based servers will continue to put higher demands on cooling and power systems. Combine that with the increased migration of data center resources from corporate premises to the cloud, and we can expect to see continued high demand for data centers.

In the future, data centers will likely be built with the ability to channel more power to each rack and have the ability to employ improved chip-cooling technologies using water substitutes. Smaller, off-grid data centers, perhaps powered by small nuclear powerplants similar to those in submarines, might well dot the nation where local power grid capabilities are below what the data center requires.

With an eye toward sustainability to improve and protect the environment, dramatic advances in AI applications, and corporate reliance on the cloud, the future for data center operations is indeed bright.

Read more about:

Green IT
Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like