Energy Efficiency Not Enough in Push for Data Center Sustainability
Energy efficiency is only part of the solution when it comes to data center sustainability, with right-sizing datasets and equipment also playing a role, according to HPE.
July 18, 2024
This article originally appeared in Light Reading.
Energy efficiency is part of the solution to reduce emissions from data centers, but it's not enough in and of itself. That was a message echoed by Hannah Brier, senior sustainable transformation technologist at Hewlett Packard Enterprise (HPE), during the company's recent event in London.
While the issue of data center energy use has been around for years now, it is being exacerbated by the AI boom, according to Brier.
"When you think about your infrastructure and you think about your data centers, straight away, you've got increased power density. So, that rack that you have in these data centers, there is more compute power going on in there, which also means there's going to be more heat generated as well," she said.
This is relevant for the telecommunications industry, which has not eschewed the AI hype. Indeed, quite the opposite is true, judging by recent industry events where generative AI (GenAI) has been a prominent topic. Yet, many telcos now have their own emissions targets to think about.
AI is highly emissions intensive, due to both training the models and using them. For example, Google's 2024 environmental report showed its emissions have risen by a whopping 48% since 2019, due mainly to AI. Perhaps even more worryingly, the company notes reducing emissions will be difficult because of the energy needed to power "the greater intensity of AI compute, and the emissions associated with the expected increases in our technical infrastructure investment."
Together with cryptocurrencies, AI is part of the reason why energy use from data centers may double by 2026, according to the International Energy Agency (IEA). At the same time, some countries have started to restrict new data centers because of concerns over their impact on the grid.
Too Much Data?
During her presentation, Brier highlighted HPE's approach to data center sustainability, which includes energy efficiency but goes beyond it to areas like software. Here, she noted, tweaks can be made to lower energy consumption.
Hardware efficiency is another factor. Often companies have more equipment than they need, which means some is unused. This not only goes against sustainability principles, but also drives up costs.
Meanwhile, Brier also acknowledged that sometimes AI isn't the right solution to a problem. She said that while there is a lot of hype around AI at the moment, it won't solve every problem.
Some telcos have made similar observations. For example, Orange's chief AI officer, Steve Jarrett, told Light Reading earlier this year that the operator is looking to other solutions where applicable. "You don't want to use the large language model sledge hammer to hit every nail," he said.
Sue Preston, vice president, WW advisory and professional services for HPE global sales, meanwhile added that not all AI is made equal, noting that there are predictive algorithms which have been used for years without causing significant energy problems.
When companies do train AI, they often collect excess data "for a rainy day," without a clear purpose, Brier said. This echoes a recent report from Omdia (a sister company of Light Reading), created in partnership with NTT Data and Net App, which found that around 60% of the data companies store goes unused.
Earlier this year, Rika Nakazawa, group vice president for connected industry and head of sustainability for Americas at NTT DATA, told Light Reading that a possible solution is tagging data to indicate an expiration date.
Right-sizing the dataset is also important when it comes to training AI, according to Brier. "When you're looking at your datasets you need to make sure that they are adequately sized before you train the model," she said.
Making Data Centers Cool
That is not to say energy efficiency does not matter. The energy used to power computing is roughly equivalent to that needed for cooling, according to the IEA. Here, Brier said, HPE is working to reduce the impact by using liquid cooling, which is more energy efficient and was originally developed by HPE for its supercomputers.
HPE is also taking this a step further in its recently announced partnership with Danfoss to reuse the heat generated by data centers.
The collaboration started with a modular data center built by HPE for Danfoss when it became a customer of HPE's GreenLake cloud platform, Preston said during the event. The equipment was integrated to heat buildings in the company's estate.
In June, both companies announced a collaboration pairing HPE's modular data centers with Danfoss' heat reuse technology. They now want to work with joint customers to deploy the solution, Preston said, targeting use cases including greenhouses, agriculture and district heating.
It's worth noting that this idea isn't completely new. Companies are using similar approaches to heat anything from homes to swimming pools, and even to farm eels.
Despite all these efforts, however, no predictions suggest that the carbon footprint associated with data centers and AI will go down anytime soon.
Read more about:
Light ReadingAbout the Author
You May Also Like