Study: Data Centers Responsible for 1 Percent of All Electricity Consumed Worldwide
Latest research shows that massive gains in computing power are accompanied by only a modest increase in energy use, thanks to efficiency improvements and the shift to the cloud.
The amount of computing done in data centers more than quintupled between 2010 and 2018. However, the amount of energy consumed by the world’s data centers grew only six percent during that period, thanks to improvements in energy efficiency.
That’s according to research published today in the academic journal Science. Such a slow rate of growth in energy consumption in relation to growth in overall computing power reflects the ongoing shift of computing from old, inefficient data centers operated by traditional enterprises, such as banks, insurance companies, or retailers, to newer facilities built by providers of cloud computing services, such as Amazon Web Services, Microsoft Azure, and Google Cloud.
According to the study, in 2018, the world’s data centers consumed 205 terawatt-hours of electricity, or about 1 percent of all electricity consumed that year worldwide. They consumed 1 percent in 2010 as well.
Titled Recalibrating global data center energy-use estimates, the report seeks to dispell often fantastically exaggerated estimates of data center energy consumption and its growth curve, as the society increasingly relies on digital services.
In general, commercially operated cloud data centers are much better optimized for efficiency than the legacy enterprise data centers, since their operators have strong business incentives to waste less energy. The less energy a cloud data center uses, the higher the provider’s profit margin.
Corporate data centers, on the other hand, often don’t have such incentives, their managers rewarded only for maintaining uptime, not for doing it efficiently. They are notorious for not only being designed and operated inefficiently but also for not having their energy consumption closely tracked – or tracked at all – by their managers.
The latest data center energy use findings come as European Union officials entertain imposing energy efficiency regulations on the block’s data center operators. As they do so, commercial data center providers have been lobbying them to create rules that would incentivize traditional enterprises to move out of their old data centers and into commercially operated facilities faster.
Writing about the new study on a company blog, Urs Hölzle, Google’s senior VP technical infrastructure, said that on average, “a Google data center is twice as energy efficient as a typical enterprise data center. And compared with five years ago, we now deliver around seven times as much computing power with the same amount of electrical power.”
The new study, authored by a group of researchers from Northwestern University, UC Santa Barbara, and the US Department of Energy, shows that the gains in energy efficiency the data center industry had made 10 years ago persist, albeit the latest gains aren’t as dramatic as they once were.
One of the authors is Jonathan Koomey, a former Lawrence Berkeley National Laboratory scientist and one of the leading authorities on data center energy use and its impact on the environment. He's been studying the subject for more than two decades.
A previous Koomey-led study of data center energy use in the US, which was paid for and published by the US Department of Energy in 2016, found that collectively, all data centers in America consumed 2 percent of all electricity consumed nationwide.
The 2016 study found that in the US, data center energy consumption grew by 4 percent between 2010 and 2014. That’s after it grew 24 percent in the preceding 5 years, and nearly 90 percent between 2000 and 2005.
The Shift to the Cloud
Speaking with DCK Thursday, Koomey said he and the other researchers found that there had been a dramatic decrease in recent years in the amount of “traditional” enterprise data center capacity online and a subsequent decrease in overal energy consumption by this class of computing facilities.
“Traditional data centers go down a lot,” he said. “The shift is both to hyperscale and to the cloud non-hyperscale.”
The authors considered any cloud-provider data center larger than 40,000 square feet hyperscale, Eric Masanet, associate professor at Northwestern University and the study’s lead author, explained in an email.
Smaller traditional data centers housed 79 percent of the world’s compute instances in 2010, the researchers wrote. By 2018, 89 percent of compute instances were hosted by cloud data centers, both hyperscale and smaller cloud computing facilities.
Better Gas Mileage
Servers, storage, and network hardware on its own consumed more energy in 2018 (130TWh) than it did in 2010 (92TWh). But these devices use energy much more efficiently now than they did a decade ago, meaning a lot more computing for every 1Wh used.
Additionally, the authors said, new data suggested that modern data center infrastructure systems (cooling and power) were so much more efficient than earlier, that the decrease in their energy use was “enough to mostly offset the growth in total IT device energy use.”
Fighting 'Simple-Minded Extrapolations'
The findings contradict reports that come out periodically painting the world’s growing appetite for digital services as creating a new, quickly growing environmental threat in the form of cloud computing infrastructure.
One report the authors point to was published by the Independent in 2016. It said data centers had gone from consuming “virtually nothing 10 years ago to consuming about 3 per cent of the global electricity supply and accounting for about 2 per cent of total greenhouse gas emissions,” or the same carbon footprint as the airline industry. The report went on to predict that the total amount of data center energy use would triple over the next 10 years.
Koomey said such reports were based on too simplistic an approach to calculating energy consumption. The typical mistakes, he explained, include using projections of future data growth and extrapolating data center energy consumption growth to support it without taking into account energy efficiency gains; or taking a data growth rate from a particularly high-growth period and extrapolating it for many years into the future.
Combine those two, and “you end up with the possibility of some [projection] mistakes being very large,” he said. “Simple-minded extrapolations can get you in real trouble.”
Koomey and his colleagues used a “bottom-up” approach to make their estimates. “We like to count gadgets,” he said. Instead of drawing conclusions based on data growth projections alone, “we’re focused much more on making sure that the physical characteristics of the systems are well represented.”
About the Author
You May Also Like