Opinion: De-Electrification of the American Economy

Even a big pro-coal policy shift from Washington may not result in higher demand for thermal coal.

Bloomberg

April 18, 2017

6 Min Read
Electricity transmission pylons
Sean Gallup/Getty ImagesSean Gallup/Getty Images

Justin Fox (Bloomberg View) -- For more than a century after the advent of commercial electrical power in the late 1800s, electricity use in the U.S. rose and rose and rose. Sure, there were pauses during recessions, but the general trajectory was up. Until 2007.

The initial drop in electricity use in 2008 and 2009 could be attributed partly to the economic downturn. But the economy grew again in 2010, and every year since. Electricity use in the US, meanwhile, is still below its 2007 level, and seemingly flatlining. The change is even more dramatic if you measure per-capita electricity use, which has fallen for six years in a row. We're now back to the levels of the mid-1990s, and seemingly headed lower.

This is a really big deal! For one thing, it's yet another explanation -- along with tighter federal emissions rules, the natural gas fracking boom, and the rise of solar and wind power -- for why the past few years have been so tough on coal miners. It means that even a big pro-coal policy shift from Washington may not result in higher demand for thermal coal. For another, it seems to settle a turn-of-the-millennium debate about the electricity demands of the digital economy.

See also:

Digital Realty: Trump’s Climate Order Won’t Change Firm’s Clean Energy Goals

Equinix to Press on With Clean Energy Goals after Trump’s Climate Order

Businessman and technology analyst Mark P. Mills, now a senior fellow at the right-leaning Manhattan Institute, kicked things off in 1999 with a report stating that computers and the internet were already responsible for 13 percent of U.S. electricity demand and would be consuming 30 percent to 50 percent within two decades. In a subsequent op-ed for Forbes, charmingly titled "Dig More Coal -- the PCs are Coming," he and fellow Manhattan Instituter Peter W. Huber argued that:

"Yes, today’s microprocessors are much more efficient than their forerunners at turning electricity into computations. But total demand for digital power is rising far faster than bit efficiencies are. We are using more chips -- and bigger ones -- and crunching more numbers. The bottom line: Taken all together, chips are running hotter, fans are whirring faster, and the power consumption of our disk drives and screens is rising. For the old thermoelectrical power complex, widely thought to be in senescent decline, the implications are staggering."

A group of scientists at Lawrence Berkeley National Laboratory who studied energy use were dubious of these claims, and published a series of reports calling them into question. One 2003 paper concluded that direct power use by computers and other office and network equipment accounted for just 2 percent of electricity consumption in 1999 -- 3 percent if you counted the energy used in manufacturing them.Since then, the digital takeover of the economy has continued apace. But it hasn't translated into an explosion in electricity demand. The "old thermoelectric power complex" was decidedly not on the cusp of a big boom in 1999. Instead, per-capita electricity use more or less stopped growing after then. Now it is falling.

Part of the reason is that a grim new economic era dawned in 2000 or 2001 that has been characterized by slow growth, declining labor-force participation and general malaise -- all of which tend to depress energy demand. But if you measure electricity use per dollar of real gross domestic product, the decline is just as pronounced, and it began much earlier than the fall in per-capita demand.

In an article published in the Electricity Journal in 2015, former Lawrence Berkeley energy researcher Jonathan G. Koomey, now a consultant and a lecturer at Stanford, and Virginia Tech historian of science Richard F. Hirsh offered five hypotheses for why electricity demand had decoupled from economic growth (which I've paraphrased here):

  1. State and federal efficiency standards for buildings and appliances have enabled us to get by with less electricity.

  2. Increased use of information and communications technologies have also allowed people to conduct business and communicate more efficiently.

  3. Higher prices for electricity in some areas have depressed its use.

  4. Structural changes in the economy have reduced demand.

  5. Electricity use is being underestimated because of the lack of reliable data on how much energy is being produced by rooftop solar panels.

The Energy Information Administration actually started estimating power generation from small-scale solar installations at the end of 2015, after Koomey and Hirsh's paper came out, and found that it accounted for only about 1 percent of U.S. electricity. That estimate could be off, and there's surely room for more study, but mismeasurement of solar generation doesn't seem to be the main explanation here.Which leaves, mostly, the possibility that life in the U.S. is changing in ways that allow us to get by with less electricity. This still isn't necessarily good news -- those "structural changes in the economy" include a shift away from manufacturing toward sectors that may not provide the kinds of jobs or competitive advantages that factories do. When you look at electricity use by sector, in fact, it's the decline in industrial use since 2001 that stands out.

Still, some of that decline is surely due to efficiency gains. The corporate focus on costs has increasingly come to include energy costs, and parts of the corporate world have also reorganized themselves in ways that make saving energy more of a priority.

Consider the shift to cloud computing. From 2000 to 2005, electricity use by data centers in the U.S. increased 90 percent. From 2005 to 2010, the gain was 24 percent. As of 2014, data centers accounted for 1.8 percent of U.S. electricity use, according to a 2016 Lawrence Berkeley study, but their electricity demand growth had slowed to a crawl (4 percent from 2010 to 2014). What happened? The nation outsourced its computing needs to cloud providers, for whom cutting the massive electricity costs of their data centers became a competitive imperative. So they innovated, with more-efficient cooling systems and new ways of scaling back electricity use when servers are less busy.

Read more: Here's How Much Energy All US Data Centers Consume

In much of the world, of course, electricity demand is still growing. In China, per-capita electricity use has more than quadrupled since 1999. Still, most other developed countries have experienced a plateauing or decline in electricity use similar to that in the U.S. over the past decade. And while the phenomenon has been most pronounced in countries such as the U.K. where the economy has been especially weak, it's also apparent in Australia, which hasn't experienced a recession since 1991.So is electricity use in the developed world fated to decline for years to come? Well, not exactly fated. Check out that bottom line in the last chart. Transportation now accounts for just 0.3 percent of retail electricity use in the U.S. If the shift to electric vehicles ever picks up real momentum, that's going to start growing, and fast. Dig more coal (or drill for more natural gas, or build more nuclear reactors, or put up more windmills and solar panels) -- the Teslas are coming.

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like