In 1998 false reports began circulating in the media about Bob Hope's death prompting him to say that "rumors of my death are greatly exaggerated." Perhaps the same applies to some - not all - of the projections of the exponential growth of electricity demand from data centers and artificial intelligence (AI). Everybody, it seems, is analyzing and forecasting how big is the growth of data centers and how much extra electricity will be needed to keep them functioning. The Electric Power Research Institute (EPRI) is looking into this as are many others. The US National Academies of Sciences, Engineering and Medicine has organized a workshop on the Implications of Artificial Intelligence-Related Data Center Electricity Use and Emissions noting that:
"Global adoption of artificial intelligence (AI) in recent years has spurred significant construction and investment in new data centers and cloud computing. These data centers require large-scale continuous power, posing challenges for both local electric grids and broader climate goals."
To get a sense of the scale, Tract, a developer of master-planned data center parks announced in August 2024 its acquisition of a 2,069-acre land parcel in Buckeye near Phoenix, Arizona with the intention to develop up to 20 million square feet of dedicated space for use by as many as 40 data centers. The company said it was working with the local utility to secure up to 1.8 GW of capacity. As a point of reference, the 2 recently completed Unit 3 and 4 reactors at Vogtle Nuclear Power Plant in Georgia have a combined capacity of 2.234 GW.
Noting that on average, a ChatGPT query consumes nearly 10 times as much electricity to process as a Google search, Goldman Sachs Research estimates that data center power demand will grow 160% by 2030. At present, data centers worldwide consume 1-2% of overall power, with the percentage likely to rise to 3-4% by the end of the decade. A single ChatGPT query requires 2.9 watt-hours of electricity, compared with 0.3 watt-hours for a Google search, according to the International Energy Agency (IEA).
Utilities, of course, have experienced big spikes in power demand growth in the past, including with the rapid adoption of TVs, central air conditioning and other appliances - and more recently electricity hungry electric vehicles and heat pumps. The electrification of heating, transport and industry are also expected to add to the demand. Data centers are expected to use as much as 8% of US power by 2030, compared with 3% in 2022 by some estimates.
While all indicators point upward, there are those who deflate much of the hype primarily due to the expected energy efficiency improvements that many of the projections don't fully consider (Box next page). During the period 2010 to 2018, for example, global data-center services increased 6.5x but their electricity use barely increased - by an estimated 1.065x, according to Jon Koomey and colleagues at the Lawrence Berkeley National Laboratory (LBL). Their 2016 report on US data center energy use, which is being updated, notes
"From 2000-2005, server shipments increased by 15% each year resulting in a near doubling of servers operating in data centers. From 2005-2010, the annual shipment increase fell to 5%, partially driven by a conspicuous drop in 2009 shipments (most likely from the economic recession), as well as from the emergence of server virtualization across that 5-year period. The annual growth in server shipments further dropped after 2010 to 3% and that growth rate is now expected to continue through 2020. This 3% annual growth rate coincides with the rise in very large "hyperscale" data centers and an increased popularity of moving previously localized data center activity to collocation or cloud facilities. In fact, nearly all server shipment growth since 2010 occurred in servers destined for large hyperscale data centers, where servers are often configured for maximum productivity and operated at high utilization rates, resulting in fewer servers needed in the hyperscale data centers than would be required to provide the same services in traditional, smaller, data centers."
The chips are getting more powerful and more energy frugal
The Economist's Technology Quarterly, which appeared in the 26 Sept 2024 issue, focused on the computer chip industry, which is at the center of projections about data center electricity demand. Technological progress taking place at the cutting edge of chip making companies is mind-boggling, to say the least.
"The blackwell chip from Nvidia, shovel-maker for the artificial-intelligence (AI) gold rush, contains 208 billion transistors spread over two "dies", pieces of silicon each about 800 square millimeters in area, that house the processor circuitry. The two dies are linked by a blazing 10 terabytes (i.e., ten thousand gigabytes) per second chip-to-chip connection. Each die is flanked by four blocks of high-bandwidth memory (hbm) chips that together store 192 gigabytes of data."
The article notes that all chipmakers are pushing to boost computing power while keeping energy consumption in check. This is significant because "Over the course of a year one of these megachips, which cost $70,000, will consume 5.2 MWhrs -- about half the energy of an average American household."
Perhaps data centers won't be as energy hungry as some expect them to be. NVIDIA's popular central processing units (CPUs), for example, have become 45,000x more efficient running large language models in the past 8 years, and 25x in the latest iteration of chip designs made specifically for AI. NVIDIA claims that if cars were as efficient as its latest AI chips in converting energy to useful work, you could drive 280,000 miles on a single gallon of fuel - enough to get to the moon on less than a gallon of gasoline.
Wouldn't it be nice if everything were so much more efficient.