The increasing electricity demand from data centers, driven by the rise of AI technologies, could significantly threaten grid reliability in the United States, according to a recent report by the Conference Board. While AI tools promise future efficiencies in electricity generation, storage, transmission, and use, the near-term surge in AI demand may hinder the energy transition to a net-zero system.
The Conference Board’s May 30 report highlights the potential doubling of electricity demand from data centers over the next decade, posing operational risks to other businesses and threatening grid reliability. This anticipated growth is due to limited grid and transformer capacity, along with lengthy permitting and interconnection timelines for new generation and transmission projects.
Alexander Heil, senior economist at the Conference Board, and Ivan Pollard, leader for the Conference Board’s Marketing and Communications Center, noted that global energy consumption from data centers, including AI and cryptocurrency mining, could jump from 460 TWh in 2022 to over 1,000 TWh by 2026, based on an International Energy Agency projection.
In the U.S., data centers’ electricity consumption could rise to more than 9% of total generation by 2030, according to the Electric Power Research Institute. Some models predict even more dramatic growth in the next five to six years, said Josh Claman, CEO of Accelsius, a provider of computer chip cooling technology.
High-end predictions for data center energy consumption may seem exaggerated, but they cannot be entirely dismissed, Claman noted. Companies like Accelsius, LiquidStack, and Marathon Digital Holdings are developing efficient cooling systems to address this issue. Cooling currently accounts for about 40% of data centers’ energy consumption. Two-phase liquid chip-cooling systems can reduce server racks’ power consumption by 50% or more, enhancing overall efficiency as data centers transition from air- to water-based cooling.
Despite these advancements, the demand for AI could outpace efficiency improvements, sustaining the rapid growth in data centers’ electricity consumption. Potential breakthroughs in silicon-based chip technology or quantum computers could reduce energy demands, but these innovations are likely a decade away from commercialization.
The load growth from data centers could impede progress toward utility, state, and federal net-zero goals. Smaller utilities, unable to quickly deploy renewable generation, may be hardest hit. For example, Oregon’s Umatilla Electric Cooperative saw its carbon emissions quadruple between 2018 and 2021 due to an influx of Amazon data centers.
Utilities lacking adequate renewable capacity risk losing high-load customers with ambitious net-zero goals. Such customers might relocate to areas where their environmental needs can be met. Utilities and data center operators can mitigate this threat by situating facilities near distributed power resources like solar, wind, or nuclear plants to utilize low-carbon energy directly.
The report suggests that AI-driven load growth could benefit the U.S. nuclear industry. Any significant increase in electricity demand from data centers and AI might prioritize nuclear energy as an emissions-free power source. Public Service Enterprise Group is already in talks with data center operators to sell power from its New Jersey nuclear plants.
While AI tools could enhance grid efficiency, the extent to which AI can offset its own load demand increase remains uncertain. More sophisticated AI could lead to efficient electricity generation, storage, delivery, and use, but Heil and Pollard believe conventional solutions like energy efficiency, peak shifting, expedited permitting, and the broader adoption of small nuclear reactors are more immediate measures to offset AI-driven power demand growth.