A new whitepaper from the Electric Power Research Institute (EPRI), titled “Powering Intelligence: Analyzing Artificial Intelligence and Data Center Energy Consumption,” reveals a striking projection of AI power needs. This 35-page report indicates that energy consumption from U.S. data centers could more than double, reaching 166% of current levels by 2030.
AI Queries Consume Substantially More Power
EPRI attributes this surge primarily to generative AI, which can consume exponentially more energy per query than traditional search engines. Notably, AI queries require about 2.9 watt-hours per request, making them ten times more energy-intensive than a typical Google search, which uses approximately 0.3 watt-hours. Emerging applications such as image, audio, and video generation pose further energy demands without historical precedent.
Analyzing Energy Consumption Across Applications
The report examines five key use cases, including Google search and ChatGPT. Among these, ChatGPT is the least energy-demanding AI query. However, researchers warn that if Google integrates similar AI capabilities into its search functions, the energy required could rise to between 6.9 and 8.9 watt-hours per search—over three times more than ChatGPT.
Anticipating Supply Constraints
EPRI's four forecasts for electricity usage in U.S. data centers from 2023 to 2030 range from low (3.7% growth) to high (15% growth). Under the high growth scenario, data center electricity usage could surge to 403.9 TWh/year by 2030, representing a 166% increase from 2023 levels. Even under the lower growth scenario, an increase of 29% to 196.3 TWh/year is projected.
The geographic distribution of this growth raises concerns. In 2023, fifteen states accounted for 80% of national data center loads, with Virginia alone responsible for 25%. Projections suggest Virginia's share could escalate to 46% of total electricity consumption by 2030 under the high growth scenario.
Different data center types contribute variably to this demand, with enterprise centers (20-30% of the total load), co-location centers, and hyperscale centers (60-70%) leading the charge. Hyperscale facilities, operated by cloud giants like Amazon and Google, are at the forefront of energy advancements, with new centers boasting capacities between 100 and 1000 megawatts—enough to power up to 800,000 homes.
Shifting Data Center Procurement Strategies
As enterprises rush to secure the latest GPU-equipped servers from leading vendors like Nvidia, the challenge extends beyond hardware procurement. The escalating power needs of these systems elevate the importance of data center capacity—reminiscent of the dot-com boom era in 1999.
To navigate this landscape, businesses must adapt their strategies to resemble those of hyperscale competitors. Companies like Amazon and Google prioritize securing long-term data center capacity through multi-year contracts with power providers and operators.
Many enterprises may need to reevaluate their traditional “three bids and a buy” procurement model. As data center capacity becomes increasingly constrained, this approach may prove ineffective. Instead, businesses should explore longer-term partnerships with data center suppliers, committing to specific capacity levels in exchange for reliable access to resources.
One industry executive shared insights on this trend: “Many data center equipment suppliers aren’t responding to RFPs as they used to. Now they operate on a model where they guarantee a certain capacity each month or quarter.” A decade ago, 100% of their revenue stemmed from a bidding process; today, that figure stands at just 25%.
For enterprise IT leaders, embracing this shift will require strategic foresight and collaboration across IT, facilities, and finance departments. Investing in data center infrastructure now, even at the potential expense of short-term gains, may be essential for competing effectively in an AI-driven future.