AI has broken the grid’s forecasting logic. Data-center demand could double—or stall—by 2028. The problem isn’t prediction, it’s governance. Smarter rules, transparent data, and flexible pricing—not overbuild—will determine how well the grid survives the AI era.
Artificial intelligence has shattered the linear logic of energy demand. By 2028, U.S. data-center electricity use could soar to 580 terawatt-hours—or stall at 325—a swing larger than Florida’s annual consumption. This volatility is not a failure of forecasting; it is a failure of governance. Institutions built for predictability are misfiring in an era defined by structural uncertainty. Developers obscure real load data, utilities are rewarded for overbuilding, and regulators—fearing shortage more than surplus—approve speculative capacity that locks in stranded assets, higher rates, and new emissions.
AI data centers are not merely larger versions of what came before—they are a new species of infrastructure: power-dense, computationally volatile, and capital-intensive. Yet our policy frameworks still assume they behave like steady industrial loads. The result is a dangerous feedback loop: incomplete data drives uncertain forecasts; uncertainty fuels fear; fear drives overbuild; and overbuild deepens the uncertainty it was meant to solve.
Drawing on new analyses from Lawrence Berkeley National Laboratory and Duke University, this first installment of the AIxEnergy Framework Series argues that the solution is not better prediction but smarter governance. The path forward begins with mandatory data disclosure, performance-based regulation that rewards accuracy over expansion, and flexible pricing that makes adaptability profitable. The question is no longer how to forecast demand—but how to build systems resilient enough to thrive when we can’t.
The New Energy Frontier
By 2028, U.S. data-center electricity consumption could reach 580 terawatt-hours—or it could plateau near 325. That 255 TWh swing—larger than Florida’s entire annual usage—isn’t a rounding error. It is a governance gap. Our institutions still assume that energy demand grows linearly, incrementally, and predictably. Artificial intelligence has shattered that assumption.
The AI data center is not an incremental technology. It is an entirely new species of infrastructure—power-dense, computationally volatile, and capital-hungry. Each new facility operates at a power density ten times that of a traditional data center, with clusters of GPUs consuming megawatts per rack. AI load is not a steady hum of cloud activity; it is a pulsing, asymmetric wave of compute intensity that shifts with algorithms, model architectures, and data availability.
This volatility is reshaping the electric grid faster than regulation can adapt. In the past 18 months, utilities from Georgia Power to Dominion have announced double-digit revisions to load forecasts driven by AI data center demand. Yet the spread of uncertainty among those forecasts remains massive. Some expect AI demand to add 7 percent to system load; others predict 20 percent. The range isn’t analytical noise—it’s policy paralysis in numerical form.
We are living through the most rapid structural change in electricity demand since the postwar industrial boom, but without the tools to govern it. The result is a dangerous feedback loop: incomplete data produces uncertain forecasts; uncertainty fuels fear of shortage; fear drives overbuild; and overbuild deepens the very uncertainty it was meant to cure.
Our energy institutions were designed for a world that changed slowly, where infrastructure could be planned in decades and demand modeled in trends. That world no longer exists. AI is forcing utilities, regulators, and investors to plan in real time for technologies whose load profiles shift in months. Yet we continue to apply the same static governance logic—rewarding confidence, not accuracy; expansion, not adaptability.
The paradox is that our forecasts are getting better while the world they describe is getting less predictable. Data quality improves, but the system’s behavior becomes more chaotic. The result is an explosion of risk—financial, operational, and political—locked inside an illusion of certainty.
The Shape of Uncertainty
Lawrence Berkeley National Laboratory’s 2024 analysis projects that U.S. data-center electricity use could fall anywhere between 325 and 580 TWh by 2028—an 80 percent swing in expected demand.¹ That range, the largest uncertainty band in any modern U.S. load forecast, reveals a structural flaw in how the energy sector handles unknowns. Some of the uncertainty stems from missing data, some from unpredictable variables like chip efficiency and cooling, and some from the unknowable—future AI applications that might multiply or moderate electricity use.
Forecasting assumes uncertainty is random noise around a trend line. But AI introduces uncertainty that is systemic and strategic. Developers withhold operational data, utilities retain structural incentives to overbuild, and innovation keeps resetting the baseline. This is not a modeling problem. It is a governance problem.
Utilities, regulators, and policymakers are being asked to plan for an industry whose power demand could double or plateau—and to do so without access to verifiable operational data. The opacity is extraordinary. Even as utilities scramble to expand generation portfolios, they lack basic visibility into how much power is being used, when, and for what type of workload. Is the load continuous inference, which runs 24/7, or intermittent training, which can flex in time? Those distinctions matter profoundly for grid design, but right now they are guesses.
The implications are enormous. The 255 TWh uncertainty band represents 40–50 gigawatts of potential generation capacity. Building to the high end and landing near the low could strand tens of billions of dollars in assets while locking in decades of emissions.² The risks extend beyond cost—overbuilds crowd out renewables, inflate rates, and erode public support for the energy transition. And yet the opposite risk—underbuilding—could stall innovation and economic growth. The policy problem is not choosing between the two; it is designing a governance system capable of adjusting between them.
A Taxonomy for Turbulence
Uncertainty is not a single phenomenon. It operates across multiple layers, each demanding a different kind of response.
- Aleatoric (natural) — Weather and load variability, statistically manageable and well understood.
- Measurement — Missing or unreliable operational data from data centers and interconnection queues.
- Parametric — Unknowns about chip efficiency, rack density, and cooling innovation trajectories.
- Epistemic — Ambiguity about the system itself: what counts as a “data center” in a globally distributed compute network that shifts workloads across continents?
- Knightian (deep) — The unknowable future: Will AI saturate, or will every device, vehicle, and system become a micro data center?
Measurement and parametric uncertainty—short-term and reducible—can be addressed through mandatory disclosure, standardized reporting, and performance-based regulation (PBR) that rewards accuracy within a 1–3 year horizon. Epistemic and Knightian uncertainty—long-term and irreducible—require flexibility and optionality, not misplaced precision. Governance must distinguish between what can be measured and what must be managed.
Treating all uncertainty as forecast error is like treating turbulence as bad piloting—it blames the wrong mechanism. We cannot smooth the flight path of AI’s energy curve, but we can build a better aircraft.
When Forecasting Becomes Fiction
Utilities are planning and permitting gigawatts of capacity based on speculative signals from developers. Industry insiders acknowledge that data-center developers file five to ten times more interconnection requests than projects they actually intend to build.³ These “phantom data centers” distort demand projections and drive unnecessary capacity commitments.
Rate-of-return regulation turns this uncertainty into profit. Utilities are rewarded for capital investment, not for accuracy. When in doubt, they build. Regulators approve capacity for the high-end scenario because no one wants to be blamed for shortage. The result: systematic overbuild, socialized cost, and fossil lock-in.⁴
This is not malfeasance—it is inertia. Every actor behaves rationally within a flawed incentive structure. Developers inflate requests to secure optionality; utilities expand capital portfolios to earn regulated returns; regulators approve to avoid risk. The outcome is predictable: excess capacity, higher costs, and slower decarbonization. The only irrational behavior is the collective assumption that uncertainty can be ignored.
In this ecosystem, uncertainty isn’t an error margin—it’s a business model.
A Governance Reframe
If uncertainty is the problem, the solution must begin with disclosure. We cannot manage what we cannot measure. Every interconnection approval should require transparent reporting of workload composition (training vs. inference), annual utilization rates, cooling type, and flexibility potential. These disclosures would allow regulators to distinguish between continuous and interruptible load—transforming forecasting from conjecture into constraint-based planning.
Next, regulators must reward accuracy, not volume. Performance-Based Regulation (PBR) frameworks should compensate utilities for narrowing forecast error and penalize chronic overestimation. A utility that saves ratepayers billions by deferring unnecessary capacity should earn returns comparable to those gained from new capital investment. The incentive should shift from building to managing uncertainty.
Third, we must price flexibility. Duke University’s Nicholas Institute found that a 0.5–1 percent reduction in peak data-center load could integrate nearly 100 GW of new demand with existing infrastructure.⁵ Geometric demand charges and narrow time-of-use (TOU) surcharges make such micro-curtailments financially rational. Once flexibility becomes an asset class, overbuild loses its logic.
Finally, we must build optionality into infrastructure itself. Flexible interconnection contracts, modular substations, and reversible investments can make systems resilient to uncertainty. If AI demand slows, capacity can be repurposed. If it accelerates, it can be expanded without starting from scratch. Resilience is no longer about redundancy—it’s about reversibility.
The Clock Is Ticking
Global data-center electricity use is expected to more than double by 2030, driven primarily by AI computing loads.⁶ In the past year alone, Google, Amazon, and Microsoft announced 15 GW of new data-center capacity—projects that will test the limits of U.S. grid infrastructure.⁷
Meanwhile, local governments are beginning to push back. In Northern Virginia—the world’s largest data-center hub—officials are debating moratoriums on new interconnections until power and land constraints ease. In Oregon, environmental regulators have halted expansions over water-use concerns. In Georgia, new substations are facing community opposition over noise and visual impact.⁸
At the same time, utilities have requested $29 billion in rate increases for 2025, more than twice the average of the past decade.⁹ For regulators and ratepayers, the message is clear: uncertainty is expensive. What once looked like prudent planning now resembles speculative overreach.
The political tolerance for overbuilding is shrinking. The next phase of the AI energy era will not be defined by shortage—it will be defined by accountability.
From Forecasting to Governance
We have entered the post-forecast era of energy planning. The question is no longer how to predict demand, but how to remain resilient when we can’t. That means treating transparency as infrastructure, rewarding prudence over projection, and designing markets that value adaptability as much as capacity.
This transformation will not be easy. It requires regulators to trade control for clarity, utilities to exchange growth for accuracy, and developers to give up opacity for credibility. But the alternative—a grid built on guesswork—is unsustainable.
Uncertainty is not the enemy. It is the environment. The grid’s real crisis isn’t forecasting—it’s the assumption that the future can still be known.
Roadmap for the Series
This first article defines the foundation: that the defining challenge of the AI-powered grid is not forecasting error but structural uncertainty. The rest of the AIxEnergy Framework Series: Managing Data Center Energy Uncertainty builds on this insight, translating diagnosis into design.
Part II — Phantom Data Centers: How Strategic Opacity Drives Overbuild
An investigation into how speculative interconnection practices and opaque disclosure allow developers to flood queues with fictitious projects—and how utilities’ capital incentives perpetuate the illusion of demand. We’ll examine policy tools that can separate signal from noise, including disclosure reform, probabilistic planning, and “load underwriting.”
Part III — The Utilization Paradox: Scarcity and Waste Inside AI Infrastructure
A technical deep dive into AI computing itself. We’ll explore how GPU underutilization (often just 60–70%) and I/O bottlenecks conceal massive temporal flexibility. We’ll show how distinguishing between training (flexible) and inference (rigid) workloads redefines grid operations—and turns waste into capacity.
Part IV — Flexibility Is the New Capacity: Unlocking 100 GW Without New Generation
This article will quantify how small, well-priced adjustments in load timing can unlock enormous system value. We’ll translate Duke University’s “curtailment-enabled headroom” findings into actionable pricing mechanisms—geometric demand charges, TOU surcharges, and pass-through incentives for data centers and their compute tenants.
Part V — From Uncertainty to Action: A Five-Mechanism Framework for AI-Era Governance
The series culminates in a full policy blueprint. Five mechanisms—granular disclosure, short-term PBR incentives, flexibility pricing, technology mandates tied to decarbonization, and speed-to-market rules for verified clean power—form a unified framework for adaptive governance.
Together, these five articles will outline a new approach to managing data-center energy uncertainty—one that treats uncertainty not as a forecasting failure but as a design constraint for a faster, cleaner, and more accountable grid.
References
- Lawrence Berkeley National Laboratory, Data Center Energy Use Forecast 2024 (Berkeley, CA: LBNL, 2024).
- Michael Leifman, Managing Data Center Energy Uncertainty: A Framework to Prevent Overbuild, Control Costs, and Unlock Grid Flexibility (Washington, DC: AIxEnergy, 2025).
- Brian Martucci, “A Fraction of Proposed Data Centers Will Get Built,” Utility Dive, May 15, 2025.
- “The Murky Economics of the Data Center Investment Boom,” The Economist, September 30, 2025.
- Duke University Nicholas Institute, Curtailment-Enabled Headroom: Integrating Large Loads with Existing Capacity (Durham, NC: Duke University, 2024).
- International Energy Agency, “AI Is Set to Drive Surging Electricity Demand from Data Centres,” April 10, 2025.
- Lisa Martens, “Tech Giants Announce 15 GW of New AI Data Center Capacity,” Reuters, April 22, 2025.
- John Funk, “Local Governments Push Back Against AI Data Centers,” Canary Media, July 3, 2025.
- U.S. Energy Information Administration, Electric Power Monthly, June 2025.
Next in the AIxEnergy Framework Series: Part II — Phantom Data Centers: How Strategic Opacity Drives Overbuild.
Michael Leifman is a strategist with 20+ years advancing clean energy, technology, and policy. Former ERM Partner leading utility and state climate strategies; Founding Principal of Tenley Consulting; ex-GE innovation strategist; early DOE/EPA analyst. He holds M.S. degrees from Johns Hopkins and Carnegie Mellon, and a B.A. from the University of Chicago.