Managing Data Center Uncertainty Part V — From Uncertainty to Action: A Five-Mechanism Integrated Framework

AI data center demand uncertainty is a governance failure, not a modeling problem. The article proposes five integrated regulatory tools to force transparency, align incentives, unlock flexibility, avoid overbuild, and protect ratepayers while accelerating decarbonization

Managing Data Center Uncertainty Part V — From Uncertainty to Action: A Five-Mechanism Integrated Framework

This article is part of the AIxEnergy Series: Managing Data Center Energy Uncertainty, drawn from the author's full research paper. The complete version is available directly from the author by request at michael.leifman1@gmail.com.

In Part I of this article series, we discussed how AI is driving unprecedented uncertainty in electricity demand, with U.S. data-center use swinging between 325 and 580 TWh by 2028. The issue isn’t forecasting—it’s governance. Without transparency, flexibility, and better incentives, utilities overbuild, costs rise, and emissions lock in.

In Part II of this article series, we reviewed how phantom data centers distort U.S. energy planning. Developers overfile interconnection requests, utilities profit from overbuilding, and regulators approve speculative capacity. Misaligned incentives create costly overbuild and fossil lock-in—requiring governance reform, transparency, and accountability.

Part III of this article series revealed that what looks like scarcity is, in many cases, inefficiency masked by opacity—a failure of synchronization between computation and electricity, compounded by a lack of data on workload types. The takeaway was clear: the cleanest power plant is the unused GPU. Unlocking that efficiency requires not more technology, but more transparency and governance that distinguish between workload types and create appropriate economic incentives for each.

Part IV revealed that the U.S. grid has nearly 100 GW of hidden capacity that could be unlocked through very small, well-timed data center curtailments—capacity that is technically feasible today but inaccessible under current economic conditions. With the right incentives, data centers can deliver this flexibility at a massive scale, enabling rapid AI-driven load growth without building new electricity generation capacity and transforming a looming grid constraint into a structural advantage.

audio-thumbnail
Article Summary
0:00
/36.571429

In this series concluding article, we contend that AI data center risk stems from poor governance, not bad forecasts, and we outline a coordinated regulatory framework to prevent overbuild, cut costs, and unlock grid flexibility before billions are stranded. This series has already established that U.S. data center electricity demand could reach 580 TWh by 2028—or plateau at 325 TWh. This 255 TWh uncertainty band, larger than Florida's total consumption, threatens tens of billions in stranded assets and decades of emissions lock-in. Strategic opacity by developers creates phantom data centers that distort planning. GPU utilization averaging just 60-70% conceals enormous flexibility potential. And geometric demand charges could unlock nearly 100 GW of grid capacity without new generation.

But pricing flexibility alone won't prevent overbuild or accelerate decarbonization. Effective governance requires four additional mechanisms working in concert with demand charges: mandatory granular disclosure that reveals workload composition and flexibility potential; performance-based regulation that rewards forecast accuracy in time horizons where improvement is achievable; technology incentives that overcome capital cost barriers for efficiency and storage; and speed-to-market incentives that exploit developers' most valuable asset—time—while using costly commitments to signal genuine intent.

Together, these five mechanisms address every dimension of the challenge. They reduce near-term uncertainty through better data, realign utility incentives away from capital spending bias, unlock technically feasible flexibility through compelling price signals, drive efficiency and decarbonization, and control costs by preventing unnecessary capacity additions. This is not theoretical—it builds on proven regulatory approaches, validated technical potential, and commercial demonstrations.

The political obstacles are real. Utilities will resist changes to capital expenditure-based regulation. Developers will claim competitive harm from disclosure requirements. Some regulators will favor economic development over ratepayer protection. But the alternative—business-as-usual planning that treats speculative demand as certain—virtually guarantees the overbuild catastrophe these mechanisms prevent.

The Affordability Crisis Demands Action Now

Before detailing the mechanisms, the stakes must be clear. This is not merely an efficiency or climate problem—it is rapidly becoming a political crisis that threatens the coalition supporting grid modernization and decarbonization.

Utilities have requested $29 billion in rate hikes for 2025, with AI data centers cited as a contributing factor. Resources for the Future notes that 25% of low-income households already face energy burdens exceeding 6% of income—the threshold for "highly burdensome." Electricity prices are rising faster than overall inflation. Residential customers already experiencing rate increases will face further pressure if tens of billions in overbuild costs are socialized across rate bases.

The political coalition supporting decarbonization and grid modernization will fracture if ratepayers conclude they are paying for phantom data centers. Resources for affordable housing, education, and healthcare will be diverted to cover stranded generation assets that never should have been built.

Most electricity affordability policy focuses on bill assistance, rate design reform, and subsidies—accepting rising costs as inevitable and focusing on who pays. Cost avoidance through accurate planning is an equally important but underutilized affordability strategy.

If performance-based regulation prevents 4 GW of unnecessary gas generation, those $4-5 billion costs never enter the rate base. If geometric demand charges unlock Duke's 100 GW of flexibility with existing infrastructure, ratepayers avoid $10-20 billion in new generation and transmission investments. These are permanent cost reductions, not subsidies requiring ongoing appropriations.

The mechanisms proposed address affordability directly: disclosure enables right-sized infrastructure; forecast accuracy PBR prevents stranded assets from entering rate base; geometric demand charges reduce need for expensive peaker plants; technology incentives lower system-wide capacity costs; speed-to-market for decarbonization eliminates fuel price volatility and reduces long-term generation costs.

The affordability solution is not just bill assistance—it is cost avoidance through accurate planning.

Mechanism 1: Granular Demand Disclosure Requirements

The first mechanism addresses the measurement uncertainty that makes accurate planning impossible. Without transparent operational data, utilities cannot distinguish between speculative and genuine projects, between flexible and firm loads, or between facilities that can provide grid services and those that cannot.

The Proposal: PUCs and ISOs require data center developers to provide:

  • Minute-by-minute power consumption data for 12+ months from three most recent comparable facilities
  • Multi-year demand projections (monthly average and peak) for proposed facilities
  • Technology specifications: chip generations, cooling systems, efficiency metrics
  • Workload composition: percentage breakdown of training vs. inference operations, with detail on workload types (batch processing vs. real-time services) and temporal flexibility characteristics
  • Demand response and load-shifting capabilities
  • Behind-the-meter generation and storage plans

Why This Matters: No reliable public data exists on training/inference workload splits, yet this fundamentally determines which flexibility mechanisms are appropriate. Temporal load-shifting works for training; electrical storage is required for inference. Without this information, utilities cannot design targeted demand response programs and regulators cannot assess whether proposed facilities can realistically provide grid services.

The absence of workload composition data is itself a critical planning challenge that mandatory disclosure directly addresses.

Implementation: Phase in over 2-3 years, starting with AI data centers 50 MW and above. Facilities qualifying as "AI data centers" must demonstrate power densities ≥35 kW/rack average. Traditional enterprise data centers (<35 kW/rack average) remain under existing disclosure frameworks regardless of total MW.

Protect commercially sensitive information through confidential filings to state PUCs, similar to existing protocols for utility resource planning data. Penalties for non-compliance: lower queue priority, higher interconnection charges, or denial of expedited approvals.

The Rationale: This mechanism reduces measurement and parametric uncertainty by creating apples-to-apples comparisons across proposals. It signals developer seriousness—speculative projects avoid binding commitments. It creates accountability for projections by establishing benchmarks against which actual performance can be measured. And it enables informed decision-making by regulators who currently plan based on guesswork.

Mechanism 2: Performance-Based Regulation for Forecast Accuracy and Uncertainty Management

The second mechanism realigns utility incentives from capital expenditure to effective uncertainty management and forecast accuracy—but only where achievable. Different time horizons have fundamentally different uncertainty characteristics, and policy should target the uncertainties that can realistically be reduced.

Critical Distinction: This proposes targeted, data center-specific performance incentives, not comprehensive performance-based regulation. The scope is deliberately narrow: PUCs would establish performance metrics and incentives/penalties specifically for utility forecasting and planning related to data center loads, implemented through a PUC order or limited rulemaking rather than comprehensive rate case restructuring.

The Proposal: Under this data center-specific PBR, utilities would:

  • Earn bonuses for near-term (1-3 year) data center demand forecasts within ±10% of actual, where parametric and measurement uncertainty can be managed through better data collection and developer accountability
  • Earn bonuses for proactive uncertainty reduction efforts in all time horizons: requiring developer deposits, conducting facility audits, implementing pilot flexibility programs
  • Face penalties for data center-related overbuilding leading to <70% capacity utilization over rolling 5-year periods
  • Receive rewards for reducing peak demand through data center demand response programs
  • Be held accountable for stranded assets from excess data center capacity in near-term planning windows where uncertainty was reducible

Distinguishing Time Horizons: Near-term forecasts (1-3 years) should achieve reasonable accuracy given known project pipelines and current technology. Parametric and measurement uncertainty dominate but can be substantially reduced through better data collection (Mechanism 1), developer deposits, and facility audits. Utilities should be held accountable for accuracy in this window because the tools for improvement exist.

Long-term forecasts (5-10 years) face genuine Knightian uncertainty about AI use cases and technological evolution. Utilities cannot be held fully accountable for these unknowables, but can be incentivized to build flexibility and optionality into long-term planning—modular generation, transmission corridors designed for future expansion, contracts with flexibility provisions. The goal is not perfect long-term forecasting (impossible) but rather planning approaches that minimize regret regardless of which scenario unfolds.

Implementation Timeline: Unlike comprehensive PBR reform requiring 3-5 years of multi-stakeholder processes, data center-specific PBR can be implemented through PUC investigatory dockets (6-12 months), targeted performance orders (12-18 months total), and pilot programs (24-36 months). Total realistic timeline: 2-3 years from initiation to implementation.

Why This Is Feasible: Narrow scope focusing only on data center forecasting, not entire utility operations. High-stakes urgency creates political will. Data availability makes verification straightforward. Natural pilot jurisdictions exist—states with high data center concentration (Virginia, Ohio, Texas) face immediate need. Less disruptive than comprehensive reform because utilities maintain existing rate structures for other customers.

The Rationale: This aligns utility incentives with ratepayer interests in time horizons where accuracy is achievable. It encourages utilities to pressure developers for accurate data in near-term planning. It reduces capital expenditure bias for data center-serving infrastructure specifically. It distinguishes between manageable near-term uncertainty and inherent long-term uncertainty. And it rewards uncertainty reduction efforts even when perfect forecasting remains impossible.

Precedent Exists: The UK's RIIO framework rewards distribution network operators for efficient investment and innovation. Hawaii's PUC implemented PBR for customer satisfaction, renewable integration, and data sharing. Academic research documents that traditional rate-of-return regulation creates capital bias—this mechanism addresses it surgically for data center planning without requiring wholesale regulatory restructuring.

Mechanism 3: Already Covered in Part IV

Part IV detailed geometric demand charges and time-of-use surcharges—the pricing mechanism that makes flexibility economically compelling. The 150 MW example showing 0.9-year payback and $680+ million NPV demonstrates how appropriate price signals transform technical potential into business reality. This mechanism works in concert with the others: disclosure (Mechanism 1) reveals which facilities can provide flexibility, PBR (Mechanism 2) gives utilities incentives to pursue it, and the mechanisms below overcome capital cost barriers and accelerate deployment.

Mechanism 4: Technology Incentives Coupled with Storage Mandates

The fourth mechanism addresses the capital cost barriers that prevent voluntary adoption of flexibility and efficiency technologies, even when geometric demand charges make the economics attractive.

The Proposal: Provide tax credits, accelerated depreciation, or expedited permitting for data centers that:

  • Install on-site battery storage ≥20% of peak load capacity
  • Commit to discharging storage during top 50-100 system peak hours
  • Deploy advanced cooling demonstrating >20% efficiency improvement
  • Achieve PUE <1.3 within two years

Enhanced Cooling R&D Incentives: Given that cooling system constraints represent a major barrier to operational flexibility, regulators should establish dedicated R&D incentive programs for next-generation cooling technologies. These could include:

  • Grants or tax credits for facilities piloting novel cooling approaches (immersion cooling, advanced liquid cooling, phase-change systems)
  • Accelerated depreciation for cooling infrastructure investments that enable faster load ramping without thermal stress
  • Performance bonuses for facilities achieving both superior PUE (<1.2) and demonstrated ability to modulate load without thermal degradation
  • Public-private partnerships to develop and validate cooling systems specifically designed for flexible operations

The Economic Logic: A 150 MW data center at PUE 1.2 uses 180 MW total. Same compute at PUE 1.6 uses 240 MW—60 MW saved per facility, enough to avoid building a $60-70 million peaker plant. Multiply across dozens of facilities and avoided generation costs reach billions in savings for all ratepayers.

Battery storage provides multiple value streams beyond peak shaving. Research demonstrates data center energy storage can provide valuable grid services including frequency regulation, voltage support, and reactive power management while maintaining operational reliability. These services generate additional revenue that improves investment returns beyond what geometric demand charges alone provide.

The Rationale: This mechanism directly addresses the peak demand problem that drives generation overbuild. It's technology-agnostic within categories—developers choose how to achieve thresholds based on their operational needs. It creates positive externalities because storage can provide grid services beyond data center operations. And cooling innovations enable both efficiency gains and operational flexibility, with improved PUE delivering permanent energy savings while better thermal management enables the load ramping that flexibility requires.

Mechanism 5: Speed-to-Market Incentives for Verifiable Decarbonization Commitments

The fifth mechanism exploits developers' most valuable asset—time—while using costly commitments to signal genuine intent and prevent phantom projects from distorting planning.

The Context: First-mover advantages in AI are worth billions. A 6-12 month delay can be competitively catastrophic when training the next frontier model or launching a new service. Speed-to-market is more valuable to serious developers than any subsidy regulators could offer.

The Proposal: Regulators offer expedited approvals, higher interconnection queue priority, or faster permitting timelines to developers who commit to one or more of the following:

Option A: Clean Baseload Commitment

  • Place in escrow funds for CCS facility or new "clean firm" generation (advanced nuclear, enhanced geothermal) equivalent to 100% of facility load, operational within 5 years

Option B: 100% Clean Energy Matching

  • Execute PPAs for renewable energy equivalent to 100% of annual consumption (must be additional capacity, not existing generation)

Option C: Virtual Power Plant Financing

  • Finance VPP capacity equal to 15-20% of peak load, capable of dispatching during system peak hours

Option D: Load Flexibility + Advanced Efficiency

  • Accept binding load caps during top 100 peak hours + deploy advanced cooling (PUE <1.2) + implement AI-driven load management

Option E: Best-in-Class Efficiency Design

  • Commit to PUE <1.2 from day one with advanced cooling, AI load management, independent verification, and financial penalties for non-compliance

Option F: Siting Flexibility for Grid Optimization

  • Commit to co-location strategies that leverage existing interconnection capacity (such as Colectric's "Power Couples" approach at retired power plant sites) and add substantial renewable generation and storage at the site

Preventing Perverse Incentives: A legitimate concern is that expedited approval could encourage more speculative requests—developers submitting multiple phantom projects to capture queue positions. The escrow and binding commitment requirements are specifically designed to prevent this.

Unlike interconnection requests that cost only modest application fees, these mechanisms require substantial upfront financial commitments. Option A places millions in escrow—speculators won't tie up capital they need elsewhere. Option B requires executing binding PPAs that create long-term financial obligations developers only sign for projects they intend to build. Option C requires creditworthy commitments that financial institutions vet through due diligence. Options D & E create downside risk through binding performance standards with financial penalties that phantom projects avoid.

All options include provisions where failure to meet milestones results in forfeiture of queue position, financial penalties, or return of expedited approval benefits.

The Key Insight: Revealed preference through costly action. Cheap talk (interconnection requests, zoning applications) is easy to multiply. Expensive commitments (escrow, PPAs, binding penalties) signal genuine intent. Only developers with serious projects will make these investments, naturally filtering speculative requests without regulatory assessment of "seriousness."

The Rationale: This mechanism exploits speed-to-market value while encouraging strategic siting that reduces transmission costs and accelerates deployment. Facilities that can locate flexibly gain competitive advantage through faster interconnection (4-6 years vs. 8+ years for traditional approaches), while the grid benefits from avoided transmission infrastructure and efficient use of existing interconnection capacity at retired power plant sites.

Co-location at sites with surplus interconnection capacity represents win-win optimization: developers get speed, ratepayers avoid transmission costs, and the system efficiently utilizes existing infrastructure. The binding commitments ensure only serious projects capture these benefits while advancing decarbonization goals.

Recent Validation: Enhanced geothermal projects like Fervo Energy's Utah Innovative Power Pathway demonstrate commercial viability of clean firm power for data centers. Google has committed to procuring 115 MW of next-generation geothermal energy from Fervo. Colectric's analysis demonstrates significant potential for co-location approaches, identifying technical potential to serve 470 GW of data center load at existing U.S. power plant sites with 85% clean energy.

How the Five Mechanisms Work Together

The power of this framework lies in integration. Each mechanism addresses a different dimension of the challenge, and together they reinforce rather than duplicate.

Mechanism 1 (Disclosure) creates the information foundation. Without knowing workload composition and flexibility characteristics, the other mechanisms cannot be designed effectively or targeted appropriately.

Mechanism 2 (Performance-Based Regulation) realigns utility incentives to reward accuracy in near-term horizons and flexibility in long-term planning, addressing the capital expenditure bias that drives overbuild.

Mechanism 3 (Geometric Demand Charges) makes flexibility economically rational once utilities and developers know which facilities can provide it and what types of flexibility are appropriate for their workloads.

Mechanism 4 (Technology Incentives) overcomes the capital cost barriers that might prevent investments even when payback periods are attractive, accelerating the deployment of efficiency and flexibility technologies.

Mechanism 5 (Speed-to-Market) filters phantom projects through costly commitments while accelerating decarbonization and encouraging optimal siting that avoids transmission costs.

These mechanisms serve multiple constituencies simultaneously. Ratepayers avoid tens of billions in stranded assets. Data center developers gain speed-to-market advantages and regulatory certainty. Utilities' incentives align with accurate planning rather than capital expansion. Climate advocates see accelerated decarbonization. Grid operators gain flexibility resources reducing peaker plant needs. The political coalition potential is substantial if advocates articulate how preventing overbuild serves all these interests.

Implementation: Three Parallel Tracks

The solution is not waiting for perfect evidence or universal buy-in before any jurisdiction moves. Implementation should proceed on three parallel tracks:

Track 1: Early Adopters (12-24 months) States facing acute pressure—Virginia, Ohio, Georgia—can implement Mechanisms 1, 3, 4, and 5 through PUC orders within twelve to twenty-four months. These early adopters provide real-world validation and demonstrate whether the mechanisms deliver promised benefits. They prevent billions in imminent overbuild while generating evidence for broader adoption.

Track 2: Structured Pilots (12-36 months) States with significant but less immediate growth—Texas, North Carolina, Arizona—can run structured pilots testing refined approaches. These generate rigorous evidence about effectiveness, identify implementation challenges, and allow mechanisms to be adapted to different market structures and regulatory contexts. Pilot jurisdictions benefit from learning from early adopters while contributing to evidence base.

Track 3: Evidence-Based Adoption (24-48 months) Jurisdictions with minimal current activity can monitor results from early adopters and pilots, then adopt refined mechanisms incorporating lessons learned. Late adopters benefit from proven approaches and avoid costly trial-and-error, while still preventing overbuild before it occurs in their regions.

This parallel approach maximizes learning while minimizing delay. Early adopters prevent billions in imminent overbuild. Pilots generate evidence and refine mechanisms. Late adopters benefit from proven approaches. Action starts now through multiple tracks rather than waiting for comprehensive evidence before any jurisdiction moves.

Timeline expectations must be realistic: Even targeted reforms face political challenges. Mechanisms 1 (disclosure), 3 (geometric charges), 4 (technology incentives), and 5 (speed-to-market) can be implemented relatively quickly (12-24 months) through PUC orders. Mechanism 2 (targeted PBR) requires more stakeholder engagement and pilot testing (24-36 months).

The urgency stems from capacity decisions being made now for 2028-2035 demand. A 2-3 year regulatory process still prevents most imminent overbuild if initiated promptly. The window for action is closing, but it has not closed.

The Political Obstacles Are Real But Not Insurmountable

Implementation faces genuine resistance. Utilities will resist changes to capital expenditure-based regulation because their business model depends on it. Developers will claim competitive harm from disclosure requirements, arguing transparency helps competitors. Some regulators will favor economic development over ratepayer protection, viewing data centers as desirable industrial load.

But political conditions in specific jurisdictions create opportunities. States with concentrated data center development and vocal ratepayer advocates—Virginia, Ohio, Georgia—face immediate pressure as electricity bills rise. Utility commissions that have already initiated data center investigations—Texas ERCOT, Oregon—have established momentum. Climate-progressive states—California, Washington, New York—can frame these mechanisms as decarbonization strategies aligned with their climate goals.

The alternative makes political resistance unsustainable. Business-as-usual planning that treats speculative demand projections as certainties virtually guarantees tens of billions in stranded assets, decades of emissions lock-in, and political backlash undermining grid modernization. When ratepayers see electricity bills rising to cover generation that serves phantom data centers, the political coalition supporting energy transition will fracture.

Preventing that fracture requires acting before the crisis becomes undeniable. The data center energy transition is fundamentally a governance and affordability challenge. We have the analytical tools to understand the uncertainty, the technical capabilities to provide flexibility, and the policy instruments to incentivize efficient outcomes. What we need is regulatory will to deploy them before tens of billions in potentially unnecessary commitments lock in for three decades.

Conclusion: The Choice Between Design and Drift

This series has traced a clear arc. Part I established that the defining challenge is structural uncertainty, not forecasting inadequacy. Part II exposed how strategic opacity and perverse incentives create phantom data centers that distort planning. Part III revealed the utilization paradox—60-70% GPU utilization concealing enormous flexibility potential. Part IV quantified how geometric demand charges can unlock nearly 100 GW without new generation. And this final installment has provided the complete policy toolkit: five integrated mechanisms that address every dimension of the challenge.

The mechanisms are not theoretical. They build on proven regulatory approaches, validated technical potential, and commercial demonstrations. Disclosure requirements draw on standard practices in utility resource planning. Performance-based regulation follows models from the UK and Hawaii. Geometric demand charges apply pricing principles long used in industrial rate design. Technology incentives mirror successful programs for renewable deployment. Speed-to-market mechanisms use revealed preference logic common in project finance.

The AI data center energy challenge is typically framed as a crisis: exploding demand, inadequate supply, climate setbacks, affordability pressures. This framing is not wrong, but it is incomplete. The deeper truth is that the challenge stems from design choices that can be redesigned. Strategic opacity by developers, perverse utility incentives under rate-of-return regulation, and regulatory information asymmetries are institutional arrangements that policy can address. Current rate structures that fail to incentivize technically feasible flexibility can be reformed. Affordability concerns that focus exclusively on bill assistance can be supplemented with cost avoidance strategies that prevent overbuilding.

The opportunity is to transform uncertainty from crisis to catalyst for smarter, cleaner, more affordable energy systems. The mechanisms exist. The technical potential is validated. The economic logic is sound. The political conditions in some jurisdictions are favorable.

The question is whether regulators will deploy these mechanisms before the window closes—or whether we'll drift into a future of stranded assets, emissions lock-in, and ratepayer backlash that could have been prevented.

The choice is between design and drift. Between governance that shapes outcomes and inertia that accepts them. Between policy mechanisms that prevent overbuild and political mechanisms that allocate blame after the fact.

The tools are ready. The evidence is compelling. The time is now.

References

Resources for the Future (2025). "Electricity Affordability 101."

PowerLines (2025). "Utilities Have Requested $29 Billion in Rate Hikes for 2025, Surpassing 2024." January 2025.

Lawrence Berkeley National Laboratory (2024). United States Data Center Energy Usage Report. LBNL-2001637.

Norris, T.H., Patiño-Echeverri, D., & Dworkin, M. (2025). Rethinking Load Growth: Assessing the Potential for Integration of Large Flexible Loads in US Power Systems. Nicholas Institute for Environmental Policy Solutions, Duke University.

Ofgem (2014). RIIO-ED1: Final determinations for the slowtrack electricity distribution companies. Office of Gas and Electricity Markets, UK. November 2014.

Hawaii Public Utilities Commission (2020). Decision and Order No. 37787 Establishing Performance-Based Regulation Framework. Docket No. 2018-0088. December 23, 2020.

Ghatikar, G., Ganti, V., Matson, N., & Piette, M.A. (2012). Demand Response Opportunities and Enabling Technologies for Data Centers: Findings From Field Studies. Lawrence Berkeley National Laboratory. LBNL-5763E.

Fervo Energy (2025). UIPA: The Enhanced Geothermal Data Center Corridor. July 2025.

Google (2024). Google commits to Fervo Energy's next-generation geothermal energy project in Nevada. Corporate press release.

Colectric (2025). "Unlocking Clean Power for Data Centers Where No One Else Can." Company presentation, October 2025.

Michael Leifman (2025). Managing Data Center Energy Uncertainty: A Framework to Prevent Overbuild, Control Costs, and Unlock Grid Flexibility. Full research paper available from author at michael.leifman1@gmail.com.