Asset Classes

Reading the Grid: Power Availability Analysis for Data Center Developers

Power availability is the most common reason viable data center sites stall or fail. This post walks through the five grid variables developers must analyze before committing capital, and explains what AI can and cannot automate in this process.

by Build Team March 20, 2026 4 min read

Reading the Grid: Power Availability Analysis for Data Center Developers

Utility reserve margins, interconnection queues and load growth projections are the real gatekeepers in data center site selection. Here's what to analyze before committing capital.

A site that clears every other hurdle — land, zoning, fiber, labor — can still fail on power. And the failure mode is slow. Interconnection queue delays are measured in years, not months. Getting the grid analysis right before committing capital is one of the highest-leverage steps in data center development.

Why Power Has Become the Gating Variable

Data center power demand in the U.S. is projected to reach 35-50 GW by 2030, according to Lawrence Berkeley National Laboratory's 2024 report on electricity use in U.S. data centers. Current national capacity is straining. In constrained markets like Northern Virginia, Hillsboro and Phoenix, utilities are already rationing new service commitments.

The result: developers who show up to a promising site without a clear-eyed grid analysis are wasting time on sites that will never clear.

The Five Variables to Analyze

1. Utility Reserve Margins

Reserve margin is the cushion between peak demand and available generating capacity. The North American Electric Reliability Corporation (NERC) publishes annual regional assessments. A region with a reserve margin below 15% is under stress. Markets across MISO and PJM have seen margins tighten as thermal generation retires faster than new capacity comes online.

For developers, the core question is: can this utility serve 50MW or 100MW of new load without creating reliability issues? Utilities are now answering that question with multi-year queues rather than straightforward approvals.

2. Interconnection Queue Position and Study Status

The Federal Energy Regulatory Commission (FERC) maintains a public interconnection queue. A new large-load customer — a data center drawing 50MW or more — typically triggers a new load interconnection study, separate from generation interconnection. Study timelines range from 6 to 24 months depending on the ISO.

Understanding where a site sits in this process, and whether the relevant utility has a track record of approving large-load requests efficiently, is essential. Some utilities in secondary markets have fast-tracked large customers. Others have frozen new service agreements entirely.

3. Substation Capacity and Proximity

Available capacity at the nearest substation is a starting constraint, not a later-stage check. Developers need to identify substations capable of serving large loads and confirm available headroom. In practice, this means formal pre-application discussions with the utility's transmission team.

Distance to substation drives transmission infrastructure cost. Every mile of new transmission line adds roughly -3 million per mile, depending on voltage level and terrain. Sites more than five miles from suitable substation infrastructure face a cost threshold that changes the underwriting materially.

4. Load Growth and Competing Demand

AI compute demand is not the only load hitting these grids. EV charging infrastructure, industrial reshoring and crypto mining have added concurrent load in many markets. Understanding what else is queued ahead of your site matters — both for timeline and for utility willingness to engage.

Some utilities publish load growth forecasts in their Integrated Resource Plans (IRPs). These are public documents. Reading the IRP for a target utility is a baseline step in grid analysis, not a specialized skill.

5. Moratorium and Policy Risk

Several major utilities have declared temporary moratoria on new large-load interconnection requests. Dominion Energy, Georgia Power and others have at various points restricted new data center service commitments in constrained zones. A site-level power screen must include a policy check: is the utility accepting new large-load applications, and are there geographic restrictions within their territory?

How AI Fits Into This Analysis

The data exists. The problem has always been assembly — pulling queue data, IRP filings, reserve margin reports and substation capacity disclosures from different sources on different schedules in different formats.

AI agents can now monitor interconnection queue updates, flag when a target utility's reserve margin crosses a threshold, parse IRP amendments and build a synthesized grid profile for a site in hours rather than weeks. The assembly problem is largely solved.

What AI does not replace: the formal utility pre-application discussion, the legal review of service agreements and the negotiation of power purchase agreement terms. Those require people with the relationships and authority to commit.

The Site Scoring Framework

When evaluating power for a data center site, score on five variables:

  1. Reserve margin above 15% in the relevant RTO/ISO

  2. Utility has a track record of approving large-load interconnections in under 18 months

  3. Substation capacity available within three miles

  4. No active moratorium or freeze on new large-load service commitments

  5. IRP projects positive load growth absorption over the planning horizon

Sites that clear four of five are worth advancing. Sites that fail two or more on the power screen don't make the shortlist, regardless of other attributes.

Developers who get this analysis right early save months of team time on sites that will never execute.