New York, May 16: Ask anyone who manages a large commercial building in northern Virginia what changed in the last two years. The answer is almost always the same: the electricity bill.
- The Grid Was Not Designed for AI Data Centers at This Scale
- Microsoft, Google, and Amazon: Three Engines Behind the AI Data Centers Surge
- The Anatomy of a Dangerous 76 Percent AI Data Centers Power Cost Spike
- Ireland and Singapore: AI Data Centers Are Straining Grids Globally
- Who Actually Gets the Bill for AI Data Centers Energy Demand
- The Renewable Promise That Does Not Match the AI Data Centers Reality
- Regulators Are Starting to Move on AI Data Centers Energy Use
- What the AI Data Centers Power Cost Spike Means for Every Business
Not just higher. Structurally, uncomfortably, budget-busting higher.
The kind of higher that forces an unplanned conversation with the CFO. The kind that makes procurement teams rethink contracts they assumed were locked in. And the engine behind it, more than any other single factor, is the relentless expansion of AI data centers.
Commercial power costs in the highest-density AI data centers markets have surged 76 percent against a 2023 baseline, according to Wood Mackenzie’s Q1 2026 Power Market Outlook.
That number has been circulating in energy circles for weeks. It lands differently depending on who is reading it. For a hyperscaler with a long-term power purchase agreement in place, it is manageable. For a food processing plant in Ohio trying to renew its annual electricity contract, it is a different kind of problem entirely.
This is the part of the AI boom that does not make the keynote slides.
The Grid Was Not Designed for AI Data Centers at This Scale
To understand what is happening, start in Loudoun County, Virginia, a suburb of Washington, D.C. once known for horse farms and wine country.
Today, it processes an estimated 70 percent of the world’s internet traffic, according to the Northern Virginia Technology Council. Row after row of low-slung, fortress-like buildings sit behind security fencing, humming with compute power that touches everything from streaming video to mortgage approvals to the large language models people use to write emails.
The density of AI data centers in that single corridor has fundamentally changed how PJM Interconnection, the regional grid operator covering 13 states and the District of Columbia, thinks about the future.
PJM’s 2025 long-range transmission planning report revised load growth projections upward by more than 40 percent in a single planning cycle, driven primarily by AI data centers and compute clusters. The report noted, with the quiet urgency of an institution that cannot afford to panic publicly, that transmission infrastructure upgrades typically require 5 to 12 years to complete.
The AI data centers, meanwhile, are breaking ground in months.
That gap between how fast the machines are being built and how fast the grid can catch up is exactly where the 76 percent lives.
Microsoft, Google, and Amazon: Three Engines Behind the AI Data Centers Surge

Microsoft committed more than $80 billion to global AI data centers investment across 2025 and 2026, according to reporting by The Wall Street Journal.
Most of that capital is going into Azure infrastructure designed to support its deepening partnership with OpenAI, whose compute demands have grown faster than almost anyone anticipated. New AI data centers in Arizona, Wisconsin, and the United Kingdom have added significant new load to grids already running lean.
Google is not far behind. The company announced in February 2026 a $75 billion capital expenditure plan for the fiscal year, with AI data centers construction accounting for the majority of that spend, per Bloomberg.
The scale is worth sitting with for a moment: $75 billion in a single year, on infrastructure that will require power at industrial scale for the next two to three decades.
Amazon Web Services has taken a quieter approach. It has methodically locked in power capacity across Georgia and Ohio through long-term contracts that effectively guarantee priority access to grid resources.
State utility commissions in both states have fielded complaints from industrial manufacturers and agricultural operators who argue those agreements crowd out traditional commercial users.
It is a reasonable grievance. The contracts were structured to give the largest AI data centers operators certainty. The question no one asked loudly enough at the time is: certainty at whose expense?
The Anatomy of a Dangerous 76 Percent AI Data Centers Power Cost Spike
When demand from AI data centers grows faster than supply in a regulated power market, utilities have to procure emergency capacity, upgrade transmission infrastructure, and maintain larger reserve margins.
Those costs do not disappear. They get passed through to commercial ratepayers on the next billing cycle.
In deregulated markets, the signal is faster and more brutal. Wholesale capacity prices spike, and large commercial buyers bear the exposure directly.
The clearest evidence showed up in February 2026, when PJM held its annual capacity auction for the 2026 to 2027 delivery year.
Clearing prices came in at $269.92 per megawatt-day for most of the region, up from $28.92 per megawatt-day the year before, according to figures published by PJM and widely reported by Reuters and Bloomberg.
A ninefold increase in the wholesale cost of reliability.
That figure will work its way through commercial electricity contracts across the region over the next 12 to 18 months, quietly, without fanfare, the way most infrastructure costs eventually find the people who cannot avoid them.
S&P Global Commodity Insights has corroborated the trend line that Wood Mackenzie documented. In Ashburn, Virginia, colocation providers have already begun renegotiating long-standing contracts with enterprise clients, according to reporting by Data Center Dynamics.
The old pricing no longer works.
Turns out what happens in Ashburn does not stay in Ashburn. These pricing signals move through supply chains, into commercial lease negotiations, into the operating models of anyone who runs something that plugs into a wall.
Ireland and Singapore: AI Data Centers Are Straining Grids Globally
The United States is not alone in this.
Ireland’s national grid operator, EirGrid, reported in its 2025 Generation Capacity Statement that AI data centers now account for approximately 21 percent of the country’s total electricity demand, up from roughly 5 percent a decade ago.
Twenty-one percent of a nation’s electricity budget flowing to facilities that primarily serve global technology corporations is the kind of statistic that tends to concentrate political minds.
EirGrid has stated publicly that it cannot guarantee capacity for new entrants without substantial new generation coming online. It has advised the Irish government to consider mandatory curtailment provisions for large commercial operators during peak demand periods.
Singapore dealt with this earlier than most. The government imposed a moratorium on new AI data centers construction in 2019, only partially lifted it in 2022, and is once again reviewing the policy. Grid operators flagged that approved projects have already consumed available headroom in the city-state’s transmission infrastructure.
The Infocomm Media Development Authority has required minimum Power Usage Effectiveness ratings as a condition for development approval. AI data centers that do not meet the threshold do not get built. Full stop.
It is not a perfect system. But it is the most direct answer to a problem that most jurisdictions are still tiptoeing around.
In the United Kingdom, National Grid Electricity System Operator released data in early 2026 showing that grid connection requests from AI data centers operators have created a queue stretching to 2036 in some regions.
The government responded by designating AI infrastructure as a strategic priority project category. The designation allows certain AI data centers projects to bypass elements of the standard connection process.
What it cannot do is manufacture electricity that does not yet exist.
Who Actually Gets the Bill for AI Data Centers Energy Demand
The hyperscalers themselves are largely protected, at least for now.
Microsoft, Google, and Amazon secured power through long-term contracts and direct utility partnerships that lock in rates well below current spot exposure. They planned for this. They had the capital, the legal teams, and the foresight to structure their procurement before the market tightened around AI data centers demand.
Small and mid-size manufacturers do not have those tools.
Office building operators, regional hospital systems, universities, logistics companies, and retail chains renew power contracts on shorter cycles, often annually, with limited ability to hedge. They are absorbing the spike in full.
The National Association of Manufacturers raised exactly this concern in a January 2026 policy submission to the Federal Energy Regulatory Commission. It argued that preferential capacity treatment for AI data centers operators amounts to a cross-subsidy drawn from traditional industrial users.
FERC has not formally acted on that submission.
The commission is reviewing interconnection queue rules under a proposed rulemaking that would require more rigorous impact assessments before large new loads, including those from AI data centers, can access the grid. The process is deliberate by design. The grid stress is not waiting.
The Renewable Promise That Does Not Match the AI Data Centers Reality
Every major hyperscaler has a clean energy commitment on the books.

Microsoft, Google, and Amazon all have net-zero or 100 percent renewable matching targets, announced with press releases and sustainability reports and earnest executive commentary.
The gap between those pledges and what is actually happening on the grid is significant.
Solar and wind capacity cannot be deployed as quickly as AI data centers demand is being added. Nuclear, which offers the around-the-clock firm power that AI data centers actually require, has long development timelines and enormous capital requirements.
Microsoft’s investment in the restart of Three Mile Island’s Unit 1, a 20-year power purchase agreement with Constellation Energy reported by the Financial Times, represents a genuine attempt to close that gap through existing nuclear capacity.
Google has signed similar long-term agreements for nuclear power in the United States and Europe, including commitments with Kairos Power for small modular reactor capacity expected online in the early 2030s.
Still, these arrangements serve the procurement needs of the very largest AI data centers operators. They do nothing for the aggregate grid stress created by the broader demand surge.
For the utility operator managing the system in real time, a renewable pledge filed in a sustainability report and a natural gas peaker plant running at capacity on a summer evening are two entirely different things.
Regulators Are Starting to Move on AI Data Centers Energy Use
The policy response has been uneven, which is about what you would expect from institutions not designed to move at the speed of capital markets.
In the United States, the Department of Energy’s Liftoff Report on AI data centers and the power grid, released in late 2024, identified the demand growth trajectory as one of the most significant reliability challenges facing the North American grid.
It called for accelerated permitting, streamlined interconnection processes, and updated building codes for AI data centers efficiency. The report was thorough. Implementation is, as ever, a longer story.
The European Union has moved with more structure. The revised Energy Efficiency Directive requires large-scale AI data centers to report energy consumption data annually, with member states empowered to impose minimum efficiency standards.
The European Commission is expected to propose binding performance thresholds for new AI data centers construction as part of its Industrial Decarbonization Strategy, with a consultation process ongoing through early 2026.
Singapore’s efficiency-first model is drawing serious attention from other jurisdictions. Ireland’s Commission for Regulation of Utilities is reportedly evaluating a comparable framework, according to the Irish Times. No formal proposal has been issued, but the conversation is happening at a level of seriousness it was not a year ago.
What the AI Data Centers Power Cost Spike Means for Every Business
If you run a business with meaningful energy costs, the 76 percent figure is not a background statistic. It is already showing up in your operating model, whether or not you have traced it to its source yet.
Energy procurement has moved, for an increasing number of large companies, from a back-office function to a boardroom priority.
Companies are locking in longer-term power purchase agreements, investing in on-site generation and storage, and in some cases relocating energy-intensive operations to markets where grid capacity is less constrained. CBRE’s March 2026 commercial real estate report noted that industrial land values adjacent to constrained power corridors have risen sharply as developers compete for sites with favorable grid access.
For founders and executives building AI-native companies, the energy question carries its own specific weight.
The cost of inference, meaning the electricity required to run large language models at scale, is among the fastest-growing line items in hyperscaler cost structures. Those costs flow downstream into API pricing, into SaaS margins, into the unit economics of every business built on top of AI data centers infrastructure.
According to reporting by The Information, OpenAI spent more than $4 billion on compute in 2024, with that figure projected to climb sharply through 2026.
If wholesale power costs continue climbing at anything close to the rate implied by recent capacity auction results, the compute bill grows with them, and so does everything priced on top of it.
The industry has been remarkably effective at telling the story of AI as a productivity revolution, a creative force, an economic transformation. That story is not wrong. But it is incomplete.
Behind every model output, every AI-generated report and code suggestion and customer service interaction, there is a physical building inside a cluster of AI data centers drawing enormous amounts of power from a grid that was not designed to carry this load.
Someone is paying for that.
The question of whether the right people are paying for it, in the right proportion, is the regulatory and market question of the next several years.
It will be answered in capacity auction clearing prices, in FERC rulemaking proceedings, in utility commission hearings in Georgia, Ohio, Dublin, and Singapore, in the electricity contracts that mid-size manufacturers sign in 2027 when their current deals expire.
AI data centers do not run on ambition or investor conviction or a well-timed press release. They run on electrons. And right now, across the markets where they have concentrated most heavily, demand has outrun everything built to supply it.
That is the real cost of building the future this fast.
Connect With Us On Social Media [ Facebook | Instagram | Twitter | LinkedIn ] To Get Real-Time Updates On The Market. Entrepreneurs’ Diaries Is Now Available On Telegram. Join Our Telegram Channel To Get Instant Updates.


