Energy + AI Infrastructure

The Grid Cannot Wait

The White House convenes emergency talks. Utilities threaten to cut off data centers. And completed server farms sit dark, waiting for power that won't arrive for years. This week, the AI power crisis stopped being theoretical.

Listen
Electrical transmission towers at twilight with data streams flowing toward distant data centers
Government building with emergency lights and power grid visible
01

Washington Finally Notices the Lights Flickering

When the White House convenes an emergency meeting with bipartisan governors to discuss something as unglamorous as power grid capacity, you know we've passed a threshold. This week, that meeting happened—and the subtext was clear: the AI boom is straining American infrastructure to breaking point.

The immediate trigger? PJM Interconnection, the grid operator serving 65 million people across the mid-Atlantic and Midwest, faces such severe capacity constraints that officials are now urging emergency power auctions to incentivize new plant construction. The governors present represented states where residents have seen electricity bills surge 12-16% in a single year—increases directly attributable to data center load subsidization.

Bar chart showing residential electricity price increases in data center hub states
Virginia and Ohio lead the nation in residential price increases, both hosting major data center clusters.

This marks a fundamental shift. For years, states competed aggressively for data center investment with tax breaks and cheap land. Now the calculus is inverting: those data centers demand so much power that the surrounding infrastructure cannot keep pace, and ratepayers are funding the gap. The federal government's involvement signals this is no longer a regional nuisance—it's becoming a national economic and security concern.

Split image showing a warm family home versus a cold industrial data center
02

PJM Proposes the Unthinkable: Cut Off the Data Centers First

In what may become the defining regulatory battle of the AI era, PJM Interconnection dropped a proposal that has Big Tech scrambling: during grid emergencies, data centers would be among the first to lose power. Homes stay lit; server farms go dark.

The reaction was immediate and predictable. Amazon, Google, and Microsoft formally opposed the measure, calling it "discriminatory." Their argument: data centers provide critical infrastructure too—cloud services, financial systems, hospital records. Cutting them off isn't just an inconvenience; it could cascade into broader failures.

But PJM's counter-argument is just as compelling: demand is growing at 4.8% annually—roughly 3x the rate of just five years ago—and capacity additions cannot keep pace. Someone has to absorb the shortfall during emergencies, and asking families to sit in the dark while AI training runs continue is politically untenable.

The real question: If grid operators are already floating mandatory shutdowns, what happens when demand doubles again? PJM is fast-tracking new plant integration, but transmission line construction takes 7-10 years. The math doesn't work.

Watch for this battle to move to FERC. The federal regulatory dimension is now unavoidable.

Futuristic nuclear reactor core glowing with teal light
03

Amazon's 960-Megawatt Nuclear Bet Gets Real

For the past two years, tech giants have announced nuclear energy deals with the fervor of a Silicon Valley product launch: splashy press releases, vague timelines, enormous numbers. This week, Amazon moved from PowerPoint to project spec.

The company released detailed plans for its Washington State small modular reactor (SMR) facility: 12 X-energy modular reactors generating 960 megawatts total. Construction begins by decade's end. The broader target? 5 gigawatts of new Amazon-dedicated nuclear capacity online by 2039.

Why nuclear, and why now? The answer is baseload. Solar and wind are cheap but intermittent. Natural gas is fast to deploy but exposes operators to fuel price volatility and emissions liability. Nuclear—especially modular nuclear—offers 24/7 output with zero carbon and a 40+ year operational lifespan. For a company planning to run AI infrastructure for decades, the economics make sense.

The challenge remains regulatory. SMR technology has never been deployed at scale in the U.S., and the licensing process remains slower than construction timelines abroad. Amazon's move forces the question: will the NRC streamline approvals, or will these projects die in permitting purgatory?

Empty dark data center hall with NO POWER warning signs
04

The Cruelest Irony: Built, But Powerless

In Silicon Valley, a peculiar form of tech ghost town has emerged: state-of-the-art data centers, fully constructed, completely empty. Not because there's no demand—there's overwhelming demand—but because there's no power to run them.

The Santa Clara utility confirmed this week that it cannot complete the grid upgrades necessary to energize several completed facilities until 2028. Interconnection wait times in some California regions have ballooned to 4-10 years. Developers who broke ground in 2024 expecting to go live by 2026 are now staring at multi-year delays.

This infrastructure lag represents a kind of supply chain crisis that doesn't fit neatly into the usual categories. The buildings exist. The servers are ready to ship. The customers are lined up. But the wires connecting generation to load simply aren't there, and building them requires easements, environmental reviews, and construction timelines measured in congressional terms, not quarters.

The result? A growing migration away from constrained metros. Texas, with its independent grid and faster permitting, is increasingly the destination of choice.

Texas prairie with solar arrays and data center complex at golden hour
05

Texas Becomes the AI Industry's Power Oasis

While other grids struggle to add capacity, ERCOT—Texas's independent grid operator—just approved another 830 megawatts for Galaxy Digital's Helios campus in West Texas. That brings the single site's total capacity to over 1.6 gigawatts, making it one of the largest flexible loads in North America.

Bar chart showing ERCOT large load queue growth from 40 GW to 226 GW
ERCOT's large load queue has grown from 40 GW in 2024 to 226 GW in January 2026—a 465% increase.

The numbers are staggering. ERCOT's large load interconnection queue has exploded from 40 gigawatts in 2024 to 226 gigawatts as of this month. That's not a typo: in two years, the queue has grown by 465%. Much of this is speculative—projects that may never break ground—but even if 10% reaches completion, Texas is looking at a fundamental transformation of its power infrastructure.

Why Texas? Three reasons: an independent grid that doesn't require FERC approval for new interconnections, abundant land with minimal NIMBY resistance, and a political culture that views industrial development as inherently good. While California's data centers sit waiting for power, Texas is building the infrastructure to absorb the entire industry's growth.

The long-term implications are profound. If the AI industry becomes concentrated in a single state's grid, ERCOT's reliability decisions could determine which companies can operate and which cannot. Concentration risk cuts both ways.

GPU chips submerged in glowing blue cooling fluid
06

Air Cooling Hits the Wall. Literally.

For decades, data center cooling meant one thing: push cold air over hot chips, exhaust the heat. Simple. Reliable. Boring. That era ended this week—not with a press release, but with physics.

The newest AI accelerators—NVIDIA's Blackwell series, AMD's MI400 family—generate power densities exceeding 1,000 watts per chip. At those levels, air cooling simply cannot remove heat fast enough. The chips throttle or fail. No amount of fans or chilled air changes the thermodynamic reality.

Infinium's launch of its "Infinium Edge" immersion cooling platform this week is one of dozens of similar announcements. The industry is now forecast to grow at 51% CAGR through 2030, transforming data center architecture from rows of air-cooled racks to tanks of dielectric fluid with submerged electronics.

Line chart showing liquid cooling market growing from $2B to $23B by 2030
The liquid cooling market is projected to grow from $2 billion in 2024 to over $23 billion by 2030.

This isn't just a component swap—it's a complete rethink of how data centers are built and operated. New facilities need plumbing infrastructure. Existing facilities need retrofitting. Technicians need retraining. The $200 billion data center construction pipeline now includes a $23 billion cooling system overhaul that almost no one budgeted for two years ago.

The Infrastructure Bottleneck Is the AI Bottleneck

The coming months will determine whether the AI boom continues unabated or slams into physical constraints that no amount of software optimization can overcome. Power generation, transmission capacity, and cooling infrastructure are now the binding constraints on AI progress—not algorithms, not talent, not capital. The companies and regions that solve the energy problem will define the next decade of the industry. Everyone else will wait.