AI’s Biggest Bottleneck Isn’t Chips — It’s America’s Power Grid

AI’s Biggest Bottleneck Isn’t Chips — It’s America’s Power Grid

A recent report from Goldman Sachs highlights that the largest hurdle facing the U.S. in its AI ambitions isn’t a lack of compute or talent, but the strain on its electricity infrastructure. AI-driven data centers currently account for around 6 % of U.S. electricity demand and are projected to reach 11 % by 2030.

The power-grid challenge is being exacerbated by shrinking “spare” capacity — the buffer between total generation and peak demand. In recent years, U.S. spare generation capacity has fallen from roughly 26 % to 19 %. Goldman projects that if AI demand continues unabated, spare capacity could drop below a critical threshold of 15 %, raising risks of grid instability.

Part of the tension stems from the fact that while AI infrastructure is expanding rapidly (the U.S. now hosts about 44 % of global data-center capacity), the pace of power-plant additions and transmission build-out is slower. Meanwhile, other nations such as China are increasing power-generation capacity more aggressively—Goldman estimates China could have around 400 GW of spare power capacity by 2030.

The implications are significant for industry, policy-makers and infrastructure planners. If the power bottleneck isn’t addressed, it could slow down AI rollout, increase operating costs, and even shift strategic advantage toward regions with more robust energy systems. It suggests that for the AI race, winning hardware and algorithms may matter less than having reliable, abundant electricity.

About the author

TOOLHUNT

Effortlessly find the right tools for the job.

TOOLHUNT

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to TOOLHUNT.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.