Big Tech’s race to build AI capacity is colliding with the realities of the U.S. power system. Goldman Sachs estimates roughly 50 gigawatts of additional capacity will be needed to feed new data centers—after a decade of flat demand—triggering a wave of utility spending on equipment and transmission. The scramble to secure hookups has led to ambitious requests and accelerated deals, even as the business case for some projects shifts.
That leaves utilities—and potentially customers—exposed if demand doesn’t materialize. Grid connections can run about $102 per kilowatt; when projects are delayed or canceled, those costs risk being socialized through higher rates or absorbed by utility investors. Microsoft has walked away from data centers totaling about 2 gigawatts, underscoring uncertainty around AI compute needs and hardware supply.
Regulatory scrutiny is intensifying. PJM’s market monitor urged FERC to reject a PECO-Amazon transmission pact, a sign that fast-tracked arrangements may face pushback. For utilities, surging AI load promises growth but also raises planning, regulatory and execution risks. For consumers, the AI boom could mean higher bills before benefits—if any—reach the broader economy.
Related article:
EIA: U.S. utilities forecast faster electricity demand growth from data centers and electrification































