Investors are pouring an estimated $3 trillion into AI-ready data centers through 2029, according to Morgan Stanley, as hyperscalers race to build dense, power-hungry facilities optimized for training large language models. Unlike traditional server farms, AI sites pack $4 million cabinets of Nvidia chips tightly together to minimize nanosecond-scale latency and enable massive parallel processing—design choices that push electricity demand into the gigawatt range and create sharp, hard-to-manage load spikes. Operators are pursuing a patchwork of energy strategies, from gas turbines operating off-grid to long-term bets on nuclear and renewables. Microsoft has struck nuclear-related deals through Constellation, Google is targeting 24/7 carbon-free power by 2030, and Amazon touts itself as the largest corporate buyer of renewable energy. Regulators are starting to scrutinize side effects, including heavy water use; Virginia is weighing approvals tied to consumption, and a proposed UK site faces pushback from a regional utility. While some warn of “bragawatts” and question whether spending can sustain, proponents argue AI’s impact could justify unprecedented buildouts, calling these facilities the “real estate of the tech world.”
Related article:





























