OpenAI struck a seven-year, $38 billion agreement to run its AI workloads on Amazon Web Services, securing access to vast fleets of Nvidia GB200 and GB300 chips as it scales ChatGPT and trains next-generation models. The pact, which begins immediately and ramps through 2026 with room to expand in 2027, bolsters AWS after investor concerns it was lagging Microsoft and Google in the AI race; Amazon shares jumped about 4.7% on the news. The move follows OpenAI’s restructuring that loosened Microsoft’s preferential rights and comes as the startup outlines $1.4 trillion in long-term compute ambitions—raising fresh questions about energy use, with labs estimating AI data centers could consume up to 12% of U.S. power by 2028. Industry observers say the deal is about securing cost-effective GPU capacity rather than new content access, while exuberant AI valuations and spending have stoked bubble worries.
Related article:





























