Microsoft aims to shift its AI infrastructure toward in-house data-center chips, a move that could reduce reliance on Nvidia and AMD as demand for computing power surges. CTO Kevin Scott said the company will prioritize price-performance while ramping deployment of Microsoft silicon, including the Azure Maia AI Accelerator and Cobalt CPU, and designing full-stack systems spanning networking and advanced cooling. Scott described a severe shortage of compute capacity since ChatGPT’s debut, noting even aggressive buildouts are falling short. The strategy mirrors efforts at Google and Amazon to tailor chips for specific workloads as Big Tech’s AI-related capital spending tops $300 billion.





























