AMD’s Chief Technology Officer has stated that artificial intelligence (AI) inference—the process by which AI models make predictions based on data—will increasingly occur on personal devices like phones and laptops, rather than being largely confined to large data centers. This shift is expected to decentralize AI processing, enabling more efficient and accessible AI-powered applications at the edge of the network. The comment reflects broader industry trends emphasizing edge computing and suggests technology companies will need to adapt their strategies and products for more distributed AI workloads.































