AMD has announced new details about its next-generation AI chips, the Instinct MI400 series, set to ship next year. These chips can be assembled into a complete server rack called Helios, enabling thousands of chips to function as a unified "rack-level" system. This innovation is crucial for AI customers, such as cloud service providers and companies developing large language models, who require extensive data center coverage and power consumption.
AMD's rack-level technology allows its latest chips to compete with NVIDIA's Blackwell chips, which feature 72 GPUs. While NVIDIA dominates the data center GPU market, AMD aims to challenge this with competitive pricing and lower operational costs. AMD's MI355X chips, already in mass production, offer significant cost advantages and performance benefits over NVIDIA's offerings.
AMD expects the AI chip market to exceed $500 billion by 2028. The company has acquired or invested in 25 AI firms over the past year, enhancing its capabilities. AMD's Instinct chips are already adopted by major AI clients, including OpenAI and Tesla. Oracle plans to offer clusters with over 131,000 MI355X chips to its customers. AMD's strategy includes leveraging open-source networking technology to integrate its systems, contrasting with NVIDIA's proprietary approach.