Highlights:
-
Advanced Micro Devices Inc. launches next-generation graphics processing units, including the AMD Instinct MI325X accelerator, aimed at the AI computing market.
-
The MI325X boasts superior memory capacity and bandwidth compared to Nvidia's H200 chip, positioning it competitively for AI workloads.
-
New chips will be available for shipping in late 2024, with widespread availability anticipated from major platform providers starting in early 2025.
Advanced Micro Devices Inc. {NASDAQ:AMD} has introduced a new suite of next-generation graphics processing units (GPUs) in a strategic effort to capture a share of the burgeoning artificial intelligence (AI) computing market, traditionally dominated by Nvidia Corp. The flagship product in this lineup is the AMD Instinct MI325X accelerator, which has been specifically engineered for data center and AI applications.
AMD claims that the MI325X offers significant advantages over Nvidia's current flagship H200 chip, providing 1.8 times more memory capacity and 1.3 times more bandwidth. Furthermore, the MI325X is designed to deliver 1.3 times greater half-precision floating-point (FP16) and 8-bit floating-point (FP8) compute performance compared to Nvidia’s H200 GPU, making it a compelling option for demanding AI workloads such as large language models (LLMs).
These advanced chips are slated to begin shipping in the fourth quarter of 2024, with broad system availability from a range of platform providers, including Dell Technologies, Eviden, Gigabyte, Hewlett Packard Enterprise, Lenovo, Supermicro, and others expected in the first quarter of 2025.
Forrest Norrod, executive vice president and general manager of AMD's data center solutions business group, stated that the company is committed to delivering high-performance solutions that enable customers to bring AI infrastructure to market rapidly and at scale.
Additionally, AMD has previewed its upcoming MI350 chip series, which aims to rival Nvidia’s forthcoming Blackwell chip. Built on the new CDNA 4 architecture, the MI350 is projected to achieve up to 35 times improvement in AI inference performance compared to the current MI300 series, based on CDNA 3. Commercial availability of the MI350 series is anticipated in the second half of 2025.
Despite AMD's advancements, Nvidia continues to maintain a technological lead, with its Blackwell chips already beginning to ship to customers. Analysts from Morgan Stanley forecast substantial revenue generation for Nvidia in the upcoming January quarter, further solidifying its position in the AI market. Recently, OpenAI announced the arrival of its first Nvidia B200 Blackwell chip, marking a significant milestone for Nvidia's product line.