AMD adds ultra-fast memory to flagship Instinct AI accelerator as it waits for next-generation CDNA 4 architecture: The Instinct MI325X accelerator has twice the memory and 30% more bandwidth compared to Nvidia's H200.


AMD has unveiled new CPU, NPU, and GPU architectures aimed at “driving end-to-end AI infrastructure from the data center to PCs,” along with an expanded AMD Instinct accelerator roadmap and a new Instinct accelerator. MI325X, which it says will be available in Q4 2024.

The new Instinct MI325X offers 288 GB of HBM3E memory and 6 TB/s of memory bandwidth. AMD says this means it will offer twice the memory capacity and 1.3 times more bandwidth than “the competition”, i.e. the Nvidia H200, as well as 1.3 times better computing performance.

scroll to top