Samsung says it has developed the industry's first 12-stack HBM3E 12H DRAM, surpassing Micron Technology and potentially laying the groundwork for Nvidia's next generation of AI cards.
The South Korean tech giant's HBM3E 12H offers bandwidth of up to 1,280GB/s and an industry-leading capacity of 36GB, representing a more than 50% improvement over the 8-stack HBM3 8H.
The 12-stack HBM3E 12H uses an advanced thermal compression non-conductive film (TC NCF), which allows 12-ply products to meet current HBM package requirements while maintaining the same height specification as 8-ply products. These advancements have resulted in a 20% increase in vertical density compared to Samsung's HBM3 8H product.
The battle heats up
“Industry AI service providers increasingly require higher-capacity HBMs, and our new HBM3E 12H product has been designed to respond to that need,” said Yongcheol Bae, executive vice president of memory product planning at Samsung Electronics. . “This new memory solution is part of our drive to develop core technologies for high-capacity HBM and to provide technology leadership for the high-capacity HBM market.”
Meanwhile, Micron has begun mass production of its 24GB HBM3E 8H, which will be used in Nvidia's latest H200 Tensor Core GPUs. Micron claims that its HBM3E consumes 30% less power than its competitors, making it ideal for generative AI applications.
Despite missing out on Nvidia's more expensive AI card, Samsung's 36GB HBM3E 12H memory beats Micron's 24GB HBM3E 8H in terms of capacity and bandwidth. As AI applications continue to grow, Samsung's 12H HBM3E will be an obvious choice for future systems requiring more memory, such as Nvidia's powerful B100 Blackwell AI, which is expected to arrive later this year.
Samsung has already started testing its 36GB HBM3E 12H to customers and mass production is expected to begin in the first half of this year. Micron will begin shipping its 24GB HBM3E 8H in the second quarter of 2024. Competition between the two technology giants in the HBM market is expected to intensify as demand for high-capacity memory solutions continues to increase in the era of AI.