Samsung missed out on Nvidia's most expensive AI card, but beats Micron for 36GB HBM3E memory. Could this new technology power the B100, the successor to the H200?

Samsung says it has developed the industry's first 12-stack HBM3E 12H DRAM, surpassing Micron Technology and potentially laying the groundwork for Nvidia's next generation of AI cards.

The South Korean tech giant's HBM3E 12H offers bandwidth of up to 1,280GB/s and an industry-leading capacity of 36GB, representing a more than 50% improvement over the 8-stack HBM3 8H.

scroll to top