- Samsung begins commercial shipments of HBM4 as AI memory competition heats up
- HBM4 achieves speeds of 11.7 Gbps while driving greater bandwidth and efficiency gains for data centers
- Samsung expands production plans with roadmap extending to HBM4E and custom memory variants
Samsung says it has not only begun mass production of HBM4 memory, but has also shipped the first commercial units to customers, claiming to be an industry first for the new generation of high-bandwidth memory.
HBM4 is based on Samsung's sixth-generation 10nm-class DRAM process and uses a 4nm logic base, which reportedly helped the South Korean memory giant achieve stable performances without redesigns as it ramped up production.
This is a technical claim that will likely be tested once large-scale deployments begin and independent performance results appear.
Up to 48 GB capacity
The new memory achieves a constant transfer speed of 11.7 Gbps, with a margin of up to 13 Gbps in certain configurations.
Samsung compares that to an industry benchmark of 8 Gbps, putting the HBM3E at 9.6 Gbps. Total memory bandwidth increases to 3.3 TB/s per stack, which is approximately 2.7 times more than its previous generation.
Capacity ranges from 24GB to 36GB in 12-layer stacks, with 16-layer versions coming later. This could increase capacity to 48GB for customers who need denser configurations.
Power usage is a key issue as HBM designs increase the number of pins and this generation goes from 1,024 to 2,048 pins.
Samsung says it improved power efficiency by about 40% compared to the HBM3E through low-voltage technology through silicon and adjustments to power distribution, along with thermal changes that increase heat dissipation and resistance.
“Rather than taking the conventional path of using existing proven designs, Samsung took the leap and adopted the most advanced nodes such as 1c DRAM and 4nm logic process for HBM4,” said Sang Joon Hwang, executive vice president and head of memory development at Samsung Electronics.
“By leveraging our process competitiveness and design optimization, we can ensure substantial performance margin, allowing us to meet our customers' increasing demands for higher performance, when they need it.”
The company also points to its manufacturing scale and in-house packaging as key reasons it can meet expected demand growth.
That includes closer coordination between foundry and memory teams, as well as partnerships with GPU manufacturers and hyperscalers creating custom AI hardware.
Samsung says it expects its HBM business to grow significantly throughout 2026, with HBM4E samples planned for later this year and customized HBM samples in 2027.
Whether competitors respond with similar timelines or faster alternatives will determine the duration of this initial advantage.
Follow TechRadar on Google News and add us as a preferred source to receive news, reviews and opinions from our experts in your feeds. Be sure to click the Follow button!
And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form and receive regular updates from us on WhatsApp also.






