SK Hynix, a key Samsung competitor, says it has exhausted all of its 2024 production of high-bandwidth stacked DRAM memory. These chips are crucial for AI processors in data centers, but the company is keeping its mouth shut about its biggest customers.
SK Hynix's newly appointed vice president Kitae Kim, who is also HBM's director of sales and marketing, confirmed the news in an interview posted on SK Hynix's website.
“Proactively securing customer purchase volumes and negotiating more favorable terms for our high-quality products are the fundamentals of semiconductor sales operations,” Kim said. “When we have great products in our hands, it's a matter of speed. Our planned HBM production volume for this year has already been exhausted. Although 2024 is just beginning, we have already started preparing for 2025 to stay ahead of the market.”
'Very sought after'
As EE News Europe As he points out, the shortage of HBM3 and HBM3E form factor chips could potentially hamper growth in the memory and logic sectors of the semiconductor industry this year.
“HBM is a revolutionary product that has challenged the notion that semiconductor memory is just one part of an overall system. In particular, SK Hynix's HBM has exceptional competitiveness,” Kim added.
“Our advanced technology is highly sought after by global technology companies,” he added, leaving us wondering who his company's most important customers might be. Nvidia and AMD are known to be voracious for high-bandwidth memory chips, but there are other players in the highly competitive AI market that might be interested in snapping up HBM stock to avoid being left in the lurch.
Interestingly, while SK Hynix cannot manufacture enough current HBM products to meet the high demand, its main rivals in this space, Samsung and Micron, are now focusing on HBM3E. Micron has begun “volume” manufacturing of its 24GB HBM3E 8H, which will be used in Nvidia's latest H200 Tensor Core GPUs. At the same time, Samsung has started testing its 36GB HBM3E 12H to customers, and this could well be the memory used in Nvidia's B100 Blackwell AI powerhouse, which is expected to arrive later this year.
However, SK Hynix won't be left behind for long. It is expected to begin manufacturing its own 36GB HBM3E in the first half of this year, following an upgrade of its Wuxi plant in China.