South Korean memory giant SK Hynix has been making several big announcements in recent months, including its plans to build the world's largest chip factory and the creation of a mobile storage chip that could power phones and laptops. faster.
The company has also begun collaborating with Taiwanese semiconductor foundry TSMC to develop and produce the next generation of high-bandwidth memory, known as HBM4, which will significantly boost HPC and AI performance, and could end up in the rumored GPU. Nvidia H300.
At the recent 16th IEEE International Memory Workshop 2024 (IMW 2024) held in Seoul, South Korea, SK Hynix revealed more details about its plans for HBM4, which is expected to be widely available in 2026 (more on that in a second). Naturally, the company, as HBM's largest manufacturer, had a lot to say about this.
Accelerating HBM development
The company's development roadmap shows that HBM4 will have the same die density as HBM3E (24 Gb), but with 16 layers instead of 12. It will also offer 1.65 TB/s bandwidth, compared with the 1.18 TB/s of the HBM3E. Capacity will be 48GB, up from 36GB, with a total IO/cube of 2048 pins, double that of its predecessor. SK Hynix claims that the chip's power consumption will be reduced by approximately half.
According PC clock“The keynote also discussed the next generation of the HBM4E module. Commercialization is planned for 2028. The maximum input/output bandwidth is likely to exceed 2 TB/s. Details such as storage capacity and capacity are unknown. DRAM array”.
The really interesting thing is that PC clock He also reports that at the end of the keynote there was a slide that said that the company's production schedule will be accelerated, and that “the commercialization of the 'HBM4' module will be brought forward to 2025 and that of the 'HBM4E' module to 2026. “.
If that's the case, SK Hynix is likely responding to the threat from archrival Samsung, which is developing its own HBM4 module that is expected to debut next year.