SK hynix has highlighted that not only its 2024 but almost all of its 2025 HBM volume has been sold out as demand for AI reaches sky-high.
During its recent press conference, SK hynix announced plans to invest in a new M15X fab in the Cheongju and the Yongin Semiconductor Cluster in Korea along with advanced packaging facilities in the US (Indiana).
SK hynix revealed that the growing demand for AI has exhausted all of its 2024 HBM capacity and even its 2025 volume has almost entirely been sold out which just shows how big the need for fast HBM memory is for current and next-gen data centers. NVIDIA being one of the key partners of SK hynix is leveraging its HBM3 and HBM3e memory solutions for its Hopper H200 and the Blackwell AI GPU lineup. The company expects to begin sampling 12-Hi HBM3E DRAM soon with production commencing in the coming quarter (Q3 2024).
The company forecasts a fast expansion of AI technology into a wider range of on-device applications such as smartphones, PCs, and automobiles from data centers now
Demand for ultra-fast, high-capacity, and low-power memory products for AI applications is expected to show an explosive increase
The company has the industry’s best technologies for various products including HBM, TSV-based high-capacity DRAM, and high-performance eSSD
SK hynix is ready to provide the industry’s best-customized memory solutions to customers through strategic collaboration with global business partners
On the production side, HBM from 2024 output already sold out, while that from 2025 volume almost sold out
On HBM technology side, the company planning to provide samples of 12-high HBM3E with the industry’s best performance in May, enabling start of mass production in 3Q
The company aims for qualitative growth through better cost competitiveness, a higher profitability with an increase in sales of value-added products
The plan is to continue to improve financial soundness by raising the level of cash holding through a flexible investment response by changing circumstances for demand
The company is committed to contributing to the domestic economy, helping advance Korea’s position as an AI memory powerhouse by growing into a trusted customer, the stable company not swayed by business circumstances in the AI era
Besides HBM3E, SK hynix is also mass producing DRAM modules with more than 256 GB capacities and have already commercialized the world's fastest LPDDR5T solution for mobile devices. Looking ahead, SK hynix plans to introduce several next-generation memory solutions such as HBM4, HBM4E, LPDDR6, 300 TB SSDs, CXL-Pooled Memory Solutions, and PIM (Processing-In-Memory) modules.
In DRAM space, the company mass producing HBM3E and modules with an ultra-high capacity of more than 256GB, while having commercialized the world’s fastest LPDDR5T
Company a top provider of AI memory also in NAND space as the sole supplier of QLC-based SSD of more than 60TB
Development of next-generation products with improved performance underway
Company planning to introduce innovative memory such as HBM4, HBM4E, LPDDR6, 300TB SSD, CXL Pooled Memory Solution, and Processing-In-Memory
SK hynix’s proprietary MR-MUF is a core technology for HBM packaging
Views that MR-MUF will face technological challenges in higher stacking incorrect as seen in SK hynix’s successful mass production of 12-high HBM3 with Advanced MR-MUF technology
MR-MUF lowers pressure from chip stacking to 6% level, raises productivity by 4 times by shortening time required for process, while improving heat dissipation by 45% VS previous technology
Advanced MR-MUF recently introduced by SK hynix improves heat dissipation by 10% by adopting a new protective material, while maintaining existing advantages of MR-MUF
Advanced MR-MUF, which adopts high-temperature, low-pressure methodology known for excellent warpage control, an optimal solution for high stacking and development of technology to realize 16-high stacking underway
Company plans to adopt Advanced MR-MUF for realization of 16-high HBM4, while preemptively reviewing Hybrid Bonding technology
Separately, company announced last month a plan to build advanced packaging facilities for AI memory in West Lafayette, Indiana
Mass production of AI products such as next-generation HBM from Indiana fab to start in 2H 2028
For HBM, SK hynix will leverage its MR-MUF technology for packaging the DRAM. An advanced version of the tech will be used to mass produce 12-Hi HBM3 memory modules, increasing productivity by 4x and improving heat dissipation by 45% compared to previous technologies.
The same packaging technology will pave the way for 16-Hi HBM memory and the company is also currently reviewing the use of Hybrid Bonding technology for its 16-Hi HBM4 modules. The mass production for the next generation of HBM memory is expected to commence in the Indiana fab by 2H 2028. The standard HBM4 modules are expected to begin mass production by 2026 for the next chapter in AI.