Samsung’s 8-Layer HBM3E Passes Quality Checks For NVIDIA’s AI Chips
Samsung’s 8-Layer HBM3E Passes Quality Checks For NVIDIA’s AI Chips

NVIDIA's AI chips will finally start utilizing Samsung's HBM3E chips for faster performance and better efficiency after passing quality checks.
After refuting the report of HBM3E's failing NVIDIA's tests due to heat & high power consumption, Samsung has recently announced that its fifth-gen HBM chips, aka HBM3E, have finally passed NVIDIA's tests for usage in their data center products, reports Reuters.
Samsung has been trying to get its HBM3E approved for some time now and it was rumored that it passed the tests a few weeks ago. However, it's locked in now and with this achievement, it can now supply the high bandwidth memory chips for NVIDIA's chips used in GPUs and AI accelerators.
While the deal between the two companies is yet to be signed officially, it's expected that Samsung will start supplying HBM3E chips to NVIDIA in the fourth quarter of 2024. The HBM3E will bring some significant changes to the memory bandwidth over the HBM3 used in NVIDIA's Hopper GPUs. While the HBM3 offers a 1024-bit data path and a memory speed of 6.4Gb/s, the HBM3E will increase it to 9.6Gb/s. This will increase the memory bandwidth to over 1200GB/s compared to just 819GB/s.
NVIDIA has been using the HBM3 memory since June 2022 but it's SK Hynix, which has been the main supplier of HBM3 chips. As AI, Machine Learning, and Data Analytics require more memory bandwidth at lower power consumption, the HBM3E is currently the only viable solution to HBM3 for these demanding workloads. Samsung did rework its HBM3E design to make sure it can deliver the power efficiency and better thermals the AI processors require but it declined those issues to be the reasons why its newer chips were not approved by NVIDIA.
While its 8-layer HBM3E has now passed NVIDIA's tests, SK Hynix is already in its course for shipping the 12-layer HBM3E chips, which the company claims to be fully booked for Q3 2024. The company has already reached its goal in May by achieving a target yield of 80% at significantly less duration than expected. While "Samsung is still playing catch up in HBM", says Dylan Patel, the founder of SemiAnalysis, SK Hynix has already deployed its HBM3E chips for NVIDIA's current H200 and next-gen Blackwell B100 GPUs.
Along with Samsung and SK Hynix, Micron is the third main supplier of HBM and it is also reportedly supplying its HBM3E chips to NVIDIA. Micron entered the mass production phase much earlier than SK Hynix and the production started in February this year. It's expected that the HBM3E will have a share of up to 60% in the total HBM chip sales by Q4 2024 while the demand for HBM chips is estimated to rise at an annual rate of 82% till 2027 as per SK Hynix.
News Source: Reuters
What's Your Reaction?






