NVIDIA to Initiate HBM3e Adoption In Q1 2024 With Hopper H200, HBM4 Expected to Debut by 2026
NVIDIA to Initiate HBM3e Adoption In Q1 2024 With Hopper H200, HBM4 Expected to Debut by 2026

It looks like the HBM industry has centered itself around NVIDIA, as TrendForce estimates a market, where NVIDIA's AI orders dominate current and next-gen HBM supply.
According to market research, NVIDIA is all set to give a fair portion of its HBM orders to the Korean giant Samsung, as both companies start to build on business ties that could prove to be vital for the AI industry.
It was reported back in September that Samsung has managed to gain the trust of Team Green by passing multiple qualification checks for its HBM products and TrendForce now discloses that Samsung might complete the process by December, with orders starting to flow in from the start of next year. Samsung could potentially be responsible for taking care of current-gen AI GPUs such as the highly demanding H100s and A100s.
Apart from HBM3, the adoption of the next-gen HBM3e standard is on track as well, since industry sources cite that suppliers such as Micron, SK Hynix, and Samsung have already initiated the process of sampling HBM3e, and it is said that a decisive outcome could come somewhere in 2024. For a bit of a recap, the debut of HBM3e is expected in NVIDIA's Blackwell AI GPUs, which are rumored to launch by Q2 2024, and in terms of performance, it will bring decisive uplifts in terms of performance per watt, through the adoption of a chiplet design.
Moving on to the interesting bit, NVIDIA has a lot planned for its customers in 2024, since the company has already announced the H200 Hopper GPU, which is expected to see mass adoption by next year, followed by the introduction of the B100 "Blackwell" AI GPUs, both of which will be based on the HBM3e memory technology.
Apart from the conventional route, NVIDIA is also rumored to unveil ARM-based CPUs for the AI segment, which will create diversification in the markets, along with ramping up the competition. Intel and AMD are also expected to introduce their respective AI solutions, with notable mentions being the next-gen AMD Instinct GPUs and Intel's Gaudi AI accelerators, featuring the HBM3e memory.
Finally, TrendForce gives a run-down on what we can expect with HBM4, especially by the fact that the upcoming memory standard will come with a complete revamp in terms of onboard die configurations since it is rumored that the base logic die will feature a 12nm process wafer for the first time, and will act as the driver behind 3D packaged DRAM/GPU, setting a collaborative environment amongst foundries and memory suppliers. HBM4 is expected to mark the next-gen transition in terms of computing power, and it could be the key to future breakthroughs in the AI industry.
News Source: TrendForce
What's Your Reaction?






