HBM Industry to Witness Growth in Market Share With SK Hynix Holding the Throne
HBM Industry to Witness Growth in Market Share With SK Hynix Holding the Throne

The HBM (High Memory Bandwidth) industry is experiencing an economic revival, courtesy of the demand from AI-driven sales. SK Hynix is capitalizing the greatest from the exponential growth, as the company is expected to see a massive increase in market share.
TrendForce reports that the immense demand from the AI sector has led to HBM suppliers facing a greater volume of orders than in previous quarters. This is because HBM is a crucial component for AI GPUs, and since we all know how the demand for AI GPUs has risen, this has had a knock-on impact on suppliers like SK Hynix.
For the market share in 2022, SK Hynix had the lion's share, standing at 50%, while Samsung and Micron were at 40% and 10%, respectively. However, in the upcoming years, the landscape will shift drastically.
It is said that future AI GPUs releases, such as the AMD MI300 Instinct GPUs and NVIDIA's H100, will feature the next-gen HBM3 process. SK Hynix is the sole memory supplier, with Samsung and Micron expected to commence production in 2024. SK Hynix has the upper hand here, due to which it is said that the company will further strengthen its position in the industry through a projected market share of 53%. Samsung and Micron will witness a slight increase too, but that will depend on buyers' interest.
However, there is a development that we would like to highlight too. Korean media (via Ctee) has reported that Micron is working with TSMC to have 2nd-generation HBM3 memory incorporated into GPU packages for NVIDIA. In simple terms, Micron aims to work with TSMC as a supplier for HBM3 memory since the process has received positive customer results. This is important to highlight since it shows market competitiveness, and second, it could cause trouble for SK Hynix, which is currently sitting at the top.
TrendForce has also shared figures on the expected AI server shipments for 2023. The race of companies to launch generative AI applications has led to an increase in demand for necessary equipment, due to which it is said that AI server shipments are anticipated to increase by 15.4%, and the CAGR (Compound Annual Growth Rate) is projected at 12.2% from 2023 to 2027.
This rapid growth in AI server shipments is owed to the increase in memory usage in AI servers. AI servers are currently employing a total of 1.2–1.7 TB of memory. However, with the demand for enhanced computing capabilities, an individual AI server is projected to reach a total of 2.2-2.7 TB of memory, which is why the demand has increased to this extent.
It was previously reported that the AI boom has resulted in an economic boost for companies like SK Hynix since its shares have increased by almost 50% since the start of 2023. The company has reportedly received a request from NVIDIA to sample its next-gen HBM3E DRAM. We can say that SK Hynix is in the lead; however, with the recent proposition of a "hybrid" strategy by Samsung to NVIDIA, things can take a shift, but that isn't certain yet.
News Source: TrendForce
What's Your Reaction?






