NVIDIA Spends Big On HBM3E Memory Purchases For Upcoming Hopper H200 & Blackwell B100 AI GPUs

NVIDIA Spends Big On HBM3E Memory Purchases For Upcoming Hopper H200 & Blackwell B100 AI GPUs

 0
NVIDIA Spends Big On HBM3E Memory Purchases For Upcoming Hopper H200 & Blackwell B100 AI GPUs
NVIDIA Drastically Reduces Delivery Times Of Its AI GPUs As Supply Chain Witnesses Improvement 1

NVIDIA has spent big money on HBM3E memory as it tries to secure enough stock for its next-gen Hopper H200 & Blackwell B100 AI GPUs.

Korean media, Biz.Chosun, has disclosed that NVIDIA has placed orders for huge stocks of HBM3E memory from the likes of SK hynix and Micron, to prepare for its next-gen products that are targeted toward the AI segment. This approach restricts access for other companies, either rendering them unable to obtain the resources or limiting their access to them.

Korean media states that NVIDIA has prepaid around 700 billion to 1 trillion won for HBM3E memory, which is a varied figure, but we expect that the actual amount is near the trillion won mark, considering the huge demand from the industry. That's around $775 Million US for the pre-payments alone with the actual figure likely exceeding a Billion US dollars.

According to the industry on the 26th, SK Hynix and Micron are known to have each received between 700 billion and 1 trillion won in advance payments from Nvidia to supply cutting-edge memory products.

NVIDIA's large advance payment led to investment actions by memory semiconductor companies that were struggling to expand HBM production capacity. In particular, SK Hynix, the largest supplier, is known to be planning to invest the advance payment received from NVIDIA intensively in expanding TSV facilities, which are holding back HBM production capacity. This is evidenced by the fact that work related to the establishment of a new TSV line was carried out smoothly in the third quarter of last year. Likewise, Micron's investment in TSV facilities is expected to receive a boost.

Meanwhile, Samsung Electronics is also known to have recently completed HBM3 and HBM3E product suitability tests with NVIDIA and signed a supply contract.

via Biz.Chosun (Machine Translated)

The AI industry is evolving rapidly, with companies like Intel & AMD ramping up their development to chase down NVIDIA in the segment, hence Team Green must take "preemptive" measures to retain its dominance

Speaking of HBM3E, currently, NVIDIA plans on the debut of HBM3e in their next-gen Blackwell AI GPUs, which are rumored to launch by Q2 2024, and in terms of performance, it will bring decisive uplifts in terms of performance per watt, through the adoption of a chiplet design. Moreover, NVIDIA's Hopper H200 GPU is also going to be equipped with the world's fastest HBM3E memory as well, hence the solution has a lot of significance for NVIDIA in terms of their success in the AI & HPC markets.

NVIDIA has a lot of expectations in terms of revenue generation from the data center segment, hence it is obvious for them to be one step ahead of others. NVIDIA plans on generating a whopping $300 billion through AI-driven sales by 2027, hence ensuring a steady supply of its AI GPUs to customers is a primary target for them. Another interesting fact is that the acquisition of HBM3E will bring a huge boost to the HBM industry, especially in terms of their expansion of facilities, since order backlogs have been a great issue for firms like SK hynix and Micron, hence a pre-measured supply means that the delivering of HBM would get a lot easier.

NVIDIA is currently in a dominant spot and doesn't look to give up its throne yet. The advancements Team Green has been making in its CUDA platform, along with its hardware department have revolutionized the state of markets, and it will be interesting to see what the future holds.

News Source: Chosun Biz

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow