Micron’s Partnership With NVIDIA For H200 AI GPUs Shocks SK Hynix & Others In The HBM Race

Micron’s Partnership With NVIDIA For H200 AI GPUs Shocks SK Hynix & Others In The HBM Race

 0
Micron’s Partnership With NVIDIA For H200 AI GPUs Shocks SK Hynix & Others In The HBM Race
Micron's Partnership With NVIDIA For H200 AI GPUs Shocks SK Hynix & Others In The HBM Race 1

Micron's recent partnership with NVIDIA has reportedly "shocked" others involved in the HBM race, as the firm manages to capture most of the spotlight.

Korean media reports that Team Green's decision to opt for Micron's HBM3e process for its H200 AI GPU has stirred up a new debate among HBM manufacturers as Micron grasps the holy grail of next-gen markets.

Micron's 24 GB 8H HBM3E will be part of NVIDIA's upcoming H200 AI accelerator, marking a massive achievement for the company, lagging significantly behind SK hynix and Samsung. Micron's superiority in the HBM3E process proved vital for the firm's agreement with NVIDIA, so things would favor the company. Here is how Micron describes the capabilities of its HBM3e process:

  • Superior Performance: With pin speed greater than 9.2 gigabits per second (Gb/s), Micron's HBM3E delivers more than 1.2 terabytes per second (TB/s) of memory bandwidth, enabling lightning-fast data access for AI accelerators, supercomputers, and data centers.
  • Exceptional Efficiency: HBM3E leads the industry with ~30% lower power consumption compared to competitive offerings. To support increasing demand and usage of AI, HBM3E offers maximum throughput with the lowest levels of power consumption to improve important data center operational expense metrics.
  • Seamless Scalability: With 24 GB of capacity today, HBM3E allows data centers to seamlessly scale their AI applications. Whether for training massive neural networks or accelerating inferencing tasks, Micron's solution provides the necessary memory bandwidth.
  • However, Micron gaining NVIDIA's trust doesn't mean the firm would gain the edge over the segment's behemoth SK hynix. It was just reported recently that SK hynix sent out its 12-layer HBM3E type for qualification tests to NVIDIA, which means that the firm could be included in the pool of suppliers of Team Green's upcoming AI solutions, and judging by the fact that SK hynix has been a massive customer of NVIDIA, it's doubtful that we will see Micron taking its place, although the balances could tip off when it comes to market shares, since as of now, SK hynix retains an overall 54% share of the HBM industry.

    However, one thing is sure: with how the HBM markets are evolving, we will witness a high-competition segment, with the likes of SK hynix, Micron, and Samsung Foundry battling to gain the market's throne. Each firm involved will need to scale up in terms of production volumes it can offer, along with the respective price-to-performance ratios with its HBM processes.

    News Source: Korea JoongAng Daily

    What's Your Reaction?

    like

    dislike

    love

    funny

    angry

    sad

    wow