SK Hynix To Develop 96 GB & 128 GB DDR5-Based CXL 2.0 Memory Solutions Next Year

SK Hynix To Develop 96 GB & 128 GB DDR5-Based CXL 2.0 Memory Solutions Next Year

 0
SK Hynix To Develop 96 GB & 128 GB DDR5-Based CXL 2.0 Memory Solutions Next Year
SK Hynix To Develop 96 GB & 128 GB DDR5-Based CXL 2.0 Memory Solutions Next Year 1

SK Hynix is working towards the development of DDR5-based CXL 2.0 memory solutions for the AI segment, especially for "memory-hungry" LLMs.

CXL refers to Compute Express Link, which is an interconnect technology that allows faster data transfer between the CPU and GPU, but in the case of AI, it's actually between the CPU and accelerators.

CXL is beneficial from conventional PCIe interfaces since it enables GPUs to access system memory directly, enhancing performance significantly. CXL protocol has recently seen massive interest from the markets, given that modern-day memory techniques are unable to fulfill the power required by large-scale AI models. To combat this, SK Hynix and other firms are working on adopting CXL-based memory solutions.

SK Hynix's Vice President of System Architecture discussed the possibilities of developing CXL-memory modules at an event in Korea, explaining how the firm is moving in this particular segment. Here is what the official had to say:

Currently, AI memory is essentially the Graphics Processing Unit (GPU), but the High Bandwidth Memory (HBM) is not that large, so it is always 'memory hungry'. We are diligently working on tuning CXL memory for AI.

Kyoung Park - VP of Research at SK hynix (via Business Korea)

It is disclosed that SK Hynix is currently working on DDR5-based 96GB and 128GB CXL 2.0 memory products, to put them out in the markets by the second half of 2025. The CXL interconnect technique allows the memory to directly access other components onboard, along with providing room for more memory capacity onboard.

Recently, Panmnesia, a Korean startup, showcased a cutting-edge CXL IP that allows GPUs to leverage memory from DRAM or even SSDs, expanding from the in-built HBM, so that's another way to go with this technology.

Unfortunately, SK Hynix hasn't showcased CXL 2.0 memory solutions yet; hence, we cannot comment on their effectiveness for AI applications right now. However, Samsung is said to be expected to bring in its own 256 GB CXL 2.0 memory module this year, so that might give us a clearer idea of what to expect with this technology moving forward.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow