SK hynix Begins Sampling HBM3e, Volume Production Planned For H1 2024

2023-08-21 By admin

SK hynix on Monday announced that it had completed initial development of its first HBM3E memory stacks, and has begun sampling the memory to a customer. The updated (“extended”) version of the high bandwidth memory technology is scheduled to begin shipping in volume in the first half of next year, with hardware vendors such as NVIDIA already lining up to incorporate the memory into their HPC-grade compute products.

First revealed by SK hynix back at the end of May, HBM3E is an updated version of HBM3 that is designed to clock higher than current HBM3, though specific clockspeed targets seem to vary by manufacturer. For SK hynix, as part of today’s disclosure the company revealed that their HBM3E memory modules will be able to hit data transfer rates as high as 9 GT/sec, which translates to a peak bandwidth of 1.15 TB/sec for a single memory stack.

Curiously, SK hynix has yet to reveal anything about the planned capacity for their next-gen memory. Previous research from TrendForce projected that SK hynix would mass produce 24 GB HBM3E modules in Q1 2024 (in time to address applications like NVIDIA’s GH200 with 144GB of HBM3E memory), boosting capacity over today’s 16GB HBM3 stacks. And while this still seems likely (especially with the NV announcement), for now it remains unconfirmed.

TrendForce HBM Market Projections (Source: TrendForce)

Meanwhile, the SK hynix also confirms that its HBM3E stacks are set to use its Advanced Mass Reflow Molded Underfill (MR-RUF) technology to reduce their heat dissipation by 10%. But thermals is not the only benefit MR-RUF can provide. MR-RUF implies the usage of an improved underfill between layers, which improves thermals and reduces thickness of HBM stacks, which allows the construction of 12-Hi HBM stacks that are only as tall as 8-Hi modules. This does not automatically mean that we are dealing with 12-Hi HBM3E stacks here, of course.

At present, SK hynix is the only high volume manufacturer of HBM3 memory, giving the company a very lucrative position, especially with the explosion in demand for NVIDIA’s H100 and other accelerators for generative AI. And while the development of HBM3E is meant to help SK hynix keep that lead, they will not be the only memory vendor offering faster HBM next year. Micron also threw their hat into the ring last month, and where those two companies are, Samsung is never too far behind. In fact, all three companies seem to be outpacing JEDEC, the organization that is responsible for standardizing DRAM technologies and various memory interfaces, as that group has still not published finalized specifications for the new memory.