SK Hynix delivers 24GB HBM3 chips to customers with industry first 12-layer tech
SK Hynix’s new 24GB HBM3 delivers insane memory densities to customers
Using their “Industry First” 12-layer technology, SK Hynix has created 24GB HBM3 memory modules, offering customers a 50% increase in memory density over their prior HBM3 chips.
Thanks to the company’s new Advanced Mass Reflow Molded Underfill (MR-MUF) and Through Silicon Via (TSV)4 technologies, SK Hynix has reduced the thickness of a single DRAM chip by 40%, allowing them to achieving the same stack height level as the 16GB product with their new 24GB products. This enabled easy product integration for chipmakers, and allows manufacturers to quickly adopt these new DRAM modules when they become available.
SK Hynix has already sampled their 24GB memory chips to customers, who are now evaluating these chips for use with future products. Nvidia currently uses 5 16GB HBM 3 chips to create their Hopper H100 Tensor Core GPU, giving it 80GB of Total memory. With SK Hynix’ new 24GB modules, Nvidia could create GPUs with 120GB of memory, and if Nvidia taps into the full potential of their Hopper silicon, they could create GPUs with six HBM3 chips and 144 GB of GDDR6 memory.
By enabling the creation of AI processors with larger memory pools, SK Hynix’s latest HBM 3 memory chips have the potential to transform the AI market by enabling the use of larger data sets with higher levels of performance. With this in mind, we expect a lot of companies to utilise these new memory chips with their latest AI processors.
SK Hynix has confirmed that their new 24GB memory chips will enter mass production in the first half of 2023, which means that we will have to wait a while before we can expect to see these new chips within any new AI accelerators, GPUs, FPGAs, or processors.