SK Hynix unveils the world’s first 48GB HBM3E modules
SK Hynix reveals insane memory densities with their new 16-high HBM3E memory stacks
SK Hynix is ready to power the next wave of AI advancement. While companies like Nvidia are seen at the forefront of the AI race, memory manufacturers have a huge role to play. Memory bandwidth and memory capacity are two of the largest performance limiters of AI performance, and SK Hynix is solving both problems with 16-stack 48GB HBM3E memory modules.
By creating 16-high HBM3E memory stacks, SK Hynix can now deliver 48GB of memory per module. That’s two RTX 4090s worth of memory on a single HBM stack! Samples of these new memory modules will be provided to customers in early 2025. Furthermore, these new memory stacks reportedly deliver 18% faster performance in AI training and 32% performance gains in AI inference workloads compared to 12-high HBM3E memory stacks.
12-high HBM3E memory stacks can give users 36GB of memory per stack. Alongside memory performance increases, SK Hynix’s new 16-high stacks deliver 12GB of additional memory per stack. That can significantly increase the memory capacity of AI chips, allowing them to work on larger, more advanced workloads.
(Image from SK Hynix)
Originally, 16-high HBM memory stacks were expected to arrive with HBM4. SK Hynix pushed forward with 16-high memory stacks with HBM3E to deliver technological leadership over their competition and stabilise the technology ahead of HBM4’s launch. This is good news for customers and SK Hynix alike. After all, customers want faster (and higher capacity) memory, and SK Hynix wants to sell more premium products.
You can join the discussion on SK Hynix’s 48GB HBM3e memory modules on the OC3D Forums.