HBM3: Cheaper, Up To 64GB On-Package, And Terabytes-Per-Second Bandwidth

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
Here is some talk about the next iteration of High Bandwidth Memory, which Samsung and Hynix are working on. HBM is basically a stacked RAM configuration, meaning space savings and bandwidth improvements. The third generation is expected to not only increase the density of memory dies but have more stacked on a single chip.

HBM1, as used in AMD's Fury graphics cards, was limited to 4GB stacks. HBM2, as used in Nvidia's workstation-only P100 graphics card, features higher density stacks up to 16GB, but is prohibitively expensive for consumer cards. HBM3 will double density of the individual memory dies from 8Gb to 16Gb (~2GB), and will allow for more than eight dies to be stacked together in a single chip. Graphics cards with up to 64GB of memory are possible. HBM3 will feature a lower core voltage than the 1.2V of HBM2, as well as more than two times the peak bandwidth: HBM2 offers 256GB/s of bandwidth per layer of DRAM, while HBM3 doubles that to 512GB/s. The total amount of memory bandwidth available could well be terabytes per second.
 
So if HMB2 is prohibitively expensive for consumer cards, does that mean Volta and Vega will both still be on GDDR5x at best?
 
The summary/article is worded incorrectly. "per layer" should be "per stack" (i.e. a device).

I wonder what the sweet spot will be for on-package memory, since it's almost certain it won't have full system memory size due to cost. In the Intel forum I compared how Crystal Well was changed in Skylake-R models is somewhat similar to the goals of HBM.
 
Last edited:
So if HMB2 is prohibitively expensive for consumer cards, does that mean Volta and Vega will both still be on GDDR5x at best?
I am under the impression HBM2 is only so expensive because NVidia put out a product before working yields and production had ramped up. I would be very surprised if there weren't $400-500 cards with HBM2 by this time next year.
 
So if HMB2 is prohibitively expensive for consumer cards, does that mean Volta and Vega will both still be on GDDR5x at best?

Only GV100 will use HBM (Tesla) from Nvidia. For AMD a HPC card would use HBM.

But it doesn't matter for a gamer if there is no benefit. GM200, GP104 for example didn't need HBM to beat a HBM card.
 
Well I don't want to wait another 4 years for this stuff so I'll be fine with 16 GB HBM2 VRAM :)

9bb6b7df-6be9-45fe-a2d1-6e4580a309f1-640x368.jpg
 
I was under the impression that the power requirements of these dies was increasing exponentially with capacity and bandwidth increases? I saw a slide somewhere.
 
Back
Top