GDDR6 Memory Costs 70% More than GDDR5

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
Pricing lists from two major electronics components distributors are showing that GDDR6 memory carries a whopping premium over GDDR5: “14 Gbps GDDR6 memory chips from Micron Technology cost over 70 percent more than common 8 Gbps GDDR5 chips of the same density, from the same manufacturer.” TechPowerUp suggests that graphics card manufacturers can save around $22 per card by using six GDDR5 chips instead of GDDR6.

Although GDDR6 is available in marginally cheaper 13 Gbps and 12 Gbps trims, NVIDIA has only been sourcing 14 Gbps chips. Even the company's upcoming RTX 2060 performance-segment graphics card is rumored to implement 14 Gbps chips in variants that feature GDDR6. The sheer disparity in pricing between GDDR6 and GDDR5 could explain why NVIDIA is developing cheaper GDDR5 variants of the RTX 2060.
 
it would help if you included how much ram costs per board .

8GB of gddr5 costs about 50-60 bucks. That puts the 70% markup of gddr6 at about 100 bucks. So 8GB of gddr6 would be like having almost 14GB of gddr5 for the same cost. Though I'm guessing techpowerup is suggesting some kind of striped setup (think raid0) for the gddr5 ram? Otherwise i'm not sure how 6 gddr5 chips will replace the 200% increase in bandwidth that gddr6 is providing.

source: https://www.gamersnexus.net/guides/3032-vega-56-cost-of-hbm2-and-necessity-to-use-it
 
I expect this to drop once supply outstrips GDDR5 supply. This hasnt been a wonderful adoption for GDDR6 with many of the currently marketed cards doing somewhat poorly in their sales.
 
Is anyone surprised? When memory makers collude to keep prices artificially-inflated shit like this happens.
 
So the price/performance ratio is pretty good.

Jen was right, the more you buy, the more you save. o_O
 
Is HBM ever going to go mainstream ???
If it drops in price, sure, why not?
Otherwise... hmm
But, its is getting used for the AI junk everyone is buzzing about and will soon ruin our lives for the next 20years or so until we discover its all a bunch of bullshit.
 
I expect this to drop once supply outstrips GDDR5 supply. This hasnt been a wonderful adoption for GDDR6 with many of the currently marketed cards doing somewhat poorly in their sales.

Honestly, the 2000series from NVidia is just to expensive to manufacture you have that massive die on top of an upcharge for using workstation class compute cores, then you have the upcharge for gddr6 ram, I mean you really aren't going to get cheap until quite a few stars align, also who said they sold poorly, despite the price they seem to be doing alright, and certain lines are sold out pretty consistently, I would say though they probably sent in the hands of gamers though, I think people are using them as cheap workstation compute cards and you probably have those banking on mining being able to leverage those cores as well.
 
If it drops in price, sure, why not?
Otherwise... hmm
But, its is getting used for the AI junk everyone is buzzing about and will soon ruin our lives for the next 20years or so until we discover its all a bunch of bullshit.

By the time HBM goes mainstream persistent nonnvolitile memory will be a thing.
 
Honestly, the 2000series from NVidia is just to expensive to manufacture you have that massive die on top of an upcharge for using workstation class compute cores, then you have the upcharge for gddr6 ram, I mean you really aren't going to get cheap until quite a few stars align, also who said they sold poorly, despite the price they seem to be doing alright, and certain lines are sold out pretty consistently, I would say though they probably sent in the hands of gamers though, I think people are using them as cheap workstation compute cards and you probably have those banking on mining being able to leverage those cores as well.

All in all seems like a bad decision for the market segment these were intended for. As for mining, it is basically dead and gone, I doubt NV made this decision based on mining craze but who knows. I have a feeling that pricing is set high simply because they can as there's no competition from AMD in the performance segment of any RTX card.
 
With a 70% increase over GDDR5, does anyone really believe the rumors AMD intends to launch a Navi 3080 with 8GB of GDDR6 @ $250? $399 maybe, but $250 just seems too good to be true.
 
With a 70% increase over GDDR5, does anyone really believe the rumors AMD intends to launch a Navi 3080 with 8GB of GDDR6 @ $250? $399 maybe, but $250 just seems too good to be true.

More likely GDDR5x running at some higher speeds, I wouldn't expect to see any GDDR6 in anything selling for under $600 but I would happily be wrong on this one.
 
All in all seems like a bad decision for the market segment these were intended for. As for mining, it is basically dead and gone, I doubt NV made this decision based on mining craze but who knows. I have a feeling that pricing is set high simply because they can as there's no competition from AMD in the performance segment of any RTX card.

Honestly, I seriously doubt they are that stupid to even try pissing off gamers that are known for being ravenous fans.

Like I said I truely believe most of the price jump is because they are afraid to cannibalize their workstation lines, most people don't remember the revenue they lost with the 9800gx2 colleges for research in particular we're buying them up in scoops at 650$ for the EVGA SSC versions versus paying thousands per Tesla card.

Combine that with GDDR6 price gouging and a massive die and the card is expensive to produce, also the r&d for the new reference cooler design.

I seriously will give them the benefit of the doubt here the same as I'd give it to AMD.
 
I don't suppose anyone has any numbers for what GDDR4 to GDDR5 looked like at the same point in that changeover. Just wondering if this is another "see greedy corporations article", or is the foundry process for GDDR6 really just more costly. Could easily be option A(investors do pay companies to be exactly that after all), or it could be option B.
 
More likely GDDR5x running at some higher speeds, I wouldn't expect to see any GDDR6 in anything selling for under $600 but I would happily be wrong on this one.
The RTX 2060 comes out next week for $349 and it uses 6GB of GDDR6.
 
The RTX 2060 comes out next week for $349 and it uses 6GB of GDDR6.
Never been happier to be wrong :D The reviews actually show it doing a good job for the price point as well so way to not fuck up nVidia, keep trying!
 
Honestly, I seriously doubt they are that stupid to even try pissing off gamers that are known for being ravenous fans.

Like I said I truely believe most of the price jump is because they are afraid to cannibalize their workstation lines, most people don't remember the revenue they lost with the 9800gx2 colleges for research in particular we're buying them up in scoops at 650$ for the EVGA SSC versions versus paying thousands per Tesla card.

Combine that with GDDR6 price gouging and a massive die and the card is expensive to produce, also the r&d for the new reference cooler design.

I seriously will give them the benefit of the doubt here the same as I'd give it to AMD.

Doesn't it pretty much sound exactly like a bad decision? They pissed off the gamers and everyone else who needs a video card, they didn't deliver with the RTX thing. IMHO their AI should have stayed on a separate line of cards and Tensor cores shouldn't have come to a consumer line given the steep cost. Like everyone joked for raytacing that it is intended that you'll be running 1080p with a $1400 video card. This isn't 2009 anymore. So anyhow, while it is a faster card it is clear that very few feel it is worth the price hike. For most folks now that 1080ti is gone, it is the only option of getting a new higher end card (also considering the fact that remaining 1080ti are highly overpriced as well).
 
I don't suppose anyone has any numbers for what GDDR4 to GDDR5 looked like at the same point in that changeover. Just wondering if this is another "see greedy corporations article", or is the foundry process for GDDR6 really just more costly. Could easily be option A(investors do pay companies to be exactly that after all), or it could be option B.
Higher bandwidth at same density, likely means more transistors and more si, means more expensive.
 
of course it does

everything costs more

memory cartels control the prices around here.
 
Doesn't it pretty much sound exactly like a bad decision? They pissed off the gamers and everyone else who needs a video card, they didn't deliver with the RTX thing. IMHO their AI should have stayed on a separate line of cards and Tensor cores shouldn't have come to a consumer line given the steep cost. Like everyone joked for raytacing that it is intended that you'll be running 1080p with a $1400 video card. This isn't 2009 anymore. So anyhow, while it is a faster card it is clear that very few feel it is worth the price hike. For most folks now that 1080ti is gone, it is the only option of getting a new higher end card (also considering the fact that remaining 1080ti are highly overpriced as well).

Before I start, I will not excuse NVidia their marketing team and tactics which are complete garbage. But……

In short "No", Ray Tracing is the next progression of rendering systems, Sprites->Vector->Rasterization->Ray Tracing, If you want to progress the Industry onto newer technologies at some point you have to make a transition, I will agree RTX was an extremely calculated risk, AMD was no where in sight except at the mainstream and budget, Did RTX "Fail" no it really didn't, it did exactly what it was supposed to and NVidia did deliver, Game Developers didn't deliver and that is the problem, Just like DX10 and Vista 64bit you will probably have to wait a year or two till people start using it and it becomes a bit more common place in the industry. This is nothing new at all, If you have been around the Industry for any length of time this exact scenario has played out a dozen times over again. If you want to move the industry you move it as a whole not segmented or fractured, 10 years from now we will still be playing AAA rasterized games because studios like doing the least amount of work to maximize profits and people want GTX and RTX to coexist which will drive the price up on both more because now they need 2 separate product segments...…...

Battlefield was not a best case use scenario for RTX and like Tessellation they were over using it

...and BTW people are running updated Version of Battlefield now that is way more optimized, and they are getting 1440p at 60+frames on Ultra with tweaked settings, so that argument is going in the garbage right now.

I am going to reiterate this, this was marketed at Prosumers not Consumers, if you are going to bitch about price than you probably aren't the intended segment for the product, this is something completely new, and its expensive a massive Die with expensive Dram, and workstation clusters.

I am sorry for the people that didn't get exactly what they want which is probably 8k HDR 240htz adaptive sync gaming for 300$...real life doesn't work that way.
 
Before I start, I will not excuse NVidia their marketing team and tactics which are complete garbage. But……

In short "No", Ray Tracing is the next progression of rendering systems, Sprites->Vector->Rasterization->Ray Tracing, If you want to progress the Industry onto newer technologies at some point you have to make a transition, I will agree RTX was an extremely calculated risk, AMD was no where in sight except at the mainstream and budget, Did RTX "Fail" no it really didn't, it did exactly what it was supposed to and NVidia did deliver, Game Developers didn't deliver and that is the problem, Just like DX10 and Vista 64bit you will probably have to wait a year or two till people start using it and it becomes a bit more common place in the industry. This is nothing new at all, If you have been around the Industry for any length of time this exact scenario has played out a dozen times over again. If you want to move the industry you move it as a whole not segmented or fractured, 10 years from now we will still be playing AAA rasterized games because studios like doing the least amount of work to maximize profits and people want GTX and RTX to coexist which will drive the price up on both more because now they need 2 separate product segments...…...

Battlefield was not a best case use scenario for RTX and like Tessellation they were over using it

...and BTW people are running updated Version of Battlefield now that is way more optimized, and they are getting 1440p at 60+frames on Ultra with tweaked settings, so that argument is going in the garbage right now.

I am going to reiterate this, this was marketed at Prosumers not Consumers, if you are going to bitch about price than you probably aren't the intended segment for the product, this is something completely new, and its expensive a massive Die with expensive Dram, and workstation clusters.

I am sorry for the people that didn't get exactly what they want which is probably 8k HDR 240htz adaptive sync gaming for 300$...real life doesn't work that way.

I totally agree that ray tracing is the next step but in BFV it only uses it for reflections so it's really a far cry from being a ray traced game. You can barely tell the difference in IQ. 1080/1440p @ 60 is really a performance segment meant for 1060/1070 cards, not a flagship for $1400 so argument still stands. Same goes for Metro where they use it for some stuff. Basically my issue is that it's way too early to be on consumer market especially given the price premium and market hardly recovered from mining price hikes. I personally feel RTX should have stayed either on professional line or cards or be a separate segment. However it is clear NV would need a separate production line for that and likely not get enough sales so we have RTX. I have a 1080ti SLI with a 4K HDR monitor so I am very much in the segment they market this for and yet I don't feel it. 2070 and 2060 are also well outside their normal segment which are not prosumer cards, so it is not about bitching but more like feeling that NV has the market by the balls with products too expensive and it doesn't quite deliver what was promised. Given how it rolled out it also doesn't appear too many devs are even trying to push out RTX feature in their games. I feel about it kinds of the same way I feel about VR, it's a cool tech but his overall cost is a problem for quicker adoption and slow adoption doesn't exactly look appealing for game developers to embrace it. Almost a chicken and and the egg thing. I don't see you running a 2080ti so not sure why you are so defensive of NV.
 
I totally agree that ray tracing is the next step but in BFV it only uses it for reflections so it's really a far cry from being a ray traced game. You can barely tell the difference in IQ. 1080/1440p @ 60 is really a performance segment meant for 1060/1070 cards, not a flagship for $1400 so argument still stands. Same goes for Metro where they use it for some stuff. Basically my issue is that it's way too early to be on consumer market especially given the price premium and market hardly recovered from mining price hikes. I personally feel RTX should have stayed either on professional line or cards or be a separate segment. However it is clear NV would need a separate production line for that and likely not get enough sales so we have RTX. I have a 1080ti SLI with a 4K HDR monitor so I am very much in the segment they market this for and yet I don't feel it. 2070 and 2060 are also well outside their normal segment which are not prosumer cards, so it is not about bitching but more like feeling that NV has the market by the balls with products too expensive and it doesn't quite deliver what was promised. Given how it rolled out it also doesn't appear too many devs are even trying to push out RTX feature in their games. I feel about it kinds of the same way I feel about VR, it's a cool tech but his overall cost is a problem for quicker adoption and slow adoption doesn't exactly look appealing for game developers to embrace it. Almost a chicken and and the egg thing. I don't see you running a 2080ti so not sure why you are so defensive of NV.

I agree to an extent, but the ball has to get rolling at some point, slow adoption and long game I believe is what NVidia is banking on here, start wilth the small things like reflections and such and work you way up as power increases. Also NVidia isn't targeting us currently they have came out openly and stated they aren't targeting previous gen customers, they are targeting customers that are a couple generations or more behind, they have openly stated you really shouldn't be jumping on every iteration. We know that isn't the case though. You currently have 1400$ worth of GPUs in SLI there is zero reasoning for you to upgrade.

We both can debate for NVidia on what they should have done, but the fact of the matter is nor you or I own the company. Nvidia caters to Prosumers more than anyone and prosumers don't care about price only progress.

I am waiting for 7nm euv/uvl before I invest money in a personal rig GPU, professionally, I'm up to date and current, but let's not get into that plz.

I will back NVidia up on this because I believe they are making the right move, progress, people seem to only want the same old same old which is just more frame rates and more resolution, but you have to realize 8k you are going to top out, then what, you beat the freaking human eye out personally I'd rather have a fully Ray traced rendering system chasing the uncanny valley than a flawed raster approach. I think the only people that are upset are the benchmark chasers that just don't see value in changing things, an example would be having a 450hp v8 or a 450twq electric car........ One is the same old same old, the other roasts supercars in a quarter mile drag race and is an suv. What good is 1080p 240htz or 4-8k with no context.

I mean it's your preference and your money, but the hate train man just needs to stop already, new tech is expensive always was and part of being a prosumer is being the guinea pig, you should know that, like I said before this is absolutely nothing new, and every single time forums act like it's the end of the world.
 
I agree to an extent, but the ball has to get rolling at some point, slow adoption and long game I believe is what NVidia is banking on here, start wilth the small things like reflections and such and work you way up as power increases. Also NVidia isn't targeting us currently they have came out openly and stated they aren't targeting previous gen customers, they are targeting customers that are a couple generations or more behind, they have openly stated you really shouldn't be jumping on every iteration. We know that isn't the case though. You currently have 1400$ worth of GPUs in SLI there is zero reasoning for you to upgrade.

We both can debate for NVidia on what they should have done, but the fact of the matter is nor you or I own the company. Nvidia caters to Prosumers more than anyone and prosumers don't care about price only progress.

I am waiting for 7nm euv/uvl before I invest money in a personal rig GPU, professionally, I'm up to date and current, but let's not get into that plz.

I will back NVidia up on this because I believe they are making the right move, progress, people seem to only want the same old same old which is just more frame rates and more resolution, but you have to realize 8k you are going to top out, then what, you beat the freaking human eye out personally I'd rather have a fully Ray traced rendering system chasing the uncanny valley than a flawed raster approach. I think the only people that are upset are the benchmark chasers that just don't see value in changing things, an example would be having a 450hp v8 or a 450twq electric car........ One is the same old same old, the other roasts supercars in a quarter mile drag race and is an suv. What good is 1080p 240htz or 4-8k with no context.

I mean it's your preference and your money, but the hate train man just needs to stop already, new tech is expensive always was and part of being a prosumer is being the guinea pig, you should know that, like I said before this is absolutely nothing new, and every single time forums act like it's the end of the world.

I don't think it's hate train but rather severe disappointment that many games simply cannot afford the new generation graphics card which is pretty much where my sentiment is. Sure each of the new cards is a lot faster by name than previous tier and 2080ti is the de-facto fastest single card I just feel they screwed up in pricing as they priced them out of most people's reach. Folks like us will just wait for an iteration or two, others will go to old gen and used so only those building new high end rigs have reason to buy into the new gpu line, not because of prices or performance, simply because it logically makes sense. I do agree that someone had to do it first, I'm just not sure the new pricing scheme after a year of inflated prices due to mining which prevented many games from being able to afford to upgrade is a good more nor does it show much they care about gamers. I'm sure this lineup does cost them more to make and it's all business as we know but since nobody else can light a competitive fire under their asses, they no doubt are exploiting it. Personally, I don't back any company as they only answer to their shareholders and have to deliver more profit each year. I simply go for best performance and stable solution and it just so happens NV has had that market in their pocket for some time.
 
I don't think it's hate train but rather severe disappointment that many games simply cannot afford the new generation graphics card which is pretty much where my sentiment is. Sure each of the new cards is a lot faster by name than previous tier and 2080ti is the de-facto fastest single card I just feel they screwed up in pricing as they priced them out of most people's reach. Folks like us will just wait for an iteration or two, others will go to old gen and used so only those building new high end rigs have reason to buy into the new gpu line, not because of prices or performance, simply because it logically makes sense. I do agree that someone had to do it first, I'm just not sure the new pricing scheme after a year of inflated prices due to mining which prevented many games from being able to afford to upgrade is a good more nor does it show much they care about gamers. I'm sure this lineup does cost them more to make and it's all business as we know but since nobody else can light a competitive fire under their asses, they no doubt are exploiting it. Personally, I don't back any company as they only answer to their shareholders and have to deliver more profit each year. I simply go for best performance and stable solution and it just so happens NV has had that market in their pocket for some time.

It's business, AMD is no where to be found other than mainstream and budget, I mean seriously 2 years late they have a card that matches the 1080ti AMD cards that almost never sell for MSRP due to supply, on a side note to be fair, the monitor market is worse than graphics cards, other than that they will be discounting the cards at some point, I get people are pissed about the price, but buy it or don't, you are either going save a little longer for it as there is no real alternative other than a step backwards.
 
Back
Top