NVIDIA GeForce RTX 3090 Ampere GPU to sport 4,992 CUDA cores and 12 GB of 18 Gbps VRAM

14Gbps GDDR6 is $11.65 per GB. The 2080 ti has $128.15 worth of GDDR6. 22GB would be double that. WAY more.

Edit: Fixing numbers.

Are those bulk prices though? NVidia probably gets a massive discount.
 
Are those bulk prices though? NVidia probably gets a massive discount.

Not bulk. Though even with bulk prices you're likely looking at $70-$100 worth of memory, depending on how big the discount it. Doubling that would still be a huge cost increase for no real practical purpose.
 
Not bulk. Though even with bulk prices you're likely looking at $70-$100 worth of memory, depending on how big the discount it. Doubling that would still be a huge cost increase for no real practical purpose.

You think they pay even that much?

One would think they pay far less than that because theyre building millions of units? But I havent a clue.
 
We need some more 4k 144hz/ 240hz monitors or 1440p ultrawides to handle these cards. A 2080Ti is more than enough for 1440p unless you run really heavy ray tracing.
 
You think they pay even that much?

One would think they pay far less than that because theyre building millions of units? But I havent a clue.

$70 is around a 45% discount. It's possible they get higher than that, but I'm not sure. Most estimates I've seen put bulk discounts at 20 to 40%.
 
Not bulk. Though even with bulk prices you're likely looking at $70-$100 worth of memory, depending on how big the discount it. Doubling that would still be a huge cost increase for no real practical purpose.

I wouldn't be surprised if nVidias cost is 20-30% of market value. Them and AMD combined make up millions of unit sales so they probably have massive discounts. As for the need, with 4k taking off and next generation consoles, I'm sure we'll easily be able to hit 11 gb vram and more in the next few years. So having 22 gb may be a bit overkill but I'd rather have more than less.
 
I wouldn't be surprised if nVidias cost is 20-30% of market value. Them and AMD combined make up millions of unit sales so they probably have massive discounts. As for the need, with 4k taking off and next generation consoles, I'm sure we'll easily be able to hit 11 gb vram and more in the next few years. So having 22 gb may be a bit overkill but I'd rather have more than less.

I just dont get why halo consumer cards cant start shipping with 16 or 20GB of ram. I cant inagine how mich faster my davinci editing would be with more vram. I mean 16GB system ram is the new defacto standard. Lets get that vram up there too
 
I just dont get why halo consumer cards cant start shipping with 16 or 20GB of ram. I cant inagine how mich faster my davinci editing would be with more vram. I mean 16GB system ram is the new defacto standard. Lets get that vram up there too
Consumer cards don't need it. If you want/need more vRAM there are already consumer options. Radeon VII namely. It's more or less a cut down Instinct card already.
 
I wouldn't be surprised if nVidias cost is 20-30% of market value. Them and AMD combined make up millions of unit sales so they probably have massive discounts. As for the need, with 4k taking off and next generation consoles, I'm sure we'll easily be able to hit 11 gb vram and more in the next few years. So having 22 gb may be a bit overkill but I'd rather have more than less.

That seems like a stretch. No one is buying these individual GDDR6 chips that I know of, so listed prices are probably wholesale cost already or possibly even manufacture cost. These are not Radio Shack retail prices.

It took a very long time to actually need 8 GB (Doom Eternal was the first, and even then the difference settings are minor). The main reason was getting rid of "Mega textures" in idtech7. There is just not enough need for it beyond that.

Even comparing the 2080 to the 1080ti, the 2080 has only increased its performance gap since launch. Many said the 1080ti would be more "future proof".
 
That seems like a stretch. No one is buying these individual GDDR6 chips that I know of, so listed prices are probably wholesale cost already or possibly even manufacture cost. These are not Radio Shack retail prices.

It took a very long time to actually need 8 GB (Doom Eternal was the first, and even then the difference settings are minor). The main reason was getting rid of "Mega textures" in idtech7. There is just not enough need for it beyond that.

Even comparing the 2080 to the 1080ti, the 2080 has only increased its performance gap since launch. Many said the 1080ti would be more "future proof".

And so far the 1080ti is very future proof. There isn't any game at 1440p that the 1080ti can't handle. Now when it comes to 4k the king is still the 2080ti.
 
The 5700xt isn't equivalent to the 2070 super though. It has a pretty significant feature deficit. So you're not getting the "same" thing for less money.

The 2070 super is a response to the 5700xt though and the $399 launch of the 5700xt is a response to the super. Thinking one company is going to be the savior of PC hardware prices is silly, we need them to go tit for tat and that's where the consumer really benefits.
 
And so far the 1080ti is very future proof. There isn't any game at 1440p that the 1080ti can't handle. Now when it comes to 4k the king is still the 2080ti.

Control where it loses to a 2060 Super. The point is that the 2080 proved to be a better purchase upon release as it was a similiar price as the 1080ti back then. 8 GB was enough after all.
 
Control where it loses to a 2060 Super. The point is that the 2080 proved to be a better purchase upon release as it was a similiar price as the 1080ti back then. 8 GB was enough after all.

Sure if you enable RTX. Then I expect RTX cards that specialize in Ray Tracing to take the lead.
 
And so far the 1080ti is very future proof. There isn't any game at 1440p that the 1080ti can't handle. Now when it comes to 4k the king is still the 2080ti.

At 60hz. If you want to push 144, or more, at 1440p the 1080ti isn't going to cut it in every game without making visual sacrifices.
 
At 60hz. If you want to push 144, or more, at 1440p the 1080ti isn't going to cut it in every game without making visual sacrifices.

Exactly. And this holds true with the 2060 super, and the 2070/2080 2070/2080 super.

Just because 1 game shows the 1080ti not doing well doesn't prove it's not furture proof at all. If anything it shows there is something wrong with the drivers? And besides who the hell plays Control? I never even heard of the game.

And who the hell buys a 1080ti to play at 1080p for 60hz? You are more likely to be CPU bound than anything else.

1585498461873.png


If you want to look at Future proofing. Here are the newest benchmarks for Doom Eternal (taking from hardware unboxed). You can see that the 1080ti which is over 3 years old, is still chugging along just fine, while the 2060 super is slower then the $100 cheaper 5700XT....

IF anything Doom Eternal shows how future proof the 1080ti is considering it is 3 years old.
 
Exactly. And this holds true with the 2060 super, and the 2070/2080 2070/2080 super.

Just because 1 game shows the 1080ti not doing well doesn't prove it's not furture proof at all. If anything it shows there is something wrong with the drivers? And besides who the hell plays Control? I never even heard of the game.

And who the hell buys a 1080ti to play at 1080p for 60hz? You are more likely to be CPU bound than anything else.

View attachment 233787

If you want to look at Future proofing. Here are the newest benchmarks for Doom Eternal (taking from hardware unboxed). You can see that the 1080ti which is over 3 years old, is still chugging along just fine, while the 2060 super is slower then the $100 cheaper 5700XT....

IF anything Doom Eternal shows how future proof the 1080ti is considering it is 3 years old.

There's no such thing as future proofing.
 
If one game cannot prove something isn't "future proof" then one game cannot prove that it is.

Exactly. Which is why if you look across all modern games. It shows how future proof the 1080ti was! With the only exception being a game called Control.

Thank you for proving my point!
 
Exactly. Which is why if you look across all modern games. It shows how future proof the 1080ti was! With the only exception being a game called Control.

Thank you for proving my point!

Games having mostly stalled in graphical jumps is not proof of "future proofing" it's luck. I simply do not believe in future proofing when it comes to hardware. You might get lucky and something will be viable for years, or you will get unlucky and your fancy card will end up only being top-end for a year or two. It has nothing to do with "future proofing" and is entirely up to how the market goes.

Also, it's more than one game. There are games where my 2080 ti struggles to reach 144 fps at 1440p without lowering settings. If you're only playing at 1440p/60 the 1080ti is great, if you want to push at 1440p/144 you will be losing visual fidelity.
 
Get a card with more RAM? Why would gamers have to pay for your RAM?

If we do not communicate our wants from these companies we will be forced to settle with what they decide for us as acceptable.

If the standard high end became 16 or 24 industry wide were not paying extra. Were not paying 10x more for 16GB of ram than we were paying for 8GB of ram 7 years ago.

Ssds are cheaper per GB than HDD when HDD were at thier peak. Vram is marked up so much on these cards its rediculous.

That is not name calling or flaming. Its my opinion and a general statement. Not at any one member.

And idiot why does GPU ONLY literally have to be for gamers? Gamers buy 1650tis and rx580s. Enthusiast video encoders and performance ju8nkies who do gaming in addition to rendering and encoding buy 2080tis on avg. Gpus are not only for games.

It turns out, a real stat too, that the avg global gamer uses 1080p and 4k is much rarer than you think. No one buys a 2080ti for1080p gaming unless theyre uninformed, lied too, or dont give af.

So to stay on topic, I hope 3080ti has 16 to 24GB of frame buffer as we need to offer more for the segment and thats my opinion.
 
Last edited:
Sure if you enable RTX. Then I expect RTX cards that specialize in Ray Tracing to take the lead.

Both Control and Red Dead 2 can't hit 60 fps at 1440p with the 1080ti even without RTX. Also, Doom Eternal is about as well optimized game as there is.

The lack of driver optimization will hurt the the 1080ti going forward more than the vram deficit of the 2080.
 
I just dont get why halo consumer cards cant start shipping with 16 or 20GB of ram. I cant inagine how mich faster my davinci editing would be with more vram. I mean 16GB system ram is the new defacto standard. Lets get that vram up there too
The Radeon VII has 16GB of VRAM, and if you have workloads that are at the point of requiring that, or more, then it might be time to start looking at professional/workstation GPUs.
While the VRAM ceiling always raises a bit each generation, one has to pay to play early with higher VRAM amounts.
 
Wonder if they will be using more hybrid designs like the Xbox Series X uses. Maybe 12 GB running at the full 384 bit and another 6 GB running at 192 bit for a total of 18 GB (half of the memory double staked). This could be useful for mainstream cards as well running 6 GB at a faster 192 bit and then a few more GB running slower where it is again double stacked.
 
Wasn’t [H] finding VRAM issues when RT was on in BFV?

I guess the only real way to tell is if you had a Titan RTX and 2080ti....

I wish that article was still available. It was probably true for the GTX 2060 only and that may have been pre-patch as well where RTX performance changed greatly on BFV. RTX seems to use 1 GB rather consistently, even for path tracing (Quake 2). Shadow of the TR uses way less it seems, but takes a 2 GB+ hit on system ram.
 
Wonder if they will be using more hybrid designs like the Xbox Series X uses. Maybe 12 GB running at the full 384 bit and another 6 GB running at 192 bit for a total of 18 GB (half of the memory double staked). This could be useful for mainstream cards as well running 6 GB at a faster 192 bit and then a few more GB running slower where it is again double stacked.
Not after the GTX 970's delayed backlash ended such practices. While not 100% identical, it was somewhat similar. People will bitch about theoretically "missing" GBps rather than actual game performance and benchmarks.
A long time ago, it wasn't uncommon to have such setups (largely for 192bit and 96bit setups to get "normal" amounts of RAM. e.g, 512GB, 1GB, 2GB, etc). Was it a majority of cards? Never. Not even in the low end (where such practices were implemented), but it did happen.
 
That seems like a stretch. No one is buying these individual GDDR6 chips that I know of, so listed prices are probably wholesale cost already or possibly even manufacture cost. These are not Radio Shack retail prices.

It took a very long time to actually need 8 GB (Doom Eternal was the first, and even then the difference settings are minor). The main reason was getting rid of "Mega textures" in idtech7. There is just not enough need for it beyond that.

Even comparing the 2080 to the 1080ti, the 2080 has only increased its performance gap since launch. Many said the 1080ti would be more "future proof".
i disagree....i have played quite a few games already that used up over 7GB of vram already years before doom eternal.(was handy even with my old rx580) RE2, DE Mankind Dev, The Div 1 and 2, Wolf and older Doom just to name a few. I agree with Tango, bring on the default 16GB cards :)
 
Wonder if they will be using more hybrid designs like the Xbox Series X uses. Maybe 12 GB running at the full 384 bit and another 6 GB running at 192 bit for a total of 18 GB (half of the memory double staked). This could be useful for mainstream cards as well running 6 GB at a faster 192 bit and then a few more GB running slower where it is again double stacked.
That’s be terrible it’d be the 1070 debacle again
 
Yes, Deus Ex MD on max settings can use up 12GB of VRAM (at least it looks that way, maybe more if the GPU supported it). RE2 as well.
 
Yes, Deus Ex MD on max settings can use up 12GB of VRAM (at least it looks that way, maybe more if the GPU supported it). RE2 as well.
Sure uses but doesn't needed it. Cause a game loads the entire level/world into memory it doesn't mean it is using it all at all times. Sure you might hit a hitch here or there if you do something that is completely unexpected but for normal gaming running it with a 8gb card wouldn't change much. Games that use stupid amounts of memory are just poor coding. Game like GoW 5 look incredible and only takes 6gb on with ultra texture pack running 3440x1440. People just get hung up on vram usage too much. Only real instance where memory matter is people modding games to the moon or have a actual work load to use it.
 
Not after the GTX 970's delayed backlash ended such practices. While not 100% identical, it was somewhat similar. People will bitch about theoretically "missing" GBps rather than actual game performance and benchmarks.
A long time ago, it wasn't uncommon to have such setups (largely for 192bit and 96bit setups to get "normal" amounts of RAM. e.g, 512GB, 1GB, 2GB, etc). Was it a majority of cards? Never. Not even in the low end (where such practices were implemented), but it did happen.

The missing 0.5 GB on the GTX 970 was about as fast as system ram, so WAY different. It was only useful in texture cache. I doubt MS would have wasted resources running a 320 bit bus if there was no benifits over running a 256 bit bus.

If you want performance AND capacity (without going up to 22/24 GB, this is the only way I see it happening with GDDR6.
 
Once again, the key word was 'used' not need. The only thing this does for you is maybe save a little system ram, but most have plenty of that.

Even with AMD's poor memory controller, none of the Division, Deus ex, CoD, games play better on a 5500 8 GB compared to a 5600 6 GB.
 
You think they pay even that much?

One would think they pay far less than that because theyre building millions of units? But I havent a clue.

$70 is around a 45% discount. It's possible they get higher than that, but I'm not sure. Most estimates I've seen put bulk discounts at 20 to 40%.

I just looked up some 8 Ghz GDDR6 on digikey for curiosity's sake. These appear to be 512 Mb ICs vs say the 1Gb on a 2080 Ti? so no idea how that would impact the price, but still pretty cheap compared to the numbers I've seen floating around above, and give you an idea about bulk pricing when the minimum quantity is 2000.
1585706916201.png
 
I just looked up some 8 Ghz GDDR6 on digikey for curiosity's sake. These appear to be 512 Mb ICs vs say the 1Gb on a 2080 Ti? so no idea how that would impact the price, but still pretty cheap compared to the numbers I've seen floating around above, and give you an idea about bulk pricing when the minimum quantity is 2000.
View attachment 234682

Im not sure who buys in trays anyways so you save money getting the ICs glued to a spool for a machine to plop them down on PCBs anyways.

But yeah those prices are MUCH lower than what was discussed here in this thread. I assume that the 1GB silicon is going to cost another $10 more at bulk rates. Well you can only buy this stuff in bulk anyways.

@2000 modules = 181 2080tis so they are buying these in orders of millions of dollars
 
Im not sure who buys in trays anyways so you save money getting the ICs glued to a spool for a machine to plop them down on PCBs anyways.

But yeah those prices are MUCH lower than what was discussed here in this thread. I assume that the 1GB silicon is going to cost another $10 more at bulk rates. Well you can only buy this stuff in bulk anyways.

@2000 modules = 181 2080tis so they are buying these in orders of millions of dollars

I figured the price couldn't be that high. So adding more vram shouldn't significantly alter pricing.
 
Back
Top