NVIDIA GeForce RTX 3090 is ~10% faster than RTX 3080

:( that's not awesome


What did you want, the ungodly performance increases will have to wait for an even larger part. You can't get new process node plus super-size plus GDDR6X for $1500.

NVIDIA kept their 1080 Ti die just under 471 mm², while the 3090 is PORKER a die size of 628 mm²(on an even costlier process node than 14nm)

This is just as nomenclature change , the total performnance increase over the 2080 Ti is still 50% total (well above what the naysayers were saying). Unfortunately, they're using twice the density for cutting-edge memory parts. (adds even more expense)
 
What did you want, the ungodly performance increases will have to wait for an even larger part. You can't get new process node plus super-size plus GDDR6X for $1500.

NVIDIA kept their 1080 Ti die just under 471 mm², while the 3090 is PORKER a die size of 628 mm²(on an even costlier process node than 14nm)

2080 Ti is even more of a porker tho:

1600658008949.png
 
2080 Ti is even more of a porker tho:

View attachment 281150


When you are using an ancient fucking process node, you can go bigger more easily. It was a lot less risky than going TSMC 7nm back in 2018!

And as a result, 2080 Ti had almost the same launch price as the 3090! You're nuts if you think a $300 increase for twice the memory and 50% performance increase isn't worth it.
 
what about the 3080 Super? I bet they'll be a Ti and also a Super, possibly a Titan
I don’t think there will be a 3080 Super/Ti. There is little room between the 3080 and 3090. A 3080 Super/Ti with 20GB VRAM would completely kill off the 3090.

I think we’ll just see a 3080 variant with 20GB added for $900-1000. Then Nvidia will drop the price of the 3090 to $1200 only if AMD has anything that comes close to it.
 
When you are using an ancient fucking process node, you can go bigger more easily. It was a lot less risky than going TSMC 7nm back in 2018!

And as a result, 2080 Ti had almost the same launch price as the 3090! You're nuts if you think a $300 increase for twice the memory and 50% performance increase isn't worth it.
i only paid $1000 for my BNIB 2080 Ti FE tho, nowhere near $1500, imho

1600659072034.png
 
what about the 3080 Super? I bet they'll be a Ti and also a Super, possibly a Titan
That would be a tight fit to increase performance of the 3080 but keep it slower than the 3090. Unless they are just going to give it 20gb and call that super with no speed difference ;). I guess they could get something in between but it would really put pressure on their own 3090 at that point (especially if priced anywhere close to the original MSRP, which it probably won't if it's got double the vram).
 
When you are using an ancient fucking process node, you can go bigger more easily. It was a lot less risky than going TSMC 7nm back in 2018!

And as a result, 2080 Ti had almost the same launch price as the 3090! You're nuts if you think a $300 increase for twice the memory and 50% performance increase isn't worth it.

First, $1200 is different than $1500. It is NOT almost the same.

Second, you'd expect a card coming out 2 YEARS AFTER the last gen to have increased specs and performance otherwise there would be no point to upgrading. Bumping up to a different pricing tier isn't necessarily as expected.

Third, it doesn't seem like AMD thought it all that risky of a bet on TSMC 7nm in 2018 as they tapped it for Zen2 and 1st gen RDNA (and tweaked versions of it going forward). They essentially bet the farm on it and are reaping the benefits.



Overall though, it does seem suspect that there is only a 10% increase in performance given the specs. I don't know that I'd agree with your assessment of the Titan branding. Titan cards are built by Nvidia. I think Nvidia wanted to distance themselves from the bad press concerning the extravagant increase in price that the 2080Ti was, and naming it a 3080Ti wouldn't have helped that image. The 3090 is a "new" nomenclature that doesn't have any price history associated with it. Plus, AIBs can design their own cards vs. having a "Built by Nvidia" only product in the Titan. I think we'll see a Quadro based Titan at some point for some $3000+.
 
Last edited:
i only paid $1000 for my BNIB 2080 Ti FE tho, nowhere near $1500, imho

View attachment 281154

That's irrelevant. The listed FE price was $1199. You can't make assumptions about product pricing from one example of the second hand market.

That being said, $1199 is still nowhere near $1499.
 
Third, it doesn't seem like AMD thought it all that risky of a bet on TSMC 7nm in 2018 as they tapped it for Zen2 and 1st gen RDNA (and tweaked versions of it going forward). They essentially bet the farm on it and are reaping the benefits.

Of course it was a risky bet. But it worked out great this time.
 
I am very late to this thread. Is there a zero missing in the headline ?

erek Schro
Nope, it’s what we discovered after the hype train. 3090 is around 10% faster than the 3080.

to ge honest it seems as if the scalpers actually gave the community enough time to digest the information before actually being taken
 
I don't know that people were hyped for the 3090. On paper it's at best 20% faster than the 3080. So even best case there was nothing to get overly excited about for $800 extra.
 
I really don't think the 3090 will be the top tier Nvidia 3000 series card. Look back at previous generations and you will see they almost always release their Trump cards later in the product cycle. It's all about money. Everyone who can will jump on 3080 and 3090 now. Some 6 months later well see 3080TI, 3090TI, new Titan, or a combination of the sorts. Many of the same customers will turn right around and buy the next big thing because they "have to have it."

GTX 780 May 2013
GTX 780 TI Nov 2013

GTX 980 Sept 2014
GTX 980 TI June 2015

GTX 1080 May 2016
GTX 1080TI March 2017

RTX 2080 Sept 2018
RTX 2080 S July 2019

If you think the 3080 and 3090 won't have faster versions this series you are very naive.
 
Yeah I think the 10GB of RAM might be a limiting factor at some point on the 3080, but boy did Nvidia hype this launch. Not really sure any of the launch cards are worth draining my loop for at 1080p high refresh rate.
 
  • Like
Reactions: erek
like this
I really don't think the 3090 will be the top tier Nvidia 3000 series card. Look back at previous generations and you will see they almost always release their Trump cards later in the product cycle. It's all about money. Everyone who can will jump on 3080 and 3090 now. Some 6 months later well see 3080TI, 3090TI, new Titan, or a combination of the sorts. Many of the same customers will turn right around and buy the next big thing because they "have to have it."

GTX 780 May 2013
GTX 780 TI Nov 2013

GTX 980 Sept 2014
GTX 980 TI June 2015

GTX 1080 May 2016
GTX 1080TI March 2017

RTX 2080 Sept 2018
RTX 2080 S July 2019

If you think the 3080 and 3090 won't have faster versions this series you are very naive.

Yeah, not for almost a year later though. So if you want to wait a year for a faster version, but then you could just wait another year for the 4080, etc.
 
Yeah, not for almost a year later though. So if you want to wait a year for a faster version, but then you could just wait another year for the 4080, etc.
Exactly... people act like technology will all of a sudden stop progressing after they buy some high end product. The facts are that anything you buy will be out of date in a matter of months.
 
I don't know that people were hyped for the 3090. On paper it's at best 20% faster than the 3080. So even best case there was nothing to get overly excited about for $800 extra.

14GB more RAM and a wider bus? May not be too relevant for gaming but for TensorFlow/CUDA types that can't justify a Quadro it could be good choice. It's probably the best argument against Nvidia releasing a 20GB variant of the 3080.
 
14GB more RAM and a wider bus? May not be too relevant for gaming but for TensorFlow/CUDA types that can't justify a Quadro it could be good choice. It's probably the best argument against Nvidia releasing a 20GB variant of the 3080.
I think 3090s will sell to a lot of people who aren't gamers, but I don't think those are the same people who will be refreshing at 9am on thursday. Once they are regularly in stock, I think they will sell a good amount.
 
If you think the 3080 and 3090 won't have faster versions this series you are very naive.
The rumor of Nvidia buying 5nm TMSC suggests that they know Ampere on Samsung 8nm is a dead end with no room for growth. So Samsung 8nm to TMSC 5nm is needing more than just a refresh so they are calling it Hopper.
 
First, $1200 is different than $1500. It is NOT almost the same.

Second, you'd expect a card coming out 2 YEARS AFTER the last gen to have increased specs and performance otherwise there would be no point to upgrading. Bumping up to a different pricing tier isn't necessarily as expected.

Because memory die shrinks have hit a wall (the density of memory is going up just as slowly as the density in other silicon chips), so expecting the same old density increase every two years is not happening.

Plus the fact that we can't get more than a generation out of the latest-and greatest GDDR6 standard means you have to add the cost of developing that custom memory design for just two graphics cards!


Third, it doesn't seem like AMD thought it all that risky of a bet on TSMC 7nm in 2018 as they tapped it for Zen2 and 1st gen RDNA (and tweaked versions of it going forward). They essentially bet the farm on it and are reaping the benefits.

Zen 2 only worked because the chiplet was fucking tiny: 80 mm2 . Even their APU is still smaller than the RX 5700 XT! Also, Zen 2 launched in July 2019, a whole year after the RTX!

RDNA1 lost money for AMD for its first half-year (and only got yields good enough to make money on the RX 5600 Xt in the last 6 months).
The die size for the RX 5700 XT was a paltry 251 square millimeters.

One year after , NVIDIA would be losing money charging $1200 for the massive die of RTX 3090 , so they raised the price. Doesn't mean we won't see some big price drops after that first 6 months (but NVIDIA would rather get paid on launch, then lower prices later).

The price drops om a mature process are the whole reason I brought-up Pascal earlier Erek, but it doesn't mean customers aren't going to have to absorb those high initial costs. The 3090 is a nearly-perfect 625 sq mm die on the second-most advanced process on earth. It's also accompanied between 12-24 chips of "didn't exist before August" cutting-edge memory.
 
Last edited:
I really don't think the 3090 will be the top tier Nvidia 3000 series card. Look back at previous generations and you will see they almost always release their Trump cards later in the product cycle. It's all about money. Everyone who can will jump on 3080 and 3090 now. Some 6 months later well see 3080TI, 3090TI, new Titan, or a combination of the sorts. Many of the same customers will turn right around and buy the next big thing because they "have to have it."

GTX 780 May 2013
GTX 780 TI Nov 2013

GTX 980 Sept 2014
GTX 980 TI June 2015

GTX 1080 May 2016
GTX 1080TI March 2017

RTX 2080 Sept 2018
RTX 2080 S July 2019

If you think the 3080 and 3090 won't have faster versions this series you are very naive.

The 3090 won’t unless you count an actual Titan. 3080 might get an AIB super but no Ti from NVIDIA. I think if we get a refresh next year, it will be on Samsung 5nm and those parts will blow these away.
 
  • Like
Reactions: erek
like this
Something doesn't add up. If the 3090 is only 10% faster than the 3080 with more cores and 24gb ram why are we even bothering waiting for the 3080 with more onboard ram? Doesn't sound like it would make much of a difference.
 
Doesn't sound like it would make much of a difference.
Well it will cost more and provide market segmentation. People will still buy them if they have more memory than AMD cards because MOAR.
 
  • Like
Reactions: erek
like this
Something doesn't add up. If the 3090 is only 10% faster than the 3080 with more cores and 24gb ram why are we even bothering waiting for the 3080 with more onboard ram? Doesn't sound like it would make much of a difference.
Yeah at this point it's just for workstation applications.
 
  • Like
Reactions: erek
like this
Something doesn't add up. If the 3090 is only 10% faster than the 3080 with more cores and 24gb ram why are we even bothering waiting for the 3080 with more onboard ram? Doesn't sound like it would make much of a difference.

Some people swear that 10GB isn’t enough. Presumably those people are willing to pay a premium for piece of mind even though in nearly every scenario there will be zero benefit.
14GB more RAM and a wider bus? May not be too relevant for gaming but for TensorFlow/CUDA types that can't justify a Quadro it could be good choice. It's probably the best argument against Nvidia releasing a 20GB variant of the 3080.

Yeah it’s probably a steal for people who can put the memory to good use.
 
Some people swear that 10GB isn’t enough. Presumably those people are willing to pay a premium for piece of mind even though in nearly every scenario there will be zero benefit.
Crystal ball aside. Doom eternal with max settings needs more than 8GB of vram afaik, so it doesn't seem that far fetched that 10GB could be a limiting factor in future games. New console generations are usually accompanied by an increase in graphics fidelity/requirements for the PC as well.
20GB being useful on the 3080 depends on what resolution you intend to play at and how long do you keep your gpus.
If you swap every generation, then it doesn't seem a issue. But if you keep them for more than 3 or 4 years and play at 4K or higher, then preferring the 20GB model for future proofing seems reasonable.
 
Something doesn't add up. If the 3090 is only 10% faster than the 3080 with more cores and 24gb ram why are we even bothering waiting for the 3080 with more onboard ram? Doesn't sound like it would make much of a difference.
All it takes is a game to update with even-higher-resolution textures and it might push it over the edge of 10GB. Remember, hardware has to lead and games follow. Right now PS5 and XBox are going to lead.

However, if you are old enough, you know what it is like to run out of RAM and it SUCKS. Who remembers expanded memory for X-Wing and Tie Fighter? Yeah, never again. Need twice as much RAM to help dispel those demons.
 
Last edited:
I think 3090s will sell to a lot of people who aren't gamers, but I don't think those are the same people who will be refreshing at 9am on thursday. Once they are regularly in stock, I think they will sell a good amount.
I'm getting one for analyzing firewall rules and intrusion detection.

But if anything this does a pretty good job at showing that the memory on the 3080 isn't really a limiting factor in any meaningful way.
 
  • Like
Reactions: erek
like this
Crystal ball aside. Doom eternal with max settings needs more than 8GB of vram afaik


No, it doesn't. The vram being allocated aggressively by the game engine (when you have a ton of it available) is not the same as the video game REQUIRING that much ram.

See the results at 4k here at Techpowerup:

You have a 5% falloff in relative performance between the RX 5600 XT and RX 5700 at 4k versus 1080p. Meaning RDNA1 dos not need more than 6GB ram (5% is lost-in the noise)

You haver a ZERO PERCENT DIFFERENCE RTX 2060 versus RTX 2060 Super at 4k versus 1080p, so there is still spare VRAM capacity (after compression) on Turing.

performance-1920-1080.png

performance-3840-2160.png

Really, I would say the actual VRAM used by DOOM Eternal for modern architectures seems to be somewhere between 4GB and 6GB at 4k (and 3gb at 1080p).
 
Last edited:
I'm getting one for analyzing firewall rules and intrusion detection.

But if anything this does a pretty good job at showing that the memory on the 3080 isn't really a limiting factor in any meaningful way.
Really? no joke?
 
1 hour long? Why do these talking heads believe I want to watch a 1 hour long talking head answering something that can be done in 2 minutes?

Exactly my thoughts, I wasn't gonna say anything but I'm not watching an hour long video for something that I already know...

Timestamps?
 
  • Like
Reactions: erek
like this
Really? no joke?
Yeah, I could probably get away with a much smaller and "cheaper" Quadro but really where is the fun in that and if accounting gives sign off on it than...... Yay?

But yes seriously AI is moving into the firewall and threat prevention space at a pretty rapid pace, as it is already being used against us to an alarming and highly successful degree.
 
  • Like
Reactions: erek
like this
Yeah, I could probably get away with a much smaller and "cheaper" Quadro but really where is the fun in that and if accounting gives sign off on it than...... Yay?

any information on this setup? Tutorial? GitHubs anything? Just curious if you're willing to share... is this nudging closer to AIOps?
 
any information on this setup? Tutorial? GitHubs anything? Just curious if you're willing to share... is this nudging closer to AIOps?
It's a product set put together by Palo Alto, I don't know the specifics on it yet as my install date keeps getting pushed back.
 
  • Like
Reactions: erek
like this
I dont know all the ends an out but i think the 3080 was about hype an it will be impossible to ever find so the real card is the 3090. i just dont see any biz giving that much performance increase an cutting the price? so they get all the media with the 3080 an the price an performance then there impossible to find due to some "bots" aka there was never any stock, now we get the 3090 which is the mainstream card with the price hike to last gen.
 
Back
Top