NVIDIA GeForce RTX 3090 is ~10% faster than RTX 3080

And yet many of the other real-world gaming scenarios I've seen seem to indicate between 5-15%. Wonder why that is.
(Previous post), and because clickbait headlines can be deceiving. They'll tend to latch onto the extreme negative end of something since it's more viral. And since most people don't bother to drill down past a headline to realize "wait, there's more to the story, the reality is actually more nuanced", the cycle continues. You ultimately just have to take everything with a grain of salt, and wait until you get more datapoints from other sources.
 
Last edited:
I'm guessing a lot of streamers who feel the need for online attention by having the latest hard to get gear to show off to their subscribers?


...and since every kid out there is a wannabe streamer these days...
I'm snagging one for development, raw assets are big huge amounts of memory are better during that phase but developing on the Quadro's doesn't work too well because you can't test your stuff as effectively because of the drivers. So a lot of stuff I build currently I develop on a quadro system then have to test on an old 7700K running a 1060 to see what it actually runs like on a "normal" pc. I'm hoping when I get the new system running with the 3090 I can build and test in one go due to a more consistent environment.
Edit: Getting these past accounting is going to be a pain, my options are either a FE, or the Gigabyte Turbo, getting sign off on RGB-X Ultra Gamer blah blah blah is going to be difficult at best.
 
Last edited:
Because you need 4k gaming to make this thing spread it's wings (and everything lower will compress the graph down to 5-10%)

If you're not GPU-limited, then you're wasting your $1500.

At 1440p, the delta drops to 12%:

View attachment 282388

And here at 1080p, the delta drops to a measly 7%!

View attachment 282389

The reason this card needs more than 1440p to really shine: in 1440p, it's still CPU-limited in half the games.

Look at that> 60fps rate IN EVERY GAME THEY TESTED AT 4k ultra! AVERAGE fps = 115!


View attachment 282390

It's still ludicrously expensive, but for those who demand the NO-COMPROMISE BEST,, NVIDIA is providing it!

Total performance improvement over the 2080 Ti is a little over 50%
Even if I’m GPU limited; $800 extra (over twice the cost of a 3080) for a 16% improvement is a massive waste of money to me. Just does not financially compute.
 
That's difficult for me to process and I've got Titans and pairs of GPU's for SLI over many, many years.
In the past even with garbage tier SLI scaling (except for negative scaling) if you bought a second GPU for $500-800 you’d still probably have at least a 30-70% performance boost in most games.
 
The strength of multi-GPU was that you could push image quality higher when FPS was limited to 50 or 60hz. The fact it's difficult to qualify improved graphics over simpler metrics like raw framerate made it hard to market.

Assuming it worked. Lucid's Virtu MVP also suffered the same fate for the same reasons.
 
Told you it would be (nearly) 20% faster at stock wattage.
That's because you're comparing an OCed 3090 to a stock 3080.

And yet many of the other real-world gaming scenarios I've seen seem to indicate between 5-15%. Wonder why that is.

If we compare a stock 3090 to a stock 3080 at 4K we get ~10% difference:
1600980349437.png


If we compare an OC'ed 3080 Gaming X to the 3090 STRIX (both the highest OCs I could find on techpowerup with a fast search) the difference is around 12~13%.
 
So kinda like the last RTX Titan and the 2080 Ti?
Some people can't see past their own little world. It's kinda typically been paying double or triple for that last bit of 10-20% extra performance per gen.

Nvidia knew their target market for 3090, it's not like these things weren't number crunched and focus-grouped to high hell and they pulled $1499 out of thin air. Just a cursory glance at the CUDA results, Blender etc. tells the tale: these cards are very attractive for usage that generates revenue since it cuts down on render times and speeds up video editing. Dudes playing Fortnite in their underwear that can't understand why "the 3090 is such a ripoff" weren't necessarily the target market.
 
Last edited:
People seem to have goldfish attention spans and memories, or just can't see beyond their own little world. It's kinda typically been paying double or triple for that last bit of 10-20% extra performance per gen.

Nvidia knew their target market for 3090, it's not like these things weren't number crunched and focus-grouped to high hell and they pulled $1499 out of thin air. Just a cursory glance at the CUDA results, Blender etc. tells the tale: these cards are very attractive for usage that generates revenue since it cuts down on render times and speeds up video editing. Dudes playing Fortnite in their underwear that are proclaiming 3090 a "ripoff" weren't necessarily the target market.
Yeah rendering, encoding, AI up scaling, if gaming isn’t your main focus but workstation workflows are than this is the card for you. Basically all the features in a Quadro but lacking the price tag associated with signed drivers and specialist support.
 
Another feather in ASUS' cap.

Anyone else find it odd that there's VERY little info about EVGA cards? The only two "reviews" I've seen are JayzTwoCents and BPS Customs, and both of those are half-assed.
 
Hope Gigabyte puts together a decent card, I need a blower design and something with a generic name. And Gigabyte is the only one doing that so far.
 
So kinda like the last RTX Titan and the 2080 Ti?

The difference is with the Titan you get the performance probably a year earlier so you get to enjoy it longer. Eventually a card will come along and give you more for less.
 
So consumers will be praising bots for avoiding this first batch of boards !?
To be honest YES!!! I know I am. The absolutle need to buy these cards immediately right now is basically gone.

But here's a question how is anyone going to know what a Rev1 vs Rev2 is if the AIB doesn't tell you?
 
To be honest YES!!! I know I am. The absolutle need to buy these cards immediately right now is basically gone.

But here's a question how is anyone going to know what a Rev1 vs Rev2 is if the AIB doesn't tell you?
I’m sure there would be a revision number on the board and a change in the part number. So while the card may still be called the 3080 RGB Ultra-X SNUF Edition the rest of the data should let you know.
 
  • Like
Reactions: kac77
like this
Isn't Zotac a budget brand? At least that's how I've always viewed them. I would never buy a top end card made as cheaply as possible to save $20 on $1500.
 
Question:

I was looking for an article that tracked historic performance difference between the top two cards at the same release time for each generation. I have a friend who is telling me the 5-15% performance jump on the 3090 over the 3080 is actually larger than normal or at least typical, but it seemed at first blush that this delta was lower than typical.

I can't find a good source on this either way. Anyone run across one to share? Trying to put some historic context on this x2 cost for 10% more performance we are seeing on the 3090 vs. 3080?
 
Question:

I was looking for an article that tracked historic performance difference between the top two cards at the same release time for each generation. I have a friend who is telling me the 5-15% performance jump on the 3090 over the 3080 is actually larger than normal or at least typical, but it seemed at first blush that this delta was lower than typical.

I can't find a good source on this either way. Anyone run across one to share? Trying to put some historic context on this x2 cost for 10% more performance we are seeing on the 3090 vs. 3080?

2080 Ti was about 15-25% faster (depending on resolution) than the 2080 and only cost $300 more ($699 vs $999). Many people recall the 2080 Ti coming in at $1199 but that was the Founder's Edition MSRP while other AIBs had it for $999 or close to it.

https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-ti-founders-edition/
https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-ti-founders-edition/33.html
 
2080 Ti was about 15-25% faster (depending on resolution) than the 2080 and only cost $300 more ($699 vs $999). Many people recall the 2080 Ti coming in at $1199 but that was the Founder's Edition MSRP while other AIBs had it for $999 or close to it.

https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-ti-founders-edition/
https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-ti-founders-edition/33.html

The 2080 Ti was a $1200 card. That $999 price was a myth.
 
So basically not a gaming card.... but can be held up as one if NV needs 10% extra to best AMD.

Its a win as long as Big Navi is more like Medium Navi. It ends up looking a bit more desperate imo if Big Navi is = to the 3080.... and probably to everyone if 3080 ends up < BN.
 
In the past even with garbage tier SLI scaling (except for negative scaling) if you bought a second GPU for $500-800 you’d still probably have at least a 30-70% performance boost in most games.

To me it *felt* more like 30% when it worked. In practice it was more like -50% to + 30%. I’d honestly rather pay double for 20-30% in one card (which I realize the 3090 isn’t that; 15% at most if you have the PL, unless VRAM comes into play).

When I was younger it was fun to mess around with. Now I just want things to work.
 
That article essentially corroborates what I said.
How does it corroborate what you said? You said the $999 price was a myth. I showed you that it wasn’t. The 2080 Ti Black was the first $999 card and was deemed a “unicorn” at the time it was released (1-2 months after launch).
 
It's basically a titan.. until the titan actually comes out, then it's just the ugly duckling ;).

Either way, do you really think they will be available for purchase anytime soon? I mean, for real purchase, not 3080 purchase? They are higher binned parts, so will be even more scarce, although demand may be a bit lower since a lot of people won't be to keen on spending over 2x the money for < %20 performance unless they have a compelling work load that can use the vram (blender rendering, AI of sorts or some computational workload the uses lots of vram). I'll probably be holding off until stocks stabilize and I know what AMD has to offer before jumping on any hype trains.
I really don’t think we will see a Titan card it was too nice of a loophole for specific markets, mine included. The Titan was perfect for anybody who had a use case where they needed the feature set of a Quadro but didn’t need the qualified drivers and was happy living outside the official support bubble so they could save a few thousand. Education was a big user of the Titans, now by moving the Titans to the 3090 and pushing this 8K gaming campaign the branding is this “3090 RGB Gaming Extreme Ultra SNUF-X Edition” It makes it something near impossible to pass by government regulators who will flag it to high hell so I can’t get accounting sign off. So that forces me to move to the much more costly Quadro lineup. Discontinuing the Titan branding was a strategic move so if they do release a Titan I will be super happy but I doubt it will come to pass, so now I await the new Quadro announcement.
 
How does it corroborate what you said? You said the $999 price was a myth. I showed you that it wasn’t. The 2080 Ti Black was the first $999 card and was deemed a “unicorn” at the time it was released (1-2 months after launch).

Exactly. Meaning no other $999 cards were available. And we have no indication the Ti Black was available in any meaningful quantity either. Either way, one SKU doesn’t change the overall picture.
 
  • Like
Reactions: Parja
like this
Exactly. Meaning no other $999 cards were available. And we have no indication the Ti Black was available in any meaningful quantity either. Either way, one SKU doesn’t change the overall picture.
Not at launch, but plenty of manufacturers did end up releasing a variant in that price range. The 2080TI was a limited product to begin with. It was never intended for a mainstream adoption so chip availability and pricing generally kept it that way. The Asus 2080 TI Turbo for example.
 
Exactly. Meaning no other $999 cards were available. And we have no indication the Ti Black was available in any meaningful quantity either. Either way, one SKU doesn’t change the overall picture.
Nothing you said doesn’t change the fact it wasn’t a myth. It existed and as you said you have no indication if it was available in a meaningful quantity. Which means it also could have been available in quantities to meet the demand for a $999 2080 Ti. It existed, wasn’t a ‘myth’ regardless if it was one SKU or 10.
 
Nothing you said doesn’t change the fact it wasn’t a myth. It existed and as you said you have no indication if it was available in a meaningful quantity. Which means it also could have been available in quantities to meet the demand for a $999 2080 Ti. It existed, wasn’t a ‘myth’ regardless if it was one SKU or 10.

Point is making statements like “the 2080 Ti was only $300 more than the 2080” are misleading at best. That clearly was not the case.
 
  • Like
Reactions: Parja
like this
Point is making statements like “the 2080 Ti was only $300 more than the 2080” are misleading at best. That clearly was not the case.
How is it misleading? You could get a 2080 Ti for $300 more than a 2080 as I showed you could do with the 2080 Ti Black. It was clearly the case.
 
Back
Top