RTX 3xxx performance speculation

Back in the day there weren’t as many alternatives. Especially as the few game benchmarks were either limited or skewed, or non universal. But that changed fast, and only the really old ones made any sense.

I remember when T&L appeared.
I remember the first time (Geforce 3) I tried AA.
I remember Comanche on DirectX 8.1
Everytime these features came with added I.Q. (suspense of disbelief)...but also a hearty performance price.

Some people want added complexity to perform 100% better at 50% the power and 25% of the cost

Right now all I want is CyperPunk 2077 and a 3090 for winter...cost is not a factor.
I worked for +30 years I can afford gaming as a hobby.
It cost nothing compared to my martial arts :LOL:
 
People have acted spoiled since Maxwell-> Pascal.
Either the forum posters average age dropped a lot or people's ability to remember fails bigtime 🤷‍♂️

Not only spoiled by it, but mis-remembering and exaggerating the gains as well.

Here is a typical (ROTTR) 980 Ti -> 1080 uplift: 28%.
https://images.anandtech.com/graphs/graph10325/82855.png

The leaked bench (SOTTR) 2080 Ti -> 3080 uplift: 33%

Ampere gains, are equal to Pascal gains, and still somehow it isn't matching faulty memories, so it isn't good enough.
 
130% a 2080Ti for 60% of the cost? Sign me in. The only thing preventing me from jumping to 2080 Ti was the outrageous price, not the perf increase over the 1080Ti.

And the 3080 seems to be a very good bump (+70% a 1080Ti?) for all the Pascal users. For a reasonable price tag too. Really not that far from what I paid a 8800GTX back in the day and that was in 2008(?) money!
 
It was fun back in the 3DMark 2000 and 2001 days. It was still useless as a benchmark, due to not being real world, but in demo mode I found it to be a fun tech demo of what modern PC's at the time were capable of if really pushed.


I use to play that more for the music, music video :D. What do you know, reflections without using screenspace reflections (back then basically mirroring the geometry for the reflective surfaces, would be rather costly to do that now). I use 3dMark for system optimization as in OCing memory, GPU, vram, sometimes stability testing while stressing the GPU and something else stressing the CPU. As for comparing to other systems it maybe able to help identify a gross problem if your system is compared to a similar system in their database. As a performance benchmark comparing different hardware gaming performance, weak. If 3dMark would incorporate different game engines such as Unreal, Unity, Crytek into separate benchmarks it would probably be more useful.
 
Not only spoiled by it, but mis-remembering and exaggerating the gains as well.

Here is a typical (ROTTR) 980 Ti -> 1080 uplift: 28%.
https://images.anandtech.com/graphs/graph10325/82855.png

The leaked bench (SOTTR) 2080 Ti -> 3080 uplift: 33%

Ampere gains, are equal to Pascal gains, and still somehow it isn't matching faulty memories, so it isn't good enough.
You use one game and then conclude that is the performance difference? This is 23 games at 1440p. The 980 Ti was 57% of the 1080Ti performance at 100% (baseline comparison percent), 1080Ti was 1/.57 =1.75, a factor of 1.75 faster or 75% faster overall using 23 games as data points. Ampere performance is still not yet fully known but does not appear to be even close for generational increase.

1599844801741.png
 
Last edited:
Just to throw this out to my [H] brethren, Bestbuy gives a one time 10% off coupon for your birthday month. I am going to use it. Question is on a 3080 or 3090 ;).

$800 for 18% perf increase is rouughhhhh.
 
Just to throw this out to my [H] brethren, Bestbuy gives a one time 10% off coupon for your birthday month. I am going to use it. Question is on a 3080 or 3090 ;).

$800 for 18% perf increase is rouughhhhh.

Too bad my coupon was for last month.
 
Back in the day 3dmark was cool because it implemented new graphics features before they were available in games. Nowadays 3dmark visuals look like ass and don't do justice to new features. I had to read the manual to learn that Time Spy was heavily using volumetric lights and order independent transparency. Both are really nice but they weren't used in a very compelling way in Time Spy.

Don't get me started on the new ray tracing benchmark. That thing is pure trash.

Yeah, still remember enjoying the eye candy of 3DMark03 - GT4: Mother Nature. Those were the days!

gt4_2-big.jpg
 
Stress testing is just about the only use I see for the synthetic GPU benches.
I think it’s decent to see where your PC performs in regards to similar configurations. It can be an easy way to see if something is performing worse than it should. Other than that, it’s just e-peen.
 
Here is the performance rating of the 980ti compared to the 780ti, taken from the german website computerbase.de, the 980ti has 50% more performance than the 780ti
1599846041780.png

Here the performance rating of the GTX 1080ti compared to the GTX 980ti, this is a review from a german website called computerbase.de, the 1080ti was 66% better then the 980ti in 1440p on average in all tested games.
1599845342335.png


Next is the RTX 2080ti compared to the GTX 1080ti in 1440p, also a performance rating from computerbase.de, as we can see the 2080ti is only 32% faster than the 1080ti for in some cases double the price (custom 2080ti are $1400 and up)
1599845582229.png

For the same price of a 1080ti you got only 8% more performance with the 2080 non Ti

With the 3000 series it basically comes back to normal performance wise, while the prices are still inflated.

We can also compare the Ti Cards to the non Ti cards and analyze the poerformance increase from Gen to Gen

Basically we got a 20% to 30% increase with the new non Ti cards compared to the previous Ti cards at a cheaper price.

Now we are cheering Nvidia for 30% increase over the previous Ti card with the non Ti cards for an increased price if we compare it to Pascal.
Actually the picture get much more clear if we just remove the fuck up Turing was.
 
Actually the picture get much more clear if we just remove the fuck up Turing was.

Agreed. It'll be simpler to just directly compare the 3080 to the 1080/Ti at $699 and take the sqrt of the increase to account for the 2 generation gap. With Turing in the mix Ampere's $/fps improvement looks better than it should.

The 3080 is not overpriced compared to Pascal though. 80%+ performance increase over 4 years for the same price is right in line with the norm.
 
Here is the performance rating of the 980ti compared to the 780ti, taken from the german website computerbase.de, the 980ti has 50% more performance than the 780ti
View attachment 278301
Here the performance rating of the GTX 1080ti compared to the GTX 980ti, this is a review from a german website called computerbase.de, the 1080ti was 66% better then the 980ti in 1440p on average in all tested games.
View attachment 278290

Next is the RTX 2080ti compared to the GTX 1080ti in 1440p, also a performance rating from computerbase.de, as we can see the 2080ti is only 32% faster than the 1080ti for in some cases double the price (custom 2080ti are $1400 and up)
View attachment 278294
For the same price of a 1080ti you got only 8% more performance with the 2080 non Ti

With the 3000 series it basically comes back to normal performance wise, while the prices are still inflated.

We can also compare the Ti Cards to the non Ti cards and analyze the poerformance increase from Gen to Gen

Basically we got a 20% to 30% increase with the new non Ti cards compared to the previous Ti cards at a cheaper price.

Now we are cheering Nvidia for 30% increase over the previous Ti card with the non Ti cards for an increased price if we compare it to Pascal.
Actually the picture get much more clear if we just remove the fuck up Turing was.
What I find interesting is the R9 Fury with Techpowerup was faster than the 980Ti and your link it is not much slower than the 980Ti. That is not how I remember those two cards, no way was the Fury really close to the 980Ti. Just how those data are collected missed the biggest part of why people buy GPUs, how it actually performs when someone is using it. Fury would have stutters, hitching, graphical anomalies initially unless settings were reduced. Brent would catch those real issues while others hit a button and looked at the numbers afterwards. Plus the 980Ti OC to the heaven and the Fury did not.

The Vega 64 and 1080 were very much competitive to each other with the advantage at that time for the Vega Freesync cheaper cost for the monitors. Yet the tone of the day was how much power it takes? The Noise! Well the AIBs took care of the noise or the LC version which was too expensive. Ampere takes even more power than Vega but do look like much better coolers. The Vega 56 was even better than the 1070 and creamed it when OC, volt mod etc. Nvidia was able to make a hell a lot more of Pascals at a profitable price while AMD could not. Plus the mining wave hit big.

Bring this back to topic, Ampere from Pascal does look like a real generational good buy vs. Turding.
 
The Vega 64 and 1080 were very much competitive to each other with the advantage at that time for the Vega Freesync cheaper cost for the monitors. Yet the tone of the day was how much power it takes? The Noise! Well the AIBs took care of the noise or the LC version which was too expensive. Ampere takes even more power than Vega but do look like much better coolers. The Vega 56 was even better than the 1070 and creamed it when OC, volt mod etc. Nvidia was able to make a hell a lot more of Pascals at a profitable price while AMD could not. Plus the mining wave hit big.

Bring this back to topic, Ampere from Pascal does look like a real generational good buy vs. Turding.

The reality of Vega was that it was day late and a dollar short. Shipping something like a year and half after GTX 1080.

So when it shows up, more than a year later, with no price or performance advantage, and using a LOT more power and with loud AMD references coolers for everyone, no one but an AMD Fan was going to care at that point. AMD missed the boat. They are luck mining bubble started at almost the same time so they could sell those cards at full price.

Now as far as Ampere taking more power, it's delivering MUCH more performance for that power. Vega wasn't.
 
If you're gonna be late, you better offer SOMETHING. I'd argue the 5700XT/5700 at least offered a decent price, while Vega offered nothing. Hopefully Navi offers either more performance, lower price and/or power consumption or a combination of those. It also isn't going to be too late, so I guess that's good.
 
The reality of Vega was that it was day late and a dollar short. Shipping something like a year and half after GTX 1080.

So when it shows up, more than a year later, with no price or performance advantage, and using a LOT more power and with loud AMD references coolers for everyone, no one but an AMD Fan was going to care at that point. AMD missed the boat. They are luck mining bubble started at almost the same time so they could sell those cards at full price.

Now as far as Ampere taking more power, it's delivering MUCH more performance for that power. Vega wasn't.
A year and 2 months later, Vega 56 slotter the 1070 when it was OC, Vega 64, agreed reference designs sucked except for the LC which really was too much. Vega 64 with proper cooling has a 15%-20% performance boost from OCing HBM and GPU way past what the 1080 can do. Your right, a power hog overall. Ampere is yet to be tested fully on how well it can OC and how the power requirements change. AMD allows 50% power boost on their GPU's (very good reference designs) Nvidia allows much less on average but normally did not need it. Ampere maybe more impressive on other aspects, so far it looks average for performance bump on today's games, we just don't know yet.
 
That and a lot of people don't know how to calculate percentage faster. If you look where it's GPU limited SOTTR at 4k. 3080 is ~65% faster than 2080S, 33% faster than 2080Ti. Exactly in line with expectation given the DF numbers and those released by NVidia.

Gamers are notoriously bad at math. When comparing 67 fps to 100 fps, the former could be '33% slower' or the latter is '50% faster'. Video charts (garbage tier) just framed it in the less impressive way.

The 2080 is only slightly faster than the 2070S. So when comparing the 2080 to the 3080, the 3080 would be around 80% faster in Tomb Raider and Far Cry in non-cpu limited scenarios.

How is anyone complaining about this??
 
FYI, Nvidia changed the FE review NDA to the 16th. Guess people haven't been getting cards in time to publish by the 14th.

Two more days to be patient, I guess. At least we will still have reviews before the cards go on sale.
 
VC reporting a 3060 Ti in October to go with 3070. Logical since it's a cut down 3070:

https://videocardz.com/newz/nvidia-geforce-rtx-3060-ti-gets-4864-cuda-cores-and-8gb-gddr6-memory

Interesting. I thought they would have saved the ti refreshes until way later. This card is about 85% of the gpu power and 100% the bandwidth of the 3070 so maybe 90% of 3070 performance. $425 would be a great price.

Unlike normal releases, nVidia isn't waiting until AMD becomes a threat before offering a better value.

Just beating AMD is no longer enough. Nvidia must see the new consoles as a real threat and are now pushing the mid range (which should be close to the new console performance) at a much better value.

Now that the Series S was revealed, maybe we wil see some value punch in the budget lineup.
 
Interesting. I thought they would have saved the ti refreshes until way later. This card is about 85% of the gpu power and 100% the bandwidth of the 3070 so maybe 90% of 3070 performance. $425 would be a great price.

Unlike normal releases, nVidia isn't waiting until AMD becomes a threat before offering a better value.

Just beating AMD is no longer enough. Nvidia must see the new consoles as a real threat and are now pushing the mid range (which should be close to the new console performance) at a much better value.

Now that the Series S was revealed, maybe we wil see some value punch in the budget lineup.

I don't see how consoles are a threat to dGPU's. Seems like a meme to me. PC gamers are PC gamers. Console gamer are console gamers. There's some cross over but it's not significant if the dGPU sales numbers are true. I suspect they are.
s
 
You use one game and then conclude that is the performance difference? This is 23 games at 1440p. The 980 Ti was 57% of the 1080Ti performance at 100% (baseline comparison percent), 1080Ti was 1/.57 =1.75, a factor of 1.75 faster or 75% faster overall using 23 games as data points. Ampere performance is still not yet fully known but does not appear to be even close for generational increase.

View attachment 278289

Yeah....uhm.. what is the problem?

The "use one game and conclude" gripe is misplaced. Are you even following the thread?

So 1st off-
1080ti vs 980ti is not the same comparison as the vanilla 3080 to 2080ti.

Compare the 3080 to the non ti 2080 or perhaps the 3090 to the 2080ti.

Secondly-
this whole conversation was driven by a supposed leak, that even if it was true, had hardly any games in it to compare.

So yeah, basically, considering the conversation around the probably fake leak and the tiny sample of games, there is nothing wrong with someone posting a single game from Maxwell to Pascal. It was a hit against how people were making conclusions based on the leaked results.
 
Gamers are notoriously bad at math. When comparing 67 fps to 100 fps, the former could be '33% slower' or the latter is '50% faster'. Video charts (garbage tier) just framed it in the less impressive way.

The 2080 is only slightly faster than the 2070S. So when comparing the 2080 to the 3080, the 3080 would be around 80% faster in Tomb Raider and Far Cry in non-cpu limited scenarios.

How is anyone complaining about this??
Actually rather simple math, to get 67fps to 100fps will take 67 x 1.49 = 100 (~). Your looking at the performance increase from the previous GPU which was in your example 67fps to the new generation GPU to take that 67fps to 100fps, it has to have 49% better performance or a factor of 1.49 to get there. The jest is you working on how much faster is the new from the old, the OLD is the baseline GPU which measures how much performance it takes to get to the new.

You don't take a new GPU performance and subtract to the old GPU for percents, especially if in percents since that is using its own percent of performance and does not relate to the other GPU percent performance. 1% performance of the 67fps GPU is different than the 1% performance of the 100fps GPU. Yes agree people continue to mess this up but what is really sad is companies will use that misunderstanding basically state plain lies. Why is the 2080 being compared to the 3080 anyways? To inflate the numbers, it should be the 2080 Super but doesn't matter -> Real testing hopefully will be released shortly.
 
Why is the 2080 being compared to the 3080 anyways? To inflate the numbers, it should be the 2080 Super but doesn't matter -> Real testing hopefully will be released shortly.

Because the 2080 was the original 2nd tier Turing card, just like how the 3080 is the original 2nd tier Ampere part.

If there is a 3080 Super refresh next year, one would compare that to a 2080 Super.

Lets not fall in the trap of comparing a 3070 to a 2070 Super either. That was basically a 2080 and the 3070 should rightfully be compared to a 2070.
 
Because the 2080 was the original 2nd tier Turing card, just like how the 3080 is the original 2nd tier Ampere part.

If there is a 3080 Super refresh next year, one would compare that to a 2080 Super.

Lets not fall in the trap of comparing a 3070 to a 2070 Super either. That was basically a 2080 and the 3070 should rightfully be compared to a 2070.
Seriously? Comparing to the card Nvidia has been selling in the exact same price category for the last year is not the baseline but one that is has not been made over a year should be??? Why not just compare it to the 1080Ti then, it would be about the same performance difference. Anyways not too important and I think we are wasting time on that aspect. All it does is inflate the numbers a little. It will be AMD vs Nvidia new generation of cards that really matters with some comparisons with the previous generations for some prospective for people to use to decide.
 
Seriously? Comparing to the card Nvidia has been selling in the exact same price category for the last year is not the baseline but one that is has not been made over a year should be???

And what if Nvidia released a refresh of the 980 in the form of a 980 Super using more cores, faster memory or whatever at the same $550. It would have made the 1080 look less impressive.

The point is that Ampere shouldn't be 'punished' just because Nvidia gave some better values on the refresh cards for the 20 series.
 
All it does is inflate the numbers a little. It will be AMD vs Nvidia new generation of cards that really matters with some comparisons with the previous generations for some prospective for people to use to decide.

Many are just sick of AMD's hype and childish 'leaks'. Steve at Gamer's Nexus explains this well in his latest video.

"Poor volta"
:whistle:
"The sun will come out tomorrow"

Good grief.
 
Many are just sick of AMD's hype and childish 'leaks'. Steve at Gamer's Nexus explains this well in his latest video.

"Poor volta"
:whistle:
"The sun will come out tomorrow"

Good grief.
Maybe folks should just ignore tweets if so upsetting over nothing. Rest is past junk that AMD marketing department have been virtually fired, hopefully they do better this round. Agree they need to lay off of Reddit. There was some common sense reasoning done by Tech Jesus there.
 
Maybe folks should just ignore tweets if so upsetting over nothing. Rest is past junk that AMD marketing department have been virtually fired, hopefully they do better this round. Agree they need to lay off of Reddit. There was some common sense reasoning done by Tech Jesus there.

Ha ha. That's a good one! You funny.

On what planet do people ignore or stay away from things that might bother them?

This modern age is all about the endless search for something to be upset and offended over. Well beyond going out of the way, for many, it seems to have become their entire life's mission.

You can't be interfering with that!
 
Seriously? Comparing to the card Nvidia has been selling in the exact same price category for the last year is not the baseline but one that is has not been made over a year should be??? Why not just compare it to the 1080Ti then, it would be about the same performance difference. Anyways not too important and I think we are wasting time on that aspect. All it does is inflate the numbers a little. It will be AMD vs Nvidia new generation of cards that really matters with some comparisons with the previous generations for some prospective for people to use to decide.

Well, when speaking in terms of generational advancement, you absolutely should keep the refreshed out. It's a rather specific discussion when comparing launches of a new generation to those of past generations.

You seem to be wanting to talk about this new generation verses what's available on the market. There is nothing wrong with that conversation, totally worthwhile and great to talk about, it is just not at all the same thing.

The market doesn't sit still, it's always changing. Without a doubt, ampere will have refreshes and considering this is a whole new node with Samsung, we could see some huge improvements as the process surely will mature.
 
Well, when speaking in terms of generational advancement, you absolutely should keep the refreshed out. It's a rather specific discussion when comparing launches of a new generation to those of past generations.

You seem to be wanting to talk about this new generation verses what's available on the market. There is nothing wrong with that conversation, totally worthwhile and great to talk about, it is just not at all the same thing.

The market doesn't sit still, it's always changing. Without a doubt, ampere will have refreshes and considering this is a whole new node with Samsung, we could see some huge improvements as the process surely will mature.
I should have not mentioned it at all, my mistake since it is really unimportant. Thanks
 
I wonder if 10 GB of memory would be enough for a Pimax 8KX especially if used for Microsoft Flight Simulator 2020. I heard that game is using more than 10GB on one 4k screen. I have an Oculus CV1 right now but I want have the ability to upgrade to a Pimax 8KX or something similiar in the future. I don't think 10 GB is going to cut it but I'm hesitant to pay more than double for a 3090 when a 20GB 3080/3080Ti could be around the corner.
 
I wonder if 10 GB of memory would be enough for a Pimax 8KX especially if used for Microsoft Flight Simulator 2020. I heard that game is using more than 10GB on one 4k screen. I have an Oculus CV1 right now but I want have the ability to upgrade to a Pimax 8KX or something similiar in the future. I don't think 10 GB is going to cut it but I'm hesitant to pay more than double for a 3090 when a 20GB 3080/3080Ti could be around the corner.

FS 2020 sucks up memory like no tomorrow at 4k and ultra settings, around 10gb without dense terrain. I'd definitely be looking for a card with >10GB. Wait for either 3080 20GB or one of the Big Navi cards with 16GB.
 
FS 2020 sucks up memory like no tomorrow at 4k and ultra settings, around 10gb without dense terrain. I'd definitely be looking for a card with >10GB. Wait for either 3080 20GB or one of the Big Navi cards with 16GB.

From experience, you don't want to get burnt buying a card version with too little memory, then just buy the bigger-memory version month or two later.
 
I received some flak recently for merely linking a table made in a topic here which looked at VRAM 'usage' in modern games. The counter argument was that true usage (not merely that allocated/reserved) is lower and thus 10GB VRAM is more than enough even for the most demanding titles.

That said no one I've seen making this counterpoint has explained how true usage can be measured, apart from one game which supposedly distinguishes it---Flight Sim 2020. Even Steve from Gamers Nexus mentioned that GPU-Z's reported VRAM 'usage' isn't representative of the real usage which makes it confusing tbh.
 
  • Like
Reactions: noko
like this
Back
Top