RTX 3xxx performance speculation

it's like people have forgotten how to adjust settings for their rig.

I've been feeling that for a number of years now. The "problems" people complain about absolutely amaze me. Apparently these days it's all native-res at the highest quality settings or bust. I don't get it. Maybe because I'm a kid from the 80s, who started gaming in the 90s, and back then the main goal was to render things smoothly enough to be usable. That often meant 320x240 at sub-30fps and it was glorious. When I hear people whine about 60fps 1440p... yeah, cry me a river.
 
I've been feeling that for a number of years now. The "problems" people complain about absolutely amaze me. Apparently these days it's all native-res at the highest quality settings or bust. I don't get it. Maybe because I'm a kid from the 80s, who started gaming in the 90s, and back then the main goal was to render things smoothly enough to be usable. That often meant 320x240 at sub-30fps and it was glorious. When I hear people whine about 60fps 1440p... yeah, cry me a river.

Probably a fair amount of gear lust and peer pressure involved as well. Technology has indeed changed a lot and everybody wants the best they can afford (or can't afford).

Many claim Q2 RTX is unplayable at 60fps. Wonder what they think it was originally played at?
 
All of the RT 3XXX will run so fast I will have to turn off the turbo button on my case just to slow the games down enough to make them playable. Just like the old days! ;)


In reality, 15-20% improvement, same pricing as current gen.

1/3 will buy them, 1/3 won't and will hold out for the next gen which will be super awesome you don't even know, and 1/3 will buy AMD then try to explain its just as good, they aren't giving into NV corporate greed, and the NV killer is around the corner.
 
All of the RT 3XXX will run so fast I will have to turn off the turbo button on my case just to slow the games down enough to make them playable. Just like the old days! ;)


In reality, 15-20% improvement, same pricing as current gen.

1/3 will buy them, 1/3 won't and will hold out for the next gen which will be super awesome you don't even know, and 1/3 will buy AMD then try to explain its just as good, they aren't giving into NV corporate greed, and the NV killer is around the corner.

upload_2019-12-19_12-44-54.jpeg
 
3080/3080ti are going to be beast cards, why? cuz I say so, j/k, 7nm die shrink from nvidia will not disappoint, expect same perf boost going from 980 to 1080.
 
I've been feeling that for a number of years now. The "problems" people complain about absolutely amaze me. Apparently these days it's all native-res at the highest quality settings or bust. I don't get it. Maybe because I'm a kid from the 80s, who started gaming in the 90s, and back then the main goal was to render things smoothly enough to be usable. That often meant 320x240 at sub-30fps and it was glorious. When I hear people whine about 60fps 1440p... yeah, cry me a river.

The native resolution issue is in a large part due to the shift to LCDs and how scaling worked. This in turn conditioned an entire generation to having to play at native res. We now some recently are having more solutions in place from separate internal rendering resolutions built in game to more scaling and post sharpening options which might change how people think about this as we move towards the transition to 4k.

I would also say FPS rates mattered more with the shift to LCDs as well due to people wanting to claw back the loss in motion clarity.

In terms of the max settings issue I feel that was in large part due to the Xbox 360/PS3 console generation. Due to the shift towards multiplatform and really a console development focus combined with that the hardware on those were eclipsed shortly into their life cycle even for mainstream buyers it basically conditioned people to console+ and really max settings as being standard for how games would run. This will be interesting going forward in that the general hardware upwards movement has slowed now and the static platform of next gen consoles will be stronger for longer in terms of raw hardware.
 
  • The native resolution issue is in a large part due to the shift to LCDs and how scaling worked. This in turn conditioned an entire generation to having to play at native res.

LOL at "having" to play at native res. Your explanations are fine and nothing we didn't already know, but I assure you a QHD panel set at 1080p looks better than any CRT at 320x240. Further, neither of them are "unplayable" like some would complain. My point was about being spoiled with what we have, not about not understanding why some cry about what I consider to be luxury. Seriously, anyone who can games at 1080p60 should feel accomplished AF. But maybe I'm just getting old.

Not achieving 4k or QHD at 144hz is not a "problem", it's being spoiled. So is "having" to play at native res, or at absolute high settings. Pure idiocy. You buy a good value card, and as it gets old you make a few compromises until you buy the next one. And you certainly don't upgrade every year because the value proposition is stupid. This last part is just my own life philosophy though, not a perspective on how things are these days. People overpay constantly for little to no gain... And before someone tells me people will spend on what makes them happy... I can tell you you'll be happier if you put that money in a retirement account and see your money grow with compound interest for when you're old and have all the time in the world to buy that GeForce RTX 35,080 TI and play the newly released Half Life 3 in 2049.
 
Last edited:
Well, until we start getting into significantly higher PPIs where 200%+ desktop scaling is necessary, there is an IQ boost to at least sending the native resolution to the monitor. Monitor scalars are usually the worst in the business, and video cards as well as the games themselves with dynamic resolution options do a much better job.
 
LOL at "having" to play at native res. Your explanations are fine and nothing we didn't already know, but I assure you a QHD panel set at 1080p looks better than any CRT at 320x240. Further, neither of them are "unplayable" like some would complain. My point was about being spoiled with what we have, not about not understanding why some cry about what I consider to be luxury. Seriously, anyone who can games at 1080p60 should feel accomplished AF. But maybe I'm just getting old.

Not achieving 4k or QHD at 144hz is not a "problem", it's being spoiled. So is "having" to play at native res, or at absolute high settings. Pure idiocy. You buy a good value card, and as it gets old you make a few compromises until you buy the next one. And you certainly don't upgrade every year because the value proposition is stupid. This last part is just my own life philosophy though, not a perspective on how things are these days. People overpay constantly for little to no gain... And before someone tells me people will spend on what makes them happy... I can tell you you'll be happier if you put that money in a retirement account and see your money grow with compound interest for when you're old and have all the time in the world to buy that GeForce RTX 35,080 TI and play the newly released Half Life 3 in 2049.
At least resolution scaling in games is becoming common and doesn't seem to have that ridiculous stigma that turning down quality settings does. Though personally, I'd rather run native resolution and turn some settings down.
 
LOL at "having" to play at native res. Your explanations are fine and nothing we didn't already know, but I assure you a QHD panel set at 1080p looks better than any CRT at 320x240. Further, neither of them are "unplayable" like some would complain. My point was about being spoiled with what we have, not about not understanding why some cry about what I consider to be luxury. Seriously, anyone who can games at 1080p60 should feel accomplished AF. But maybe I'm just getting old.

Not achieving 4k or QHD at 144hz is not a "problem", it's being spoiled. So is "having" to play at native res, or at absolute high settings. Pure idiocy. You buy a good value card, and as it gets old you make a few compromises until you buy the next one. And you certainly don't upgrade every year because the value proposition is stupid. This last part is just my own life philosophy though, not a perspective on how things are these days. People overpay constantly for little to no gain... And before someone tells me people will spend on what makes them happy... I can tell you you'll be happier if you put that money in a retirement account and see your money grow with compound interest for when you're old and have all the time in the world to buy that GeForce RTX 35,080 TI and play the newly released Half Life 3 in 2049.


How much being at native resolution matters, depends heavily on the content. For smoother, more analog content like Video, it's just about irrelevant. Now that doesn't mean low res will look as good as hi res. It just means that 1800p on 4K will still look better than 1440p on Native. Non native really doesn't harm the image quality much at all.

Now if we are talking sharp edged 2D computer generated content. 1440p native will usually look better than 1800p on 4K. Because very sharp blocky content will be disrupted at non-native resolutions.

Modern 3D game rendering with crappy TAA anti-Alaising is so soft that it is closer to Video than 2D computer graphics, you can often go non native without much impact. It' the 2D overlaid HUD, that jumps out as getting wrecked at non-native resolution.

Which is why every game should have some kind of resolution scaling for the 3D rendering built in. Render the 2D HUD overlay at native (costs nothing) and lower the rendering resolution at a lower res to boost performance, and it often isn't that noticeable. In this scenario I bet that 1800p on 4K, would still look better than 1440p native.
 
How much being at native resolution matters, depends heavily on the content. For smoother, more analog content like Video, it's just about irrelevant. Now that doesn't mean low res will look as good as hi res. It just means that 1800p on 4K will still look better than 1440p on Native. Non native really doesn't harm the image quality much at all.

Now if we are talking sharp edged 2D computer generated content. 1440p native will usually look better than 1800p on 4K. Because very sharp blocky content will be disrupted at non-native resolutions.

Modern 3D game rendering with crappy TAA anti-Alaising is so soft that it is closer to Video than 2D computer graphics, you can often go non native without much impact. It' the 2D overlaid HUD, that jumps out as getting wrecked at non-native resolution.

Which is why every game should have some kind of resolution scaling for the 3D rendering built in. Render the 2D HUD overlay at native (costs nothing) and lower the rendering resolution at a lower res to boost performance, and it often isn't that noticeable. In this scenario I bet that 1800p on 4K, would still look better than 1440p native.
With the Radeon Nano and 4K monitor, I played most of my games using 1800p or lower then upscaled to 4K which the drivers automatically did with great results. It was definitely better IQ then my 1440p monitor (IPS Korean model which was rather good in itself, Catleap). I saw no issues with monitor running at native resolution while the game was being rendered at a lower resolution and upscaled. I just find it hard to believe many others don't seem to know about this, have not tried it and so on. Now with Nvidia hardware, GTX 1070, scaling sucked pretty bad so maybe at the time an AMD advantage.
 
With the Radeon Nano and 4K monitor, I played most of my games using 1800p or lower then upscaled to 4K which the drivers automatically did with great results. It was definitely better IQ then my 1440p monitor (IPS Korean model which was rather good in itself, Catleap). I saw no issues with monitor running at native resolution while the game was being rendered at a lower resolution and upscaled. I just find it hard to believe many others don't seem to know about this, have not tried it and so on. Now with Nvidia hardware, GTX 1070, scaling sucked pretty bad so maybe at the time an AMD advantage.

Another factor with 4K monitors specifically (especially smallish ones), that most won't admit, is that it's overkill and hides a lot of issues. For some they are already at the point of needing 150-200% scaling in Windows to make it usable (or cranking font sizes in every application).
 
  • Like
Reactions: Auer
like this
Another factor with 4K monitors specifically (especially smallish ones), that most won't admit, is that it's overkill and hides a lot of issues. For some they are already at the point of needing 150-200% scaling in Windows to make it usable (or cranking font sizes in every application).

I use a 32", at 125% scaling and I would not want anything smaller. I do love the real estate for photo editing and such, and some games look fantastic when scaled to it or even native.
Currently my RTX2070 scales beautifully from 1440p to 4k.
 
Is there a statistic over 1080Ti vs 2080Ti sales?
I'm curious to read how many of each have been sold.

I doubt they've released numbers like that, but it's really irrelevant because regardless of those numbers; if you're buying at these severely marked-up prices you're objectively telling them to keep or raise future prices and directly contributing to the price gouging.

So at some point, if you're considerate at all, you have to stop considering if you can personally afford it right now and consider voting with your wallet in order to at least keep pricing in check, because eventually it will get to a point where you either can't afford it or at least don't want to pay what they're asking for their next product (which is probably the case for most here that went for a 2080 or lower instead, as I did).

Anyways, to stay on topic: I'm predicting a relatively small 15-20% performance increase at the same or more expensive price points for the 2080Ti successor.
 
Last edited:
Steam survey is about as good as you'll get for those numbers. Last time I looked the 2080ti was barely a percent.

This says enough; 1080ti 1.6% 2080ti 0.6%
Just buy it!
Highlights the irrelevancy of halo stuff in real-world use.

Steam isn't very accurate though and has had issues classifying AMD cards for a long time. But it's best we got.
 
Steam survey is about as good as you'll get for those numbers. Last time I looked the 2080ti was barely a percent.

Edit this says enough; 1080ti 1.6% 2080ti 0.6%

Just buy it!

Hard to compare sales from launch though considering the 1080ti has been out 18 months longer than the 2080ti. But I would assume also that flagship cards have mostly accounted for around 1% of Steam users as well in the past.
 
Hard to compare sales from launch though considering the 1080ti has been out 18 months longer than the 2080ti. But I would assume also that flagship cards have mostly accounted for around 1% of Steam users as well in the past.
True it could creep up a little. That said sales will be tapering off as next gen approaches. I'd say you're mostly seeing people vote with their wallets.
2070 is about as popular as the 1080ti and the 2080 is almost at 1%.
When you move pricing tier up like that, that tends to happen.
 
With the Radeon Nano and 4K monitor, I played most of my games using 1800p or lower then upscaled to 4K which the drivers automatically did with great results. It was definitely better IQ then my 1440p monitor (IPS Korean model which was rather good in itself, Catleap). I saw no issues with monitor running at native resolution while the game was being rendered at a lower resolution and upscaled. I just find it hard to believe many others don't seem to know about this, have not tried it and so on.

Above 1440p I think there are diminishing returns in fidelity so I agree that running at say 1880p with image sharpening can give really good results on top of increased performance. The real problem is the selection of 4K monitors on the market, especially if you want above 60 Hz refresh rates. Most are 27-28" and then there is a huge gap between 32 and 43" which would be the sweet spot for 4K where you would need less scaling but would still have sharp text.
 
In Canada prices are far too high. Gonna wait to see with die shrink and improvements to raytracing to get 3080ti when it comes out. Have 2 1080tis at moment. Also Intel and Amd are coming with cards as well. Gonna be busier year 2020 for GPU and CPU.
 
Last edited:
I've been feeling that for a number of years now. The "problems" people complain about absolutely amaze me. Apparently these days it's all native-res at the highest quality settings or bust. I don't get it. Maybe because I'm a kid from the 80s, who started gaming in the 90s, and back then the main goal was to render things smoothly enough to be usable. That often meant 320x240 at sub-30fps and it was glorious. When I hear people whine about 60fps 1440p... yeah, cry me a river.

If you could run a game at 60fps in online multiplayer it gave you such a competetive advantage that graphics almost held no utility at that point other than draw distance.
 
Yeah. Some people would rather ignore the massive increase in die size (and thus cost) and pretend it's just big Evil NVidia raising pricese for no reason at all.


Not to mention further ballooning costs because of a massive jump in the amount of Masks needed per Chip at the Foundry.

GTX1080 was 7.2 Billion xtors,on 16nm,RTX2080 is 13.6 Billion xtors on 12nm,and thats using far more Masks for Turing. The overall costs to produce chips is getting to be insane. Dr Lisa Su's opening talk at HotChips2019 on how TSMC 7nm is damn expensive, far more then GloFlo's 14/12nm was very eye opening ! I suggest the "I hate Ngreedia" crowd watch it,before they next decide to further demonize either company.
 
Not to mention further ballooning costs because of a massive jump in the amount of Masks needed per Chip at the Foundry.

GTX1080 was 7.2 Billion xtors,on 16nm,RTX2080 is 13.6 Billion xtors on 12nm,and thats using far more Masks for Turing. The overall costs to produce chips is getting to be insane. Dr Lisa Su's opening talk at HotChips2019 on how TSMC 7nm is damn expensive, far more then GloFlo's 14/12nm was very eye opening ! I suggest the "I hate Ngreedia" crowd watch it,before they next decide to further demonize either company.
No one has to demonize Nvidia, they do a fine job all by themselves.

AMD Navi 10 is 10.3 billion transitors (if both count transistors the same way) but also 251mm^2, 225w. Less than half the size of TU-104 545mm^2 chip, 250w. Will Nvidia be able to get that kind of density increase and still cool the chip sufficiently to maintain high clock speeds? I see a lot of unknowns here but AMD Navi is looking rather dazzling in ability to be cooled and still maintain relatively high clock speeds, with a higher node process cost yet AMD still beats Nvidia in perf/$. Nvidia will not be competing overall against Navi but Navi 2 which should make for a very interesting year in 2020, that is if Nvidia can get Ampere out.
 
The days of massive transistor count increases (and resulting performance gains) are behind us.


I remember the mods on Beyond3d,some of them actual industry insiders saying that back in the 8800GTX Days....

"Oh we'll never see 3 Billion.....5 billion.....10 billion xtors GPU chips!! Trust me I work for so and so......." I used to hang on their every word,from AMD/Nvidia employees on THE Beyond3d forum,actual AMD/NV engineers were saying this stuff.

ALL were Wrong. NV's Chief Architect said publicly at HotChips 19(this Fall) day after Lisa gave her keynote,he fully expects NV will be full Monolithic Chip design, on the High end for 2 more Generations!! Given they plan out 5 to 7 years at a time,and he is privy to TSMC's 3nm EUV Fab roadmap,I think he is telling us the truth.
 
I remember the mods on Beyond3d,some of them actual industry insiders saying that back in the 8800GTX Days....

"Oh we'll never see 3 Billion.....5 billion.....10 billion xtors GPU chips!! Trust me I work for so and so......." I used to hang on their every word,from AMD/Nvidia employees on THE Beyond3d forum,actual AMD/NV engineers were saying this stuff.

ALL were Wrong. NV's Chief Architect said publicly at HotChips 19(this Fall) day after Lisa gave her keynote,he fully expects NV will be full Monolithic Chip design, on the High end for 2 more Generations!! Given they plan out 5 to 7 years at a time,and he is privy to TSMC's 3nm EUV Fab roadmap,I think he is telling us the truth.

Clarification. Transistor counts will keep increasing. It just we won't see massive jumps in single generation as in the past.
 
I remember the mods on Beyond3d,some of them actual industry insiders saying that back in the 8800GTX Days....

"Oh we'll never see 3 Billion.....5 billion.....10 billion xtors GPU chips!! Trust me I work for so and so......." I used to hang on their every word,from AMD/Nvidia employees on THE Beyond3d forum,actual AMD/NV engineers were saying this stuff.

ALL were Wrong. NV's Chief Architect said publicly at HotChips 19(this Fall) day after Lisa gave her keynote,he fully expects NV will be full Monolithic Chip design, on the High end for 2 more Generations!! Given they plan out 5 to 7 years at a time,and he is privy to TSMC's 3nm EUV Fab roadmap,I think he is telling us the truth.

I have no reason to doubt a Chief Architect would say that for Nvidia, just like Intel's would have gone on about how succesfull 10nm was going to be as well. Things dont always turn out as well as you plan sometimes, plenty of people have seen their plans go up in smoke over the years.
 
No one has to demonize Nvidia, they do a fine job all by themselves.

AMD Navi 10 is 10.3 billion transitors (if both count transistors the same way) but also 251mm^2, 225w. Less than half the size of TU-104 545mm^2 chip, 250w. Will Nvidia be able to get that kind of density increase and still cool the chip sufficiently to maintain high clock speeds? I see a lot of unknowns here but AMD Navi is looking rather dazzling in ability to be cooled and still maintain relatively high clock speeds, with a higher node process cost yet AMD still beats Nvidia in perf/$. Nvidia will not be competing overall against Navi but Navi 2 which should make for a very interesting year in 2020, that is if Nvidia can get Ampere out.
A node that has 1.6x density fits more transistors per square millimeter?

upload_2019-12-26_9-12-11.png
 
A node that has 1.6x density fits more transistors per square millimeter?

View attachment 211049
You totally missed the point or I was very unclear. Will Nvidia be able to push 250w through half the area and still maintain clock speed? Or have double the area and twice the transistors and still be at 250w?

giphy (1).gif
 
You totally missed the point or I was very unclear. Will Nvidia be able to push 250w through half the area and still maintain clock speed? Or have double the area and twice the transistors and still be at 250w?

View attachment 211050


It's unclear why you think NVidia would have any more issues cooling 7nm GPUs than AMD does.
 
It's unclear why you think NVidia would have any more issues cooling 7nm GPUs than AMD does.

Yeah I don't get it either. Not sure what area has to do with anything. Especially since it's a very good bet that Nvidia's 7nm 5700xt competitor will require far less power.
 
It's unclear why you think NVidia would have any more issues cooling 7nm GPUs than AMD does.
I think Mangoseed point is good. Area is very much key in keeping something cool enough so it will work. While density is potentially 1.5x higher on a smaller node power consumption does not go down to 67%, more like 90%. Add in additional transistors for improved performance and I see a wall. Make the chip bigger to allow better heat transfer then yield probably becomes a much bigger issue. I still like MangoSeed point, if Nvidia made a chip the same size as Navi 10 - how would it perform compared to it? I just find 2020 looks like a very interesting year, as for Ampere I think it will have to be pretty amazing from a design point to be launched next year. I am talking about big Ampere for gaming that is.
 
Yeah I don't get it either. Not sure what area has to do with anything. Especially since it's a very good bet that Nvidia's 7nm 5700xt competitor will require far less power.
Now think about this, AMD 251mm^2 Navi 10, 225w but then AMD increases efficiency significantly -> less power -> I see a potential huge clock boost potential resulting. For the smaller nodes where size really does matter for yield much more than usual, clock speed to get more out of less transistors seems to be the way to go with as much efficiency as possible. Navi 10 small area, high clock speed and yet rather high power shows much potential if they can get the efficiency even better. I just see Ampere and RNDA 2 pushing clock speeds way up to get the performance up and keeping yields reasonable for as small as possible dies.
 
Its going to be GPU and CPU 2020. Intel suppose to release discete cards. AMD and Nvidia competition is good for us. Stlll holding off for RTX 3080 ti. Not impressed with rtx 2080ti.
 
I would also say FPS rates mattered more with the shift to LCDs as well due to people wanting to claw back the loss in motion clarity.

A lot of lthe current shortcomings of displays are tied directly to the inherent sacrifices in moving from crt to lcd tech. We lost the instant response (no lag), created ghosting, bleed, and a myriad of other limitations by adopting lcd and its brethren. While there have been improvements made, and we are finally seeing some of the benefits from years of said reworks, it still just isn't as clean as running on even a low end crt. Oled and future technology will alleviate many of the hurdles we self imposed, but honestly, playing a game at 85 fps or more on a crt was like poetry in motion. It just flowed in a way that can't be described to a new generation of gamers, let alone sub 30fps console button mashers. There is a pinacle, it was a reached, lost to time, and will one day be reached again. Until then, pay your aging crt repairman who alone holds the lost art of reviving our ancient trasures to fix up your fw-900 one last time. Eventually there will be no one and no parts to bring her back.
 
I'm going to say that Nvidia needs to provide at least a 30% bump to make upgrading worthwhile.

I'd also like to see them dramatically increase Raytracing performance.

Right now the RTX performance loss is too great to make it worthwhile.

If they can enable RTX while still providing at least a 20% performance upgrade over 20xx series performance, then that could peak my interest in an upgrade.

Also please have those updated outputs for hdmi, displayport and pcie 4.0 please!
 
I'm going to say that Nvidia needs to provide at least a 30% bump to make upgrading worthwhile.

I'd also like to see them dramatically increase Raytracing performance.

Right now the RTX performance loss is too great to make it worthwhile.

If they can enable RTX while still providing at least a 20% performance upgrade over 20xx series performance, then that could peak my interest in an upgrade.

Also please have those updated outputs for hdmi, displayport and pcie 4.0 please!
I would go along with that as well performance wise but also priced at a more reasonable rate as well. HDMI 2.1, Displayport 2 and pcie 4.x, I would say HDMI 2.1 is mandatory for the next high end GPU the other two very close to that as well.
 
I've been feeling that for a number of years now. The "problems" people complain about absolutely amaze me. Apparently these days it's all native-res at the highest quality settings or bust. I don't get it. Maybe because I'm a kid from the 80s, who started gaming in the 90s, and back then the main goal was to render things smoothly enough to be usable. That often meant 320x240 at sub-30fps and it was glorious. When I hear people whine about 60fps 1440p... yeah, cry me a river.

They're too young to know the glory of a CRT where every resolution is native! :cool:

And no input lag, limited colour gamut, etc... :rolleyes:
 
Back
Top