GPU - diminishing returns reached

I don't think you understand the fine wine concept (as in cards get better as they age with driver improvements, not worse). I'm not even claiming that's happening. I don't think they optimize drivers for new games on older cards.

Here's a reputable site review at the time of the 2080 launch compared to the 1080Ti. A whole whopping 1% faster at 1440p AND 4K.

View attachment 255860

Here's a more recent review (2060 super) showing that the 1080Ti is now 11% behind the 2080 and even behind the 2070 Super:
View attachment 255864

You can claim fail, lies, and fuckery all you want, but show me some actual numbers.

All this shows is that Turing drivers matured over time.
 
I saw it pretty much the same way, 1080TI held up good. Ended up with the 5700XT anyways due to much better support for my FreeSync II monitor. This coming generation may prove to be worth while but for me it will have to be a monitor upgrade because at 1440p the 1080Ti and 5700XT works extremely well. Unless there are must have games that needs the newer hardware with usable worthwhile features, I can see many just skipping yet another generation.
Heck the resale on the 1080ti is insane!
 
I stayed with a x58 x5660 and a 290x for a long time as to buy my time and going Ryzen 1000 series payed off as to have a cpu to flash my older boards and I chose the RX 5700 as to be the performance of two 290x 's at 150 watts for $329 paired with a 65 watt Ryzen 3600 is a sweet package ..

It's free to clock itself up to 4.4Ghz as it's auto overclocking under AIO https://valid.x86.fr/bench/imttyy/1

The board cost me $62 new and the memory it just mixed leftovers
That is a nice, low power draw, quick setup.
 
  • Like
Reactions: N4CR
like this
Here's a casual observation. At launch, the 1080Ti was essentially equal to the 2080 which was disappointing at best (I cancelled my 2080 pre-order when I saw reviews). Now, a 1080Ti can barely keep up with a 2070. Why is that? Nvidia wants you to buy a 2070 instead of a used 1080Ti and isn't putting in the effort to optimize for Pascal cards.

Probably because the 2070 has a different architecture than than the 1080 Ti. You seem to think the biggest gains in software optimization come late in the products lifecycle, but that makes no sense. I would expect larger gains in the first few driver iterations, then diminishing returns after that.
 
Probably because the 2070 has a different architecture than than the 1080 Ti. You seem to think the biggest gains in software optimization come late in the products lifecycle, but that makes no sense. I would expect larger gains in the first few driver iterations, then diminishing returns after that.

Indeed it does, Turing altered the SM/CUDA core ratio compared to Pascal...but he does not care for facts FYI.
 
Probably because the 2070 has a different architecture than than the 1080 Ti. You seem to think the biggest gains in software optimization come late in the products lifecycle, but that makes no sense. I would expect larger gains in the first few driver iterations, then diminishing returns after that.

At this point we are at the end of both product life cycles. So diminishing returns all around? I think at best you would see a slight bump in performance over time with a newer card, not a complete shift down the product stack for the older cards in comparison to newer cards unless there was no effort (or little effort) to maintain optimizations for older cards with newer game engines, etc.

I'm not sure exactly why Factum agrees with your post because he tried to argue that there is no "fine wine" drivers and the old HardOCP article proved it, but then when Nvidia increases performance over time, it's to be expected and to disagree means that facts don't matter. I mean I don't think anyone is disagreeing that they are different architectures.

To put it in perspective, essentially what people are saying in this thread is that if the 980Ti was the equivalent to the 1070 at launch, by the end of the Pascal era, the 980Ti would only be performing like a 1060 (e.g. the next card down the product stack) because of optimizations. That of course, didn't happen and the 980Ti is still a similar performer to the 1070. Likewise, I would expect the 1080Ti to always have similar performance to a 2080.
 
Last edited:
I've railed on about the diminished improvements/massive price upticks in GPUs (mainly propelled by nV and followed suit by AMD) over the past 3-4 years and the somewhat nefarious tactics taken to get there, so will refrain here.

I will only add this anecdote as an example of how shit the market has been, and will likely continue to be... I had a local person offering to purchase a 2080 Super I was selling recently, and he was offering to pay within £40 of what I had bought the card for 5 months prior (card prices have jumped here in the UK recently). He was telling me he wanted to "upgrade" from his GTX 1080 Ti, and I actually convinced him he should not buy my card (or any other) in the current state of things. I couldn't have him go down the path I did over the past 4 years from the 1080 Ti to the RTX 2080 to Radeon VII to RTX 2080 -- all complete side grades and lost money on my part, the only added value I derived from them was keeping me entertained playing with new GPUs.
 
I've railed on about the diminished improvements/massive price upticks in GPUs (mainly propelled by nV and followed suit by AMD) over the past 3-4 years and the somewhat nefarious tactics taken to get there, so will refrain here.

I will only add this anecdote as an example of how shit the market has been, and will likely continue to be... I had a local person offering to purchase a 2080 Super I was selling recently, and he was offering to pay within £40 of what I had bought the card for 5 months prior (card prices have jumped here in the UK recently). He was telling me he wanted to "upgrade" from his GTX 1080 Ti, and I actually convinced him he should not buy my card (or any other) in the current state of things. I couldn't have him go down the path I did over the past 4 years from the 1080 Ti to the RTX 2080 to Radeon VII to RTX 2080 -- all complete side grades and lost money on my part, the only added value I derived from them was keeping me entertained playing with new GPUs.

Should have just bit the bullet and started off with a 2080ti ;)
 
Should have just bit the bullet and started off with a 2080ti ;)

Hah, that's just about what my wife said. Every time I go to sell a piece of hardware because I'm not happy with its performance she tells me "PLEASE, next time just spend the extra money to get the better one so you're not constantly trading."

<and me on my high horse> "Hmmmfff! I'm not giving nVidia $1200 for a card that will last me 3+ years, when I can give them $1800 over 3 years for cards that are slower!!"
 
Hah, that's just about what my wife said. Every time I go to sell a piece of hardware because I'm not happy with its performance she tells me "PLEASE, next time just spend the extra money to get the better one so you're not constantly trading."

<and me on my high horse> "Hmmmfff! I'm not giving nVidia $1200 for a card that will last me 3+ years, when I can give them $1800 over 3 years for cards that are slower!!"

I got the same advice. I was doing it with trucks and cars. I kept trading every year or so because I didn't get what I wanted. Once I did, that nonsense stopped.
 
Hah, that's just about what my wife said. Every time I go to sell a piece of hardware because I'm not happy with its performance she tells me "PLEASE, next time just spend the extra money to get the better one so you're not constantly trading."

<and me on my high horse> "Hmmmfff! I'm not giving nVidia $1200 for a card that will last me 3+ years, when I can give them $1800 over 3 years for cards that are slower!!"

So true! Buy once, cry once! 😢
 
Last edited:
So true! Buy once, cry once! 😢
I never regret my purchases. Nor have the urge to upgrade in the short term. I only upgraded from a GTX1070 to a GTX1070Ti after a few months because I practically got it for free. I skipped Turing. I figured 1st gen RTX was going to be a mess and I wasn't wrong. Now its much better thanks to DLSS 2.0 so I figure Ampere will be my next upgrade. I might even go Turing if the price is right.
 
I don't think you understand the fine wine concept (as in cards get better as they age with driver improvements, not worse). I'm not even claiming that's happening. I don't think they optimize drivers for new games on older cards.

Here's a reputable site review at the time of the 2080 launch compared to the 1080Ti. A whole whopping 1% faster at 1440p AND 4K.

View attachment 255860

Here's a more recent review (2060 super) showing that the 1080Ti is now 11% behind the 2080 and even behind the 2070 Super:
View attachment 255864

You can claim fail, lies, and fuckery all you want, but show me some actual numbers.

bad comparison, apart of what Armenius which it's the correct way to do the calculations, you can't compare relative performance from one site to other, games are different, settings are different, amount of game sample aren't the same the whole testing machine is different, if you remove rainbow six siege from the techpowerup chart, "overall" the 1080Ti would be faster..
 
I got a 1060 FE at launch, ($250) upgraded to a 1070 about... 3 years ago ($200) Then I upgraded that to a 1080ti last year ($400) when the price/performance on the 20XX series absolutely did not impress me. Was $400 too much? Perhaps, but it was better than $800 for a 2080 super which was the next step up in performance.

Silly? Yeah, but my income over those years increased considerably, and I'm also now trying to keep two systems gaming capable instead of one. A very large chunk of these upgrades were due to me and my partner playing co-op Monster Hunter World.

Pretty much my experience between the founders edition 1060, buying a Ryzen 1 system also at launch, along with a first gen NVME drive has made me realize that I don't want to deal with launch of completely new platforms again. If I don't let my computer lapse into an unplayable state between major upgrades, i won't be as tempted to get the new hotness before a few patches and price drops. I'll likely upgrade to a geforce 3000 series (or Radeon 6000 or whatever its called) 6 months after the product launch after everything has been adequately tested.
 
Last edited:
Still on my GTX 1080 for this; waiting on 3090, will upgrade to that. 1440P G-sync at 144hz is fine on a 1080 right now.


This, so far, on the games that I run, anyway, my 1080ti handles them just fine. The cost/benefit just isn't there so far. Even the latest Assassins Creed ran without a hitch on 1440 @ 144 hz. I have a feeling that the 30x0 cards will be just as silly in pricing as the 20x0 series. I'd really like to see amd step up into the high-end cards for some competition to drive prices down. Then again, I'm not sure that I need 4K gaming, which would be the driving factor for me to upgrade so drastically.
 
This, so far, on the games that I run, anyway, my 1080ti handles them just fine. The cost/benefit just isn't there so far. Even the latest Assassins Creed ran without a hitch on 1440 @ 144 hz. I have a feeling that the 30x0 cards will be just as silly in pricing as the 20x0 series. I'd really like to see amd step up into the high-end cards for some competition to drive prices down. Then again, I'm not sure that I need 4K gaming, which would be the driving factor for me to upgrade so drastically.

I just picked up a 2080TI FE on good deal from the forum here; my wife needs my 1080, so that was a good excuse. Still planning on the 3XXX though as I want to WC it, and not going to track down a block for a 2XXX this close to release.

I figure I can sell the 2080TI for $600 in a few months.
 
I just picked up a 2080TI FE on good deal from the forum here; my wife needs my 1080, so that was a good excuse. Still planning on the 3XXX though as I want to WC it, and not going to track down a block for a 2XXX this close to release.

I figure I can sell the 2080TI for $600 in a few months.

I did the same a few months back. Was planning since last year to wait for Ampere. By early this year, I was casually searching for 2080 Ti's in case I found a deal... but I just couldn't pay the $1,100 - $1,300 being asked for a year+ old GPU, on the cusp on a new release. Then 'rona happened. I started to assume Ampere would be later than the May/June time frame I was hoping for. Oddly enough, it also seemed like my 1080 Ti was appreciating in value, I suspect due to people being stuck at home. Early this year they were selling in the high $300s. In late March I was able to sell it for $430 (and had a lot of offers... probably could have gotten more). Ended up scoring what I think was a good deal on a used 2080 Ti ($840 for a premium model, I could live with a $410 net to upgrade).

I purchased it with the intent to sell it again when Ampere came out, hoping it held some value. Now that I have it, I honestly don't know if I'll do that. It was a bigger upgrade than I expected... it got me over that 4K60 hump that the 1080 Ti just couldn't do anymore. While there will obviously come a time when the 2080 Ti will no longer be a 4K60 card... I question if it's going to matter if DLSS 2.0 becomes mainstream. My experience with it is limited (haven't needed it yet), but everything I've seen suggests the performance is real and the quality hit is negligible to non-existent.

CP2077 is going to be the tell, I think. I do not expect the 2080 Ti to be able to max it out at 4K60. But, if it can be done with DLSS 2.0 enabled, and really doesn't look any worse for it, that'll be enough for me to hang onto my 2080 Ti for a while.
 
We haven't hit diminishing returns. AMD's just been sucking cock and Nvidia's been dragging their feet because of it.

We're actually starving for a new GPU that can do games at 4k @ 120fps.

We're not even CLOSE to diminishing returns on any type of computer hardware.
 
I never regret my purchases. Nor have the urge to upgrade in the short term. I only upgraded from a GTX1070 to a GTX1070Ti after a few months because I practically got it for free. I skipped Turing. I figured 1st gen RTX was going to be a mess and I wasn't wrong. Now its much better thanks to DLSS 2.0 so I figure Ampere will be my next upgrade. I might even go Turing if the price is right.

I was a dummy and missed a deal to go 7700k to 9900k for little to nothing out of pocket and just a motherboard to sell.
 
We haven't hit diminishing returns. AMD's just been sucking cock and Nvidia's been dragging their feet because of it.

We're actually starving for a new GPU that can do games at 4k @ 120fps.

We're not even CLOSE to diminishing returns on any type of computer hardware.

Actually NVIDIA read the wrting on the wall years ago.
They realized that "Moore's Law" woul lead to a a point in the future where simple numbers of transistors mean "parity".
So they planned to expand their ecosystem, so they would have more features, making them an easy choice.
And it is starting to show...DLSS, RTX Voice, Ansel, Geforce Experience...and I susoect this is just the begining.
NVIDIA is the GPU market leader and unlike Intel on the CPU (process) side they kept their eye on the ball.

And the price increase per node mean the doubling of performance for the same price is a ship that has sailed long ago...some people are in for a rude awakening the next process nodes 😉
 
Actually, stuff like DLSS is snake oil.

You can't cheat on resolution. DLSS looks like blurry shit. When the entire point is visual fidelity, tech like that is a complete joke.

Even stuff like AA is a hack job. There is no substitute for simply more pixels.
 
Actually, stuff like DLSS is snake oil.

You can't cheat on resolution. DLSS looks like blurry shit. When the entire point is visual fidelity, tech like that is a complete joke.

Even stuff like AA is a hack job. There is no substitute for simply more pixels.

You seem to have fallen behind the curve:


Show me your data that validates your claims? :)
 
So...I purchased a 2070 Super after having my RX 480 for many years; games needed more. I game at an ultra wide 2k (non 144) and my rx 480 was dipping as games demanded more. It was noticeable in the game Squad on some of the larger maps filled with trees. I did not want to view the game in a lesser mode. The 2070s allows me to keep it maxed out and running in the high 90s to low 100s; buttery smooth! I was looking at going with the 5700 XT, but there were so many issues and I didn't want to wait any longer. While people state the 5700 XT is near or above the 2070 in some case, I did not want to deal with the crap drivers and messing about, I just wanted it to work, and Nvidia delivered. Last time I had an Nvidia card was my GeForce 2 MX 200, which was crap at the time. I hope AMD catches up, but I cannot deal with their driver issues any longer.
 
So...I purchased a 2070 Super after having my RX 480 for many years; games needed more. I game at an ultra wide 2k (non 144) and my rx 480 was dipping as games demanded more. It was noticeable in the game Squad on some of the larger maps filled with trees. I did not want to view the game in a lesser mode. The 2070s allows me to keep it maxed out and running in the high 90s to low 100s; buttery smooth! I was looking at going with the 5700 XT, but there were so many issues and I didn't want to wait any longer. While people state the 5700 XT is near or above the 2070 in some case, I did not want to deal with the crap drivers and messing about, I just wanted it to work, and Nvidia delivered. Last time I had an Nvidia card was my GeForce 2 MX 200, which was crap at the time. I hope AMD catches up, but I cannot deal with their driver issues any longer.

I under stand that not everyone can be pleased in this world but I read all the AMD driver hate here and then go to Best Buy on there website where the XFX 5500 XT 8Gb has 72 reviews from real owners with 67 above 4 stars out of 75 reviews .. so there is no catch up to be made really according to those people .
 
  • Like
Reactions: N4CR
like this
Actually, stuff like DLSS is snake oil.

You can't cheat on resolution. DLSS looks like blurry shit. When the entire point is visual fidelity, tech like that is a complete joke.

Even stuff like AA is a hack job. There is no substitute for simply more pixels.
Well now you know of DLSS 2.0, but I have to agree that is still a form of snake oil. It is great for a game that is supported by it, and if you happen to want to buy that particular game, but... it is hardly used by anything. I play in VR and was really hoping that this feature would have blossomed into something that would have been of great use by now. It isn't.
 
  • Like
Reactions: N4CR
like this
We haven't hit diminishing returns. AMD's just been sucking cock and Nvidia's been dragging their feet because of it.

We're actually starving for a new GPU that can do games at 4k @ 120fps.

We're not even CLOSE to diminishing returns on any type of computer hardware.
I think you misunderstand the point of this thread. They're talking about diminishing returns in terms of how much they're capable of pushing transistor counts in silicon. Meaning they're reaching a point where it's becoming increasingly more difficult to make a graphics card faster than it is now.
There is never fast enough graphics cards. That much is obvious. I can't drive triple 8k monitors at 240Hz/240fpz, so obviously I haven't hit my performance target yet.
 
What's happening in the GPU market? My last purchase was a 1080TI FE back in 3/2017 for $699 back then and I'm not seeing any compelling reason to upgrade. It looks like the AMD alternative is the RX5700 XT and Nvidia's is the RTX 2080TI. The AMD might match the 1080TI but still might be a bit behind and the 2080TI can beat it but costs $1200+ for what I consider a modest performance gain. Are we in the same type of market that we were with Intel CPU's in the years before Ryzen? No reason to upgrade until multiple generations of GPU improvements?

My 2080ti allows me to play most any game at 1440 with most of the eye candy on without worrying about my minimum frames dropping below 60fps. That’s something my old (Pascal) Titan X couldn’t come close to doing. No way could I ever go back, the difference is night and day. That being said, I likely won’t upgrade until I can get the same performance with a 4K monitor.
 
I think you misunderstand the point of this thread. They're talking about diminishing returns in terms of how much they're capable of pushing transistor counts in silicon. Meaning they're reaching a point where it's becoming increasingly more difficult to make a graphics card faster than it is now.
There is never fast enough graphics cards. That much is obvious. I can't drive triple 8k monitors at 240Hz/240fpz, so obviously I haven't hit my performance target yet.

Given how massively parallel GPUs are I don't get why they can't just make them bigger. It's not like CPUs where additional cores don't necessarily get you anything. GPUs are parallel by nature. Who gives a shit if you can't keep packing in transistors at the same form factor. Just make them bigger and slap water cooling blocks on them.

ry-3dfx-voodoo-5-6000-shown-4-way-sli-single-board.jpg
 
Given how massively parallel GPUs are I don't get why they can't just make them bigger. It's not like CPUs where additional cores don't necessarily get you anything. GPUs are parallel by nature. Who gives a shit if you can't keep packing in transistors at the same form factor. Just make them bigger and slap water cooling blocks on them.

View attachment 259049

Because computing is cheap compared to moving data...reality is more complicated than you suggest.
 
Given how massively parallel GPUs are I don't get why they can't just make them bigger. It's not like CPUs where additional cores don't necessarily get you anything. GPUs are parallel by nature. Who gives a shit if you can't keep packing in transistors at the same form factor. Just make them bigger and slap water cooling blocks on them.

View attachment 259049
Think what will need for full holographic projection without goggles. I want that card now.
 
I think you misunderstand the point of this thread. They're talking about diminishing returns in terms of how much they're capable of pushing transistor counts in silicon. Meaning they're reaching a point where it's becoming increasingly more difficult to make a graphics card faster than it is now.
There is never fast enough graphics cards. That much is obvious. I can't drive triple 8k monitors at 240Hz/240fpz, so obviously I haven't hit my performance target yet.

I just want something to play that doesn't suck.
Warzone is not my deal.
Tried Valorant and Hyperscale, meh.
Didn't like Apex.
Siege sucks.

So right now it's a moot point with 1 player games bc I go a week or 2 then I'm off it.
 
I just want something to play that doesn't suck.
Warzone is not my deal.
Tried Valorant and Hyperscale, meh.
Didn't like Apex.
Siege sucks.

So right now it's a moot point with 1 player games bc I go a week or 2 then I'm off it.
Cool. Head to the games subforum and complain about it I guess.
I personally feel there are an absurd amount of games to play. If you can’t find something to play that you like, that's on you and/or maybe you're just not that into games.

Try and stay on topic. This subforum is about GPUs. And this particular thread is about diminishing returns from silicon, in particular regarding GPUs.
 
Last edited:
That could be a whole separate issue since they had problems at first.

I don’t think that was isolated to the 2080tis but you’re right, it was probably worse give the bigger die/more components, whatever that was.
 
Back
Top