GPU - diminishing returns reached

mlcarson

Limp Gawd
Joined
Jul 12, 2009
Messages
385
What's happening in the GPU market? My last purchase was a 1080TI FE back in 3/2017 for $699 back then and I'm not seeing any compelling reason to upgrade. It looks like the AMD alternative is the RX5700 XT and Nvidia's is the RTX 2080TI. The AMD might match the 1080TI but still might be a bit behind and the 2080TI can beat it but costs $1200+ for what I consider a modest performance gain. Are we in the same type of market that we were with Intel CPU's in the years before Ryzen? No reason to upgrade until multiple generations of GPU improvements?
 
I saw it pretty much the same way, 1080TI held up good. Ended up with the 5700XT anyways due to much better support for my FreeSync II monitor. This coming generation may prove to be worth while but for me it will have to be a monitor upgrade because at 1440p the 1080Ti and 5700XT works extremely well. Unless there are must have games that needs the newer hardware with usable worthwhile features, I can see many just skipping yet another generation.
 
The RTX 2080 Ti is 20-50% faster than the GTX 1080 Ti, depending on the application/game. Whether that is worth the $1200+ price tag is subjective.

This. I had 1 1080 Ti, then 2 1080 Ti's, and when I upgraded to a solo 2080 Ti it was night and day different. Games that were struggling to hit over 80 FPS were now hitting 100-140 FPS at 4K HDR. NVIDIA did make the 2080 Ti expensive, but to say that there is diminishing returns is untrue.
 
Next gen is in the pipe, Nvidia should be at 7nm which should help and AMD has both ps5 and Xbox X driving their cause. I saw the same with 980ti, where until the 1080ti came out no reason to upgrade. There are some cool enhancements with rtx like dlss, but really just tip of the iceberg imho.
 
The wildcard will be driver support. Nvidia drops support for older generations pretty much out of the gate, so you start to see a drop in performance on newer titles that is sharper than the hardware might suggest it should be. AMD has fared better, but only because they haven’t veered away from GCN for years. Now that AMD is moving away from GCN, we’ll probably see the same pattern.

That said, there is still a lot of horsepower on the higher end cards of last generation like the 1080Ti, so notwithstanding new features, they’ll still likely be playable for a while, even if not optimal.
 
What's happening in the GPU market? My last purchase was a 1080TI FE back in 3/2017 for $699 back then and I'm not seeing any compelling reason to upgrade. It looks like the AMD alternative is the RX5700 XT and Nvidia's is the RTX 2080TI. The AMD might match the 1080TI but still might be a bit behind and the 2080TI can beat it but costs $1200+ for what I consider a modest performance gain. Are we in the same type of market that we were with Intel CPU's in the years before Ryzen? No reason to upgrade until multiple generations of GPU improvements?

Moore's Law died in like 2011. From December 2011-April 2016 we were stuck on 28nm, rather than the typical 2 year node cadence previously. So that was basically 5.5 years stuck on 28nm.

Nvidia is finally dropping a node later this year in 3rd QTR 2020, around a 4.5 year cadence. A little quicker than 28nm, but we were basically stuck for close to 5 years. Sure if felt quicker, because they kept renaming the node from 20nm to 16nm to 12 nm, but in reality the transistor size was actually the same so we were effectively on the same node the entire time.
 
The wildcard will be driver support. Nvidia drops support for older generations pretty much out of the gate, so you start to see a drop in performance on newer titles that is sharper than the hardware might suggest it should be.

How do you mean? It looks like Nvidia's latest drivers are still supporting cards that came out in 2012. Or do you mean they stop optimizing the drivers on specific games for older cards?
 
Yeah, the returns are there if you want to pay.

I have a 2080 Ti but honestly I'm not sure I will buy another card at these prices.
Just skip a gen. It's what I did with the 2000 series. I saw the performance gains and the "newness" of RTX as a tax. Not that I disagreed with the direction, I just didn't see the point yet. I got a used 1080 for $300. No complaints about the performance I'm getting. I got a good deal. I'm more inclined to purchase a 3000 now that more and more things are supporting the feature set. So, I feel my money is better spent. But, I'm gaming at 1440p. So, I'm not eying the 3080ti. Though, I might aim higher if I decide to get an Index...but I'm thinking I'll wait for the next revision of the Index.
 
How do you mean? It looks like Nvidia's latest drivers are still supporting cards that came out in 2012. Or do you mean they stop optimizing the drivers on specific games for older cards?

The latter. So for example, you’ll see a more narrow gap in performance between an RTX2060 and a GTX1080Ti on a newer title than you will on an older one.

Probably the best way to see how this evolves would be to compare the performance of a 780 to an HD7970 in newer titles. Since AMD’s core architecture still remains more or less the same to this day, owners of that card got significantly more legs from it than the 780 owners, as Kepler’s performance dropped like a stone in newer titles to the point that there was a conspiracy theory floating around that Nvidia was intentionally nerfing it’s performance to incite upgrades.

“Support” was probably a bad choice of words on my part. “Optimization” would be wiser. The implication of this is, of course, that you might end up upgrading a GPU sooner than you’d expect to based on software optimizations in newer titles. For example, if you didn’t care about RTX ray trading and Nvidia kept optimizing the 1080Ti for newer titles, you could probably get a lot more use out of the card than you may otherwise, simply because that card is a beast.
 
Yeah, the returns are there if you want to pay.

I have a 2080 Ti but honestly I'm not sure I will buy another card at these prices.

Agreed. Mine came in a prebuilt system. (I know, I know. Its my first EVER) If I had to buy stand-alone I'd probably target 2080 Super...
 
Moore's Law died in like 2011. From December 2011-April 2016 we were stuck on 28nm, rather than the typical 2 year node cadence previously. So that was basically 5.5 years stuck on 28nm.

Nvidia is finally dropping a node later this year in 3rd QTR 2020, around a 4.5 year cadence. A little quicker than 28nm, but we were basically stuck for close to 5 years. Sure if felt quicker, because they kept renaming the node from 20nm to 16nm to 12 nm, but in reality the transistor size was actually the same so we were effectively on the same node the entire time.

How did they get away with that?
 
Moore's Law died in like 2011. From December 2011-April 2016 we were stuck on 28nm, rather than the typical 2 year node cadence previously. So that was basically 5.5 years stuck on 28nm.

Nvidia is finally dropping a node later this year in 3rd QTR 2020, around a 4.5 year cadence. A little quicker than 28nm, but we were basically stuck for close to 5 years. Sure if felt quicker, because they kept renaming the node from 20nm to 16nm to 12 nm, but in reality the transistor size was actually the same so we were effectively on the same node the entire time.
TSMC's 12nm has a slightly tighter MMP, increasing density by 19.85% (around 28.2 million transistors/mm² to 33.8). The power efficiency between 20nm and 16nm was improved by 21%, even though CPP and MMP were exactly the same.
 
There is compelling reason to upgrade for people on 1440p ultrawide or 4k 144hz+ monitors. With that said, I think we are approaching a limit on the human eye's ability to appreciate higher resolutions in graphics so hopefully ray tracing gets some strong development behind it.
 
Last edited:
Yeah, 4K is pretty nice but we are approaching a limit. For example, monitors have been getting larger, but there is a limit on what is comfortable on a desk (which is probably in the 30 - 40" range).

Similarly, 4K on a 40" screen looks great. I'm sure 8K is cool, but it can't look 4 times better, at least not given the same screen size / distance and the boundaries of human perception.

I think it will be like audio. I mean, we have been using basically the same standards for decades, and while there have been improvements (like surround sound) we are still not that far away from when CDs came out in the 80's.
 
Yeah, 4K is pretty nice but we are approaching a limit. For example, monitors have been getting larger, but there is a limit on what is comfortable on a desk (which is probably in the 30 - 40" range).

Similarly, 4K on a 40" screen looks great. I'm sure 8K is cool, but it can't look 4 times better, at least not given the same screen size / distance and the boundaries of human perception.

I think it will be like audio. I mean, we have been using basically the same standards for decades, and while there have been improvements (like surround sound) we are still not that far away from when CDs came out in the 80's.

For me 32" is the perfect size since you can see the full screen without panning and it is great for productivity as well. Ultimately once we get to 32" HDR 4K 144hz I dont really see me going past that. I know these are coming this year, but not sure I can do the $3500 price tag but they should come down in price. I personally would rather game at 4K 60fps than 1440 at 144Hz but that is just me, so I need it in a 4K monitor.
 
I've actually been pretty happy with 1080p ultrawide. It seemed like a strange decision at first, but I really like high refresh rate (this monitor is 166Hz) and 4K was too slow.

I did like to look of 4K and HDR, and 60Hz was acceptable but not ideal. I agree that if they can make a 35" HDR 21:9 4K 144Hz, I will be a happy man.
 
the newer GPU's are mainly for people who game at higher resolutions...if you're still rocking 1080p then yes it doesn't offer a compelling reason to upgrade...180 fps vs 150 fps makes no difference...another reason to upgrade is ray-tracing which is about to blow up now that the next-gen consoles support it
 
What's happening in the GPU market? My last purchase was a 1080TI FE back in 3/2017 for $699 back then and I'm not seeing any compelling reason to upgrade. It looks like the AMD alternative is the RX5700 XT and Nvidia's is the RTX 2080TI. The AMD might match the 1080TI but still might be a bit behind and the 2080TI can beat it but costs $1200+ for what I consider a modest performance gain. Are we in the same type of market that we were with Intel CPU's in the years before Ryzen? No reason to upgrade until multiple generations of GPU improvements?

Correct.

AMD is not competitive at the high end. Nvidia has no need to innovate since nobody can touch them. Prices are going up due to lac of competition.

Like you mentioned, this is exactly the same thing that happened to Intel between when they released Sandy Bridge all the way through to when AMD released Ryzen.

Hopefully Intel and AMD can put release high end GPUs this year to compete with Intel.
 
Correct.

AMD is not competitive at the high end. Nvidia has no need to innovate since nobody can touch them. Prices are going up due to lac of competition.

Like you mentioned, this is exactly the same thing that happened to Intel between when they released Sandy Bridge all the way through to when AMD released Ryzen.

Hopefully Intel and AMD can put release high end GPUs this year to compete with Intel.

Unlike Intel, NVIDIA has not rested to much on their laurels...the enjoy solitude at the high end performance and their R&D is spreading (RTX Voice being the last new way to use their GPU's).
You are sadly deluding yourself if you think NVIDIA has "no need to innovate"...AMD is very much playing catchup to NIVIDA like it or not.
 
What's happening in the GPU market? My last purchase was a 1080TI FE back in 3/2017 for $699 back then and I'm not seeing any compelling reason to upgrade. It looks like the AMD alternative is the RX5700 XT and Nvidia's is the RTX 2080TI. The AMD might match the 1080TI but still might be a bit behind and the 2080TI can beat it but costs $1200+ for what I consider a modest performance gain. Are we in the same type of market that we were with Intel CPU's in the years before Ryzen? No reason to upgrade until multiple generations of GPU improvements?
Used 2080ti’s aren’t 1200 bucks.
 
I've actually been pretty happy with 1080p ultrawide. It seemed like a strange decision at first, but I really like high refresh rate (this monitor is 166Hz) and 4K was too slow.

I did like to look of 4K and HDR, and 60Hz was acceptable but not ideal. I agree that if they can make a 35" HDR 21:9 4K 144Hz, I will be a happy man.

I agree. This is why I'm very happy with 3440x1440 on a 34'' ultrawide. I had a 29'' ultrawide @ 2560x1080 and it never was enough pixels vertically. I moved to a 27'' QHD @ 2560x1440 155Hz and knew then that 3440x1440 120Hz would be a nice sweet spot that isn't all the way to 4k.
 
How do you mean? It looks like Nvidia's latest drivers are still supporting cards that came out in 2012. Or do you mean they stop optimizing the drivers on specific games for older cards?

Here's a casual observation. At launch, the 1080Ti was essentially equal to the 2080 which was disappointing at best (I cancelled my 2080 pre-order when I saw reviews). Now, a 1080Ti can barely keep up with a 2070. Why is that? Nvidia wants you to buy a 2070 instead of a used 1080Ti and isn't putting in the effort to optimize for Pascal cards.
 
Correct.

AMD is not competitive at the high end. Nvidia has no need to innovate since nobody can touch them. Prices are going up due to lac of competition.

Like you mentioned, this is exactly the same thing that happened to Intel between when they released Sandy Bridge all the way through to when AMD released Ryzen.

Hopefully Intel and AMD can put release high end GPUs this year to compete with Intel.
I don't know how you can say that considering all the new hardware and software tech NVIDIA has brought into the fold in the past 5 years while AMD keeps using the same basic IC design and trying to make it work with the graphics demands of new games. You would think the underdog would be the one trying to innovate and stand out from the market leader, but it seems like AMD are the ones resting on their laurels perfectly happy with their position in the gaming console and budget- to midtier-PC markets.
 
Still on my GTX 1080 for this; waiting on 3090, will upgrade to that. 1440P G-sync at 144hz is fine on a 1080 right now.
 
Here's a casual observation. At launch, the 1080Ti was essentially equal to the 2080 which was disappointing at best (I cancelled my 2080 pre-order when I saw reviews). Now, a 1080Ti can barely keep up with a 2070. Why is that? Nvidia wants you to buy a 2070 instead of a used 1080Ti and isn't putting in the effort to optimize for Pascal cards.

Alternative suggestion: Optimization for new hardware takes time too - so they're finding more things that they can do on the 2XXX that they couldn't on the 1XXX, and the 1000-series is already had most of its optimizations. (Also, the game devs have more time to find tricks/things they can do they couldn't before, and better tuning, etc). See: Consoles, early games vs later ones in terms of quality. Once you figure out tricks to make the hardware do more than it could before, you use those.

Now I agree Nvidia probably isn't investing time on the 1xxx series cards anymore, but there's more to that story too - multiple actors, if you will. :)
 
Here's a casual observation. At launch, the 1080Ti was essentially equal to the 2080 which was disappointing at best (I cancelled my 2080 pre-order when I saw reviews). Now, a 1080Ti can barely keep up with a 2070. Why is that? Nvidia wants you to buy a 2070 instead of a used 1080Ti and isn't putting in the effort to optimize for Pascal cards.

Stop the lies...HardOCP even did an article about how that was a lie.
Only ones still trying to use that lie are die-hard AMD fannys.
I even think Kyle & Co debunked the "Fine Wine".

More fails from you?
 
Alternative suggestion: Optimization for new hardware takes time too - so they're finding more things that they can do on the 2XXX that they couldn't on the 1XXX, and the 1000-series is already had most of its optimizations. (Also, the game devs have more time to find tricks/things they can do they couldn't before, and better tuning, etc). See: Consoles, early games vs later ones in terms of quality. Once you figure out tricks to make the hardware do more than it could before, you use those.

Now I agree Nvidia probably isn't investing time on the 1xxx series cards anymore, but there's more to that story too - multiple actors, if you will. :)

Turing changed (once again) how many CUDA cores there are in a SM...but he doesn't care for facts, he is posting lies as an argument...that should give you a hint about the level of fuckery involved ;)
 
I personally think we get performance increase %'s that are fine since the 2080ti was 25-45% faster depending on Hz, plus RT, DLSS, ect. The price left something to be desired but was understandable given the die was ~50% larger.

What I would agree with is that the graphical fidelity of games have hit diminishing returns. For the first time I basically have no desire to upgrade. I'm selling my main rig's 2080ti because I expect something fancy with Ampere, but it's more for a lateral swap for funsies than a desire for more performance.
 
Stop the lies...HardOCP even did an article about how that was a lie.
Only ones still trying to use that lie are die-hard AMD fannys.
I even think Kyle & Co debunked the "Fine Wine".

More fails from you?

I don't think you understand the fine wine concept (as in cards get better as they age with driver improvements, not worse). I'm not even claiming that's happening. I don't think they optimize drivers for new games on older cards.

Here's a reputable site review at the time of the 2080 launch compared to the 1080Ti. A whole whopping 1% faster at 1440p AND 4K.

1440p_2080_1080Ti.png

Here's a more recent review (2060 super) showing that the 1080Ti is now 11% behind the 2080 and even behind the 2070 Super:
relative-performance_2560-1440.png

You can claim fail, lies, and fuckery all you want, but show me some actual numbers.
 
Last edited:
I don't think you understand the fine wine concept (as in cards get better as they age with driver improvements, not worse). I'm not even claiming that's happening. I don't think they optimize drivers for new games on older cards.

Here's a reputable site review at the time of the 2080 launch compared to the 1080Ti. A whole whopping 1% faster at 1440p AND 4K.

View attachment 255860

Here's a more recent review (2060 super) showing that the 1080Ti is now 11% behind the 2080 and even behind the 2070 Super:
View attachment 255864

You can claim fail, lies, and fuckery all you want, but show me some actual numbers.

In Nvidias defense they've released multiple architectures in the time span that AMD has stagnated and kept polishing same GCN turd. Most people don't stay at the same company for a long time and I doubt there's a whole lot of institutional knowledge still around to tweak older architectures. Its unlikely malice and more likely just lack of competence.
 
I don't think you understand the fine wine concept (as in cards get better as they age with driver improvements, not worse). I'm not even claiming that's happening. I don't think they optimize drivers for new games on older cards.

Here's a reputable site review at the time of the 2080 launch compared to the 1080Ti. A whole whopping 1% faster at 1440p AND 4K.

View attachment 255860

Here's a more recent review (2060 super) showing that the 1080Ti is now 11% behind the 2080 and even behind the 2070 Super:
View attachment 255864

You can claim fail, lies, and fuckery all you want, but show me some actual numbers.
You're reading the chart wrong.

The original 2080 review from TPU shows the 1080 Ti is (92 - 100)/100 = 8% slower.

The 2060 Super review shows the 1080 Ti is (113 - 122)/122 = 7.4% slower.

Think of it in terms of FPS. If we converted the 100% of the 2060 Super to, say 70 FPS:
  • 1080 Ti = 113% of a 2060 Super = 70 * 1.13 = 79.1 FPS
  • 2080 = 122% of a 2060 Super = 70 * 1.22 = 85.4 FPS
  • (79.1 - 85.4)/85.4 = -7.4% (-0.0737704918)
You can do the above calculation with literally any base number for the 2060 Super and come up with the same 7.4%, technically meaning the 1080 Ti got faster compared to the 2080 over time at 1440p.
 
No, we haven't hit the diminishing returns for GPUs. The GTX 1080 Ti just squandered the reason to upgrade for many users for longer than they're used to. I normally upgrade my GPU every 1-2 years, but this 1080 Ti has been trucking along for 3 years without breaking a sweat. Its held up exceptionally well both performance and value wise especially compared to the next generation cards that released at roughly the same price. The best part is I'd guess it still has another 1-2 years of running new games at 1440p 60+ FPS without breaking much of a sweat, especially if you don't care about ray-tracing.
 
Welcome to why so many people were disappointed with Turing. It felt like a sidegrade, where you essentially get equivalent price/performance to Pascal while gaining RTX support and the NVENC encoder. To get a compelling upgrade over a pascal part, you had to spend significantly more money. When the SUPER lines were released in response to Navi finally being announced, price to performance improved in the upper-midrange to high-end, but nothing really changed for anyone that already had a 1080ti.

I will give credit to the 1660 as it was a decent buy when it came out, and the 1660 super was even better. The lower-end to midrange card market had been weird for quite some time before that, as we basically just had different versions of AMD's rx 470/480 with gradual price drops as the cards to buy.
 
NVENC is pretty cool though the streamer kids don't need capture cards afaik with it.
 
I find it really funny when people invest on a high end card to last for a long time, yet complain when nothing else comes out that makes them want to upgrade. :D:D:rolleyes::rolleyes:

OP you did fine, your purchase lasted a whole new generation, now get ready for Ampere/Big Navi.
 
I stayed with a x58 x5660 and a 290x for a long time as to buy my time and going Ryzen 1000 series payed off as to have a cpu to flash my older boards and I chose the RX 5700 as to be the performance of two 290x 's at 150 watts for $329 paired with a 65 watt Ryzen 3600 is a sweet package ..

It's free to clock itself up to 4.4Ghz as it's auto overclocking under AIO https://valid.x86.fr/bench/imttyy/1

The board cost me $62 new and the memory it just mixed leftovers
 
Last edited:
Back
Top