Is there any evidence Nvidia cripples performance on old GPUs?

biggles

2[H]4U
Joined
Jul 25, 2005
Messages
2,215
I have read a few articles on this that said no. But this week I was trying to get Forza 6 Apex running without hiccups and micro-stutter on main rig (see sig). Eventually determined it would play smoothly at 1080p but only with AA disabled and vsync off. Aliasing is pretty ugly on a 27" 1080p monitor, and the tearing is noticeable.

Next I tried Forza on laptop (Core i7 6820k, 32 gb ram, nvidia 970m 3 gb). Set to high and dynamic it ran great. The 970m is a newer chipset but a weaker gpu vs the gtx 780 (Maxwell vs Kepler). Also, this is a MSI laptop with gsync so the tearing was completely gone. In short, a superior gameplay experience on the less powerful laptop.
 
Same driver version?

I ask only because the latest drivers (the ones they put out for Mordor) completely cripple my FPS in everything and its like G-Sync isn't even on when it is.

Off the top of my head, a driver rollback or clean install would be my first thing to try probably, but I'm not the best when it comes to diagnosing problems. No patience for it. :p
 
No, there is no evidence. Nvidia doesn't seem to cripple performance on old cards so much as they simply don't spend the same time and effort optimizing them as they do their current cards. You also don't get "magical" extra performance out of the cards a few months after launch like you do from AMD cards.
 


Someone would have to show that the same game decreases in performance, with the same card over time, in order to prove this.
 
I would chalk all that up to driver versions. Just because a driver is newer doesn't mean it's better - this is more so for older hardware.
Personally, when I have a driver version that works I don't usually upgrade unless I have reason to. Others like to upgrade their drivers as soon as they come out and have good luck, so I dunno.
 
Dunno about evidence, but anecdotally totally seems so. Geforce experience definitely nerfs "recommended" settings well below actual capacity on old hardware.
 
Dunno about evidence, but anecdotally totally seems so. Geforce experience definitely nerfs "recommended" settings well below actual capacity on old hardware.

That isn't a nerf, not remotely. You aren't forced to use those settings. The recommended GFE settings are almost always below what the hardware is capable of.
 
if you were Nvidia or AMD would you want a 3+ year old GPU to perform well enough that people would not want to buy a newer one?
 
No, but going out of your way to actually hamper the performance of older cards (like running worse in a newer version of driver than an older one) would be both

1. Stupid PR loss, since everyone will hate that company for it.
2. Fruitless. It's not like older versions of any driver will magically disappear when a new version comes out, not even illuminati has THAT kind of a pull, let alone a company like nVidia or Intel, which are not even on the same level as say google.
 
Well at this point it is anecdotal. The articles on this do not show older cards performing worse. And this is the first case I have uncovered. May also be that Forza 6 was somehow not optimized for Kepler and is optimized for Maxwell. Just a guess though.
 
That isn't a nerf, not remotely. You aren't forced to use those settings. The recommended GFE settings are almost always below what the hardware is capable of.
Don't be a "person of different abilities", you get what I mean.
 
Last edited:
I have certainly experienced this too. It's simply the driver versions. They don't optimize for the older generations.
 
No, we actually have evidence to the contrary. NVIDIA simply doesn't continue to optimize older hardware after it has been superseded by newer hardware.
 
That isn't a nerf, not remotely. You aren't forced to use those settings. The recommended GFE settings are almost always below what the hardware is capable of.

It's also in what people value. For example I find that GFE is far too aggressive on settings. It'll say I can turn up AA to levels that I find unacceptable. Why? Because I demand 60fps solid performance, not average, but I basically never want to drop below it at any time. That's the standard I like, and what I optimize for. Other people are the opposite, they can deal with sub 30 fps so long as everything is pretty. They'll max graphics right up to the line of a game becoming unplayable.

nVidia's recommendations are what they feel is the best experience. People will disagree with them and that's ok.
 
It's also in what people value. For example I find that GFE is far too aggressive on settings. It'll say I can turn up AA to levels that I find unacceptable. Why? Because I demand 60fps solid performance, not average, but I basically never want to drop below it at any time. That's the standard I like, and what I optimize for. Other people are the opposite, they can deal with sub 30 fps so long as everything is pretty. They'll max graphics right up to the line of a game becoming unplayable.

nVidia's recommendations are what they feel is the best experience. People will disagree with them and that's ok.

Yep. GFE can be a good starting point, but it's rarely a one-size fits all solution.
 
Yep. GFE can be a good starting point, but it's rarely a one-size fits all solution.

I think it is mostly designed for people who don't know what they are doing. You like games but aren't a computer person? Ok, click here and we'll give you a good experience. For those of us that are fanatics, well nVidia often publishes great guides on their site that cover each and every setting, what the visual impact is, and what the performance impact is.
 
Does GFE work with games from the Windows Store? I was under the impression that it will not.
 
Dunno about evidence, but anecdotally totally seems so. Geforce experience definitely nerfs "recommended" settings well below actual capacity on old hardware.

Anecdotal but GeForce experience made my 980TI sc run SC2 in 4k ultra (which obviously felt slightly off which is why I noticed) vs my friend's 1070 that ran at 1080p ultra. Keep in mind I have only a 1080p monitor right now which makes this setting even weirder.
 
Anecdotal but GeForce experience made my 980TI sc run SC2 in 4k ultra (which obviously felt slightly off which is why I noticed) vs my friend's 1070 that ran at 1080p ultra. Keep in mind I have only a 1080p monitor right now which makes this setting even weirder.
If you have DSR enabled in the control panel it will use those resolutions should their gameplay experience show good results. You can change the resolution it optimizes for by changing the settings for that game.
 
Having spent enough hours troubleshooting this I am not going to test the following: maybe the performance difference is due to vsync and/or anti-aliasing being handled in-game on the desktop and in Nvidia software on the laptop. Perhaps if I had Nvidia software on the desktop control those that would have fixed it. That is the frustrating part of PC games, you are constantly testing different combinations of settings to get a game to run right. Instead of enjoying that time playing the game.
 
Aha, ram upgrade from 8 gb to 16 gb this week. Game runs just fine now. Suggest anybody playing this to NOT play with suggested minimum of 8 gb ram, lots of stutter otherwise.
 
Nice necro thread...

But I should add:

Nvidia does not cripple old cards per se, rather they simply stop all optimisation for the performance of a series of cards the exact second a new series is released. AMD, on the other hand keeps updating and optimising cards even when new ones come out. This is why the AMD 7970 is faster than a 780 Ti in many newer games, even though it gets left in the dust by a 680 in older titles...

Edit: my friend and I have a running joke that the shift in customer service with Nvidia's driver focus is so immediate that a conversation with a service representative on the day of release of a new series of Geforce cards may go a bit like:

"Oh that is no good, your GTX 1080 should be performing much better, make sure you download the recent driver, and send me a DXDIAG of your setup so I can work out the..." *that exact second a new series of cards are announced by Jen Hsun at a conference* "... I'm sorry sir we can't promise newer games are going to run well on legacy hardware, we suggest upgrading to the new, modern series. "
 
Last edited:
I can't stand the typical settings the GeForce experience defaults to. It's usually nothing but jacked up details and framerates in the 30's or 40's.
There should be some sort of setting where you can say that "I want 30/60/120fps no matter what" and roll with that.
 
Back
Top