Nvidia Nerfing 10-Series With New Drivers?

Brent_Justice

Moderator
Joined
Apr 17, 2000
Messages
17,755
Note I am not saying they are, but this video brings up the question

You know this question would come up, it always does, this video is interesting, if a bit incomplete.

I will have to do my own testing. We are starting our full 2080 Ti review now. I was planning to use the newest driver for 10 series testing as well, to keep the driver version the same, so when I get to it I'll do a little comparison for my own sake, to see, really interested in the Shadow of the Tomb Raider outlier.

 
Brought to you by the company who charged 40% more for the replacement products...

40% more? What data points are you using? I imagine you are talking about the RTX cards which realize the 2080 is a significantly larger die than the 1080ti. That said I personally think RTX was a mistake. Straight up CUDA cores would have made more $$$ at least in the near term.

On topic I am really curious of [H]’s findings. I often reference the work [H] did on “fine wine” for both nVidia and AMD.
 
Guess it is a good thing I finished Shadow of the Tomb Raider before the new drivers then?
 
40% more? What data points are you using? I imagine you are talking about the RTX cards which realize the 2080 is a significantly larger die than the 1080ti. That said I personally think RTX was a mistake. Straight up CUDA cores would have made more $$$ at least in the near term.

On topic I am really curious of [H]’s findings. I often reference the work [H] did on “fine wine” for both nVidia and AMD.

We definitely do and will cover driver over time performance, from launch driver to current drivers. It will be interesting to see how performance compares from launch driver, to say a driver a year from now.
 
This guy is just dumb.

His own tests in new games show a drop off of 3-8 fps in newer games. Shadow of the Tomb Raider was a full 5-7 frames at 4K! That's a big drop at 4k where every frame counts. I can't get above 44fps in it's own benchmark with a 1080Ti, yet this guy shows over 50? Yes certain games were better, but the ones most ppl are playing is a clear drop in performance. I hope more writers/vloggers show this and call them out for it.

Yeah they nerfed it. I just realized I am on the 411.70 with my 44fps. Going to drop back to the 399 driver to see what I am at in SotTR.
 
  • Like
Reactions: N4CR
like this
40% more? What data points are you using? I imagine you are talking about the RTX cards which realize the 2080 is a significantly larger die than the 1080ti. That said I personally think RTX was a mistake. Straight up CUDA cores would have made more $$$ at least in the near term.

On topic I am really curious of [H]’s findings. I often reference the work [H] did on “fine wine” for both nVidia and AMD.

You're right...I meant to say 72% more. Data points: 1080Ti FE = $699; 2080Ti FE = $1199, so 72% more money.

I don't mind extra RTX features as essentially a preview of the future at this point. I absolutely mind a 72% higher price point.
 
This doesn't make much sense. But neither did apple nerfing battery life.

For any currently released title, just use the older drive if you felt Nvidia was nerfing anything.

It's not like Nvidia disables all older drivers driv a new driver is released.
 
Clickbait farming using 5 year-old FUD?

Y8SqjWuohk8Rq.gif
 
I did a test between the 399 and 411.70. 5 passes on each. DDU used to clean drivers. j No Geforce Experience, just drivers. In Shadow of the Tomb Raider the results were exactly the same.
EVGA 1080Ti Black Edition 2025Mhz core, memory 5700mhz on both drivers.

E1kvQk.jpg

kYqx2e.jpg

NHMuwo.jpg


I know it is only one test but this is one the most popular games out. If they didn't nerf it in this, why would they on others? Off to Physical Therapy, have fun.

P.S. the cpu is actually a slight overclock to 4200mhz, it only shows 4000. CPU has been a bad overclocker from day one. Out of all the cpu's over the last 25 years that I have had, this one is horrible.
 
Last edited:
You always see this every card generation. I don't think Nvidia would be that stupid to put themselves in hot water again like they've done before for fucky drivers. They know their audience will data mine the shit out of every driver release. Any small discrepancy could be attributed to any detail change or slight rendering adjustments.
 
You always see this every card generation. I don't think Nvidia would be that stupid to put themselves in hot water again like they've done before for fucky drivers. They know their audience will data mine the shit out of every driver release. Any small discrepancy could be attributed to any detail change or slight rendering adjustments.
If you look on Reddit every driver release has the same user comparing benchmark results of the new driver to the older one. That is how neurotic they are.

Focusing improvements on the newest hardware is not "nerfing" (god, I hate that term...) the older hardware.
 
this kind of test due the nature of modern gpus reaction to temperature should be done under a ambient controlled environment not in the mom's kitchen.. things we don't know and are most important:

1. how many times was made each test, how repeatable were the results.

2. clocks and temperatures of GPU (most of the changes in performance comes from here always). testing in the day vs testing in the night or even across the whole day provide different results with the temperature variations. (most nerdy youtubers never take this in consideration and never are those numbers present)

3. canned benchmark (which I think it is in this case) or real world testing..

4. system used.

5. aaaand last but not least important, margin of error..
 
this kind of test due the nature of modern gpus reaction to temperature should be done under a ambient controlled environment not in the mom's kitchen.. things we don't know and are most important:

1. how many times was made each test, how repeatable were the results.

2. clocks and temperatures of GPU (most of the changes in performance comes from here always). testing in the day vs testing in the night or even across the whole day provide different results with the temperature variations. (most nerdy youtubers never take this in consideration and never are those numbers present)

3. canned benchmark (which I think it is in this case) or real world testing..

4. system used.

5. aaaand last but not least important, margin of error..
I was just thinking that the test conditions need to be controlled, which none of these video "reviewers" seem to do. Especially with Windows 10 these days. Casually installing the newest driver to the PC you use every day isn't going to be sufficient.
 
There are reports and confirmation over at Primegrid that the 411 drivers are causing computation errors on some subprojects and 30% slowdowns on others vs. 399 on Pascal and Maxwell cards.
 
Well my test conditions are:

Not Mom's Kitchen
My Home Theater with an Epson HC4000 super heater creator 5000
Generally stays 70-74 degrees F
Had to turn fan on at end of testing as I started sweating, sitting.
Tests were done 5 times on each driver with a clean install using DDU

Margin of error? Literally 44fps every time. LITERALLY. -+1% from switching from guacamole to spinach dip between tests margin of error.
 
Well my test conditions are:

Not Mom's Kitchen
My Home Theater with an Epson HC4000 super heater creator 5000
Generally stays 70-74 degrees F
Had to turn fan on at end of testing as I started sweating, sitting.
Tests were done 5 times on each driver with a clean install using DDU

Margin of error? Literally 44fps every time. LITERALLY. -+1% from switching from guacamole to spinach dip between tests margin of error.

Lol hahaha I believe more on your results than the guy in the OP video...
 
Lol hahaha I believe more on your results than the guy in the OP video...
Especially Snowbeast probably did a better job controlling the variables to prevent major fluctuations in clock speed from GPU or CPU. Normally these people doing these kind of testing at pretty bad at it since most of them don't bother setting a fixed clock speed for GPU and/or CPU and running fan at max.
 
Yeah if I had time to do more I would. Because real world, my equipment is in an enclosed case, not an open workbench.

I just don't have patience like Brent and Kyle. Doing the same task over and over just get nerve wrecking to me.
My hats off to them being able to do everything really in depth.
 
I saw this yesterday and I was going to send this to Kyle but every time I bring up something like this I get blasted in the forum.

My main question is, did the Nvidia launch driver have compatibility for both 10 series and 20 series? I see that most used 399.24 for the 10 series which seems to show the two GPUs in a fair light. But looking at 411.63 it shows what Nvidia is capable of doing to the 10 series. The only real product that was affected was the 1080ti. The 1070ti remained the same which Nvidia knew reviewers were going to pit it against the 1080ti on every RTX card.
 
40% more? What data points are you using? I imagine you are talking about the RTX cards which realize the 2080 is a significantly larger die than the 1080ti. That said I personally think RTX was a mistake. Straight up CUDA cores would have made more $$$ at least in the near term.

On topic I am really curious of [H]’s findings. I often reference the work [H] did on “fine wine” for both nVidia and AMD.

I disagree. Turing is a game changer. The only way to advance that R&D is to toss it (Tensor cores) on a product and sell a few hundred thousand. Then, the next series cards will have a very significant hike in performance across the board. Someone has to pay for that tech advancement though....that would be us.
 
FFS we see this every time NVidia drops a major driver update and JUST when we stopped seeing ignorant posts of people claiming NVidia nerfs their older GPU's....

NVidia does a lot of shady and shitty stuff, but lets not make up stupid shit just because we hate the company. Besides, even if they get a game or two OFF doesn't mean they intentionally borked your product, it could also be an oversight that will be patched later. If so then AMD "nerfed" all my other cards more often....

I'm not at my 1080 (non-Ti) with a 1700X, I'd test it in a few games as well.
 
Nvidia drivers for older gen cards falls off bad. Nothing new here.

Historically this has been proven categorically false, as seen on such websites as [H] ( https://m.hardocp.com/article/2017/02/08/nvidia_video_card_driver_performance_review )

From the incredibly small sample size of this current gripe (one driver), we have virtually no ability to show anything of merit. If an issue is identified, it will be remedied. They are still selling the 10xx series as a full lineup in their product stack. There is virtually nothing to be gained by torching the performance on their hi-margin (highest yielding) product segment. However, feel free to continue spouting uniformed slander, it is the internet after all right?
 
Historically this has been proven categorically false, as seen on such websites as [H] ( https://m.hardocp.com/article/2017/02/08/nvidia_video_card_driver_performance_review )

From the incredibly small sample size of this current gripe (one driver), we have virtually no ability to show anything of merit. If an issue is identified, it will be remedied. They are still selling the 10xx series as a full lineup in their product stack. There is virtually nothing to be gained by torching the performance on their hi-margin (highest yielding) product segment. However, feel free to continue spouting uniformed slander, it is the internet after all right?
You have google, see how AMD cards catch up to Nvidia offerings that were far faster as time passes, in newly released games. That is what I am referring to not some thing along the lines of you lose 20% fps in a game that has been out, after time passes.
Here is a quote form that [H] article:
"Looking toward the AMD GPUs in Fallout 4 Page we find that the GeForce GTX 980 Ti is pulling ahead a bit farther than the AMD Radeon R9 Fury X in this game. However, once again the AMD Radeon R9 Fury X is the most improved in terms of performance update impact with new drivers. We also noticed the AMD Radeon R9 Fury X, and RX 480, hold up better using the "Ultra" Godrays."
 
Historically this has been proven categorically false, as seen on such websites as [H] ( https://m.hardocp.com/article/2017/02/08/nvidia_video_card_driver_performance_review )

From the incredibly small sample size of this current gripe (one driver), we have virtually no ability to show anything of merit. If an issue is identified, it will be remedied. They are still selling the 10xx series as a full lineup in their product stack. There is virtually nothing to be gained by torching the performance on their hi-margin (highest yielding) product segment. However, feel free to continue spouting uniformed slander, it is the internet after all right?

And it starts a new....
 
  • Like
Reactions: filip
like this
You have google, see how AMD cards catch up to Nvidia offerings that were far faster as time passes, in newly released games. That is what I am referring to not some thing along the lines of you lose 20% fps in a game that has been out, after time passes.
Here is a quote form that [H] article:
"Looking toward the AMD GPUs in Fallout 4 Page we find that the GeForce GTX 980 Ti is pulling ahead a bit farther than the AMD Radeon R9 Fury X in this game. However, once again the AMD Radeon R9 Fury X is the most improved in terms of performance update impact with new drivers. We also noticed the AMD Radeon R9 Fury X, and RX 480, hold up better using the "Ultra" Godrays."

What does AMD cards getting faster over time have to do with nVidia?
 
What does AMD cards getting faster over time have to do with nVidia?
I mean what dose it not have to do with it. You can buy two types of cards, AMD and Nvidia. If AMD scales better over time they have a better product or have better drivers released for older cards. However, Nvidia is not releasing optimized drivers for their older cards and in this case we use the other manufacture to compare the falloff of the cards over time as new games are released.
 
Here is a quote form that [H] article:
"Looking toward the AMD GPUs in Fallout 4 Page we find that the GeForce GTX 980 Ti is pulling ahead a bit farther than the AMD Radeon R9 Fury X in this game. However, once again the AMD Radeon R9 Fury X is the most improved in terms of performance update impact with new drivers. We also noticed the AMD Radeon R9 Fury X, and RX 480, hold up better using the "Ultra" Godrays."

So you are going to use an AMD observation to determine Nvidia is nerfing their older products via driver updates? You do know that doesn't make any sense right? Why are you still arguing this point? It's okay to make a mistake, just learn from it and move on.
 
So you are going to use an AMD observation to determine Nvidia is nerfing their older products via driver updates? You do know that doesn't make any sense right? Why are you still arguing this point? It's okay to make a mistake, just learn from it and move on.
Come on man, as an example: if the Nvidia card was 20% faster than the AMD card when it came out and then over time that lead fades to 5% or even goes in the favor of AMD, what is that? Did AMD come to my house and upgrade the card while I was sleeping like some kind of tooth fairy. The divers have fallen off on the Nvidia side compared to the competition. You have to compare Nvidia to something. I mean how else can this be seen? If there was only Nvidia in the market I would agree with you but that is not how it is.
And I am not saying they are nerfing their cards, I am saying they stop optimizing drivers for older cards.
 
...or it could mean AMD driver team is badly underfunded and it takes them years to get the most out of their hardware, while nVidia has superior optimization from the beginning. This seems to me a far simpler and more plausible explanation over "nVidia gimps their old hardware" conspiracy theories.

You have Google, look up the term "Occam's razor"...
 
I mean what dose it not have to do with it. You can buy two types of cards, AMD and Nvidia. If AMD scales better over time they have a better product or have better drivers released for older cards. However, Nvidia is not releasing optimized drivers for their older cards and in this case we use the other manufacture to compare the falloff of the cards over time as new games are released.

don't be silly and of such kind of ignorant people... there's a reason why AMD cards "age exceptionally well" overtime, and it's called GCN.. they has been using GCN since HD7000 series, it went to R9 200, R9 300, R9 Fury, RX 400/500 RX VEGA, the main architecture is still being shared commonly between GPUs, so most driver and software optimizations being made for newer GPUS are also applied to older GPUS back to the HD7000 series, it's that hard to understand?.

Same scenario it's actually happening with nvidia since Maxwell where it share A LOT of the architecture with Pascal and Now with Volta/Turing.. so we can expect at least this generation of GPUS to keep "passively" improving overtime, that's the reason why GTX 900 series are still relevant today and will be that way until Nvidia make a noticeable architecture jump as it did from Fermi to Kepler and from Kepler to Maxwell, same scenario will apply to AMD, as soon as they make a noticeable architecture jump they will forget about older generations of GPUs, which has been happening since polaris, they focused in Polaris ignoring the much stronger R9 Fury and Fury X to the point where in modern tittles they tend to be neck to neck, AMD realized they need to keep people upgrading and save money in the same time and in order to achieve that they need to focus always in the newer tech that help to save money and that help to make newer cards more appealing to their own customer. is not hard to understand or it is?
 
...or it could mean AMD driver team is badly underfunded and it takes them years to get the most out of their hardware, while nVidia has superior optimization of their newest hardware. This seems to me a far simpler and more plausible explanation over "nVidia gimps their old hardware" conspiracy theories.
I mean if AMD driver team is badly underfunded and they some how manage to get the cards to be better over time how could a company that is not underfunded not do the same thing. I do see what you are saying though but I find it a little hard to believe that Nvidia has near 100% optimization on launch and can't do anything else to improve the drivers as time passes.
 
don't be silly and of such kind of ignorant people... there's a reason why AMD cards "age exceptionally well" overtime, and it's called GCN.. they has been using GCN since HD7000 series, it went to R9 200, R9 300, R9 Fury, RX 400/500 RX VEGA, the main architecture is still being shared commonly between GPUs, so most driver and software optimizations being made for newer GPUS are also applied to older GPUS back to the HD7000 series, it's that hard to understand?.

Same scenario it's actually happening with nvidia since Maxwell where it share A LOT of the architecture with Pascal and Now with Volta/Turing.. so we can expect at least this generation of GPUS to keep "passively" improving overtime, that's the reason why GTX 900 series are still relevant today and will be that way until Nvidia make a noticeable architecture jump as it did from Fermi to Kepler and from Kepler to Maxwell, same scenario will apply to AMD, as soon as they make a noticeable architecture jump they will forget about older generations of GPUs, which has been happening since polaris, they focused in Polaris ignoring the much stronger R9 Fury and Fury X to the point where in modern tittles they tend to be neck to neck, AMD realized they need to keep people upgrading and save money in the same time and in order to achieve that they need to focus always in the newer tech that help to save money and that help to make newer cards more appealing to their own customer. is not hard to understand or it is?
Ok, thanks for proving my point Nvidia does not optimize drivers for old cards.
 
Ok, thanks for proving my point Nvidia does not optimize drivers for old cards.

oh so you has been negligent ignoring the difference between driver optimization for current tech and nerfing older cards?.. awesome, I know what kind of people you are now to save some answer later, nothing can save blindless ignorance.
 
oh so you has been negligent ignoring the difference between driver optimization for current tech and nerfing older cards?.. awesome, I know what kind of people you are now to save some answer later, nothing can save blindless ignorance.
Read my other post, I am not saying they are nurfing cards, I never said that. All i am saying is that Nvidia does not optimize drivers for old cards and that is why they fall off as time passes. Blind ignorance has a new meaning when you cant even read what is being said by someone.
 
Back
Top