Nvidia Nerfing 10-Series With New Drivers?

Discussion in 'nVidia Flavor' started by Brent_Justice, Oct 1, 2018.

  1. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,811
    Joined:
    Apr 17, 2000
    Note I am not saying they are, but this video brings up the question

    You know this question would come up, it always does, this video is interesting, if a bit incomplete.

    I will have to do my own testing. We are starting our full 2080 Ti review now. I was planning to use the newest driver for 10 series testing as well, to keep the driver version the same, so when I get to it I'll do a little comparison for my own sake, to see, really interested in the Shadow of the Tomb Raider outlier.

     
    IdiotInCharge and nEo717 like this.
  2. kirbyrj

    kirbyrj Why oh why didn't I take the BLUE pill?

    Messages:
    23,344
    Joined:
    Feb 1, 2005
    Brought to you by the company who charged 40% more for the replacement products...
     
  3. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    6,840
    Joined:
    Feb 22, 2012
    40% more? What data points are you using? I imagine you are talking about the RTX cards which realize the 2080 is a significantly larger die than the 1080ti. That said I personally think RTX was a mistake. Straight up CUDA cores would have made more $$$ at least in the near term.

    On topic I am really curious of [H]’s findings. I often reference the work [H] did on “fine wine” for both nVidia and AMD.
     
    Araxie likes this.
  4. M76

    M76 [H]ardness Supreme

    Messages:
    8,146
    Joined:
    Jun 12, 2012
    Guess it is a good thing I finished Shadow of the Tomb Raider before the new drivers then?
     
  5. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,811
    Joined:
    Apr 17, 2000
    We definitely do and will cover driver over time performance, from launch driver to current drivers. It will be interesting to see how performance compares from launch driver, to say a driver a year from now.
     
  6. SnowBeast

    SnowBeast [H]ard|Gawd

    Messages:
    1,200
    Joined:
    Aug 3, 2003
    This guy is just dumb.

    His own tests in new games show a drop off of 3-8 fps in newer games. Shadow of the Tomb Raider was a full 5-7 frames at 4K! That's a big drop at 4k where every frame counts. I can't get above 44fps in it's own benchmark with a 1080Ti, yet this guy shows over 50? Yes certain games were better, but the ones most ppl are playing is a clear drop in performance. I hope more writers/vloggers show this and call them out for it.

    Yeah they nerfed it. I just realized I am on the 411.70 with my 44fps. Going to drop back to the 399 driver to see what I am at in SotTR.
     
    N4CR likes this.
  7. kirbyrj

    kirbyrj Why oh why didn't I take the BLUE pill?

    Messages:
    23,344
    Joined:
    Feb 1, 2005
    You're right...I meant to say 72% more. Data points: 1080Ti FE = $699; 2080Ti FE = $1199, so 72% more money.

    I don't mind extra RTX features as essentially a preview of the future at this point. I absolutely mind a 72% higher price point.
     
    Dayaks likes this.
  8. jmilcher

    jmilcher 2[H]4U

    Messages:
    3,900
    Joined:
    Feb 3, 2008
    This doesn't make much sense. But neither did apple nerfing battery life.

    For any currently released title, just use the older drive if you felt Nvidia was nerfing anything.

    It's not like Nvidia disables all older drivers driv a new driver is released.
     
  9. Armenius

    Armenius [H]ardForum Junkie

    Messages:
    16,243
    Joined:
    Jan 28, 2014
    Clickbait farming using 5 year-old FUD?

    Y8SqjWuohk8Rq.gif
     
    Mav451, Trimlock and Araxie like this.
  10. SnowBeast

    SnowBeast [H]ard|Gawd

    Messages:
    1,200
    Joined:
    Aug 3, 2003
    I did a test between the 399 and 411.70. 5 passes on each. DDU used to clean drivers. j No Geforce Experience, just drivers. In Shadow of the Tomb Raider the results were exactly the same.
    EVGA 1080Ti Black Edition 2025Mhz core, memory 5700mhz on both drivers.

    E1kvQk.jpg
    kYqx2e.jpg
    NHMuwo.jpg

    I know it is only one test but this is one the most popular games out. If they didn't nerf it in this, why would they on others? Off to Physical Therapy, have fun.

    P.S. the cpu is actually a slight overclock to 4200mhz, it only shows 4000. CPU has been a bad overclocker from day one. Out of all the cpu's over the last 25 years that I have had, this one is horrible.
     
    Last edited: Oct 1, 2018
    DCookSta, GoodBoy, N4CR and 15 others like this.
  11. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,811
    Joined:
    Apr 17, 2000
    Thanks for the test!
     
    SnowBeast likes this.
  12. NickJames

    NickJames Viagra Required

    Messages:
    6,494
    Joined:
    Apr 28, 2009
    You always see this every card generation. I don't think Nvidia would be that stupid to put themselves in hot water again like they've done before for fucky drivers. They know their audience will data mine the shit out of every driver release. Any small discrepancy could be attributed to any detail change or slight rendering adjustments.
     
    Armenius likes this.
  13. Armenius

    Armenius [H]ardForum Junkie

    Messages:
    16,243
    Joined:
    Jan 28, 2014
    If you look on Reddit every driver release has the same user comparing benchmark results of the new driver to the older one. That is how neurotic they are.

    Focusing improvements on the newest hardware is not "nerfing" (god, I hate that term...) the older hardware.
     
    SnowBeast and Araxie like this.
  14. Araxie

    Araxie [H]ardness Supreme

    Messages:
    6,303
    Joined:
    Feb 11, 2013
    this kind of test due the nature of modern gpus reaction to temperature should be done under a ambient controlled environment not in the mom's kitchen.. things we don't know and are most important:

    1. how many times was made each test, how repeatable were the results.

    2. clocks and temperatures of GPU (most of the changes in performance comes from here always). testing in the day vs testing in the night or even across the whole day provide different results with the temperature variations. (most nerdy youtubers never take this in consideration and never are those numbers present)

    3. canned benchmark (which I think it is in this case) or real world testing..

    4. system used.

    5. aaaand last but not least important, margin of error..
     
    IdiotInCharge, SnowBeast and Armenius like this.
  15. Armenius

    Armenius [H]ardForum Junkie

    Messages:
    16,243
    Joined:
    Jan 28, 2014
    I was just thinking that the test conditions need to be controlled, which none of these video "reviewers" seem to do. Especially with Windows 10 these days. Casually installing the newest driver to the PC you use every day isn't going to be sufficient.
     
    Araxie likes this.
  16. Grebuloner

    Grebuloner Limp Gawd

    Messages:
    412
    Joined:
    Jul 31, 2009
    There are reports and confirmation over at Primegrid that the 411 drivers are causing computation errors on some subprojects and 30% slowdowns on others vs. 399 on Pascal and Maxwell cards.
     
  17. SnowBeast

    SnowBeast [H]ard|Gawd

    Messages:
    1,200
    Joined:
    Aug 3, 2003
    Well my test conditions are:

    Not Mom's Kitchen
    My Home Theater with an Epson HC4000 super heater creator 5000
    Generally stays 70-74 degrees F
    Had to turn fan on at end of testing as I started sweating, sitting.
    Tests were done 5 times on each driver with a clean install using DDU

    Margin of error? Literally 44fps every time. LITERALLY. -+1% from switching from guacamole to spinach dip between tests margin of error.
     
  18. Araxie

    Araxie [H]ardness Supreme

    Messages:
    6,303
    Joined:
    Feb 11, 2013
    Lol hahaha I believe more on your results than the guy in the OP video...
     
    Red Falcon, Armenius and SnowBeast like this.
  19. Chimpee

    Chimpee [H]ard|Gawd

    Messages:
    1,187
    Joined:
    Jul 6, 2015
    Especially Snowbeast probably did a better job controlling the variables to prevent major fluctuations in clock speed from GPU or CPU. Normally these people doing these kind of testing at pretty bad at it since most of them don't bother setting a fixed clock speed for GPU and/or CPU and running fan at max.
     
  20. SnowBeast

    SnowBeast [H]ard|Gawd

    Messages:
    1,200
    Joined:
    Aug 3, 2003
    Yeah if I had time to do more I would. Because real world, my equipment is in an enclosed case, not an open workbench.

    I just don't have patience like Brent and Kyle. Doing the same task over and over just get nerve wrecking to me.
    My hats off to them being able to do everything really in depth.
     
    N4CR, Red Falcon and Armenius like this.
  21. Rvenger

    Rvenger [H]ard|Gawd

    Messages:
    1,491
    Joined:
    Sep 12, 2012
    I saw this yesterday and I was going to send this to Kyle but every time I bring up something like this I get blasted in the forum.

    My main question is, did the Nvidia launch driver have compatibility for both 10 series and 20 series? I see that most used 399.24 for the 10 series which seems to show the two GPUs in a fair light. But looking at 411.63 it shows what Nvidia is capable of doing to the 10 series. The only real product that was affected was the 1080ti. The 1070ti remained the same which Nvidia knew reviewers were going to pit it against the 1080ti on every RTX card.
     
  22. Mr. Baz

    Mr. Baz 2[H]4U

    Messages:
    2,813
    Joined:
    Aug 17, 2001
    I disagree. Turing is a game changer. The only way to advance that R&D is to toss it (Tensor cores) on a product and sell a few hundred thousand. Then, the next series cards will have a very significant hike in performance across the board. Someone has to pay for that tech advancement though....that would be us.
     
    Maddness likes this.
  23. 5150Joker

    5150Joker 2[H]4U

    Messages:
    2,806
    Joined:
    Aug 1, 2005
    So now 1-2 frame differences between driver versions is considered nerfing? Oh lord.
     
    Montu, defaultluser, Armenius and 4 others like this.
  24. Trimlock

    Trimlock [H]ardForum Junkie

    Messages:
    15,127
    Joined:
    Sep 23, 2005
    FFS we see this every time NVidia drops a major driver update and JUST when we stopped seeing ignorant posts of people claiming NVidia nerfs their older GPU's....

    NVidia does a lot of shady and shitty stuff, but lets not make up stupid shit just because we hate the company. Besides, even if they get a game or two OFF doesn't mean they intentionally borked your product, it could also be an oversight that will be patched later. If so then AMD "nerfed" all my other cards more often....

    I'm not at my 1080 (non-Ti) with a 1700X, I'd test it in a few games as well.
     
    Armenius likes this.
  25. defaultluser

    defaultluser I B Smart

    Messages:
    12,076
    Joined:
    Jan 14, 2006
    This is a C-O-N-spiracy
     
  26. euskalzabe

    euskalzabe Gawd

    Messages:
    829
    Joined:
    May 9, 2009
    Not to defend Nvidia, but this has been demonstrated to be a bogus paranoid rumor multiple times now. It makes no sense for NV to do this. For anyone, really.
     
    Armenius, Montu and defaultluser like this.
  27. filip

    filip [H]ard|Gawd

    Messages:
    1,044
    Joined:
    Aug 15, 2012
    Nvidia drivers for older gen cards falls off bad. Nothing new here.
     
    dragonstongue likes this.
  28. oldmanbal

    oldmanbal [H]ard|Gawd

    Messages:
    1,863
    Joined:
    Aug 27, 2010
    Historically this has been proven categorically false, as seen on such websites as [H] ( https://m.hardocp.com/article/2017/02/08/nvidia_video_card_driver_performance_review )

    From the incredibly small sample size of this current gripe (one driver), we have virtually no ability to show anything of merit. If an issue is identified, it will be remedied. They are still selling the 10xx series as a full lineup in their product stack. There is virtually nothing to be gained by torching the performance on their hi-margin (highest yielding) product segment. However, feel free to continue spouting uniformed slander, it is the internet after all right?
     
  29. filip

    filip [H]ard|Gawd

    Messages:
    1,044
    Joined:
    Aug 15, 2012
    You have google, see how AMD cards catch up to Nvidia offerings that were far faster as time passes, in newly released games. That is what I am referring to not some thing along the lines of you lose 20% fps in a game that has been out, after time passes.
    Here is a quote form that [H] article:
    "Looking toward the AMD GPUs in Fallout 4 Page we find that the GeForce GTX 980 Ti is pulling ahead a bit farther than the AMD Radeon R9 Fury X in this game. However, once again the AMD Radeon R9 Fury X is the most improved in terms of performance update impact with new drivers. We also noticed the AMD Radeon R9 Fury X, and RX 480, hold up better using the "Ultra" Godrays."
     
  30. Trimlock

    Trimlock [H]ardForum Junkie

    Messages:
    15,127
    Joined:
    Sep 23, 2005
    And it starts a new....
     
    filip likes this.
  31. NukeDukem

    NukeDukem 2[H]4U

    Messages:
    2,215
    Joined:
    Feb 15, 2011
    What does AMD cards getting faster over time have to do with nVidia?
     
    Armenius and oldmanbal like this.
  32. filip

    filip [H]ard|Gawd

    Messages:
    1,044
    Joined:
    Aug 15, 2012
    I mean what dose it not have to do with it. You can buy two types of cards, AMD and Nvidia. If AMD scales better over time they have a better product or have better drivers released for older cards. However, Nvidia is not releasing optimized drivers for their older cards and in this case we use the other manufacture to compare the falloff of the cards over time as new games are released.
     
  33. oldmanbal

    oldmanbal [H]ard|Gawd

    Messages:
    1,863
    Joined:
    Aug 27, 2010
    So you are going to use an AMD observation to determine Nvidia is nerfing their older products via driver updates? You do know that doesn't make any sense right? Why are you still arguing this point? It's okay to make a mistake, just learn from it and move on.
     
  34. filip

    filip [H]ard|Gawd

    Messages:
    1,044
    Joined:
    Aug 15, 2012
    Come on man, as an example: if the Nvidia card was 20% faster than the AMD card when it came out and then over time that lead fades to 5% or even goes in the favor of AMD, what is that? Did AMD come to my house and upgrade the card while I was sleeping like some kind of tooth fairy. The divers have fallen off on the Nvidia side compared to the competition. You have to compare Nvidia to something. I mean how else can this be seen? If there was only Nvidia in the market I would agree with you but that is not how it is.
    And I am not saying they are nerfing their cards, I am saying they stop optimizing drivers for older cards.
     
  35. NukeDukem

    NukeDukem 2[H]4U

    Messages:
    2,215
    Joined:
    Feb 15, 2011
    ...or it could mean AMD driver team is badly underfunded and it takes them years to get the most out of their hardware, while nVidia has superior optimization from the beginning. This seems to me a far simpler and more plausible explanation over "nVidia gimps their old hardware" conspiracy theories.

    You have Google, look up the term "Occam's razor"...
     
  36. Araxie

    Araxie [H]ardness Supreme

    Messages:
    6,303
    Joined:
    Feb 11, 2013
    don't be silly and of such kind of ignorant people... there's a reason why AMD cards "age exceptionally well" overtime, and it's called GCN.. they has been using GCN since HD7000 series, it went to R9 200, R9 300, R9 Fury, RX 400/500 RX VEGA, the main architecture is still being shared commonly between GPUs, so most driver and software optimizations being made for newer GPUS are also applied to older GPUS back to the HD7000 series, it's that hard to understand?.

    Same scenario it's actually happening with nvidia since Maxwell where it share A LOT of the architecture with Pascal and Now with Volta/Turing.. so we can expect at least this generation of GPUS to keep "passively" improving overtime, that's the reason why GTX 900 series are still relevant today and will be that way until Nvidia make a noticeable architecture jump as it did from Fermi to Kepler and from Kepler to Maxwell, same scenario will apply to AMD, as soon as they make a noticeable architecture jump they will forget about older generations of GPUs, which has been happening since polaris, they focused in Polaris ignoring the much stronger R9 Fury and Fury X to the point where in modern tittles they tend to be neck to neck, AMD realized they need to keep people upgrading and save money in the same time and in order to achieve that they need to focus always in the newer tech that help to save money and that help to make newer cards more appealing to their own customer. is not hard to understand or it is?
     
    IdiotInCharge, Armenius, N4CR and 2 others like this.
  37. filip

    filip [H]ard|Gawd

    Messages:
    1,044
    Joined:
    Aug 15, 2012
    I mean if AMD driver team is badly underfunded and they some how manage to get the cards to be better over time how could a company that is not underfunded not do the same thing. I do see what you are saying though but I find it a little hard to believe that Nvidia has near 100% optimization on launch and can't do anything else to improve the drivers as time passes.
     
  38. filip

    filip [H]ard|Gawd

    Messages:
    1,044
    Joined:
    Aug 15, 2012
    Ok, thanks for proving my point Nvidia does not optimize drivers for old cards.
     
  39. Araxie

    Araxie [H]ardness Supreme

    Messages:
    6,303
    Joined:
    Feb 11, 2013
    oh so you has been negligent ignoring the difference between driver optimization for current tech and nerfing older cards?.. awesome, I know what kind of people you are now to save some answer later, nothing can save blindless ignorance.
     
  40. filip

    filip [H]ard|Gawd

    Messages:
    1,044
    Joined:
    Aug 15, 2012
    Read my other post, I am not saying they are nurfing cards, I never said that. All i am saying is that Nvidia does not optimize drivers for old cards and that is why they fall off as time passes. Blind ignorance has a new meaning when you cant even read what is being said by someone.