video: Do Nvidia drivers cause planned obsolescence?

polonyc2

Fully [H]
Joined
Oct 25, 2004
Messages
25,858
Debunking claims of intentionally reduced performance on older graphics cards to force users to upgrade

for some time now there have been claims that Nvidia drivers intentionally cripple older hardware, trying to force users to upgrade...this is the so-called "planned obsolescence" approach to hardware...LinusTechTips posted a video today where they've done the dirty work!..hundreds of benchmarks covering five years of hardware, from the GTX 680 through the current generation GTX 1080 Ti...for the drivers, each card was tested with the drivers one release after the initial product launch, and from there forward...the tested games consist of The Witcher 3 (with and without HairWorks), Metro: Last Light, and BioShock: Infinite...

 
Debunking claims of intentionally reduced performance on older graphics cards to force users to upgrade

for some time now there have been claims that Nvidia drivers intentionally cripple older hardware, trying to force users to upgrade...this is the so-called "planned obsolescence" approach to hardware...LinusTechTips posted a video today where they've done the dirty work!..hundreds of benchmarks covering five years of hardware, from the GTX 680 through the current generation GTX 1080 Ti...for the drivers, each card was tested with the drivers one release after the initial product launch, and from there forward...the tested games consist of The Witcher 3 (with and without HairWorks), Metro: Last Light, and BioShock: Infinite...



I watched it earlier, was a good video, good on them for putting in the hours to do that much benchmarking
 
... ok ima give him this one try to redeme himself from al lthe other crap he has made. this should be a pretty easy (but big) job so hopefully he cant fuck it up..
 
... ok ima give him this one try to redeme himself from al lthe other crap he has made. this should be a pretty easy (but big) job so hopefully he cant fuck it up..
The ALL CAPS title and look on Linus' face in the thumbnail is enough for me to ignore it. I never understood the devotion to LTT to begin with. He's after that 12-21 crowd that hangs out in r/PCMasterRace.
 
Especially since the current WHQL driver spans from Pascal to Fermi - and has since Pascal launched. (The rather weird quirk is that the very day I was swapping GPUs (from GTX550Ti to GTX1050Ti was ALSO the day nVidia released a new WHQL Game-Ready driver - it was certainly not planned on my part! As I mentioned, I was under the knife the morning the new GPU arrived - and since the procedure was same-day-surgery (colonoscopies often are), I was doing the swap within an hour after I got home - which was itself one hour after I regained consciousness; the 'scopy itself was less than an hour under general anaesthesia. My original plan was, in fact, to reinstall the SAME driver I had been running.)

In the three days since, I've been bringing games out of mothballs (and in some cases - such as Crysis 2 and 3, I found a few moth corpses amongst the balls; I had never thrown either version of Crysis at the '58). The OTHER reason I was bringing games off the shelf has to do with one of the more infamous (PC tech) videos by a relatively-mainstream outfit - PCperspective's "Maxwell in an OEM PC". Lest we forget, the GTX750/Ti were not merely mainstream priced, but they mostly did not require any more juice than is sent through the PCI-E slot. Before the resurgence of GPU-based mining, GTX1050Ti, and AMD's RX460 aimed squarely at the same turf that Maxwell used to own until it stopped being produced - gaming performance in a LOT of games for relative cheapskates. Because that was, indeed, nVidia's plan, I thought of the various no-power-except-the-slot Pascal cards (from everybody) as Maxwell Redux. But was it really like that? Therefore, I didn't just throw my Usual Suspects (RTS games, city-builders, etc.) that I normally played on the '58/Fermi tag-team, I went back and looked for games that I would not have DARED play on said tag-team (the latest was Ashes of the Singularity, with both the Escalation add-on and the 2.3 update). Ashes - Escalation is a decided outlier - and especially for a '58; consider that the publisher (Stardock) has an i5 as the absolute minimum; the '58 is a dual-core. In fact, it's the Haswell i3 minus a few features - such as HTT; along with it being lacking in cache compared to an i3 of the same LGA. The surprise was that I didn't take a big a bath as expected - using the included benchmarking tool, I pulled up an average of 23 fps at 1080p with all settings firewalled. Not a steady 30 fps - but it didn't miss by much. Definitely unexpected turf for a '58 - let alone a stock-clocked '58. I couldn't bench DX12 - still, this game with DX12 has been problematical for nVidia GPUs with quad-core CPUs - had the '58 gotten through at all, I doubtless would have been suspected of shens. (The game crashed outright - so back to DX11 I went.) Still, when it comes to DX11, Pascal - even shackled to the '58 - definitely MORE than cuts the ketchup in games that use one or two cores - which is STILL a LOT of games - including some - but not most - entry-level DX12 games. But the game in my rota that had me planning to pull the trigger is a city-builder - Anno 2205. The latest in the "Anno" series - like the predecessors - uses, at MOST, two cores. However, unlike most two-core games, it WILL leverage a beefy GPU - even a virtual one. (Before I pulled that trigger, I had been playing the game on LiquidSky - the OTHER virtual computer gaming service. (GeForce Now is currently not only still in beta, but only for Macs; while LiquidSky is ALSO still in beta, it's a more widespread open beta - for Windows and Android, currently. Anno 2205 is small enough that even a bottom-end SkyComputer can fit the whole game and your trainer of choice - with space to spare - AND butter-smooth graphics - depending on your Internet connection of course.) The ONLY area where LS has me beat is the SkyComputer's network performance and the amount of virtual RAM; however, each SkyComputer is a Xen-based virtual instance running on XEON hardware with virtualized nVidia GRID M60 GPUs - basically, Pascal taking anabolic steroids and hitting the gym. Such hardware is far from cheap.
What made Maxwell a value play was that it was a one-piece upgrade for cheapskates, and so far (to my own eyeballs), if it weren't for mining, I have every reason to think the same applies to Pascal - and the GTX1050Ti in specific.
 
Actually, as a matter of fact, when the 680 got released, they gimped the 580's max overclock to 1000mhz.

Not many noticed, but I had mine under water at 1050mhz and immediately did. I upgraded to a 680, then sometime later, like 6+ months, I noticed they'd unlocked it again. Nothing was ever said, but seeing my 580 @ 1050mhz beat a 680 stock in some cases said it all to me. Not so at 1000mhz.

That's a fact! :cool::D
 
The ALL CAPS title and look on Linus' face in the thumbnail is enough for me to ignore it. I never understood the devotion to LTT to begin with. He's after that 12-21 crowd that hangs out in r/PCMasterRace.

normally I don't pay much attention to his videos but no matter the source if a topic or video is well done I'll give it a look...he definitely did a good job testing the driver performance over time
 
Back
Top