NVIDIAs new driver features ultra-low latency mode, integer scaling, sharpening

5150 I lub j0000! I do! This is me, lubbing j0000000! :love: And GoldenTiger. And even Snowdog (in a platonic, manly sort of way.)

I'm not the one who said, 'conspiracy', and I didn't think of it in those terms. My original statement was more of a three-dudes-farting-on-the-couch statement - "you got another 20% out of the card?? You had that much fat laying around?" The announced speed improvements (with few clarifications), the big reveal at Gamescon and their current competition with AMD sorta pulled one of my nostril hairs.

We're pushing this way out of bounds. I LOVE YOU! :love:


P.S. I'm one of the people who installed the driver and got the 'GeForce Experience' experience, too.

P.P.S. I lub j000! I do! :love:

P.P.P.S. You big poopyhead.

P.P.P.P.S. There is no 'grovel' emoji? :notworthy:

confused.gif
 
I am looking forward to trying the scaling paired with DLSS on my laptop to see what it does for Anthem...
 
Today on [H] I learned nVidia is an evil, super greedy corporation. Also one that apparently gimps it’s own products because it likes losing market share? If they relaunched a chip at a higher price point and it magically did better maybe I’d bite.

I’ll have to check out that sharpening filter. Might come in handy for RT. You know, what DLSS was supposed to be...
 
5150 I lub j0000! I do! This is me, lubbing j0000000! :love: And GoldenTiger. And even Snowdog (in a platonic, manly sort of way.)

I'm not the one who said, 'conspiracy', and I didn't think of it in those terms. My original statement was more of a three-dudes-farting-on-the-couch statement - "you got another 20% out of the card?? You had that much fat laying around?" The announced speed improvements (with few clarifications), the big reveal at Gamescon and their current competition with AMD sorta pulled one of my nostril hairs.

We're pushing this way out of bounds. I LOVE YOU! :love:


P.S. I'm one of the people who installed the driver and got the 'GeForce Experience' experience, too.

P.P.S. I lub j000! I do! :love:

P.P.P.S. You big poopyhead.

P.P.P.P.S. There is no 'grovel' emoji? :notworthy:
Ummmmm, did your visiting nurse not show up with your meds today?
 
Ummmmm, did your visiting nurse not show up with your meds today?

Dude, this is America, you can't get a visiting nurse in this country! Shit, I can't even get my dealer to come to my house.

And apparently "I lub j0000!" is not universally understood as the Cartman voice (that might have been local to my old guild boards, and maybe my last job, where I used it a lot, my bad...) Although, as a perfectly normal (normal, I tell you) meterosexual man, I am not afraid to say, "I love you" to another man, but that's irrelevant. /s point is that I am not / was not thinking 'conspiracy', when I posted, I was mostly thinking, "errrr.... wait a minute...."

I have used bulletin boards like this one for decades, and I'm old enough to spot the point where a thread is about to descend into a "No, YOU'RE the idiot," where two, three or five people have posts that are a series of multi-quotes followed by one line rebuttals that all make no sense. There are no proofs that will solve a runaway thread, so I'll take the bullet, I'm the fool, and it was kinda my bad that I pulled Snowdog in front of bullet that I thought was pointed at me.

Now it would be cool if you would say, "No worries, just another strawman day on the Internet."


P.S. <attempting to come up with grandiose closing statement>
 
Last edited:
Not ever worth arguing with conspiracy theorists.

plonk

Ah, because corporations doing practical and pretty much effortless things to make themselves more a lot money is unheard of and conspiracy. Corporations never price-fix, never practice forced obsolescence, never disable GPU cores, never hold back anything more than what they have to do to make the same amount or more of money...

The issue there is that you've chosen to label regular critical thinking as conspiracy.

Also, your very first statement, "No one spends extra money to make their own product worse", a position that is refuted by the existence of things such as software editions, like Windows 10 Home and Pro, shows that you don't have a reality-based understanding of the topic. Why aren't all editions of Windows 10 simply Windows 10 Enterprise, which has everything from all the other editions in it? Because Microsoft makes more money by locking advanced capabilities behind higher pay options. The exact same applies to GPUs and how much performance is on tap. Why does Nvidia sell cut-down versions of higher tier GPUs at lower prices, when it costs Nvidia money to stunt the performance the cut-down GPUs? No sane company would spend money to make their product worse, right?

Actually, companies would, and they do as standard practice - and it's nothing new. The more performance Nvidia or another GPU company gives out now, the less performance people will pay for later. But, if Nvidia isn't better than AMD, they lose market share and profits. So, the strategic thing to do is to be just enough ahead of AMD to entice buyers to their product, but to not be so far ahead that Nvidia loses out on future GPU purchases because the old stuff is still good enough.

Windows 10 Home exists, and so do an untold number of other such analogous items, therefore companies do spend extra money to make their own products worse. And they do it because being strategic with what they offer can make them more money. This is basic stuff.

Your denial of the reality of what is a basic business strategy represents the conspiracy theory.
 
Last edited:
Comprehension issue? I wasn't referring to low latency mode.

I was referring to the weird theory that NVidia was purposefully gimping their own GPUs gaming performance, like that was some kind of nefarious benefit.

The ludicrous theories I see on this board make me think there must be a contest I wasn't told about.

I see you still haven't looked into the miraculous nVidia driver performance increases which always suddenly appeared as soon as competitors caught up or surpassed them. There's very good reason to believe nVidia is holding back performance increases through drivers because the company has a history of doing so.

You can claim "conspiracy" all you want but it's not me or anyone else miraculously releasing new drivers increasing performance. That's nVidia miraculously doing that when the competition is looking better and better. You're also attempting to claim that the drivers are "gimped" in order to make your argument sound better. No one is claiming that but you. Holding back on improvements until it's deemed necessary is not the same as "gimping" something.
 
The peanut gallery is funny when it comes to nVidia. They release an amazing driver and people claim they've been holding out--damned if you do and damned if you don't. I think it's important major tech sites re-review the 5700/5700 XT against the 20 series after this new driver, especially in games like Apex Legends where NVIDIA should have a considerable lead. Almost everyone from mid tier Pascal to high end Turing is reporting massive jumps in fps in Apex and other games.

Here's a nice video:



Damn you NVIDIA, holding out all this awesome performance just in time for Navi. You evil evil bastard Mr. Leather Jacket.

JKHBu2m.jpg
 
Last edited:
I see you still haven't looked into the miraculous nVidia driver performance increases which always suddenly appeared as soon as competitors caught up or surpassed them. There's very good reason to believe nVidia is holding back performance increases through drivers because the company has a history of doing so.

You can claim "conspiracy" all you want but it's not me or anyone else miraculously releasing new drivers increasing performance. That's nVidia miraculously doing that when the competition is looking better and better. You're also attempting to claim that the drivers are "gimped" in order to make your argument sound better. No one is claiming that but you. Holding back on improvements until it's deemed necessary is not the same as "gimping" something.

nVidia boosting Pascal helps nVida financially... how?
 
So in essence when AMD gets performance improvements from driver updates it's Fine Wine™, but when Leather Jacket Man gets performance improvements from driver updates it's only because they have previously intentionally gimped performance.
Whatever happened to judging everyone by the same measure?
 
Last edited:
I see you still haven't looked into the miraculous nVidia driver performance increases which always suddenly appeared as soon as competitors caught up or surpassed them. There's very good reason to believe nVidia is holding back performance increases through drivers because the company has a history of doing so.

You can claim "conspiracy" all you want but it's not me or anyone else miraculously releasing new drivers increasing performance. That's nVidia miraculously doing that when the competition is looking better and better. You're also attempting to claim that the drivers are "gimped" in order to make your argument sound better. No one is claiming that but you. Holding back on improvements until it's deemed necessary is not the same as "gimping" something.

In reality, they improve drivers on a fairly even spaced cadence, and you are attributing them to some kind of special purpose/meaning at certain times when you take notice.

People are particularly bad at seeing this kind of pattern when it really isn't there. Especially if those people have an agenda, that predisposes them to seeing a behavior they already are primed to believe.

Take this latest driver improvement that has some of you absurdly theory crafting: This is far from the ideal time for NVidia to "ungimp" their cards performance.

That would have been for the Super Launch, right before 5700 series launch.

That was the biggest press week for both cards, the reviews from that period will be the resource that people look to when deciding which cards to buy, for many months to come.

You can clearly see that NVidia made a major effort to try and quash the AMD launch and build it's own momentum. They did a HW re-spin on their cards, lowered their prices and improved performance. They were not holding back because just like I said, this week of releases and reviews are absolutely key. NVidia absolutely wanted their best showing.

Now your wacky theory is that during that big release/review week, NVidia were still just purposefully gimping their own cards.

That is just a laughably absurd theory. It would have been completely detrimental to NVidia's goals to hold back any performance option they had at that time.

Any clear thinking (IE unbiased, without an agenda) should have been able to see how badly NVidia missed their most important timing window. Despite your absurd claim that they miraculously always have these gains at the right time.

So your wacky theory doesn't come close to holding water.

What NVidia's current driver release might be is a bit of Rorschach test about how deep some peoples bias runs. When they make up or blindly believe, such poorly constructed theories without thinking them through.
 
Ah, because corporations doing practical and pretty much effortless things to make themselves more a lot money is unheard of and conspiracy. Corporations never price-fix, never practice forced obsolescence, never disable GPU cores, never hold back anything more than what they have to do to make the same amount or more of money...

The issue there is that you've chosen to label regular critical thinking as conspiracy.

Also, your very first statement, "No one spends extra money to make their own product worse", a position that is refuted by the existence of things such as software editions, like Windows 10 Home and Pro, shows that you don't have a reality-based understanding of the topic. Why aren't all editions of Windows 10 simply Windows 10 Enterprise, which has everything from all the other editions in it? Because Microsoft makes more money by locking advanced capabilities behind higher pay options.

Those are different editions of Operating systems... it's too complex to make "completely different builds" so they disable features you aren't paying for and don't want/need. Nothing new, but also not proof that nVidia purposely hampers performance.

The exact same applies to GPUs and how much performance is on tap. Why does Nvidia sell cut-down versions of higher tier GPUs at lower prices, when it costs Nvidia money to stunt the performance the cut-down GPUs? No sane company would spend money to make their product worse, right?

And this is done with CPU's as well, by Intel and AMD. They lock out the defective portions, and sell the 8-core die as a 6-core. It keeps them from 1) wasting expensively produced dies, 2) supports different markets without the expense of a different (expensive) die tape-out. It's not "spending extra money to cripple performance", it's "spend a little to save a lot". Completely logical thing to do and wholly un-supportive of your argument.

Actually, companies would, and they do as standard practice - and it's nothing new. The more performance Nvidia or another GPU company gives out now, the less performance people will pay for later. But, if Nvidia isn't better than AMD, they lose market share and profits. So, the strategic thing to do is to be just enough ahead of AMD to entice buyers to their product, but to not be so far ahead that Nvidia loses out on future GPU purchases because the old stuff is still good enough.

You've made some logic errors by applying well-known tactics used that save the companies' production and engineering effort/costs, as explained above, to provide evidence that nVidia would gimp performance to be "just good enough".

One makes sense, the other is a bit of fantasy with no proof.

Windows 10 Home exists, and so do an untold number of other such analogous items, therefore companies do spend extra money to make their own products worse. And they do it because being strategic with what they offer can make them more money. This is basic stuff.

Your denial of the reality of what is a basic business strategy represents the conspiracy theory.

Already refuted, see above.
************************************************************************************************************************************************************
So, I've installed these new drivers. With the low latency set to Ultra, Star Wars Battlefront II is crashing pretty often. Turned if back to off (default) and it's better. Definitely feels like they made some big changes under the hood... vs. oh they just decided to ungimp their chips...

Funny thing is, if they really were gimping their own GPU's this way, it would mean they are light years ahead of AMD in performance/design/efficiency... as AMD has had to go 7nm along with running their GPU's on the bleeding edge just to get equal performance to 1080Ti (> 2 years old) / 2070 Super (and then only in some games), and can't even touch 2080 Super or 2080Ti except in their wet dreams...
 
Those are different editions of Operating systems... it's too complex to make "completely different builds" so they disable features you aren't paying for and don't want/need. Nothing new, but also not proof that nVidia purposely hampers performance.

That is the exact same thing. To disable and remove features, it takes time and money, and lots of coding.

It doesn't have to be done: Microsoft could simply sell Enterprise to everyone and then Microsoft would spend less time and money on development and support, while customers would get more. But, it makes Microsoft more money to spend more time and money degrading and limiting their own product and selling no more of the full thing than is necessary to retain market share. And it takes far more effort to cut-down Windows than it would to modify some, possibly very few, variables in a driver.

Pointing out that fact wasn't an attempt at proving that Nvidia is doing it with their drivers. It was, however, showing that "No one spends extra money to make their own product worse" is a false statement. Actually, lots of businesses do that, and that's kind of the idea behind forced obsolescence.

And this is done with CPU's as well, by Intel and AMD. They lock out the defective portions, and sell the 8-core die as a 6-core. It keeps them from 1) wasting expensively produced dies, 2) supports different markets without the expense of a different (expensive) die tape-out. It's not "spending extra money to cripple performance", it's "spend a little to save a lot". Completely logical thing to do and wholly un-supportive of your argument.

Nope, it's very relevant. It also isn't simply done to lock out defective portions, but to reduce the performance of non-defective parts of a chip so that it can be sold as lower model, or so that people don't get too much performance for their money, costing the company future sales.

Companies don't want another Sandy Bridge situation.

You've made some logic errors by applying well-known tactics used that save the companies' production and engineering effort/costs, as explained above, to provide evidence that nVidia would gimp performance to be "just good enough".

One makes sense, the other is a bit of fantasy with no proof.

Actually, that claim is fantasy because you've misunderstood what you've read: Explaining the concept isn't an attempt at proving Nvidia is doing it. The argument I made refuted the claim that "No one spends extra money to make their own product worse". You taking that refutation as a claim that it's proven that Nvidia is doing it is something coming from your own mind.

Already refuted, see above.

I see that you failed to refute my post in any way.


So, I've installed these new drivers. With the low latency set to Ultra, Star Wars Battlefront II is crashing pretty often. Turned if back to off (default) and it's better. Definitely feels like they made some big changes under the hood... vs. oh they just decided to ungimp their chips...

Funny thing is, if they really were gimping their own GPU's this way, it would mean they are light years ahead of AMD in performance/design/efficiency... as AMD has had to go 7nm along with running their GPU's on the bleeding edge just to get equal performance to 1080Ti (> 2 years old) / 2070 Super (and then only in some games), and can't even touch 2080 Super or 2080Ti except in their wet dreams...

Those are logically fallacious arguments. Your anecdotal experience is anecdotal. And if Nvidia were gimping their drivers it wouldn't mean that Nvidia is any more further ahead of AMD than the fact that Nvidia can achieve these performance gains by tweaking their drivers means. The performance gain they produce is how much further Nvidia's hardware is ahead of AMD regardless of whether the gain is revealed by new driver developments or by un-gimping the driver software.
 
Last edited:
Let's play along, and give you the point that companies will do whatever to sell multiple tiers of hardware/software. They do it.

Still just a supposition on your part that nVidia is purposely gimping performance via drivers. And there's never been proof of this. The above practices are not evidence or proof that nVidia gimps performance in drivers. So your statement is still in conspiracy theory land/fantasy.

It's far more likely that the occasional complete-rewrite of code can produce significantly more efficient code, resulting in better performance. I say occasional because these big jumps are actually pretty rare these days... I can't even remember the last time an nVidia driver gave up to 28% performance jumps (in one game)... Maybe back in the Detonator days with TNT2's and the first GeForce GPU's... when the GPU was a brand new piece of computing hardware. Makes sense they would learn better over time how to code for it.

Edit: In fact, [H] did a few reviews where they took several old generations of both AMD and nVidia cards, did new performance tests on them with both the cards' launch driver, and the current driver at the time. If nVidia (or AMD) was gimping performance on purpose, we would have seen it in those results, and we didn't. nVidia had the most consistent performance between the launch driver and the current driver. There was small increases usually just a few% or fps. AMD on the other hand had some larger swings. We took that to mean that AMD was finally getting the drivers right, not as AMD had purposely gimped their cards and then ungimped them...

someone go find those reviews, we got some readers that need educated.
 
Last edited:
Let's play along, and give you the point that companies will do whatever to sell multiple tiers of hardware/software. They do it.

Still just a supposition on your part that nVidia is purposely gimping performance via drivers. And there's never been proof of this. The above practices are not evidence or proof that nVidia gimps performance in drivers. So your statement is still in conspiracy theory land/fantasy.

It's far more likely that the occasional complete-rewrite of code can produce significantly more efficient code, resulting in better performance. I say occasional because these big jumps are actually pretty rare these days... I can't even remember the last time an nVidia driver gave up to 28% performance jumps (in one game)... Maybe back in the Detonator days with TNT2's and the first GeForce GPU's... when the GPU was a brand new piece of computing hardware. Makes sense they would learn better over time how to code for it.

Edit: In fact, [H] did a few reviews where they took several old generations of both AMD and nVidia cards, did new performance tests on them with both the cards' launch driver, and the current driver at the time. If nVidia (or AMD) was gimping performance on purpose, we would have seen it in those results, and we didn't. nVidia had the most consistent performance between the launch driver and the current driver. There was small increases usually just a few% or fps. AMD on the other hand had some larger swings. We took that to mean that AMD was finally getting the drivers right, not as AMD had purposely gimped their cards and then ungimped them...

someone go find those reviews, we got some readers that need educated.

Ask and you shall receive!

https://www.hardocp.com/article/2017/01/30/amd_video_card_driver_performance_review_fine_wine

https://www.hardocp.com/article/2017/02/08/nvidia_video_card_driver_performance_review/
 
So in essence when AMD gets performance improvements from driver updates it's Fine Wine™, but when Leather Jacket Man gets performance improvements from driver updates it's only because they have previously intentionally gimped performance.
Whatever happened to judging everyone by the same measure?

GREEN MAN BAD!
 

Thanks! These are exactly the ones I remember reading.

Brent Justice said:
NVIDIA Driver Performance Over Time

Simply put we did not see a lot of NVIDIA driver performance improvements over time on either video card. When we did experience driver performance gains over time the actual percentage difference was very small, generally well under 5%. There was the occasional game where a bug fix, or new game support was added that boosted performance. However, from "Game Ready" driver to "Game Ready" driver, the improvements were nil to none.

In newer APIs like Vulkan API in DOOM and DX12 in Gears of War 4 we saw the largest improvements on the newer GeForce GTX 1080 video card using the Pascal architecture. In Hitman DX12 for example the GeForce GTX 1080 received performance improvements, while the GeForce GTX 980 Ti was stagnant.

In Gears of War 4 we saw a greater improvement with the GeForce GTX 1080 than we did with the GeForce GTX 980 Ti. The Maxwell GPU GeForce GTX 980 Ti does not support Async Compute in Gears of War 4. However, Pascal GPUs (GTX 1080/1070/1060/1050) do support Async Compute. Only in DOOM with the Vulkan API, did we see both improve almost equally.

Outside of Vulkan API and DX12 API games, both the GeForce GTX 980 Ti and GeForce GTX 1080 were rather stagnant in terms of performance updates in DX11 games. We just did not see anything worthy of noticing while playing or changing the gameplay experience. Most performance differences fell within what we would consider a margin with the real-world game testing we do.

Maybe now we can get back on topic, to discussing the new driver...

Anyone else having game crashes since this latest driver, specifically when enabling the Ultra low latency option?
 
This is far from the ideal time for NVidia to "ungimp" their cards performance.

That would have been for the Super Launch, right before 5700 series launch.
Good point. If they really intentionally held performance back, the Super card launch would have been the time to unleash this, not freaking Gamescom - a lowly gaming show for euromullets with the terrifying sandals and black socks ensemble.
 
Last edited:
You are still running at half the native resolution of your monitor. It's isn't going to look great.

The idea behind integer scaling is you can take one pixel and use it to fill four pixels. If done right, it won't look blurry at all.
 
The idea behind integer scaling is you can take one pixel and use it to fill four pixels. If done right, it won't look blurry at all.

Maybe there is a better word to describe it. But I consider a native 1280x720 panel displaying a 1280x720 image blurry compared to a native 2560x1440 panel displaying a 2560x1440 image.
 
Anyone know how long driver releases are worked on before they are released? Like the current version might have been a few months in development? I'm asking b/c I don't really know.
 
Guys take a look at this.




Noted performance improvements but on other titles there is a deficit. Can anyone else confirm this? I will be checking into it soon once I install it.
 
Guys take a look at this.




Noted performance improvements but on other titles there is a deficit. Can anyone else confirm this? I will be checking into it soon once I install it.

I've always wondered when AMD or nVidia release drivers and claim improvements in specific games, if they just improved their drivers that affected certain games, or added game-specific optimizations. Subtle difference I know, but if they specifically target certain games, and I happen to not play those specific games, then the driver at best does nothing for me, and as was seen by that reviewer, could hamper other games.
 
Guys take a look at this.



Noted performance improvements but on other titles there is a deficit. Can anyone else confirm this? I will be checking into it soon once I install it.


Nearly every driver release sees some performance difference in some games. GTA V for example saw a massive fps loss on Pascal for a few releases until nVidia fixed it. There is never a one driver fits all scenario but so far this one is the best one in a long time. I don't think anyone minds some fps lost in crysis 3. The most worrying loss in that video is metro Exodus and if someone is intent on playing it at max fps, they could revert drivers.
 
  • Like
Reactions: Auer
like this
Back
Top