NVIDIAs new driver features ultra-low latency mode, integer scaling, sharpening

5150Joker

Supreme [H]ardness
Joined
Aug 1, 2005
Messages
4,569
Well that didn't take long for NVIDIA to respond to Navi. The new 436.02 Game Ready Driver adds the following features:


Game Ready
  • Provides increased performance and the optimal gaming experience for Apex Legends, Battlefield V, Forza Horizon 4, Strange Brigade, and World War Z

Gaming Technology
  • Adds Beta support for GPU Integer Scaling
  • Adds Beta support for Ultra-Low Latency Mode
  • Adds support for new Freestyle Sharpen Filter
  • Adds support for new G-SYNC compatible monitors
Driver release notes: https://us.download.nvidia.com/Windows/436.02/436.02-win10-win8-win7-release-notes.pdf
jpg.jpg


This isn't the old max pre-rendered frames setting, its supposedly a new improved algorithm that matches AMDs anti-lag feature for Navi.

Next up is support for integer scaling which Intel recently announced but it seems NVIDIA beat everyone to the punch. Unfortunately, this is only for Turing for now:
jpg.jpg


Additionally NVIDIA has created a new filter for FreeStyle simply called "Sharpen" which improves performance and visual clarity that is supposed to be on par with AMD RIS but with much broader support and granular adjustment. According to Eurogamer, "They also point out that the amount of sharpening can be adjusted from 0 to 100 per cent, applied on a per-game basis and works with games across all major graphics APIs: DX9, DX11, DX12 and Vulkan".
jpg.jpg


Finally, the driver brings with it some game performance improvement, including very popular titles like Apex Legends which sees up to 23% boost at 1080p:

jpg.jpg


Overall a pretty solid release, it had some issues earlier when it was released with GeForce Experience being force installed but it should be remedied by now.

Eurogamer article: https://www.eurogamer.net/articles/...eger-scaling-and-other-fan-requested-features
 
Last edited:

KazeoHin

[H]F Junkie
Joined
Sep 7, 2011
Messages
8,774
Uh... How bad did the old drivers have to be that you could squeeze an extra 20% out of your card? You've been optimizing for 10 months?

Nvidia didn't have a reason to improve drivers for 10 months. Why waste the effort if people only have the choice of buying your cards.
 

Gideon

2[H]4U
Joined
Apr 13, 2006
Messages
3,168
Well now DLSS is dead if they got time to do this but still cant make DLSS better.
 

Meeho

Supreme [H]ardness
Joined
Aug 16, 2010
Messages
5,781
This isn't the old max pre-rendered frames setting, its supposedly a new improved algorithm that matches AMDs anti-lag feature for Navi.

Hmm, looks exactly like it:


Next up is support for integer scaling which Intel recently announced but it seems NVIDIA beat everyone to the punch. Unfortunately, this is only for Turing for now:
View attachment 182159

Finally! Shitty to limit it to Turing, but not unexpected for a corp.

Additionally NVIDIA has created a new filter for FreeStyle simply called "Sharpen" which improves performance and visual clarity that is supposed to be on par with AMD RIS but with much broader support and granular adjustment. According to Eurogamer, "They also point out that the amount of sharpening can be adjusted from 0 to 100 per cent, applied on a per-game basis and works with games across all major graphics APIs: DX9, DX11, DX12 and Vulkan".
View attachment 182161

This is just terrible, but more examples are needed.
 

5150Joker

Supreme [H]ardness
Joined
Aug 1, 2005
Messages
4,569
Hmm, looks exactly like it:





Finally! Shitty to limit it to Turing, but not unexpected for a corp.



This is just terrible, but more examples are needed.


The low latency feature will have to be tested, especially at Ultra and I'm sure some website like TechSpot/HWUB will get around to it. NVIDIA claims it's new so we'll see. I'm hoping integer scaling isn't kept as a Turing only feature and eventually makes its way down to Maxwell/Pascal after it gets out of beta.
 

SmokeRngs

[H]ard|DCer of the Month - April 2008
Joined
Aug 9, 2001
Messages
17,967
Uh... How bad did the old drivers have to be that you could squeeze an extra 20% out of your card? You've been optimizing for 10 months?

For some of us old fogies this is just par for the course for nVidia. Back in the day nVidia would release new drivers with rather large performance improvements any time a competitor brought out a new product or an improvement on a current product. This happened way too regularly for it to have been anything resembling coincidence and I'm betting we're seeing the same thing here although the beta nature of many of the new features indicates this could be a rushed feature update.
 

Krenum

Fully [H]
Joined
Apr 29, 2005
Messages
18,948
Thanks for the info OP. Will install this new driver tonight.
 

GoodBoy

2[H]4U
Joined
Nov 29, 2004
Messages
2,461
Installed these last night but didn't enable the ultra low latency option. Gonna do that tonight.

Improvements look SWEET!! :)
 

Snowdog

[H]F Junkie
Joined
Apr 22, 2006
Messages
11,262
Still no tests available in how the integer scaling works?

What kind of testing would you need? It should be zero overhead and what it does is simple and obvious, though of very limited usability.
 
Joined
Apr 22, 2011
Messages
820
Installed and tested last night in Ultra Low Latency, "feels" like it's more instant than before, imo. Would like to hear what others have to say. Hopefully gets tested with actual equipment.
 

Lepardi

Limp Gawd
Joined
Nov 8, 2017
Messages
369
What kind of testing would you need? It should be zero overhead and what it does is simple and obvious, though of very limited usability.
Confirmation that it really works for lossless scaling (720p for 1440p and 1080p for 4K). Saw a forum report complaining that it is still blurry.
 

Snowdog

[H]F Junkie
Joined
Apr 22, 2006
Messages
11,262
Confirmation that it really works for lossless scaling (720p for 1440p and 1080p for 4K). Saw a forum report complaining that it is still blurry.

You are still running at half the native resolution of your monitor. It's isn't going to look great.
 

Lepardi

Limp Gawd
Joined
Nov 8, 2017
Messages
369
You are still running at half the native resolution of your monitor. It's isn't going to look great.
Looks great to me if it's just sharp like dropping resolution on a CRT.

R6: Siege used to have a lossless checkerboard upscaling coupled with 2x MSAA. Even though it was 720p, it looked great on a 27" monitor. And the performance difference was mindblowing for the small jaggie tradeoff.
 

odditory

Supreme [H]ardness
Joined
Dec 23, 2007
Messages
6,482
Uh... How bad did the old drivers have to be that you could squeeze an extra 20% out of your card? You've been optimizing for 10 months?
Weird way of looking at it. Engineers make breakthroughs and find better ways of doing things with the benefit of time.

Better, equally obtuse question - how bad does the competition have to be that they can't catch up even when Nvidia has one hand tied behind their back?
 

Armenius

Extremely [H]
Joined
Jan 28, 2014
Messages
36,240
For some of us old fogies this is just par for the course for nVidia. Back in the day nVidia would release new drivers with rather large performance improvements any time a competitor brought out a new product or an improvement on a current product. This happened way too regularly for it to have been anything resembling coincidence and I'm betting we're seeing the same thing here although the beta nature of many of the new features indicates this could be a rushed feature update.
You mean like the driver that exposed mutlithreaded rendering for DX11 that AMD never got around to doing themselves?
 

Krenum

Fully [H]
Joined
Apr 29, 2005
Messages
18,948

I've seen other people using this term "That's Illegal". I think its the new form of saying something is "Cool". You kids and your newfangled lingo....


Anyways, Do I need the "Geforce Experience" to enable Sharpening and integer support? I usually just install the driver without it.
 

Delicieuxz

[H]ard|Gawd
Joined
May 11, 2016
Messages
1,409
Uh... How bad did the old drivers have to be that you could squeeze an extra 20% out of your card? You've been optimizing for 10 months?

They could have been bad by design, so that Nvidia could open up additional performance only if necessary to compete against AMD.

Nvidia didn't have a reason to improve drivers for 10 months. Why waste the effort if people only have the choice of buying your cards.

You mean Nvidia didn't have a reason to remove the built-in gimping.
 

Nobu

[H]F Junkie
Joined
Jun 7, 2007
Messages
8,978
I've seen other people using this term "That's Illegal". I think its the new form of saying something is "Cool". You kids and your newfangled lingo....


Anyways, Do I need the "Geforce Experience" to enable Sharpening and integer support? I usually just install the driver without it.
As opposed to "that's legit"?
 

Delicieuxz

[H]ard|Gawd
Joined
May 11, 2016
Messages
1,409
:wtf:
clear.png


No one spends extra money to make their own product worse.

Is there a competition for outlandish theories?

That is an assumptive position.

Who says it would cost any significant amount, or anything at all? It could be something that takes a couple of hours to tweak, or it could even cost less if it's achieved by leaving out some known optimization paths. Or, it could be done by leaving in older, less-optimized code while a more optimized code version is kept in storage in case it becomes needed. It could likely be done by tweaking some known spots that will reduce efficiency across the board. It is not sound to assume any serious work would be required to make a GPU perform less optimally, whether it's by 0.5% or 50% of its optimal performance status.

If it happened, it would be a choice to make more money by not giving purchasers more performance for their dollar than necessary to stay competitive. The people who bought 20XX GPUs will be moved to upgrade their GPUs sooner if they have less performance on tap. Offering less performance per dollar allows Nvidia to stretch out performance increases and ultimately make more profit along the way.
 

Lepardi

Limp Gawd
Joined
Nov 8, 2017
Messages
369
I've seen other people using this term "That's Illegal". I think its the new form of saying something is "Cool". You kids and your newfangled lingo....


Anyways, Do I need the "Geforce Experience" to enable Sharpening and integer support? I usually just install the driver without it.
gamescom-2019-geforce-game-ready-driver-integer-scaling-nvcpl-option-850px_575px.png


Integer should be part of the drivers, but sharpening is experience exclusive
 

mda

2[H]4U
Joined
Mar 23, 2011
Messages
2,197
This driver was crashing Division 2 for me on the first machine in my sig with or without the low latency... rolled back to the earlier driver (one of the ones released in July) and crash was gone.
 

seanreisk

[H]ard|Gawd
Joined
Aug 29, 2011
Messages
1,711
Not ever worth arguing with conspiracy theorists.

plonk

I'll stand by my statement. Turns out 'ultra-low latency mode' is not new, and was previously called 'max pre-rendered frames'. It has been offered in GeForce drivers for years but has been kept low-key. There is speculation (no facts) that pre-rendered frames and GSync 1.0 didn't play well together, and nVidia was making a push for GSync monitors. And frankly, GSync monitors are the logical choice for nVidia to put their weight behind because smooth framerates are going to give a much better user experience.

And ultra-low latency mode is supposed to be for DX9/DX10/DX11 games only because DX12 already gives control of frame queuing to the game/developer.

So ... A feature that GeForce drivers already had is rebranded 'ultra-low latency mode' and is announced with fanfare at Gamescon. No conspiracy, just marketing.

A basic description is at Anandtech.


P.S. The other features are fine.
 
Last edited:

5150Joker

Supreme [H]ardness
Joined
Aug 1, 2005
Messages
4,569
I'll stand by my statement. Turns out 'ultra-low latency mode' is not new, and was previously called 'max pre-rendered frames'. It has been offered in GeForce drivers for years but has been kept low-key. There is speculation (no facts) that pre-rendered frames and GSync 1.0 didn't play well together, and nVidia was making a push for GSync monitors. And frankly, GSync monitors are the logical choice for nVidia to put their weight behind because smooth framerates are going to give a much better user experience.

And ultra-low latency mode is supposed to be for DX9/DX10/DX11 games only because DX12 already gives control of frame queuing to the game / developer.

So ... A feature that GeForce drivers already had is rebranded 'ultra-low latency mode' and is announced with fanfare at Gamescon. No conspiracy, just marketing bullshit.

A basic description is at Anandtech.


P.S. The other features are fine.


The source you quoted isn't even sure if the algorithm is new or not so until there's actual testing done, I'll believe NVIDIA:

"Meanwhile, perhaps the oddest part of all of this isn’t the first time that NVIDIA has offered Ultra mode. Until earlier this decade, NVIDIA’s drivers also supported a queue size of 0, which is why I’m not sure this entirely counts as a new feature. However given the tricky nature of queuing and the evolution of OSes, it’s also entirely possible that NVIDIA has implemented a newer algorithm for pacing frame submission."
 

seanreisk

[H]ard|Gawd
Joined
Aug 29, 2011
Messages
1,711
The source you quoted isn't even sure if the algorithm is new or not so until there's actual testing done, I'll believe NVIDIA...

5150 I lub j0000! I do! This is me, lubbing j0000000! :love: And GoldenTiger. And even Snowdog (in a platonic, manly sort of way.)

I'm not the one who said, 'conspiracy', and I didn't think of it in those terms. My original statement was more of a three-dudes-farting-on-the-couch statement - "you got another 20% out of the card?? You had that much fat laying around?" The announced speed improvements (with few clarifications), the big reveal at Gamescon and their current competition with AMD sorta pulled one of my nostril hairs.

We're pushing this way out of bounds. I LOVE YOU! :love:


P.S. I'm one of the people who installed the driver and got the 'GeForce Experience' experience, too.

P.P.S. I lub j000! I do! :love:

P.P.P.S. You big poopyhead.

P.P.P.P.S. There is no 'grovel' emoji? :notworthy:
 

Snowdog

[H]F Junkie
Joined
Apr 22, 2006
Messages
11,262
So ... A feature that GeForce drivers already had is rebranded 'ultra-low latency mode' and is announced with fanfare at Gamescon. No conspiracy, just marketing.

Comprehension issue? I wasn't referring to low latency mode.

I was referring to the weird theory that NVidia was purposefully gimping their own GPUs gaming performance, like that was some kind of nefarious benefit.

The ludicrous theories I see on this board make me think there must be a contest I wasn't told about.
 
Top