Nvidia GeForce Game Ready Driver 416.34 WHQL Has Been Released

cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,089
Nvidia GeForce Game Ready Driver 416.34 WHQL has been released and it provides the optimal gaming experience for Call of Duty: Black Ops 4, SOULCALIBUR VI, and GRIP. Fixed issues in this release include: Games launch to a black screen when DSR is enabled. [2411501]. Some games produce red/green/blue shimmering line when played in full-screen mode and with G-SYNC enabled. [2041443]. [Windows Defender Application Guard][vGPU][Surround]: Surround cannot be enabled from the NVIDIA Control Panel when running Edge Browser with Application Guard over vGPU. [200444614]. [PUBG]: Issue with shadows may occur in the game. [2414749]. When HDR is enabled, games show green corruption. [2400448]. The release notes are located here.

Windows 10 issues: [Windows Defender Application Guard][vGPU][Surround]: Edge Browser with Application Guard cannot be opened when Surround is enabled. [200443580]. [GeForce GTX 1060]AV receiver switches to 2-channel stereo mode after 5 seconds of audio idle. [2204857]. [GeForce GTX 1080Ti]: Random DPC watchdog violation error when using multiple GPUs on motherboards with PLX chips. [2079538]. [SLI][HDR][Battlefield 1]: With HDR enabled, the display turns pink after changing the refresh rate from 144 Hz to 120 Hz using in-game settings. [200457196]. [Firefox]: Cursor shows brief corruption when hovering on certain links in Firefox. [2107201]. [Far Cry 5]: Flickering occurs in the game. [2400207].
 
It seems both of the 416 drivers cause flickering in Witcher 3 and there is still flickering in Far Cry 5. This has been reported by many and even the youtuber duderandom84 had to revert back to 411 drivers in the middle of his 2080 ti Witcher 3 video. If they cant even get drivers right in huge games like Witcher 3 and Far Cry 5 then how the hell can we expect not to have issues in less popular games?
 
Misterbobby, changing the negative LOD bias setting could rid you of flicker. I always enable it to prevent flicker, espeically if the texture is white and AA is used. Actually, the control panel has a lot of setting no one tickers with but should. I always use high quality textures, clamp LOD bias, use fast refresh, and highest available refresh for prefered refresh.
 
It seems both of the 416 drivers cause flickering in Witcher 3 and there is still flickering in Far Cry 5. This has been reported by many and even the youtuber duderandom84 had to revert back to 411 drivers in the middle of his 2080 ti Witcher 3 video. If they cant even get drivers right in huge games like Witcher 3 and Far Cry 5 then how the hell can we expect not to have issues in less popular games?
I had to figure this out on my own yesterday. Sure enough, it was the 416 drivers causing this. 411 doesn't have this weirdness.
 
Misterbobby, changing the negative LOD bias setting could rid you of flicker. I always enable it to prevent flicker, espeically if the texture is white and AA is used. Actually, the control panel has a lot of setting no one tickers with but should. I always use high quality textures, clamp LOD bias, use fast refresh, and highest available refresh for prefered refresh.
Interesting. It was my understanding that this was one of those settings that only worked in OpenGL. I need to enable this option globally from now on, as generally nothing good can come from using a LOD bias < 0.
 
I had to figure this out on my own yesterday. Sure enough, it was the 416 drivers causing this. 411 doesn't have this weirdness.


Oh look, it is very similar to some of the freaky issues that The Division used to have that they took over a year to fix. Color me not surprised.

On the other hand, my R9-390 never had the issue. Only after switching to a 1080 did I start to see that.
 
...changing the negative LOD bias setting could rid you of flicker. I always enable it to prevent flicker, especially if the texture is white and AA is used. Actually, the control panel has a lot of setting no one tickers with but should. I always use high quality textures, clamp LOD bias, use fast refresh, and highest available refresh for prefered refresh.

The driver release notes say that Negative LOD Bias clamp is no longer supported on Fermi or higher. Listed on page 9 under limitations of this release: https://us.download.nvidia.com/Windows/416.34/416.34-win10-win8-win7-desktop-release-notes.pdf
Negative LOD bias clamp for DirectX applications is not supported on Fermi-based GPUs and later.

So does changing only the negative LOD bias clamp setting affect that flicker in the Witcher 3?

I do not usually mess with anything in the nVidia control panel, other than making sure G-Sync settings are ok and match the display.

Since that is an apparently deprecated setting, I wonder if a driver uninstall and cleanup with driver cleaner, reboot, then a fresh reinstall of the latest 416 drivers would get rid of those glitches... could be that old registry settings are enabling old deprecated functions, resulting in the occasional glitch, in this case that texture flicker.. Could also be that it's some overlooked bug in the driver. If enabling a deprecated function fixes the flicker, that would probably be a good clue to the driver programmer to find and fix it.
 
That is texture corruption....I wouldn't say flicker. Game and/or driver issue without a doubt. The Division has had similar issues with DX 12 enabled. It is the same block'y corruption too.

I have been setting LOD bias for 4 generations and apparently it is only for OGL and has been disabled for the past three gen's (whoa o0, need to read the manuals further next time). This setting prevented lines that were AA from flickering as you move the POV. It worked for say lamp post or grated surfaces.

If you use G-Sync you should always be V-sync disabled in control panel and game if a configuration option is given.
 
Last edited by a moderator:
Lol

20A7E139-E9EE-460B-BA55-3AFE0025C1ED.jpeg
 
Misterbobby, changing the negative LOD bias setting could rid you of flicker. I always enable it to prevent flicker, espeically if the texture is white and AA is used. Actually, the control panel has a lot of setting no one tickers with but should. I always use high quality textures, clamp LOD bias, use fast refresh, and highest available refresh for prefered refresh.

Actually, as of a little while back, Negative LOD bias now does NOTHING. Which is hilarious, because they left the option in the control panel and didn't even bother to clarify the tool-tip (and never do anymore), like they did at one time man years ago for Triple Buffering only affecting OpenGL now and not DirectX.
 
Interesting. It was my understanding that this was one of those settings that only worked in OpenGL. I need to enable this option globally from now on, as generally nothing good can come from using a LOD bias < 0.

Sorry I should have replied to you too... Neg LOD bias in the NV Control Panel no longer does anything, since the 980/970 cards I believe, but don't quote me as to exactly when it stopped working for most cards on the market. You can see it in the release notes for the newest drives, I believe, and probably before that.
 
It seems both of the 416 drivers cause flickering in Witcher 3 and there is still flickering in Far Cry 5. This has been reported by many and even the youtuber duderandom84 had to revert back to 411 drivers in the middle of his 2080 ti Witcher 3 video. If they cant even get drivers right in huge games like Witcher 3 and Far Cry 5 then how the hell can we expect not to have issues in less popular games?


I played Far Cry 5 again for the first time in months, and it ssems like my frame rate dropped from around 90 something to lows 60s.

Hmmm.

Guess my 1080ti is garbage now that a new card is out
 
Nvidia pulling an apple? How apple gimped battery life on older phones? Hmm. I wonder if the 2080s have these issues? I noticed some weird shit on bf1 last night. Texture corruption. Ground textures looked like soup.
 
I played Far Cry 5 again for the first time in months, and it ssems like my frame rate dropped from around 90 something to lows 60s.

Hmmm.

Guess my 1080ti is garbage now that a new card is out

Most likely, you have Geforce Experience set to automatically optimize games... I've found that the auto optimization isn't always accurate on ultra widescreen displays (that's what I have). Hell it could just overestimate performance if it doesn't take things like the CPU into account. Really, I think it just makes a best guess.

Turn off "Automatically optimize games". Let it do 1 complete pass on optimizing performance. Then play each game and make adjustments as needed. I haven't had any unusual performance shifts with my 1080Ti since making those adjustments.
 
Nvidia pulling an apple? How apple gimped battery life on older phones? Hmm. I wonder if the 2080s have these issues? I noticed some weird shit on bf1 last night. Texture corruption. Ground textures looked like soup.

Courage.
 
Back
Top