Nvidia WHQL 419.17 Driver Released

AlphaAtlas

[H]ard|Gawd
Staff member
Joined
Mar 3, 2018
Messages
1,713
Nvidia just uploaded a new set of drivers optimized for Anthem and Dirt Rally 2.0. Among other things, Nvidia says the WHQL 419.17 drivers add SLI support for Apex Legends and Far Cry New Dawn, and feature enhanced SLI support for Anthem. Nvidia fixed an artifacting bug in Battlefield V and a black texture issue in Doom, but notes that some G-Sync devices may have issues on Windows 10 and that Apex Legends might still have some issues. You can download the drivers straight from Nvidia's website, or wait for Windows update to push them to your PC. Thanks to Armenius and cageymaru for the tip.

Nvidia also mentioned that these drivers support the recently released Video Codec SDK 9.0. Now, Turing GPUs support encoding HEVC B-Frames, which should substantially improve h.265 video quality and/or bitrates, while Nvidia also added some features to reduce encoding overhead. Both AMD and Nvidia GPUs have supported hardware encoding for years, but the feature has largely been ignored in mainstream applications due to the reduced encoding quality compared to CPUs, software quirks, and a general reluctance to switch from h.264 to HEVC. Now that Turing is approaching the quality levels of CPU encoding (though just how close they are is up for debate), I suspect we'll see more applications make use of dedicated GPU video encoding blocks.
 
Good news about the encoder improvements. Just swapped a 1060 for a 2060 for the superior NVENC. And no, it's not as good as CPU only encoding, but it's fantastic in applications where doing the encode in a fraction of the time is more important than the delta in quality. A godsend for streamers as well.
 
Anyone done any sli testing with anthem with these drivers? My dual 1080tis are struggling at 1440p still.
 
  • Like
Reactions: jolli
like this
Nvidia also mentioned that these drivers support the recently released Video Codec SDK 9.0. Now, Turing GPUs support encoding HEVC B-Frames, which should substantially improve h.265 video quality and/or bitrates, while Nvidia also added some features to reduce encoding overhead. Both AMD and Nvidia GPUs have supported hardware encoding for years, but the feature has largely been ignored in mainstream applications due to the reduced encoding quality compared to CPUs, software quirks, and a general reluctance to switch from h.264 to HEVC. Now that Turing is approaching the quality levels of CPU encoding (though just how close they are is up for debate), I suspect we'll see more applications make use of dedicated GPU video encoding blocks.

I wished they hadn't screwed up hardware video decode on Linux.

Forever they used the a library called VDPAU which was great. In fact it was the best way to hardware decode video on Linux, but they discontinued it a couple of years ago, so under Linux no nvidia board can hardware decode anything more than basic 8-bit HEVC.

This means that blurays encoded with 10bit HDR HEVC are freaking slide shows on my Kodi box...

They replaced VDPAU with some new library called NVDEC, but apparently the design sucks to the point where none of the open source projects like Kodi seem willing to use it, despite it being available as part of ffmpeg 4... Something to do with depending on some poorly written EGLstreams driver Nvidia uses.

The industry seems to have coalesced around VAAPI on AMD and Intel at this point and it really works well. I wish NVIDIA would stop trying to buck open standards.
 
Last edited:
I wished they hadn't screwed up hardware video decode on Linux.

Forever they used the a library called VDPAU which was great. In fact it was the best way to hardware decode video on Linux, but they discontinued it a couple of years ago, so under Linux no nvidia board can hardware decode anything more than basic 8-bit HEVC.

This means that blurays encoded with 10bit HDR HEVC are freaking slide shows on my Kodi box...

They replaced VDPAU with some new library called NVDEC, but apparently the design sucks to the point where none of the open source projects like Kodi seem willing to use it, despite it being available as part of ffmpeg 4...

MPV uses NVDEC, IIRC. And you can configure Kodi to use it as an external player.
 
The industry seems to have coalesced around VAAPI on AMD and Intel at this point and it really works well. I wish NVIDIA would stop trying to buck open standards.
Well, if you are unhappy wth Nvidia trying to buck open standards, vote with your wallet on future builds for a while.
 
Well, if you are unhappy wth Nvidia trying to buck open standards, vote with your wallet on future builds for a while.

If AMD only made anything fast enough, I would.

I'm about ready to upgrade my Pascal Titan X to something faster to help me hit 4k ultra at 60fps, and only just now have AMD come out with something on the level of what I already have in the Vega VII.

I'm probably holding off until next gen at this point. I hope AMD comes out with something worthwhile in that time, but I'm not holding my breath.

I'm willing to wait a year, but I had better see 4k performance 50% faster than my Pascal Titan.
 
SLI sure would be nice on BF-V and Metro Exodus...

SLI works in Battlefield V. You have to use DX11 and the BF1 profile. I am able to max it at 4k with no msaa with 2 1080s and get like 90+fps. Totally playable and no flicker unlike BF1 which flickered for me.
 
If AMD only made anything fast enough, I would.

I'm about ready to upgrade my Pascal Titan X to something faster to help me hit 4k ultra at 60fps, and only just now have AMD come out with something on the level of what I already have in the Vega VII.

I'm probably holding off until next gen at this point. I hope AMD comes out with something worthwhile in that time, but I'm not holding my breath.

I'm willing to wait a year, but I had bet
ter see 4k performance 50% faster than my Pascal Titan.
I'm in the same boat, I want to ditch sli but nothing is going to get me to the performance point I'm looking for in a single card, which means I am going to have to wait like 5 years at the current rate of generational performance increases unless Intel decides to throw its dick around with their forthcoming dgpu.
 
SLI works in Battlefield V. You have to use DX11 and the BF1 profile. I am able to max it at 4k with no msaa with 2 1080s and get like 90+fps. Totally playable and no flicker unlike BF1 which flickered for me.

Tried with my 2080's. No mas with the BF1 profile and Nvidia Inspector. I really want mGPU/DX12 to work with DLSS and ray tracing.
 
Re bugs in new games, its always tempting to blame whoever fixes the problem for having caused the problem. But I recall an interview with a former Nvidia engineer discussing how often even AAA games often ship with rendering, texture, shader, and etc.. bugs that for the sake of their product experience they bypass or override.

My experience working in software companies says very likely. I know AMD often has similair patch notes for new games but have never heard anyone from AMD accept blame or credit for the cause or solution of those bugs. Yet, I've always wondered.

Probably bugs go both ways and they all have a silent gentleman's pact to not stir drama.

TLDR: Grandma wisdom says when you point one finger you have 3 more fingers pointing right back at you.
 
I'm in the same boat, I want to ditch sli but nothing is going to get me to the performance point I'm looking for in a single card, which means I am going to have to wait like 5 years at the current rate of generational performance increases unless Intel decides to throw its dick around with their forthcoming dgpu.
If AMD only made anything fast enough, I would.

I'm about ready to upgrade my Pascal Titan X to something faster to help me hit 4k ultra at 60fps, and only just now have AMD come out with something on the level of what I already have in the Vega VII.

I'm probably holding off until next gen at this point. I hope AMD comes out with something worthwhile in that time, but I'm not holding my breath.

I'm willing to wait a year, but I had better see 4k performance 50% faster than my Pascal Titan.


Rumor is july'ish.

We'll see. In the past AMD always was late and disappointing but much of that was to be blamed on globalfoundries. We will see how this AMD-TSMC partnership plays out against Nvidia-samsung. Next 18 months conventional wisdom and historical precedent may no longer apply for errrbody.
 
I'm in the same boat, I want to ditch sli but nothing is going to get me to the performance point I'm looking for in a single card, which means I am going to have to wait like 5 years at the current rate of generational performance increases unless Intel decides to throw its dick around with their forthcoming dgpu.

Previous advantage Intel had was being a generation ahead on fabs thus double transistor density, and could throw major bucks into GPU design, and possibly being able to make a pure raytracing chip from ground up was supposed to offset previous GPU maker experience and optimizations.

But, TSMC fabs and the radeon vii and maybe even NAVI being a generation ahead of anything Intel can ship, really wrecks that once advantage. IIRC The original talk was to build a pure raytracing chip with traditional rendering as a software compatible type mode.

For me I have contended myself to play in 1080p and sit my recliner a few extra feet back from the tv, now I can't really tell difference between 1080v4k

TLDR: Not expecting anything serious from Intel for several years.
 
but notes that some G-Sync devices may have issues on Windows 10 and that Apex Legends might still have some issues.

It's truly ironic. Nvidia makes G-Sync. Nvidia makes their own drivers. Nvidia drivers, lately, have been causing issues for G-Sync.?!?
 
It's truly ironic. Nvidia makes G-Sync. Nvidia makes their own drivers. Nvidia drivers, lately, have been causing issues for G-Sync.?!?

Yeah, I installed the driver and experienced an issue where it caused some problems on the edge and center of my Acer Predator g-sync monitor that I couldn't capture in a screenshot. Rebooting my PC resolved the issue. Avoid this driver if you can, especially if you have a g-sync monitor.
 
Yeah, I installed the driver and experienced an issue where it caused some problems on the edge and center of my Acer Predator g-sync monitor that I couldn't capture in a screenshot. Rebooting my PC resolved the issue. Avoid this driver if you can, especially if you have a g-sync monitor.

Then I'll definitely avoid. My G-Sync(PG278Q) rig is paired to a 1080TI and I haven't updated the driver since 399.24. Seems ever since the RTX drivers came out I've read of constant issues for either Pascal or G-Sync. I experienced similar problems back in the day with my 970's when we were crossing over from Maxwell to Pascal. Plays SOTTR, RE2, and Metro Exodus just fine so I don't see any need for 'optimized' drivers yet. On my other rig, a 2080TI that is paired to a 4k/HDR/V-Sync T.V., the post 400 drivers, so far, have helped with various improvements.

I'm actually curious with these new GTX's coming out now if they'll finally split the drivers between RTX and GTX since they don't necessarily seem to mix well.
 
Back
Top