GeForce Game Ready Driver 344.48 WHQL

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
NVIDIA sends word that its brand new GeForce Game Ready Driver 344.48 WHQL is now available.

The new GeForce Game Ready driver, release 344.48 WHQL, allows GeForce owners to continue to have the ultimate gaming experience. This driver brings support for Dynamic Super Resolution (DSR) to Kepler and Fermi desktop GPUs. In addition, this Game Ready WHQL driver ensures you'll have the best possible gaming experience for the latest new blockbuster titles including Lords of the Fallen, Civilization: Beyond Earth, and Elite: Dangerous.
 
Application Profiles
*Lichdom: Battlemage – SLI profile added
*Lords of the Fallen – SLI profile added
*Ryse: Son of Rome – SLI profile added, stereo blocked
*Sleeping Dogs Definitive Edition – SLI profile added
*The Vanishing of Ethan Carter – SLI profile added
 
Surround - NVIDIA Control Panel
*Added support for up to 5 displays
*Added support for G-SYNC displays
 
NVIDIA G-SYNC
*Added support for cloned G-SYNC displays as well as cloned G-SYNC/non-G-SYNC displays
*Added support for G-SYNC displays in a Surround configuration

Very nice, but still no acknowledgement of the voltage issue in SLI? Also says MFAA is now available, but only in single-card configurations... :mad:
 
Is this confusing to anyone else?

3D Vision Profiles
Added or updated the following profiles:
• Dead Rising 3 – Not Recommended
• Strife – rated as Fair

3D Compatibility Mode Support
Support for 3D Compatibility Mode has been added for the following games:
• Dead Rising 3 – rated as Excellent
• Strife – rated as Excellent

*edit* aparently compatibility mode is some alternate form of 3d vision which is enabled by default for titles with specific profiles. However it requires dx10/11 and does not support surround mode.
 
Still no SLI profile for Dead Rising 3... damn it, Nvidia.

This is why I bought two of your GTX 980s - to overpower games like this.

Dead Rising 3 can't hold a solid 60 FPS @ 1080p on a single GTX 980.

They're really dropping the ball on this one.
 
I kinda expected DSR to make the trip (if at all possible) to Kepler (where larger VRAM loadouts became commonplace) - however, DSR as a Fermi option is a shocker.

Refresh my memory - what is the OLDEST Fermi-based GPU?

GTX 480
 
Wow I'm looking forward to this, I've been wanting to test DSR ever since it was announced.
 
I'm not all that shocked about Fermi support, honestly. There are slower modern cards (GTX 750) that are getting DSR support, so why not?
 
I'm not all that shocked about Fermi support, honestly. There are slower modern cards (GTX 750) that are getting DSR support, so why not?

Two words - GPU architecture.

GTX 750 is "baby Maxwell", and is not necessarily "slower" than Kepler. It is "bandwidth starved" compared to Kepler, but that is all.

The shock comes in with GTX 550 Ti - a rather commonplace (still) Fermi-era card. Yes - it supports DSR. (I have enabled it with the deliberately-conservative settings of double current screen resolution and maximum quality - so far, it's for my least-taxing game only (Bejeweled 3). That will create a DSR resolution of 3840x2160 if I have the math right. I had planed to only apply DSR to games that are firewalled otherwise. Bejeweled 3's settings ARE, in fact, firewalled otherwise - hence it making a great testbed for DSR.
 
Still no SLI profile for Dead Rising 3... damn it, Nvidia.

This is why I bought two of your GTX 980s - to overpower games like this.

Dead Rising 3 can't hold a solid 60 FPS @ 1080p on a single GTX 980.

They're really dropping the ball on this one.
The developers do not want to work with NVIDIA to get SLI working, so NVIDIA just forced single-GPU in the drivers since there were problems with the default AFR modes.

HardOCP Dead Rising 3 Patch 4 Announcement said:
There is a new patch out for Dead Rising 3 that addresses a crash-on-boot issue. Also, if you are running a SLI / CrossFire set up, you are going to want to read the quote below:
We also want to comment on some of the issues people are having when trying to force SLI and Crossfire to work. Some players have set up their systems to force usage of both GPUs by default, and some players have tried to force SLI/Crossfire ON while playing DR3. While this can benefit some games, DR3 PC does not make use of dual-GPUs. If your system is set to force dual-GPU usage when playing games, you can expect to see lower frame rates when compared to not forcing this setting.
 
Two words - GPU architecture.

GTX 750 is "baby Maxwell", and is not necessarily "slower" than Kepler. It is "bandwidth starved" compared to Kepler, but that is all.
I was referring to the GTX 480 mentioned above, not the entire Kepler architecture.

A GTX 480 is still decidedly faster than a GTX 750 :p

The shock comes in with GTX 550 Ti - a rather commonplace (still) Fermi-era card.
Not all that shocking. If DSR supports one Fermi-based card, it supports another :p
 
Any benchmarks of DSR out there? I'm seeing 10-30% hit. Currently, I'm sticking with 3x, because it looks damn good at 1080p.
 
1.78 (1440p on 1080p) hits me around 15% with SLI 970s. 4.00 (4k) hits me around 50%.

I'm pretty sure I saw a review somewhere that did some benchmarks with DSR... But I would imagine the performance hit being similar to running at that actual resolution.
 
SLI and voltages are still not fixed. Tested at stock settings.

Blah
 
Combining DSR and a PhysX-heavy game = mega performance hit. I don't recommend using it in any of the Batman games :D

Presumably, both DSR and PhysX use the same set of compute units to do their work, and compete for resources.
 
Combining DSR and a PhysX-heavy game = mega performance hit. I don't recommend using it in any of the Batman games :D

Presumably, both DSR and PhysX use the same set of compute units to do their work, and compete for resources.

I had no issue with Borderlands TPS at one notch below 4k and Ultra Physx, but Batman is probably more taxing. DSR should be fine for most dx9/10 games at close to or @4k especially with SLI.
 
And this, friends, is a major reason why I avoid both SLI and Crossfire like the plague.

It'll never be as smooth and trouble-free as single-GPU. I want my system to work, without worrying about random weird issues caused by an increased-complexity configuration.

Honestly not feeling the need for multi-GPU anyway. Still gaming just fine at 5760x1200 with a single GTX 780.
 
I love SLI.

It's pretty trouble-free. Dead Rising 3 is the first game I've played in quite some time that hasn't supported it perfectly.

For everything else its brute-force power is very helpful. I couldn't hold 60 FPS on half the stuff I'm playing now @ 4K DSR if I only had one GPU.
 
excitement fading :(

anger growing :mad:

I said the following in the Geforce Forums regarding this:

:(

"Well having them confirm the issue is being worked on is a big step. Maybe they have a beta on the way that will address this. Keep in mind the WHQL process at Microsoft takes several weeks so this driver was probably finalized and submitted for WHQL certification before they confirmed this issue.

I don't want people to think I'm irrationally optimistic but sadly we need to wait just a little longer."
 
And this, friends, is a major reason why I avoid both SLI and Crossfire like the plague.

It'll never be as smooth and trouble-free as single-GPU. I want my system to work, without worrying about random weird issues caused by an increased-complexity configuration.

Honestly not feeling the need for multi-GPU anyway. Still gaming just fine at 5760x1200 with a single GTX 780.

No choice if you want to push 120 frames, even at 1080p.
 
And this, friends, is a major reason why I avoid both SLI and Crossfire like the plague.

It'll never be as smooth and trouble-free as single-GPU. I want my system to work, without worrying about random weird issues caused by an increased-complexity configuration.

Honestly not feeling the need for multi-GPU anyway. Still gaming just fine at 5760x1200 with a single GTX 780.
Either you're cherry picking your games or it sounds like antialiasing quality isn't much of a priority for you.

I hear you though about SLI / Crossfire, honestly the way it's set up is insane. My understanding of it is the Graphics vendors have to create the custom SLI profile for EVERY SINGLE GAME instead of some unified solution. I mean I guess they have the money to do it, but it still strikes me as so damn inefficient.
 
No choice if you want to push 120 frames, even at 1080p.
You have plenty of choices to get 120FPS at 1080p, but they involve turning down a couple settings rather than adding a second (headache-inducing) card.

Also, 120FPS isn't relevant to a lot of people (me included). I wont be getting a 120Hz monitor until they come in 8-bit IPS or OLED with G-Sync and/or FreeSync. I'll stick with my current 60Hz IPS monitors until a worthy upgrade is available.

Either you're cherry picking your games or it sounds like antialiasing quality isn't much of a priority for you.
Running native resolution at 60+ FPS is priority #1.

I don't mind FXAA in most games, which immediately opens up all kinds of headroom. I can also stand turning down a couple settings if it means NOT having to deal with SLI / Crossfire.

And if I ever run into a title I just CAN'T run at 5760x1200, I also have the option of dropping to single-screen. That pretty much signals that it's upgrade-time for me, though.

I hear you though about SLI / Crossfire, honestly the way it's set up is insane. My understanding of it is the Graphics vendors have to create the custom SLI profile for EVERY SINGLE GAME instead of some unified solution. I mean I guess they have the money to do it, but it still strikes me as so damn inefficient.
Yeah, it's FAR from an ideal implementation.

Lucid actually had the right idea. Virtualize both graphics chips and hide them behind a single logical GPU. Wish Nvidia would implement a similar SLI mode...

I also wonder why we have yet to see them implement a dual-GPU card with both GPU dies on a single package (similar to the first generation of dual-core CPUs, where it was two physical chips on one package under the heatspreader). This was addressed as a single dual-core processor, and was indistinguishable from a true single-die dual-core as far as any software running on it is concerned. Why can't the same setup work for a graphics processor?
 
Last edited:
Other Updates: "support for Oculus Rift’s drivers on G-SYNC systems" Added

Finally! It sucked having to uninstall the rift or disable g-sync in order to use one or the other.
 
The DSR stuff is kind of fun to mess with, though also easy to break things if you push too far.

I can set 3620x2036 for FFXIV and on my 2560x1440 screen its still 60fps and it looks pretty sharp with no FXAA, but if i push to 5120x2880 I get no image at all.

Kind of a pain that it has to be in Fullscreen only too, since it makes mousing between displays a bit more of a pain. Also funny that FRAPS grabs the screenshots at 3620x2036 now making my screenshot folder in danger of exploding....

Wish I had some 970's to toy with.... sigh.
 
I can set 3620x2036 for FFXIV and on my 2560x1440 screen its still 60fps and it looks pretty sharp with no FXAA, but if i push to 5120x2880 I get no image at all.
Bet you something in the rendering engine is using a 32bit integer, and freaks out if the resolution goes higher than 4096 pixels on either axis.

I've run into that problem in some older games when running Surround (mostly older OpenGL titles). They totally crap out if the screen resolution goes wider than 4096 pixels.
 
Bet you something in the rendering engine is using a 32bit integer, and freaks out if the resolution goes higher than 4096 pixels on either axis.

I've run into that problem in some older games when running Surround (mostly older OpenGL titles). They totally crap out if the screen resolution goes wider than 4096 pixels.

Thats a good guess, I hadnt thought of it myself.

I got it up to a higher resolution... 3840x2160 but at that point the frame rate is starting to drop seriously, so for now I wont bother with it but for some extra high resolution screenshots for wallpapers I could see me messing with it some more.
 
after I downloaded this driver and installed it I cant delete the driver from by desktop, it says its been used by internet explorer, wtf?

is anyone else having the same issue?
 
You have plenty of choices to get 120FPS at 1080p, but they involve turning down a couple settings rather than adding a second (headache-inducing) card.

Also, 120FPS isn't relevant to a lot of people (me included). I wont be getting a 120Hz monitor until they come in 8-bit IPS or OLED with G-Sync and/or FreeSync. I'll stick with my current 60Hz IPS monitors until a worthy upgrade is available.

I hate compromising on settings just to hit 120 FPS. Until they make a single GPU powerful enough to run everything maxed out at steady 120 frames at 1080p SLI is the only way to go for me.

I personally don't mind SLI since I'm not sensitive to microstutter, and my personal experience is it doesn't become an issue until frames start dipping into the low 40s.
 
after I downloaded this driver and installed it I cant delete the driver from by desktop, it says its been used by internet explorer, wtf?

is anyone else having the same issue?

Firefox pulls this same gag occasionally. Reboot and delete. Or restart the desktop, browser, etc.
 
Yeah, looks nice for games that don't support Surround.

Really wanted to try out DSR in some older games (Left4Dead2, Space Marine, etc.) in Surround, but the DSR option vanishes from the list as soon as Surround is enabled.
 
Application Profiles
*Lichdom: Battlemage – SLI profile added
*Lords of the Fallen – SLI profile added
*Ryse: Son of Rome – SLI profile added, stereo blocked
*Sleeping Dogs Definitive Edition – SLI profile added
*The Vanishing of Ethan Carter – SLI profile added
 
Surround - NVIDIA Control Panel
*Added support for up to 5 displays
*Added support for G-SYNC displays
 
NVIDIA G-SYNC
*Added support for cloned G-SYNC displays as well as cloned G-SYNC/non-G-SYNC displays
*Added support for G-SYNC displays in a Surround configuration

Very nice, but still no acknowledgement of the voltage issue in SLI? Also says MFAA is now available, but only in single-card configurations... :mad:

Granted, I'm finished with the game, but I wish they would release a damn SLI profile for Shadow of Mordor. The workaround is fine, but I would like to see what a real profile would have done.

Also pretty bummed about the MFAA only for single cards right now. I guess they will add it to SLI configurations soon though. DSR only took a week or two to get added to SLI.
 
I hope by the time GM200 is out we'll have a semi-decent driver. Otherwise fuck this.
 
It might be just placebo, but I used DSR in World of Tanks. Not only is FPS about 20 faster (i was at about 50 or so, now i'm 65-75 FPS), but doing 4x in DSR makes the tanks look AWESOME.

I'll have to do more testing.
 
I randomly got a BF4 crash using an overclock that used to be rock solid on the previous driver. Hopefully it's just a coincidence.
 
Back
Top