GeForce 344.75 WHQL

pandora's box

Supreme [H]ardness
Joined
Sep 7, 2004
Messages
4,846
What’s New in Version 344.75
  • The latest GeForce Game Ready driver, release 344.75 WHQL, provides support for Maxwell’s new Multi-Frame Sampled Anti-Aliasing (MFAA) mode. In addition, this Game Ready WHQL driver ensures you'll have the best possible gaming experience for Far Cry 4, Dragon Age: Inquisition, The Crew, and World of Warcraft: Warlords of Draenor.

Game Ready
  • Best gaming experience for Far Cry 4, Dragon Age: Inquisition, The Crew, and World of Warcraft: Warlords of Draenor.

Gaming Technology
  • Supports Multi-Frame Sampled Anti-Aliasing (MFAA) mode.

Application Profiles
  • Added or updated the following profiles:
• Alien: Isolation
• Assassin's Creed: Unity
• Borderlands The Pre-Sequel
• Call of Duty: Advanced Warfare
• Castlevania: Lords of Shadow 2
• Dead Rising 3 - SLI-Single profile
• Divinity: Original Sin
• Dragon Age: Inquisition
• Elite Dangerous
• Escape Dead Island
• F1 2014
• Far Cry 4
• FIFA 15
• Gauntlet - SLI disabled
• GRID Autosport
• Heroes of the Storm
• IL-2: Sturmovik: Battle of Stalingrad
• Lichdom: Battlemage
• Lords of the Fallen
• MechWarrior Onlin
• Metro Redux
• Monster Hunter Online Benchmark
• Ryse: Son of Rome
• Sid Meier's Civilization: Beyond Earth
• Skyforge
• Sleeping Dogs Definitive Edition
• Strife
• The Crew
• The Vanishing of Ethan Carter
• TitanFall
• Train Simulator 2015
• World of Warcraft: Warlords of Draenor​

Differing GPU Voltages in SLI Mode
  • When non-identical GPUs are used in SLI mode, they may run at different voltages. This occurs because the GPU clocks are kept as close as possible, and the clock of the higher performance GPU is limited by that of the other. One benefit is that the higher performance GPU saves power by running at slightly reduced voltages. An end-user gains nothing by attempting to raise the voltage of the higher performance GPU because its clocks must not exceed those of the other GPU.

Software Modules
  • NVIDIA PhysX System Software - version 9.14.0702
  • HD Audio Driver - version 1.3.32.1
  • CUDA - version 6.5
  • GeForce Experience - 16.13.65.2


Windows 7/8 64bit WHQL

Windows 7/8 32bit WHQL

Windows XP 32bit

Windows XP 64bit

Windows 7/8 Notebook 32bit WHQL

Windows 7/8 Notebook 64bit WHQL

344.75 WHQL Release Notes PDF
 
Differing GPU Voltages in SLI Mode
When non-identical GPUs are used in SLI mode, they may run at different voltages. This occurs because the GPU clocks are kept as close as possible, and the clock of the higher performance GPU is limited by that of the other. One benefit is that the higher performance GPU saves power by running at slightly reduced voltages. An end-user gains nothing by attempting to raise the voltage of the higher performance GPU because its clocks must not exceed those of the other GPU.


Wait, I thought this was occurring on identical GPUs, and that people were indeed gaining by evening them out.
 
Differing GPU Voltages in SLI Mode
When non-identical GPUs are used in SLI mode, they may run at different voltages. This occurs because the GPU clocks are kept as close as possible, and the clock of the higher performance GPU is limited by that of the other. One benefit is that the higher performance GPU saves power by running at slightly reduced voltages. An end-user gains nothing by attempting to raise the voltage of the higher performance GPU because its clocks must not exceed those of the other GPU.

Wait, I thought this was occurring on identical GPUs, and that people were indeed gaining by evening them out.
It is occurring on identical GPUs. Someone on the GeForce forums even tested on identical reference cards with nearly identical ASIC quality and the same thing happened. That is, before the moderators deleted all of his posts...

And yes, people are gaining performance and stability from evening them out.

My personal opinion is NVIDIA can't figure out the problem, so instead of admitting a screw up they release a statement saying this is "normal" behavior... The issue didn't even exist before the 344.11 drivers came out.
 
Not the first time it's happened, and certainly won't be the last. Just look up soldergate

It's shit like this that makes me want to jump ship to AMD. Yeah yeah space heater driver issues blah blah. Not like the drivers are any better for Maxwell cards :rolleyes:
 
Not the first time it's happened, and certainly won't be the last. Just look up soldergate

It's shit like this that makes me want to jump ship to AMD. Yeah yeah space heater driver issues blah blah. Not like the drivers are any better for Maxwell cards :rolleyes:

+1. Seriously considering R390X pair with HBM.
 
Last edited:
Like AMD/ATI have never done anything shady...

I'm sure they have in the past but if anything they've been doing good things more recently. Competition needs to desperately be stimulated in their(AMD's) favor for a while. Nvidia has been having too many good quarters and needs to be humbled again. I personally wouldn't mind if the R390X kicks their ass forcing them to lower prices and regain focus on the things that mattered and made nVidia what they are today, Customer service, excellent driver support, listening to the fan base and community.

They have seriously been screwing up lately.
 
Not the first time it's happened, and certainly won't be the last. Just look up soldergate

It's shit like this that makes me want to jump ship to AMD. Yeah yeah space heater driver issues blah blah. Not like the drivers are any better for Maxwell cards :rolleyes:

I just sold my second 970. If AMD's 390X has some kind of native downsampling support I will buy one. No more dual card setup's for me.

Here is what I posted in a thread over at Rage3D:

I simply don't have the patience anymore to deal with the issue's SLI brings to the table. For one even though your getting higher frame rate it still doesn't feel as smooth as single card. I disabled SLI in Far Cry 4 as I was having issues with shadows being too dark. I was amazed at how smooth it felt, even though the fps was lower. Same goes for Dragon Age Inquisition. Was having issues with flashing shadows in SLI. Disabled it, and 1. the shadow flickering is gone, and 2. It feels incredibly smooth, it's been so long since I've used a single card setup I think I forgot how smooth it can be. Sure I may not be using DSR in the latest games anymore or using insane amount of AA but being able to play the latest games without waiting on a new driver is awesome. Also SMAA takes care of enough of the aliasing for me, so stuff like TXAA and 8xMSAA is overkill in my eyes. TXAA looks like crap anyway. I feel like as we get more next gen console releases SLI is going to get buggier and buggier. I think there's so much going on in games now that SLI needs to be re-worked. I'd almost prefer to have split frame rendering at this point. The fact that Nvidia's own features don't even work with SLI is telling. Some people are still having issues getting DSR working with SLI, 2 cards in SLI work at different voltages. Nvidia claims forcing them to the same voltage via 3rd party programs doesn't improve performance, when infact it does. Sounds like they either can't fix the issue or won't spend the time to fix the issue. Nvidia: If you are going to release a driver as "GameReady" for Far Cry 4, Assassin's Creed: Unity, Dragon Age Inquisition; you should probably make sure SLI works properly. The glaring bugs in all 3 of these games makes it look like whoever came up with the SLI profile for them, did it blind folded. The bugs in the game are so damn obvious. Finally the simple fact that I am not gaming as much these days makes it hard to justify that second card. I sold it for what I bought it for.
 
Can someone plz check if with gtx970 and new drivers if there is still problem with a "render test" (bus interface). The render test did work properly with 344.65 drivers and my gtx970. Also I had tearing issues (tearing) when playing video h264/divx files with 344.65 drivers. I'm currently using 344.48 without any issues.

thanks in advance.
 
However, is Xfire any better than SLI in terms of compatibility and performance? I had thought both were equally bad at it.
 
I'll wait.

I usually don't really play these kind of games on Day 1 anyway, too expensive, probably buggy and driver support will most likely be subpar (for both sides). I'll bite when they are on for 50% discount, no harm in waiting for SP games to come down in price.

Edit: Has anyone tried MFAA? the degree of performance hit or its actual look?
 
However, is Xfire any better than SLI in terms of compatibility and performance? I had thought both were equally bad at it.

XFire has frame pacing and far more bandwidth than SLI with XDMA (900MB/s vs 32GB/s). It's way better at 4K resolution in particular.
 
Actually, as the drivers mature my sli experience gets better and better with BF4 at least. Very smooth. Moving to Windows 10 tech preview from Windows 7 really helped too. I think Windows 7 is dead for gaming and is not as well supported as developers would have us believe.

You can always disable the second card if necessary. Sometimes it is necessary. Thats why I sprung for two 980s instead of 970s. Always get the fastest single gpu INMHO. Then add second gpu for the anti-aliasing, ect. If you have two graphics cards you are an uber minority enthusiast anyway. I dunno I end up waiting for these crappy ports to go to the bargain bin anyway and by that time drivers are ready. Win win.

oh and speaking of latency, since I installed the GTX 980 and enabled SLI I was experiencing actual system DPC latency that was screwing up my high-end USB audio DAC. I ran a tool that was designed to test system latency and found that the throttling of the GPUs was causing the latency. I think by throttling I'm referring to what is commonly referred to as Nvidia power miser. at some point either through a driver update or a tool that disables power miser I was able to eliminate the latency and got better performance from the system all around. But I honestly don't know what solved the issue. But when people talk about latency, I'm not sure they realize the extent to which latency can be an issue with respect to videos drivers. it can wreak havoc with your audio and the rest of your system and I believe it affects gameplay as well.
 
Last edited:
XFire has frame pacing and far more bandwidth than SLI with XDMA (900MB/s vs 32GB/s). It's way better at 4K resolution in particular.

Bandwidth yes, frame pacing not even close. In fact AMD didn't even bother with frame pacing until Catalyst 13.8, which is why XFire has tradtionally suffered much more from microstutter than SLI has. There's even a piece on this at this very site.
 
Back
Top