980 Ti Sli-Scaling issue with new 4k monitor

Island

Gawd
Joined
Sep 28, 2005
Messages
883
Greetings fellas. Ran into a strange issue that I was hoping I could get some answers with. I recently upgraded my Dell U3011 for a new Samsung UN48js9000. I'm running in HDMI UHD and PC/Game mode. I have the newest NVidia drivers installed, and am still running on Windows 8.1 64 bit. With my Dell 30" monitor, my 980 Ti's were killing most every game I play, with the scaling pegged in the high 90's for both cards. Even when I enabled DSR via NVidia Experience, games played perfectly.

Using my new Samsung 4K TV, I notice when I play games like Witcher 3, Skyrim (Heavily modded), etc. Afterburner OSD shows the main GPU at only around 70-75% scaling usage and the 2nd card around 80%? WTH is going on? You would think that with a higher native resolution, more of the GPU would be needed, and subsequently, scaling should remain high. My cards do overclock when I launch my games, so they are boosting. Also, temps are still fine. Any help would be appreciated. Wonder if this is a bug with the newest drivers? Should I do a reinstall? Thanks fellas!
 
Probably related to tessellation issues. Currently with water on high or ultra if I am near a coastline my GPU utilization drops down to the 50-60% range with overclocked Titan Xs and fps gets locked to 40. IIRC tessellation is bugged in multi GPU setups on the Witcher 3 engine, like only one card is doing the tessellating which is causing low fps in SLI.
 
This happens in multiple games for me on my js9000 with triple 780 classifieds. Fps will lock up at 40 or so, but sometimes fixes when I resize the game res.

Though I'm also having issues with some games stuttering at an unplayable pace when SLI is enabled at full screened-2160p 60hz. Disable SLI and the issues go away. I think drivers are buggy at this res right now, which is disappointing.
 
This happens in multiple games for me on my js9000 with triple 780 classifieds. Fps will lock up at 40 or so, but sometimes fixes when I resize the game res.

Though I'm also having issues with some games stuttering at an unplayable pace when SLI is enabled at full screened-2160p 60hz. Disable SLI and the issues go away. I think drivers are buggy at this res right now, which is disappointing.

If you look on the GeForce forums, there are more then a few of us who are having these issues. I do think it is more driver related but I am wondering if a new 6700K would help. I don't think my 3770K at 4.6ghz though really needs to be upgraded at this point in time, but who knows. Benches with certain games, I have seen anywhere from 20-30% increase with Sky lake over ivy Bridge
 
At 4K, I doubt that your oc 3770k bottlenecks your gaming performance. 20%-30% performance gains are experienced at 720p and 1080p.

Welcome to mulpile cards shenanigans.
 
At 4K, I doubt that your oc 3770k bottlenecks your gaming performance. 20%-30% performance gains are experienced at 720p and 1080p.

Welcome to mulpile cards shenanigans.

I just installed HW info monitor and added it to Riva tuner's OSD so I could monitor the CPU cores. all 4 cores never get maxed out, and the 3rd core gets the most utilized at say 40-60%, but the other 3 stay below 25% during games like Witcher 3 and GTA 5. So I still think this is a driver issue with SLI scaling. I'm still only seeing 70% on the main GPU and 75-80% on the 2nd 980Ti
 
The drivers went downhill recently it seems. I sold my 2nd Titan X because games that worked before stopped working (like BF4). I don't have time to fiddle so I ditched it.

I agree it's likely a driver issue. A 3770k at 4.6 is no joke.
 
Those GPU % are what I am getting with my 980ti sli @ 4k with a 5820 as well. Card one bounces from 70%-90% and two is usually at 85%-90%.
 
I've experienced the same with my setup. My GPU usage will go from 50-60% on one card and maybe 75% on the other up to 75-80%/85%, depending on the area. The experience is incredibly choppy too, especially when I look around in game (a fixed perspective is definitely smoother).

I've tried digging into this for quite awhile and struggled to find anything that fixes it. One person on the Nvidia forums claims it has to do with Vsync being bugged in Witcher 3. Like someone else mentioned, it sees that if the game doesn't hit 60fps in certain areas, it'll cap you at <60, and GPU usage goes way down. I have found that turning V-sync off - which needs to be done in Nvidia CP because turning off in game doesn't work - does bring back my GPU usage (at least to the 80-90% range), but then there's the tearing.

The only perfect play experience I've been able to get with the game is on my Asus ROG Swift. Then again, it kind of defeats the purpose of the 980ti setup I just upgraded to since my 980 SLI setup was more than enough to run that monitor. I really wanted (mostly) smooth play in UHD, but apparently that's a pipe dream for this game.
 
It is a vsync issue with the witcher 3. Prior to 1.07 vsync in borderless windowed mode and Fullscreen behaved differently. Borderless will lock you at 40fps if your fps is consistently a few frames below 60, like in the mid 50s. Fullscreen up until 1.06 allowed would go fluidly between 40-60 rather than 40/55-60 binary with proper triple buffering. What's interesting is for whatever reason I get fluid frame rates between 40-60 fps in cutscenes, but during gameplay when I drop below 57-58 fps I get locked to 40fps.

But if you at Xbox one areas that used to run at 30 fps in 1.06 run at 20 fps in 1.07/8. It seems to me CDPR simply consolidated all their platforms and game modes to run with the same double buffering mode where you are locked at 2/3 vsync refresh rate, maybe to stabilize fps swings but at the cost of higher frame rates. Either way the game was much smoother at launch with proper triple buffering on all platforms, but CDPR decided to trade off lower smoothness for less input lag in latest patches.
 
It is a vsync issue with the witcher 3. Prior to 1.07 vsync in borderless windowed mode and Fullscreen behaved differently. Borderless will lock you at 40fps if your fps is consistently a few frames below 60, like in the mid 50s. Fullscreen up until 1.06 allowed would go fluidly between 40-60 rather than 40/55-60 binary with proper triple buffering. What's interesting is for whatever reason I get fluid frame rates between 40-60 fps in cutscenes, but during gameplay when I drop below 57-58 fps I get locked to 40fps.

But if you at Xbox one areas that used to run at 30 fps in 1.06 run at 20 fps in 1.07/8. It seems to me CDPR simply consolidated all their platforms and game modes to run with the same double buffering mode where you are locked at 2/3 vsync refresh rate, maybe to stabilize fps swings but at the cost of higher frame rates. Either way the game was much smoother at launch with proper triple buffering on all platforms, but CDPR decided to trade off lower smoothness for less input lag in latest patches.

Thanks for this and this is a very interesting post, because 39-40FPS is where my FPS has been hovering at. It stays at this FPS even if I set every setting to low and turn off Hair works/AA, so I know it is not the Cards holding me back. So, this is a game issue rather then a driver?
 
Back
Top