A bit disappointed with 980Ti in SLI

Nebell

2[H]4U
Joined
Jul 20, 2015
Messages
2,383
I used 670 SLI and I must admit those cards worked like a charm and I was able to max almost everything on 1080p until one of them broke down.
Fortunatelly I had valid warranty so I decided to upgrade and bought my first 980Ti about at the same time as I went for 4k resolution.

While single 980Ti is perfectly capable of chewing through most resource heavy games at 4k, I still wanted to get those few demanding games to run at 60 fps.

So I bought a second 980Ti this weekend.

3DMark shows good improvement, almost twice performance, but that application is developed in focus to support the latest hardware & tech.
The biggest problem I have is that some games don't support SLI. For example Rust. That game is gorgeous but if I max it out I'm at 25-30 fps.
The Witcher 3 is another issue. Since it's a PC game from the ground, I expected it to have better SLI scalability (and it does according to reviews), however, I went from 31 fps to 40 fps with everything maxed on.
Fallout 4 is at about 40-45 fps with Everything maxed on.
I have yet to test new Tomb Raider and GTA V, but this does not really impress me as much as my couple 670 did.
 
THat's modern SLI for you, poor scaling, no profiles at launch... Send the card back, save your money for top end Polaris/Pascal and you will be better with that.
 
I think for all the criticism it gets AMD scales much better in multicard configurations than nvidia. I absolutely loved my 290x CF setup, 4K was no problem. And it's the bad scaling why I decided not to get a second 980TI. I'll wait for the next thing, or switch back to AMD CF.
 
I think for all the criticism it gets AMD scales much better in multicard configurations than nvidia. I absolutely loved my 290x CF setup, 4K was no problem. And it's the bad scaling why I decided not to get a second 980TI. I'll wait for the next thing, or switch back to AMD CF.

AMD does scale pretty well if and when there is a Crossfire profile for a particular game.

Also, you might see a driver set release 6 months after the game is out.

Nvidia does have it's issues, but at least they are on top of driver releases. Unfortunately some games are just not designed for SLi/CF on release.....that's the curse of multicard ownership.
 
SLI has been completely useless these last couple of years. 75% of the top AAA titles do not support it or have piss poor scaling. RIP SLI
 
I assume you are familiar with nvidia inspector and sli compatibility bits? I have had good luck with my setup when it comes to new/unsupported/broken games. Recently fallout 4 and witcher in particular.

780ti's here and 1440.

However when witcher launched I was getting an average 30-40 fps while running it in surround 3840x1920...

I'd certainly expect you to be doing better than you report. You are monitoring clockspeeds, vram usage, gpu load levels etc etc right? Both cards getting 90%+ usage?

And you don't have AA cranked up to some rediculous level do you?
 
I don't have issues with scaling. There are some games that may not support SLI but most do after a couple of weeks. Without SLI gaming can be a chore especially if you like high FPS.

For me at 1440P I get proper scaling with Rise of Tomb Raider, Witcher 3, Ass Creed Syndicate etc. Not sure where is the 75% comment coming from but SLI not being supported is more the exception than the rule. I can't believe why you are getting 40-45 FPS in Witcher 3 at 4K. Possibly since you are still maxing out AA. I used to get 50-55 FPS when I played 4K with DSR. Now I prefer to do 1440p GSync with higher FPS than 4K with shitty FPS.
 
The only game that I saw considerable gains is GTA V. Other than that, other games such as Witcher 3 do not see much improvement whatsoever.

SLI is a waste for 980 ti's. The 970's and 980s for some reason scale a lot better than the ti's in my experience.
 
The Witcher 3 is another issue. Since it's a PC game from the ground, I expected it to have better SLI scalability (and it does according to reviews), however, I went from 31 fps to 40 fps with everything maxed on.
Fallout 4 is at about 40-45 fps with Everything maxed on.

Kind of sounds like something else may be wrong with your setup. However for Fallout 4 I would turn down/off God Rays and tweak Shadow Distance, as both of those settings tend to kill performance regardless of your specs.
 
Yeah you really don't need Max AA @ 4K. That's a fps killer.
 
Yeah you really don't need Max AA @ 4K. That's a fps killer.

"Need" is such a dirty word.

If it was on a what you need basis, we might as well still play in 320x200 and gouraud shading.
 
The only game that I saw considerable gains is GTA V. Other than that, other games such as Witcher 3 do not see much improvement whatsoever.

SLI is a waste for 980 ti's. The 970's and 980s for some reason scale a lot better than the ti's in my experience.

The 970 and 980 cards are a lot slower. You get more benefit out of having two because there are few games that can choke a 980Ti. You basically have to run at 4k or higher to stress them. Even then some titles are still fine on a single card at 4k.
 
I would be too if I spent that much and got less than stellar results.
 
I was considering adding another Titan X to my system, too, but quite a few games I've been interested in that have been released recently have not had good SLI support or none at all. Which is a shame because I've had a pretty good experience with SLI prior to Maxwell. Like MorgothPI said, I decided to save my money and wait for the next generation of cards to hit. I plan on building a new system around the end of the year (or whenever Skylake-E comes out), anyway.
 
Surprising, as I'm pretty happy with Titan X sli performance (60+ in most games, except not surprisingly Witcher 3) with 3x 1440p + GSync. Gsync in this case really helps with SLI related microstutter. Even with G-Sync I do turn off some options to stay above 60+. Most of the time it's not noticeable when features are turned off vs on (God Rays, HBAO+, Ultra shadows) when playing fast motion games.
 
Surprising, as I'm pretty happy with Titan X sli performance (60+ in most games, except not surprisingly Witcher 3) with 3x 1440p + GSync. Gsync in this case really helps with SLI related microstutter. Even with G-Sync I do turn off some options to stay above 60+. Most of the time it's not noticeable when features are turned off vs on (God Rays, HBAO+, Ultra shadows) when playing fast motion games.

This has generally been my experience as well. My Titan X's have performed well in every game I've played lately excluding Arkham Knight. That's just a broken piece of shit and not NVIDIA's fault.
 
Surprising, as I'm pretty happy with Titan X sli performance (60+ in most games, except not surprisingly Witcher 3) with 3x 1440p + GSync. Gsync in this case really helps with SLI related microstutter. Even with G-Sync I do turn off some options to stay above 60+. Most of the time it's not noticeable when features are turned off vs on (God Rays, HBAO+, Ultra shadows) when playing fast motion games.

I don't know why you don't get +60 fps in witcher 3. I run 2 980ti in sli and use 3440*1440 res and always get 60-85 fps in all areas.
 
7680x1440 is more than double the pixels of 3440x1440...

Yeah, not really comparable as OPs 4k res is closer (11k vs 8.3k pixels) I do get high 50s in witcher 3 with hair off, shadows high, AA off, and foliage distance to high (everything else maxed), very reasonable for the resolution I'm running and still very high graphics quality. When I do max out everything in Witcher 3 I do get 30-42fps similar to OP so his system should be running fine or we're both have something wrong.
 
AMD does scale pretty well if and when there is a Crossfire profile for a particular game.

Also, you might see a driver set release 6 months after the game is out.

Nvidia does have it's issues, but at least they are on top of driver releases. Unfortunately some games are just not designed for SLi/CF on release.....that's the curse of multicard ownership.
Both AMD and Nvidia have plenty of games that scale 90%+ but that's with profiles when you get those profiles really just depends on the game most of the time...Multi-card set ups can be a labor at times to get that power you thought you'd get.
 
This has generally been my experience as well. My Titan X's have performed well in every game I've played lately excluding Arkham Knight. That's just a broken piece of shit and not NVIDIA's fault.

Its nvidia fault every game they touch with their gamesdontwork malware becomes a pos, how funny is it a non gamesdontwork titles do not have troubles. But i do understand your position and your views:D;)
 
I have no clue what Nvidia Inspector is. I'll have to check that out when I get home.
I never used SLI profiles with my 670s, but I wouldn't mind to start using them if that means better performance.

You guys are right that AA might be the problem here. I have it on but in games where I paid attention to (Rust mainly) there was a difference, probably because I use a large TV so I have lower pixel density per inch.

I read in reviews the The Witcher 3 has about 170% fps increase in SLI at 4k (980Ti). So I was surprised to go from 30 to 40. That's about 135%.
Funny thing is, DayZ went from 30-50 fps to15-20 fps slideshow, not sure what happened there, lol :)

I'm not going to wait on Pascal. Nvidia is definitely not going to release Pascal Ti anytime soon. First they will drop Titan and 980/970 equivalent, then 1080 Ti will come about 6 months later. Titans will be way too expensive to SLI. I can afford two 980Ti but not two Titans ;)
1080 is probably not going to be worth getting over 980Ti.
I'm more of a strategy/rpg/survival gamer so 30-40 fps actually works fine for me.
 
Buy the most powerful single card you can afford and don;t bother with SLI/Crossfire. Not worth it.
 
Buy the most powerful single card you can afford and don;t bother with SLI/Crossfire. Not worth it.

NOT really true, for 1080P 120Hz and 1440P 60Hz, sure! but 1440p 120Hz, or 4K 60Hz, single-cards are not QUITE adequate. SLI is your ONLY option if you want to play top-quality at the big-boy resolutions.

But on the subject of AA: I would recommend using basic FXAA at 4K, as the SLIGHT blurriness it adds is overshadowed completely by the jaggedness it removes. It also gets rid of shader and post-process aliasing. I use 4K at 40" and FXAA is still just fine with me. MSAA is nice, but go over 2X or 4X in modern titles and it kills your FPS, and SSAA or DSR is completely out of the question unless you are playing REALLY old games, like Half-life 2 old.
 
Oh boy, I need to read up on AA :)
Gone are the days when you could just buy the best stuff and then crank up the settings without worrying too much.
 
Dual GPU solutions will leave you feeling that way until the developers figure out a simple way to get this in every game. Which will probably be a while.

Funny thing is, DayZ went from 30-50 fps to15-20 fps slideshow, not sure what happened there, lol
I love Day Z but its a awfully programmed game. It is so random in its performance.
 
Invidia Inspector is amazing :D
I just went from 30-34 fps in Rust to over 55 by using The Witcher 3 SLI bits.
Thanks for that tip!
 
Last edited:
Just be aware that using compatibility bits from other games may or may not cause graphical anomalies with SLI. If you notice anything weird just try a different bit. I think there is a database or something similar somewhere where users have compiled on what bits work the best with different games.
 
Invidia Inspector is amazing :D
I just went from 30-34 fps in Rust to over 55 by using The Witcher 3 SLI bits.
Thanks for that tip!

No problem. That tool is an absolute must for SLI.

Also renders msi afterburner and evga precision 99% useless. Not to mention I became very very suspect of the overlay functions of both the graphs and the programs in general causing problems with games and the nvidia drivers in general after a while.


Don't forget to run the monitors in the background so you can watch GPU use, GPU power, voltage, vram, usage, clockspeeds, bus usage and temps. Becomes a VERY enlightening tool to track down issues with SLI.
 
how does inspector render afterburner useless? they perform completely different functions.
 
Just be aware that using compatibility bits from other games may or may not cause graphical anomalies with SLI. If you notice anything weird just try a different bit. I think there is a database or something similar somewhere where users have compiled on what bits work the best with different games.

Not sure if I noticed any anomalies in Rust so far. I noticed that bushes update visuals when I'm almost next to them, but I'm not sure if that was the case even before I tried editing.

The database you mentioned, must be this? http://forums.guru3d.com/showthread.php?t=392715
 
how does inspector render afterburner useless? they perform completely different functions.

You're able to overclock your GPU via Inspector, so if you use it's own way to log counters or something separate to monitor the card clocks/temps then you don't really need Afterburner or PrecisionX.

I still personally prefer to just alter SLI bits or flags within Inspector but do all of the overclocking and monitoring through Afterburner. I haven't encountered issues due to the Afterburner OSD, or if I did it's simple enough to deactivate for certain applications. Nothing on the same level as Steam overlay or UPlay or something.
 
GTV rocks in 980ti sli
2560x1440
pulling like 140FPS everything maxed out

Try getting those frames on a single card
 
how does inspector render afterburner useless? they perform completely different functions.

Hmm, no sir they do not. Nvidia inspector does everything and a whole lot more. You need to take a closer look. The only thing I can think of that inspector doesnt do is fan profiles. And fan profiles really are more efficient when used with a custom bios.
 
What game needs dual 980ti's really?

At 4K, you'd be surprised. Using multiple monitors can also have you pushing well beyond 4k resolution. At 7680x1600 there were very few games where I felt like the performance attained through a single GPU was ever enough.
 
At 4K, you'd be surprised. Using multiple monitors can also have you pushing well beyond 4k resolution. At 7680x1600 there were very few games where I felt like the performance attained through a single GPU was ever enough.

Still trying to figure out why some games are poorly programmed
i.e. Ark Survival

good luck trying to run this game at 2560x1440p GTX 980ti
runs like crap
 
What game needs dual 980ti's really?

Out of all the games in my library, only really old ones, and Civ 5 are able to run at decent framerates at 4k with one 980ti.

Most of the titles still struggle with two 980ti's in SLI, unless you turn the quality down so far as to gimp the game.

Even a slightly older game like Metro 2033 - for instance - struggles at 4k with two 980ti's.

Id argue that unless you play exclusively older older or very light requirement games, if you plan on going 4k, there is no single GPU solution that is currently fast enough.

I don't even think dual 980ti's are fast enough. Hopefully dual big pascals will be sufficient when they launch.
 
Still trying to figure out why some games are poorly programmed
i.e. Ark Survival

good luck trying to run this game at 2560x1440p GTX 980ti
runs like crap

It's not always a matter of poor programming. I wish people would stop regurgitating that nonsense. Performance doesn't scale linearly with graphics quality across different engines. So you can't say, this game looks worse than Crysis 3, so it should run faster, or whatever.
 
It's not always a matter of poor programming. I wish people would stop regurgitating that nonsense. Performance doesn't scale linearly with graphics quality across different engines. So you can't say, this game looks worse than Crysis 3, so it should run faster, or whatever.

Yeah, there are many factors to consider.

Some 3rd party maps on Red Orchestra 2 are almost as hard on my video cards as Metro 2033.

Metro 2033 gets away with a lot because it is essentially a single player tunnel shooter, and saves a lot of rendering power by not needing to render outside landscapes. (this fact is mostly hidden by the gasmask effects in the outdoor scenes)

Red Orchestra 2 - on the other hand - has vast outdoor 32v32 player maps where players are taking pot shots at eachother with scoped rifles over long distances. In addition to relatively easy to render square objects like buildings, there are trees, rivers, grass bushes, etc. to render, all which have much higher polygon counts. As a result, the overall graphical fidelity of the game is lower. This doesn't mean that the rendering engine is somehow less efficient. it simply means that it is a different game with a different type of graphics processing need.

Now I'm not familiar with Ark Survival, but just looking at screen shots, it appears to have many of the same high polygon outdoor environment type issues RO2 does, just with smaller maps.
 
Back
Top