The State of SLI/Xfire in 2016

Kloudzero

n00b
Joined
Apr 22, 2016
Messages
34
What's everyone's perception about the effectiveness of running dual cards? Do you feel the support both in the quantity of games and quality of implementation has increased, decreased or stayed about the same?

I'll refrain from voicing my opinion for now.
 
What's everyone's perception about the effectiveness of running dual cards? Do you feel the support both in the quantity of games and quality of implementation has increased, decreased or stayed about the same?

I'll refrain from voicing my opinion for now.

I don't think either camp has put much effort into multi GPU solutions in the past 8 months. Maybe they are both waiting on DX12 games to launch, hoping the the game devs will do all the work.
 
I don't think either camp has put much effort into multi GPU solutions in the past 8 months. Maybe they are both waiting on DX12 games to launch, hoping the the game devs will do all the work.

You know now that you mention it, I haven't heard anything about DX12 as it relates to SLI/Xfire. Any word on if it will make implementation easier? You often hear devs complain about the complexity hassle of coding it.
 
I think the pure lack of desire for gaming companies to develop for this has led to it never really taking off for either camp, so while the capability will likely always remain, I think that going forward we will see single card solutions pushing the envelope.
 
My perspective is real limited. I bought 2 PC games in 2015. Fallout 4 and SWBF.

SWBF has awesome XFIRE support. Fallout 4's xfire support is complete shit.
 
It's basic economics, why invest in something that 1-5% actually own/implement?

Only [H] members and a very minuscule minority of gamers invest in SLI/Crossfire.
 
Been SLIing for 3 years and a couple generations of cards. Definitely seems like support has been dwindling since I got into the game, especially the last year or so. Haven't really had any major problems, but even still think next round I'll be eyeing a single gpu, as it certainly doesn't look like there is an effort to improve.
 
My perspective is real limited. I bought 2 PC games in 2015. Fallout 4 and SWBF.

SWBF has awesome XFIRE support. Fallout 4's xfire support is complete shit.

And that's the big thing that gets to me. I can understand issues at launch, with new games, but how long has Fallout 4 been out and it's still hasn't been fixed. Is it really that difficult to make a profile for. It's not like it is an indi release.
 
CF/sli was never officially part of the DX11 (and under) APIs, it was hacked onto the API by IHVs. So unless an IHV did it themselves, very few developers ever learned how to implement it. DX12 is the first Microsoft API (technically Mantle was the first API overall) which officially supports multi-GPU, so the information required to do mGPU should be more easily available to developers. Flip side is IHVs probably still need to help guide developers at least part of the way since this is so new. So far only AoTS developers have mGPU working. (some of which also helped develope DX I believe, so they know what they're doing)
 
I mean, seriously... graphics cards put out a lot of heat. If you have two or three high-end cards in a cramped space, you'll need an extreme cooling solution, an expensive PSU, and decent cable management. And you can kiss a dedicated sound card goodbye unless you have an X99 or similar -E chipset because SLI will eat up all your available PCI-E bandwidth otherwise. I don't like fumbling around with running cables behind a motherboard plate or setting up watercooling.
 
CF/sli was never officially part of the DX11 (and under) APIs, it was hacked onto the API by IHVs. So unless an IHV did it themselves, very few developers ever learned how to implement it. DX12 is the first Microsoft API (technically Mantle was the first API overall) which officially supports multi-GPU, so the information required to do mGPU should be more easily available to developers. Flip side is IHVs probably still need to help guide developers at least part of the way since this is so new. So far only AoTS developers have mGPU working. (some of which also helped develope DX I believe, so they know what they're doing)

Ah, that's great info. Appreciate the insight. So we might potentially see some quantity/quality improvements as DX12 improves.
 
I mean, seriously... graphics cards put out a lot of heat. If you have two or three high-end cards in a cramped space, you'll need an extreme cooling solution, an expensive PSU, and decent cable management. And you can kiss a dedicated sound card goodbye unless you have an X99 or similar -E chipset because SLI will eat up all your available PCI-E bandwidth otherwise. I don't like fumbling around with running cables behind a motherboard plate or setting up watercooling.

I'm sorry you're lazy and/or a wuss.

Seriously though, not wanting it for yourself is one thing. Talking down about it in general? Stupid. Short-sighted.

If there's a single card solution that'll work at your resolution and the settings you're aiming for in the titles you play... Great. But there isn't, always.
 
  • Like
Reactions: Zuul
like this
having been running sli or crossfire cards for the past 5-6 years the only huge problem I've encountered has been Star Wars Battlefront having a flickering loading screen but that game tends to suck on all levels so I consider it a non-issue.
 
I used to run SLI 8800GTXs back in the day, and back then it was a no-brainer. I just turned on SLI and forgot about it, it worked with practically every game. I got a 760 now, and the original plan was to SLI it with another, but I got a new job and decided to invest in a 980Ti.

A year or two after I got the 8800s my wife built a new system, and she put twin ATI 6950s in there. Still using them to this day, and still having CF problems. Only works for maybe 3/5 of games, and the ones it doesn't work for it royally screws up. The Catalyst Control Center give her fits as well, that thing was constantly crashing or causing issues. Some games would actually run better with the CCC not running!

So that's been my experience, that SLI just works and CF doesn't always.
 
I'm sorry you're lazy and/or a wuss.

Lazy? Okay, I'll own that one. A wuss? Nope. More like cheap and unwilling to put up with buggy games. I'd at least try SLI if I had the money to burn on two/three cards and an X99 motherboard to run them at full speed. I don't have money to throw away on bricked cards if something goes wrong. Yeah, I'm a wuss because I don't want to risk blowing my last $1000 on two 980Tis and then being stuck with Intel HD Graphics for a year because the cards died, and all for the pleasure of waiting on driver updates to make every new game work right.

Seriously though, not wanting it for yourself is one thing. Talking down about it in general? Stupid. Short-sighted.

If there's a single card solution that'll work at your resolution and the settings you're aiming for in the titles you play... Great. But there isn't, always.

If you read what I wrote, all I talked about were reasons why I didn't want it for myself and didn't think it was worth it. There was no reason for you to take it so personally. Do I not have a right to speak critically of a technology on a Hardware forum?

And I don't see why you have to shut down criticism of something just because you like it. SLI has always been bugger than running a single GPU solution. Look at all the people who have been frustrated at their games not working correctly on it until there are driver updates. Most people I know that ran SLI in the past are relieved when a single GPU comes out that can outperform their SLI setup so they can ditch it.

If you're trying to run games in 4K on today's hardware, you make a fair point that you're likely stuck with SLI even though it's inferior to just having a good single card. But my point was that most people don't NEED to play their games at 4K with max settings in 2016. 2018 or 2020, they probably will.

NVidia markets this stuff and doesn't do a good job making it work. You shouldn't be angry with me for finding fault with the technology. It's only through people pointing out the limitations and flaws in a technology that it can be improved or replaced with something even better.
 
And that's the big thing that gets to me. I can understand issues at launch, with new games, but how long has Fallout 4 been out and it's still hasn't been fixed. Is it really that difficult to make a profile for. It's not like it is an indi release.

SLi / Crossfire has to be more than an afterthought during the development of the game for AMD or Nvidia to make a profile for it. To be exact, if Bethesda had developed the game with intentions of SLi / Crossfire from the get go, there is no need for a profile. That's how real mGPU is supposed to work. Just Cause 3 developers stated that they have no intention of implementing mGPU into their game. Every so often you see someone on these boards complaining that it doesn't work and AMD / Nvidia should make it work. Sorry, but it just doesn't work that way. It's something that the developer has to have enough pride in their work to write proper code for.

The good thing is that many new game engines are getting mGPU support baked into the engine. So that if the developer once again writes proper code, then mGPU can be easily implemented. The new VR initiatives are all based on mGPU like AMD LiquidVR. So all of those games are supposed to be mGPU compatible. AMD just released their Pro Duo mGPU card for VR developers to make it easier for developers to have access to the proper equipment to test on. Hopefully this trickles down to the non VR games since it would be asinine to write a proper VR compatible engine, that is compatible with mGPU, and then intentionally break the engine's mGPU support for non VR users.
 
I used to run SLI 8800GTXs back in the day, and back then it was a no-brainer. I just turned on SLI and forgot about it, it worked with practically every game. I got a 760 now, and the original plan was to SLI it with another, but I got a new job and decided to invest in a 980Ti.

A year or two after I got the 8800s my wife built a new system, and she put twin ATI 6950s in there. Still using them to this day, and still having CF problems. Only works for maybe 3/5 of games, and the ones it doesn't work for it royally screws up. The Catalyst Control Center give her fits as well, that thing was constantly crashing or causing issues. Some games would actually run better with the CCC not running!

So that's been my experience, that SLI just works and CF doesn't always.
Sorry cant let this go. If you have CCC crashing you got issues that aren't with CCC. In all my years I have never had CCC crash or cause any issues, completely different and aside from CF (although never had issues with CF). There is another issue with her computer and it sure isn't CCC.
 
I had SLI 9600gt and more recently SLI 660ti. I have upgraded to a 980ti and it just works.
For me, it seems, support has improved somewhat. But it still remains quite niche, requiring the use of nv inspector to make sure it works. I think it is only worth if you are SLI'ing two of the best gpus available (like 980ti) or you play certain games that you know will work with it. It's really disappointing when a new, long awaited release does not work well woth sli and you either have to play on low settings or low fps, or spend time trying to fix it. For some people time is worth a lot and having a powerful single gpu solution is a much better investment.
Tl;dr: I think SLI support improved somewhat in the last ~4-5 years, but it still remains far from a good solution.
 
Yea, a bit OT, but I get what you're saying. We've never been able to isolate a specific issue, as it was intermittent and seemed to be related to the version of CCC she was running at the time. Of course, I can't be 100% on that, but the latest version she installed hasn't been giving her those specific troubles.

(although never had issues with CF)

Never, not at all? You have never had a game that stuttered so bad in CF that you had to disable it for a massive boost in framerate or never experienced the shadow banding in Skyrim with CF enabled?


To address the main topic of the thread, and reading through all the posts, the conclusion I am tempted to come to is: people have a surprisingly wide variety of experiences using SLI or CF. That makes it less appealing. I do hope that mGPU and DX12 makes it more of a standard, for consistency across the board.
 
Well, mGPU is, once again, left for the developers to implement. Seeing how most of the market is at 1080p and single gpu solutions are easily capable of running new games at highest settings at high framerates, only the very high-end market would benefit from well implemented multi-gpu solutions. And that is just too few consumers for someone to spend development time on.
The current state of console gaming does not help, they did not push graphics as far as the last generation so most developers will not bother to create something that will only be used by high-end PCs. I hope I'm wrong and we will get a new Crysis soon. Maybe the console refresh will help, but it seems they are just catching up to the power of mid-range PCs. So, essentially - I think SLI/CF/mGPU will stay at the same inconsistent level of support. I hope I am wrong.
 
The GPU vendors need to push for multi-GPU, the developers see NO use for it. Wasting precious time to develop and test something a tiny fraction of the player base will utilise. The GPU vendors are the ones who benefit from SLI and CFX: One person = 2+ sales. So, AMD and Nvidia need to push these modes, the devs wont do it unprompted.
 
One theory is that the console upgrades will come with multiple GPUs. We know AMD is making those semi-custom SoCs, and all actual data indicates that polaris10 is not a high-end GPU. So one way for them to offer 980ti+ performance in the PS4 Neo and NeXbox One would be to pop 2 polaris 10s in each and tell the console game devs to learn how explicit multi-GPU (or Vulcan, for PS4) works.

If they can convince the big engine developers (unreal, crytek, unity, frostbyte) to support explicit multi-GPU on PC by leveraging console games, that would be a critical mass and get everybody on board.
 
I've run SLI/XFire three times: 2x 7800GTs, 2x HD6950s, and 2x GTX 770s. Every single config had their quirks. By far, SLI was/is the most turnkey. Even though many of my games did not have SLI profiles at the time (e.g. Attila Total War), it was easy to setup and just worked.

On the other hand, I had so many problems with the Xfire HD6950s that I would not consider another Xfire build until I hear things have drastically impaired (looking at you DX12). I'll go into the details if someone asks, but basically my driver wouldn't trigger 3D core/memory speeds until I manually created and hotkeyed profiles. This didn't come to me overnight, took several days of troubleshooting to finally figure that one out because I had never seen anything like it before. Outside of that, I got wonky texture anamolies in my favorite games (e.g. Total War) making Xfire unplayable, even with the performance bump.
 
Back
Top