Any point to SLi anymore?

I've ran 2 cards in sli since the 8800 GT. Yea, the SLi scaling issue with unsupported games sucks and having to force it in inspector has its inherent issues....BUT.... #1 I am a graphics whore..... #2 I can afford it #3 When SLi works, it makes a big difference

I am currently playing Kingdom Come (My new favorite game of all time) SLI-not supported so I used the SLI profile fix found online. BOOM-Increased FPS from 20s-30s to 40s-50's at 4K/ultra. that's why I stay SLI. Because I at least have the opportunity/option to increase FPS if possible. Single card, you are stuck with just that.....
 
I would say if your looking to play at enthusiast level detail settings (4K+) and looking for the highest possible FPS, SLI and Crossfire certainly should be considered. Single GPU solutions just won't cut it at the high end of performance and detail quality.
 
^^ yup. I was an early adopter to 2160p. I was an early adopter to 1600p. Etc.
Single cards don’t cut it at those high cutting-edge resolutions.
SLi works great for the vast majority of games I’ve played/play. Good scaling.
That said, the 1180Ti *should* be the first real single-card 4K solution. The 1080Ti is not. I’ll be moving to a single card on release day for the 1180Ti. Hopefully it’ll be at least a 20%-25% jump over the 1080Ti.
 
I've run SLI'd systems starting with the NV 6600's. Often, I'd buy a card and upgrade later adding a second card when I felt I needed extra graphics power or if a sale made it hard to resist. Other times I'd go SLI from the start if I had a good deal (like when BFG went out of business and MC had a mad sale).

As others have said, when it works it's awesome and I'd rather have the ability to scale up or turn it off if there is a problem.

However, the 1080 Ti I have now manages most of my needs. I SLI'd a pair early on, but returned one card, as a single card now is fine for what I do. It was close enough performance wise with the 980 Ti SLI config it replaced.

Anxious to see what the next Ti variant will do.
 
jRXbPNph.jpg


The good ole days are long gone.
 
I run my two titans pretty well, the games that support SLI run beautifully. The games that don't... Well... We don't talk about them...

At 4k you really only have one single card choice.
 
Any new resolution demands far more than the current GPU tech can offer.
If you jump into the new resolutions right away, well... not much choice but to run SLi.
Right about the end of every high resolution’s run, a single card is adequate, then it starts all over again.
8K (4320p) will be no different, except that GPU tech is WAY behind the curve on that one. It requires 400% GPU power over 4K 2160p. Good luck with that.
At least 4-5 gens away from a single card solution with max settings/60fps.
 
Any new resolution demands far more than the current GPU tech can offer.
If you jump into the new resolutions right away, well... not much choice but to run SLi.
Right about the end of every high resolution’s run, a single card is adequate, then it starts all over again.
8K (4320p) will be no different, except that GPU tech is WAY behind the curve on that one. It requires 400% GPU power over 4K 2160p. Good luck with that.
At least 4-5 gens away from a single card solution with max settings/60fps.

Pretty much this. I've been running higher resolutions since the first 30" displays hit at 2560x1600 and then I moved up to three of those for NV Surround / AMD Eyefinity. A single graphics card has never provided enough performance for that at any point since SLI came out.
 
This guy runs 4 Titan XP 2017 model at 8k res.



That is nothing but epeen. The real shit is all the headaches involved which no one doing that amount of epeen will admit to. I know cuz I've ran quads for years until it was more trouble than its worth.
 
Been playing with a couple of 1080ti's for a few weeks now.. not my first dicso with SLI & Xfiah. Nothing much has changed over the years. In titles where it works right... it's bangin fast.
In titles that don't support it or the implementation is weak.. not..so..much.

I pulled my second card last night to see how far down I need to cut the eye candy in titles like GTA5 to get 60fps solid and it's not too bad. I'm thinking of trying a single Vega 64 to see if Freesync on my 3 Omens is worth the trade off.

My2c.
 
Recommend copious research here, including this forum. I recall one user ditching their Vega(s?) due to lack of software support.

Thanks for the heads up.. read quite a bit and have had more than a few PM's with Archaea who I think you are referring to. I'm holdin onto at least one if not both of the 1080ti's for the time being. Just switching from one to two cards currently to see real world impact on gaming. I just have no personal basis for opinion if freesync is worth it or no.

thx
-scoot
 
Been playing with a couple of 1080ti's for a few weeks now.. not my first dicso with SLI & Xfiah. Nothing much has changed over the years. In titles where it works right... it's bangin fast.
In titles that don't support it or the implementation is weak.. not..so..much.

I pulled my second card last night to see how far down I need to cut the eye candy in titles like GTA5 to get 60fps solid and it's not too bad. I'm thinking of trying a single Vega 64 to see if Freesync on my 3 Omens is worth the trade off.

My2c.
Wouldn't it make much more sense to buy G-Sync monitors and play games without tons of input lag from V-Sync and SLI and not worry about hitting 60fps or SLI compatibility and other issues like micro stuttering? :wideyed:
 
Wouldn't it make much more sense to buy G-Sync monitors and play games without tons of input lag from V-Sync and SLI and not worry about hitting 60fps or SLI compatibility and other issues like micro stuttering? :wideyed:

Would make more sense had I mot just purchased the 2 additional Omens.. hindsight always 20/20 ;p
 
Would make more sense had I mot just purchased the 2 additional Omens.. hindsight always 20/20 ;p
It will only hurt you until you either change GPU to AMD or monitors with G-Sync... or move to alternate dimension where NV will add support for FreeSync sooner than never :dead:
 
Wouldn't it make much more sense to buy G-Sync monitors and play games without tons of input lag from V-Sync and SLI and not worry about hitting 60fps or SLI compatibility and other issues like micro stuttering? :wideyed:

Tons of input lag? I am a top FPS player in many games and have NO input lag problems from gsync. Microstuttering doesn't happen anymore. Why do you guys who don't own or haven't owned sli in years give old played out talking points?
 
I too own 1080 ti SLI

In games that support it I activate SLI

For those that do not I disable

For ME the expense it worth it even though many games are now being released without SLI support. In fact most of the games I play run fully maxed out on a single 1080Ti at 3440x1440 @ 100 refresh rate

And I like having the option for more eye candy if the game supports it. If it doesn't I don't give it another thought

This is exactly my problem with Crossfire and modern SLI (read: newer than Voodoo 2) setups - they aren't fire and forget. Until they are, this technology is dead to me and most gamers. Most gamers aren't going to fudge around with it to see if it can be made to work when with a single card, they can just load in and play.
 
ons of input lag? I am a top FPS player in many games and have NO input lag problems from gsync. Microstuttering doesn't happen anymore. Why do you guys who don't own or haven't owned sli in years give old played out talking points?

I've been waiting for some substantiation here, and it's been pretty quiet ;).

Personally, I would believe that there is some inherent lag with SLI. AFR more or less demands it from a technical perspective. But just like FreeSync provides a close experience to GSync, I wonder if that lag actually 'hurts more than helps'- and I'd wager that the answer is likely specific to the testing scenario (and not even just system configuration or game).

This is exactly my problem with Crossfire and modern SLI (read: newer than Voodoo 2) setups - they aren't fire and forget. Until they are, this technology is dead to me and most gamers. Most gamers aren't going to fudge around with it to see if it can be made to work when with a single card, they can just load in and play.

Eh, they're not fire and forget, but if you need the performance and you are willing to pay for it (so probably not you, but I've done it in the past), it's worth tinkering a smidge to get working.

Generally speaking the impetus to make the jump to SLI (or Crossfire) is to get more performance, either because a single card is inadequate or because the user simply wants the fastest solution available (most of [H]), for a specific subset of games, or even just one specific game. In my case it was the Battlefield series which has supported multi-GPU quite well over the last decade.

Of course, the move to lower-level graphics APIs like DX12 and Vulkan has thrown a bit of a wrench into multi-GPU, but that's largely been resolved in the most popular game engines.
 
I've been waiting for some substantiation here, and it's been pretty quiet ;).

Personally, I would believe that there is some inherent lag with SLI. AFR more or less demands it from a technical perspective. But just like FreeSync provides a close experience to GSync, I wonder if that lag actually 'hurts more than helps'- and I'd wager that the answer is likely specific to the testing scenario (and not even just system configuration or game).



Eh, they're not fire and forget, but if you need the performance and you are willing to pay for it (so probably not you, but I've done it in the past), it's worth tinkering a smidge to get working.

Generally speaking the impetus to make the jump to SLI (or Crossfire) is to get more performance, either because a single card is inadequate or because the user simply wants the fastest solution available (most of [H]), for a specific subset of games, or even just one specific game. In my case it was the Battlefield series which has supported multi-GPU quite well over the last decade.

Of course, the move to lower-level graphics APIs like DX12 and Vulkan has thrown a bit of a wrench into multi-GPU, but that's largely been resolved in the most popular game engines.

Idiotincharge,

Essentially, yes some specialized testing would show there is input lag, but that is quantitative. And humans can't measure experiences quantitatively. I am talking about Qualitative. We experience the feeling and the motion and the sensations provided by said experience. That is the human factor. And I am telling you guys I can detect none of this input lag that technically does exist but doesn't persist to violate my sense of it. Gsync is really that good. And SLI when properly implemented is amazing.
 
Tons of input lag? I am a top FPS player in many games and have NO input lag problems from gsync. Microstuttering doesn't happen anymore. Why do you guys who don't own or haven't owned sli in years give old played out talking points?
Why would I need to have SLI for my comments about SLI adding input lag to be valid? It is just how the Alternate Frame Rendering tech works and it needs to add delay. Single Frame Rendering doesn't add input lag but usually scale poorly so most games have profiles with AFR for SLI.

How much delay it adds might vary from game to game and it might be possible that some games which already have huge input lag (eg. some games add at least 2 frames of delay) do not suffer further. Most likely however it always add at least one frame of lag. You might not feel it but that is 'benchmark' that is invalid. Just like countless people claim they do not feel input lag from laggy monitors and especially TVs or from V-
Sync. Friend had TV with 100ms input lag and he claimed he didn't notice it... completely incidentally he stopped playing online shooters on his console and when I saw him playing one I could see he can't hit anything and I remember he was pretty skilled.

This is how I see people defending SLI as not having input lag. Just defending their buying decisions. Having lag problem might start with different amount of input lag for different people but:
1. you should be aware of it existing and preferably to be more conscious about this whole thing do actual comparisons. More often than not people who do open minded testing once noticing such issues stay away from anything that causes them and are glad they know about them.
2. do not say to potential adopters of this tech that there is no such issue because this is just not fair, especially since SLI setup is quite expensive

BTW. I am not saying G-Sync add delay. It does not. Tests are done is such a way that they measure time it took from pressing a button to fire being registered on high speed camera. Due to V-Sync OFF frame might be presented in the middle of the frame thus fire shot (which is presented in lower part of the screen!) might be registered on average slightly faster but that is hardly removing input lag as this is visual artifact that make image ugly. In fair comparison which is GSync + frame rate limiter G-Sync have lower input lag than V-Sync On and the same amount of lag as FastSync + frame rate limiter and properly fine tuned monitor refresh rate that avoids stuttering with FastSync
 
This is exactly my problem with Crossfire and modern SLI (read: newer than Voodoo 2) setups - they aren't fire and forget. Until they are, this technology is dead to me and most gamers. Most gamers aren't going to fudge around with it to see if it can be made to work when with a single card, they can just load in and play.

Most gamers are still running 1920x1080 displays like its 2007. At that resolution, a multiGPU solution is a waste of money. Beyond that, I rarely have to do anything in the control panel, SLI works or it doesn't. In cases where it doesn't work very well it isn't as though I've been lacking in performance to a point where I've needed to disable SLI or anything. I haven't had to fuck with the settings for SLI since Mass Effect Andromeda first came out. Even that was somewhat of an exception to the rule. I can't remember the last time I had to mess with those settings prior to that.
 
Last edited:
Most gamers are still running 1920x1080 displays like its 2007. At that resolution, a multiGPU solution is a waste of money. Beyond that, I rarely have to do anything in the control panel, SLI works or it doesn't. In cases where it doesn't work very well it isn't as though I've been lacking in performance to a point where I've needed to disable SLI or anything. I haven't had to fuck with the settings for SLI since Mass Effect Andromeda first came out. Even that was somewhat of an exception to the rule. I can't remember the last time I had to mess with those settings prior to that.

Agreed. My SLI setups:

2xV2 12MB
2x6600GT
2x7800GT
2x780

...never once did I have to screw with the settings. Enabled SLI in NCP, changed some Global Settings, fired up games. That's it. Didn't even bother with going into the profile settings for individual games.
 
I love how people get angry at those of us that have SLI. They make all these reasons to not own sli but for those of that do and like it God forbid.

I have had SLI since Voodoo 2s when it was the original Scan Line Interleave not Scalable Link Interface that nvidia stole and renamed. I have had Crossfire with the golden fingers and then without. And if you ask me I have always felt Crossfire to be better than SLI but I no longer own a Texas GPU so I have to go with a Cali team greenie setup for now. I hope SLI continues. For instance I run 3440x1440 UW and Far cry 5 which is a retarded game in concept but great to look at hits about 50 to 60 fps in ultra. In sli I can hit an avg of 90 and it feels way better.


There is a reason that Farcry5 dev team made the game SLI optimized. Because they knew it was going to punish Gpus at good looking resolutions. You may be happy with all your sliders to the left but many of us like them all the way right and we like higher resolution.
Thanks SLI!
 
I've owned multi-GPU systems as long as you have. I've usually found SLI to work better than Crossfire. Very rarely has this not been the case. In fact, I learned that 7680x1600 was beyond what the 7970 Crossfire setup could do with Eyefinity enabled. It was powerful enough to do it back then, but it required more bandwidth than the Crossfire bridge provided. The very next generation, Crossfire was redesigned. I never had that issue with my GTX 680 3-Way SLI setup which replaced my 7970's. I never could get the 4870x2's to work in Crossfire. Shit, my 4870x2's didn't work worth a damn on their own. Internal Crossfire only worked about half the time.

All that said, there are a few Crossfire setups that worked as well as any SLI setup ever has. The 1950XTX's in Crossfire were simply spectacular. Dual GPU cards like the 5970 were amazing, using Crossfire internally. However, the 5970 was shit in pairs.

Overall, I've been very happy with my multiGPU setups and I've occasionally disabled it to see what if any difference it makes. Very rarely have I found zero benefit to having it on. About the most I've had to do in order to make it work was to rename an executable to another name that matches a game with an existing SLI profile. I had to do this with Andromeda before an optimized driver came out for it. My Titan X (Maxwell) cards were absolutely excellent and might have been the best multi-GPU configuration I've had since 8800GTX's in SLI and later 3-Way SLI.
 
Overall, I've been very happy with my multiGPU setups and I've occasionally disabled it to see what if any difference it makes. Very rarely have I found zero benefit to having it on. About the most I've had to do in order to make it work was to rename an executable to another name that matches a game with an existing SLI profile. I had to do this with Andromeda before an optimized driver came out for it. My Titan X (Maxwell) cards were absolutely excellent and might have been the best multi-GPU configuration I've had since 8800GTX's in SLI and later 3-Way SLI.

I've not run either a Crossfire or SLI setup but my understanding was that, when it worked, Crossfire has/had better scaling than SLI. Is that not the case?
 
I've not run either a Crossfire or SLI setup but my understanding was that, when it worked, Crossfire has/had better scaling than SLI. Is that not the case?

It varies by generation due to drivers and GPU architecture. Even if it did, the performance wasn't always better. Sometimes AMD can't get past the performance gap between their best vs. NVIDIA'S best cards even if the scaling is better. I've also run into more issues with games not supporting Crossfire than SLI. AMD used to take months to deliver drivers with support for a given game.
 
It’s 2:30 and I still can’t sleep... my experience with SLI going back to the 9800GX2, dual GTX 280’s and a little 5870 crossfire. Every single time I’ve had significant issues messing with settings to get the 2nd gpu working in popular games and/or graphics heavy games. Even when they’re working well there’s many games that don’t support multiple GPUs. Not to mention multiple display multiple gpu bugs, microstuttering, half-second freezes from time to time too. While SLI back in the 780ti days, it was working well most of the time- now the Honus of functioning multi gpu setups is on developers rather than on NVidia. Even if I had plenty of scrilla to but you wouldn’t see me buying a multi gpu setup because of the waste of time it is setting up. Multiple drivers, games, and confide isn’t worth it to me. I’m not 20 years old anymore and I don’t enjoy dicking around my computer anymore. I’d rather make money, enjoy the comfort of an attractive woman, or do something productive. If you’re someone with very few social opportunities and enjoy pissing away money from (broke student to rich moron) lack of opportunities elsewhere then go ahead. Come join the neck beard parade online and compare my overclocks to yours!
 
Back
Top