It was actually very well supported for a good period of time.
And one reason I said the post made no sense. On top of that, what SLI did for us isn't the same as what G-Sync does.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
It was actually very well supported for a good period of time.
True, I was just trying to decipher his post.It was actually very well supported for a good period of time.
naaa, there are cases where some folks do play only a few particular games, even for years that if SLI works well with may benefit them greatly. Everyone is different and some are way different. It is up to the person to research, ask questions and figure out their best solution. The answer could be yes, maybe and no for SLI is my view.The only time you should choose an SLI setup is if you have literally no other choice.
Instead of Nvidia selling you two cards to make more money, Nvidia just charges double the price to make even more money from you.SLI was useful for many years, and in the G80 days in particular where nvidia invested more resources to get games working better with it.
My guess is moving forward it will make a comeback, because if AMD/Nvidia go forward with chiplet design for GPU's this will require SLI-Like technology to get running properly.
For the most part all older games that were supported with SLI are still supported - in those cases performance with 2x 1070Ti will indeed increase. If one has a good backlog of older games that one will play -> SLI might be a rather smart option. If all you play are new games then SLI is probably a crappy choice unless you play the very few newer titles that work well with it.
For games that do not work well or just don't plain work with SLI -> Just turn off SLI -> The game will now work just as well as before as with a single card -> Why the hell do people bring up degrading performance with SLI as if one has to suffer with it in a particular game? TURN IT OFF.
Buying the single fastest card does not mean the fastest performance in all cases, case in point, my 2x 1080 Ti will rip any single 2080 Ti in Shadows of The Tomb Raider with over 95% scaling that game engine delivers. Now overall I would say the 2080 Ti would be the better more consistent performer in a broad base selection of games. And of course if one want the feel for lower performance with a 2080Ti, forcing you to use lower resolutions for good frame rates -> use RT in the few titles that support it
naaa, there are cases where some folks do play only a few particular games, even for years that if SLI works well with may benefit them greatly. Everyone is different and some are way different. It is up to the person to research, ask questions and figure out their best solution. The answer could be yes, maybe and no for SLI is my view.
So really, and thanks for all the erudite responses, it is hit or miss but still, in a lot of games two 1070Tis can bring a decent performance boost.
Let's say I got an extra 1070ti for free? Is it worth it then?
When i used to run a 5970 class gpu years ago the drivers were so borked that I would literally have to install a different set of drivers depending on what game I was playing, it was that much of a difference. Some games were choppy and unplayable with certain drivers, while others would get literally like 3x the fps with newer drivers.
I think your interpretation of what everyone said is quite optimistic. If you get one for free, you're still better off selling both for a single better card. Unless you simply want the thrill of seeing two GPU's in your case with only one doing the work most of the time...
I can clearly see that the industry isnt going to keep supporting SLI so I am not going to go that route.
Here's the fun part: the engine developers are supporting mGPU.
I'm thinking that while there's very little return for it today, keeping the basic support updated -- even if games shipping with said engines don't use it -- is worth it with higher resolutions and refresh rates, VR, and ray tracing all on the horizon.
4k per eye with < 12ms frametimes? With ray tracing? That's well more than two of whatever is coming out next year.
Not only engine devs but Nvidia has continued to work on mGPU and new tech for it as well. NV is still working with AFR but has put more effort into Checkerboard Frame Rendering (CFR) rendering for NVlink capable GPU's. CFR is still scaling at about 50%, compared to 90% for AFR, in Crysis but there is no micro stutter with CFR. CFR is available in the drivers but requires activation in the inspector.
Was reading RDR2's SLI seems to work really well and the scaling is impressive for all the other issues with the game.
Here is 3dCenters info on CFR and their forums have a ton more. It is in German, just FYI.
http://www.3dcenter.org/news/nvidia-wiederbelebt-multigpu-rendering-mittels-neuem-cfr-modus
With regard to VR, I would gladly pickup another GPU if mGPU would be implemented in VR games. Unfortunately the VR games I play, mainly DCS/P3D, have said they have little interest in mGPU. It is a shame.
With regard to VR, I would gladly pickup another GPU if mGPU would be implemented in VR games.
Does CFR "just work"? I am fine with a true 50% increase in FPS if there's zero additional microstutter and it just works all the time. It sounds like it still needs developer support from your last line.
I personally won't dabble in MGPU if it's not as easy as a single GPU. I am ok with enabling in inspector but it has to "just work" after that.
My last line was mainly regarding VR since most use custom engines and the dev teams won't spend time with mGPU at this point.
I thought one of the major issues with multi card and VR is latency.
VR is so sensitive to uneven framerate and lag, its difficult to solve both.
Perhaps the new chiplet designs that share memory between cores can help.
Isn't the current rumor that nvidia is looking at a chiplet design for their next GPU architecture in a couple of years?SLI was useful for many years, and in the G80 days in particular where nvidia invested more resources to get games working better with it.
My guess is moving forward it will make a comeback, because if AMD/Nvidia go forward with chiplet design for GPU's this will require SLI-Like technology to get running properly.
That's actually the thing they'd be trying to solve: use one GPU per eye. At most there'd need to be a simple bit of logic to make sure that the GPUs were synced, but otherwise not a big deal.
I heard they were doing this but havent seen any results.That's actually the thing they'd be trying to solve: use one GPU per eye. At most there'd need to be a simple bit of logic to make sure that the GPUs were synced, but otherwise not a big deal.
I wonder what the scaling for that would actually look like. It's seems like such an obvious solution, but I have a feeling it's not as great as one might expect. There is a lot of work in rendering each frame of a game that is shared between both eyes in VR, such as shadow maps and such, and if you split the rendering between two GPUs, you'd either have to have a way of sharing the shadow maps between the GPUs or render them independently on each one, which would undermine the benefit of using two or more.
Unless...
they create a none standard link that connects 2 cards down one cable.
We would be at their mercy for extensions.
Its the turning round twisting cables round each other.I don't see why binding two cables together wouldn't work, honestly. Obviously the cable would be a little bulkier and the headset would need to be able to accept the extra signal lines, but that's the easy stuff.
Its the turning round twisting cables round each other.
Adds more movement resistance and is more likely to damage a cable.
AyeOh no doubt -- they'd want to do a custom cable, not glue two together to prevent that as much as possible.