Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Any new resolution demands far more than the current GPU tech can offer.
If you jump into the new resolutions right away, well... not much choice but to run SLi.
Right about the end of every high resolution’s run, a single card is adequate, then it starts all over again.
8K (4320p) will be no different, except that GPU tech is WAY behind the curve on that one. It requires 400% GPU power over 4K 2160p. Good luck with that.
At least 4-5 gens away from a single card solution with max settings/60fps.
This guy runs 4 Titan XP 2017 model at 8k res.
This guy runs 4 Titan XP 2017 model at 8k res.
Has there been any games that have utilised DX12 split frame rendering?
I'm thinking of trying a single Vega 64 to see if Freesync on my 3 Omens is worth the trade off.
Recommend copious research here, including this forum. I recall one user ditching their Vega(s?) due to lack of software support.
Wouldn't it make much more sense to buy G-Sync monitors and play games without tons of input lag from V-Sync and SLI and not worry about hitting 60fps or SLI compatibility and other issues like micro stuttering?Been playing with a couple of 1080ti's for a few weeks now.. not my first dicso with SLI & Xfiah. Nothing much has changed over the years. In titles where it works right... it's bangin fast.
In titles that don't support it or the implementation is weak.. not..so..much.
I pulled my second card last night to see how far down I need to cut the eye candy in titles like GTA5 to get 60fps solid and it's not too bad. I'm thinking of trying a single Vega 64 to see if Freesync on my 3 Omens is worth the trade off.
My2c.
Wouldn't it make much more sense to buy G-Sync monitors and play games without tons of input lag from V-Sync and SLI and not worry about hitting 60fps or SLI compatibility and other issues like micro stuttering?
It will only hurt you until you either change GPU to AMD or monitors with G-Sync... or move to alternate dimension where NV will add support for FreeSync sooner than neverWould make more sense had I mot just purchased the 2 additional Omens.. hindsight always 20/20 ;p
No they're not...
Wouldn't it make much more sense to buy G-Sync monitors and play games without tons of input lag from V-Sync and SLI and not worry about hitting 60fps or SLI compatibility and other issues like micro stuttering?
I too own 1080 ti SLI
In games that support it I activate SLI
For those that do not I disable
For ME the expense it worth it even though many games are now being released without SLI support. In fact most of the games I play run fully maxed out on a single 1080Ti at 3440x1440 @ 100 refresh rate
And I like having the option for more eye candy if the game supports it. If it doesn't I don't give it another thought
ons of input lag? I am a top FPS player in many games and have NO input lag problems from gsync. Microstuttering doesn't happen anymore. Why do you guys who don't own or haven't owned sli in years give old played out talking points?
This is exactly my problem with Crossfire and modern SLI (read: newer than Voodoo 2) setups - they aren't fire and forget. Until they are, this technology is dead to me and most gamers. Most gamers aren't going to fudge around with it to see if it can be made to work when with a single card, they can just load in and play.
I've been waiting for some substantiation here, and it's been pretty quiet .
Personally, I would believe that there is some inherent lag with SLI. AFR more or less demands it from a technical perspective. But just like FreeSync provides a close experience to GSync, I wonder if that lag actually 'hurts more than helps'- and I'd wager that the answer is likely specific to the testing scenario (and not even just system configuration or game).
Eh, they're not fire and forget, but if you need the performance and you are willing to pay for it (so probably not you, but I've done it in the past), it's worth tinkering a smidge to get working.
Generally speaking the impetus to make the jump to SLI (or Crossfire) is to get more performance, either because a single card is inadequate or because the user simply wants the fastest solution available (most of [H]), for a specific subset of games, or even just one specific game. In my case it was the Battlefield series which has supported multi-GPU quite well over the last decade.
Of course, the move to lower-level graphics APIs like DX12 and Vulkan has thrown a bit of a wrench into multi-GPU, but that's largely been resolved in the most popular game engines.
How do they breathe?
How do they breathe?
Why would I need to have SLI for my comments about SLI adding input lag to be valid? It is just how the Alternate Frame Rendering tech works and it needs to add delay. Single Frame Rendering doesn't add input lag but usually scale poorly so most games have profiles with AFR for SLI.Tons of input lag? I am a top FPS player in many games and have NO input lag problems from gsync. Microstuttering doesn't happen anymore. Why do you guys who don't own or haven't owned sli in years give old played out talking points?
This is exactly my problem with Crossfire and modern SLI (read: newer than Voodoo 2) setups - they aren't fire and forget. Until they are, this technology is dead to me and most gamers. Most gamers aren't going to fudge around with it to see if it can be made to work when with a single card, they can just load in and play.
Most gamers are still running 1920x1080 displays like its 2007. At that resolution, a multiGPU solution is a waste of money. Beyond that, I rarely have to do anything in the control panel, SLI works or it doesn't. In cases where it doesn't work very well it isn't as though I've been lacking in performance to a point where I've needed to disable SLI or anything. I haven't had to fuck with the settings for SLI since Mass Effect Andromeda first came out. Even that was somewhat of an exception to the rule. I can't remember the last time I had to mess with those settings prior to that.
Overall, I've been very happy with my multiGPU setups and I've occasionally disabled it to see what if any difference it makes. Very rarely have I found zero benefit to having it on. About the most I've had to do in order to make it work was to rename an executable to another name that matches a game with an existing SLI profile. I had to do this with Andromeda before an optimized driver came out for it. My Titan X (Maxwell) cards were absolutely excellent and might have been the best multi-GPU configuration I've had since 8800GTX's in SLI and later 3-Way SLI.
I've not run either a Crossfire or SLI setup but my understanding was that, when it worked, Crossfire has/had better scaling than SLI. Is that not the case?