G-sync and SLI. Regretting getting 2nd card...

Discussion in 'Video Cards' started by Johnny78, Oct 10, 2018 at 1:58 PM.

  1. Johnny78

    Johnny78 [H]ard|Gawd

    Messages:
    1,270
    Joined:
    Jan 13, 2007
    So I've noticed that with this last generation of cards (Pascal), there have been lots of cries about the death of SLI and how it's not worth the money/trouble. I've never been one of these people. I've always had a good experience with SLI, not perfect and not everything was supported, but for the most part, I've always enjoyed SLI and noticed a nice performance bump in most games I play. That said, my last experience with SLI was with 2 980ti's which overall, worked very well.

    Fast forward to today, I've been using a Titan Xp for the last 9 months or so and this last week I grabbed a 2nd Titan Xp as I came across a good deal. After getting everything set up, I ran the usual set of benchmarks (Valley, Heaven and Firestrike) and everything looked good score wise. (Firestrike Extreme was around 20.5K). I loaded up a couple games however and I noticed that my FPS were really not that much different from my single Xp. But I noticed my GPU utilization was like only 70-75% on GPU 1 and maybe 80-85% on GPU 2. After much googling, I've come across a complaint that G-sync and Pascal SLI cards don't seem to work well together. With G-sync enabled it diminishes the GPU load by a noticeable bit. Sure enough, I disabled G-sync, fired up Witcher 3 and lo and behold, both my GPUS were around 98-99% load and my FPS rose by about 30 FPS.

    I'm using an Acer Predator X34 by the way.

    Apparently this is a known issue for the last couple years and Nvidia has acknowledged it but has not found a fix. So now I'm a little disappointed that I either have to play with G sync off, or disable 1 of the cards (which obviously defeats the point of SLI). Lame that Nvidia can't get their own technologies to work well with each.

    Anyone else run into this? What would you recommend? Sell, the 2nd Titan Xp and just go back to 1 card? Get a new monitor that doesn't have G-sync (what else is out there that's around the same size?).

    Anyway, I think I'm now going to have to apply for membership into the "SLI sucks" camp. :sick:


    Pic of rig.

    u69Xtea.jpg
     
    Last edited: Oct 11, 2018 at 5:18 PM
    Aireoth, Master_shake_ and Armenius like this.
  2. NoOther

    NoOther [H]ardness Supreme

    Messages:
    6,844
    Joined:
    May 14, 2008
    I don't have the issue with G-sync (I don't use it, too much for little benefit), but I have 2 Titan Xps and noticed minimum gains from SLI. It largely depends on the games you play though. I have had some games actually work better when SLI is disabled.
     
    IdiotInCharge likes this.
  3. Johnny78

    Johnny78 [H]ard|Gawd

    Messages:
    1,270
    Joined:
    Jan 13, 2007
    You don't use G-sync because you disable it, or you don't have a G-sync enabled monitor? I'm specifically talking about using a G-sync monitor and SLI with Pascal cards.
     
  4. NoOther

    NoOther [H]ardness Supreme

    Messages:
    6,844
    Joined:
    May 14, 2008
    I don't buy G-sync monitors because of the cost. The benefit to cost ratio is not in my favor. I know what you were talking about, I was adding my own experiences with SLI and Pascal cards.
     
  5. Randall Stephens

    Randall Stephens Limp Gawd

    Messages:
    179
    Joined:
    Mar 3, 2017
    Are you sure you're not seeing benefits because you're being blinded by all the lights?????????

    I"m only joking and being an:asshat:
     
    cybereality, Flexion, Doc Doc and 5 others like this.
  6. Johnny78

    Johnny78 [H]ard|Gawd

    Messages:
    1,270
    Joined:
    Jan 13, 2007
    I change the lighting schemes like I change my underwear. It's usually not so seizure inducing. Lol.
     
    Doc Doc and Armenius like this.
  7. Supercharged_Z06

    Supercharged_Z06 2[H]4U

    Messages:
    2,155
    Joined:
    Nov 13, 2006
    Some folks like rave parties on their desks, some don’t. :-p
     
    Flexion and Armenius like this.
  8. Armenius

    Armenius [H]ardForum Junkie

    Messages:
    15,407
    Joined:
    Jan 28, 2014
    One of the reasons I frequent NVIDIA's forum is to be informed of issues like this. People have been complaining to NVIDIA about G-Sync + SLI issues for quite awhile now and they have still yet to fix it. I think I recall a post from an NVIDIA representative explaining that they cannot reproduce the problem in their test environment, so they do not acknowledge the issue in the driver notes nor attempt to fix it.
     
  9. ccityinstaller

    ccityinstaller 2[H]4U

    Messages:
    3,527
    Joined:
    Feb 23, 2007
    It is entirely possible that the issue lies in the HW of the "G-sync" module and I firmware update is unable to fix the issue, meaning Nvidia would have to issue a "recall" and incur the cost and bad PR for .0001% of their customer base. They clearly decided that it is not in their best interest to fix it. IF it can be replicated across multiple builds with different cards etc and different G-sync monitors, I would be headed to a class action lawsuit attorney and be the first in line so that you actually get more then $1.50 when Nvidia settles the suit.

    You could also sue Nvidia in small claims court for the cost of the HW/LCD since they do not work properly. It's a roll of the dice but it's super cheap and they cannot bring an attorney..If they do not show up in most states you win by default.

    Sorry to hear about your issues. I am glad this is not an issue with Freesync and multi-gpu configs.
     
  10. Master_shake_

    Master_shake_ Little Bitch

    Messages:
    7,515
    Joined:
    Apr 9, 2012
  11. ccityinstaller

    ccityinstaller 2[H]4U

    Messages:
    3,527
    Joined:
    Feb 23, 2007
    Looks like a Corsair HX570 or something like that..Or its the smaller ThermalTAke version of the big ass VIEW 71 TG I have...They look so badass, but those glass panels really kill performance..I have my 420mm rad set as an intake, with a rear 120mm intake pulling fresh air in for the M.2 heatsink/VRM area in my case with my 360mm rad mounted up top in exhaust...The glass panel was causing my temps to be 10C higher then my old setup...I thought something was wrong with my loop until I pulled that top pane off and the temperatures dropped nearly instantly..Granted not everyone has an 8 Core CPU and 3 RX VEGAs working 100% 24/7 but its a poor setup.
     
  12. ssnyder28

    ssnyder28 2[H]4U

    Messages:
    3,232
    Joined:
    May 9, 2012
    Why not try SLI with the two cards and Gsync turned off before you make a decision. Two Titan Xps in sli should have average FPS higher than your monitor's refresh rate as long as the game you're playing supports it.
     
    N4CR likes this.
  13. Johnny78

    Johnny78 [H]ard|Gawd

    Messages:
    1,270
    Joined:
    Jan 13, 2007
    Yeah, it's a 570X from Corsair. It's a beautiful case. There's glass on all sides so you have to take care to cable management carefully in the back. I'm not noticing any temperature issues. The glass panes over the radiators sit like an inch above the chassis (giving a floating look) and there's plenty of airflow. Under full load, one GPU sits in low to mid 40s while the other is usually about 5 degrees higher. I have a 360 and 240 radiator cooling the entire loop. Again no issues with temperatures.
     
  14. Johnny78

    Johnny78 [H]ard|Gawd

    Messages:
    1,270
    Joined:
    Jan 13, 2007
    Yeah, I didn't get a chance to do much gaming with G-sync turned off to see if I notice any microstutter or issues. Ill be doing that tonight to see if that's an option. I'll be playing Shadow of the tomb raider tonight to see how it feels. Surprisingly even with two Titan Xp's, it's not always a given that I'm above the monitor's 100 hhz refresh rate. The resolution is pretty high at 3440X1440 and I usually max out AA and every option which puts a lot of load on the GPU's. But I do like my eye candy.
     
  15. linuxdude9

    linuxdude9 Limp Gawd

    Messages:
    491
    Joined:
    Dec 25, 2004
    Which CPU and motherboard do you have?

    Whatever NVidia is doing, G-Sync requires a lot of PCI-E bandwidth, at least on pre-turing cards. Also, on pre-turing cards, games with temporal dependencies(like TAA in the Witcher) also use a lot of PCI-E bandwidth for transferring data between the cards. I wouldn't do SLI without full 16x/16x bandwidth.

    For the Witcher 3, try the SLI bits listed here https://www.forum-3dcenter.org/vbulletin/showpost.php?p=10674669&postcount=2008
     
    Last edited: Oct 10, 2018 at 5:44 PM
  16. ccityinstaller

    ccityinstaller 2[H]4U

    Messages:
    3,527
    Joined:
    Feb 23, 2007
    Hmm...I do have a 3rd GPU to throw into the mix and am running an 8C/16t maxed, but I do not like my temps with it on...It's not like they are "bad" at 62C lol but I like them around 50C! :) Also forgot to mention that was peak summer with 75-77F ambient no AC.

    I went with the TG that is smoked so I do not have to worry about rear CM too much.
     
  17. DoubleTap

    DoubleTap [H]ard|Gawd

    Messages:
    1,901
    Joined:
    Dec 16, 2010
    I'm reading everything I can about Pascal SLI + G-sync and this is the first I've heard of your issue.

    I have 3x 1440P G-sync monitors on a single 1080Ti and have been maneuvering my system (case, mobo, psu) to go SLI+Surround+G-sync but damn if you can find much testing or info on that config.
     
  18. Johnny78

    Johnny78 [H]ard|Gawd

    Messages:
    1,270
    Joined:
    Jan 13, 2007
    I know right? I like to think of myself as fairly well versed with known hardware issues and I had never heard of it either. But apparently it's a thing. I can find dozens of threads on Geforce describing this exact thing.


    https://forums.geforce.com/default/topic/1065687/sli/sli-gsync-low-gpu-usage/


    https://forums.geforce.com/default/...ou-fix-gsync-with-sli-low-gpu-usage-problem-/


    https://forums.geforce.com/default/...-1080-sli-low-gpu-usage-while-g-sync-enabled/


    Believe me, if I had known about it, I would have seriously paused before adding a 2nd card. The thing is, SLI worked just fine with this same monitor back when I had my 980ti's. The issue seems to be with Pascal generation cards.
     
  19. ccityinstaller

    ccityinstaller 2[H]4U

    Messages:
    3,527
    Joined:
    Feb 23, 2007
  20. N4CR

    N4CR 2[H]4U

    Messages:
    2,802
    Joined:
    Oct 17, 2011
    Was going to say this, and if not, then lock it to a rate that works manually.
     
  21. Chas

    Chas [H]ardness Supreme

    Messages:
    7,065
    Joined:
    Oct 31, 2005

    I can tell you that NVSurround is essentially an abandoned feature.

    It works...but there's annoying interface issues (like inability to make the taskbar span or retract).

    The fact that many apps still only maximize to a single screen...

    And as far as I can tell, NV hasn't really even touched this feature much since it was introduced.
     
  22. Vega

    Vega [H]ardness Supreme

    Messages:
    5,701
    Joined:
    Oct 12, 2004
    At one time I was SLI's biggest proponent. Not anymore. Get the fastest single GPU and call it a day.
     
    chappedstick likes this.
  23. DoubleTap

    DoubleTap [H]ard|Gawd

    Messages:
    1,901
    Joined:
    Dec 16, 2010
    I've been using it for about 7 years and it works fine for me with no issues.

    Almost all SLI scaling tests are 4K or 1440P - it's very hard to find info on how SLI scales in surround.
     
  24. Seyumi

    Seyumi Limp Gawd

    Messages:
    169
    Joined:
    Mar 30, 2011
    Can anyone else confirm if this is still an issue? I was about to upgrade to two 2080 Ti's but I also use G-Sync. I looked at the links to the GeForce forums and it seems this has been going on for a few years. 90% of my rigs have been SLI (currently not right now) but I'm pretty new to the G-Sync party and currently using my first G-Sync monitor. I know about all the SLI issues and gripes, but if enabling G-Sync with SLI (in additional to all the normal SLI issues) will drop my GPU usage by 25% then I'd rather not bother wasting my time & money.
     
  25. Johnny78

    Johnny78 [H]ard|Gawd

    Messages:
    1,270
    Joined:
    Jan 13, 2007
    Okay so after playing Shadow of the Tomb raider for about 5 hours last night, I've come to the opinion (at least based on this game) that SLI is not worth the trouble or expense. Really a shame too. So in addition to G sync affecting the GPU utilization of SLI, there is another bug (or glitch or just the nature of the tech) where TAA (temporal anti aliasing) doesn't play well with SLI resulting in either bad performance, or weird light artifacting. In my case, if I enable TAA while running SLI, the in game benchmark crashes the second I try to run it. With SLI, I can only enable SMAA which is the lowest quality anti-aliasing technique. TAA, SMAAT2x and SMAA4x all crash my game when I try to run the benchmark (using SLI). With just one card enabled all the other AA settings work fine.


    So like people suggested, I tried actually playing the game with SLI enabled but G sync turned off. I tried this at two different settings. The first I enabled V-sync, which locked the frame rate at 100 fps (my max refresh). All settings were maxed out except for AA which was set to SMAA (due to the other AA settings not liking SLI). With 2 Titan Xp's in SLI the system easily held a locked 100 fps. Then I kept everything the same but instead turned off V-sync. Again all maxed settings with the exception of SMAA. At this setting my FPS hovered in the 140s and 150s. Both GPUs were running at nearly full load.

    Lastly, I played the game with SLI disabled, and G-sync enabled with all the same settings except AA now set to SMAAT2x. (SMAA4x is ridiculously taxing). At this setting a single Titan Xp kept frames on average in the mid 70s or so. Definitely below the max refresh rate.

    So, the million dollar question? Which provided the best playing experience? IT WAS NO CONTEST WHATSOEVER!!. The G-sync experience was by far the best experience. Even though the frame rates were far lower (70s versus locked 100 and variable 140s) the actual playing experience is so damn buttery smooth by comparison. Don't get me wrong, non G-sync is not bad and there wasn't really any stutter (and in fact, if I didn't have G-sync to compare I would probably think it's great) but when you compare them side by side the G sync experience is far superior. For the guy above who said he doesn't think it's worth it, I would reconsider. Also, comparing the two side by side, the non SLI play experience looked better because without SLI I can enable SMAAT2x which makes things like Laura's hair look so much better.

    So bottom line, 1 card (with G-sync) plays far better AND looks better than the SLI play experience even though it has far higher frame rates.


    So yeah, I'm disappointed and I'm about 95% convinced I'm going to sell this 2nd Titan Xp. It's not worth the expense if you're getting a inferior final product.
     
  26. Archaea

    Archaea [H]ardForum Junkie

    Messages:
    8,499
    Joined:
    Oct 19, 2004
    After reading your first post I was going to ask WHY? I’d take gsync or freesync over SLI or crossfire any day of the week after trying all four options in the last couple years with a pair of Fury X, and pair of Vega 56 against a trio of FreeSync HP Omen 32s and a pair of 1080Tis against the FreeSync Omen 32s sucked which made me want VRR back so I had to buy a gsync monitor. When I bought a gsync monitor I realized quickly I didn’t need two 1080tis and sold one.

    I can have my frame rates drop into the mid 40s with gsync and it feels buttery smooth(Hunt Showdown). With SLI you need to keep the FPS exactly capped and it STILL doesn’t feel as smooth.

    Sell that second Titan XP. There’s no point in it with gsync and 3440x1440. I’ve not found a game yet that can take my 1080ti below gsync range on my Dell Alienware AW3418, and it doesn’t matter that I don’t hit 120hz continually...Cause it’s smooth no matter what frame rate I’m getting.

    People who say gsync is too much for too little, either haven’t experienced it or visually perceive much different than me. VRR is the biggest advancement in gaming in the last decade IMO. It improves the whole gaming experience significantly.
     
    Last edited: Oct 11, 2018 at 6:35 PM
    GoldenTiger, d8lock and Maddness like this.
  27. cybereality

    cybereality 2[H]4U

    Messages:
    3,081
    Joined:
    Mar 22, 2008
    Strange. I've never had this problem. I ran 3x 980s at one point 2x 1080 and now 2x 2080Ti with G-Sync 1440p Surround and performance is great.
     
  28. linuxdude9

    linuxdude9 Limp Gawd

    Messages:
    491
    Joined:
    Dec 25, 2004
    Jonny78: Do each of your cards have a full 16x PCI-E connection or are you running 16x/8x or 8x/8x?

    Did you try those Witcher 3 SLI bits I posted above?
     
  29. sharknice

    sharknice [H]ard|Gawd

    Messages:
    1,300
    Joined:
    Nov 12, 2012
    I was doing SLI 770s and had a lot of problems with gsync.

    Also I'm into competitive gaming so input lag is important. I did some tests with a high speed camera.
    Any extra fps you get with SLI doesn't reduce input lag. And if you're hitting your fps cap it could actually make input lag worse compared to a single card.

    So I stopped using SLI.
     
    cybereality likes this.
  30. guitarslingerchris

    guitarslingerchris Failure is just success rounded down

    Messages:
    6,460
    Joined:
    Oct 29, 2004
    Better go back to a CRT then.
     
  31. IdiotInCharge

    IdiotInCharge Not the Idiot YOU are Looking for

    Messages:
    7,373
    Joined:
    Jun 13, 2003
    Had the basic SLI challenges when running 970 SLI with G-Sync, but didn't have any 'hard stops'. Definitely noticed when that second card was missing!