G-sync and SLI. Regretting getting 2nd card...

Johnny78

[H]ard|Gawd
Joined
Jan 13, 2007
Messages
1,408
So I've noticed that with this last generation of cards (Pascal), there have been lots of cries about the death of SLI and how it's not worth the money/trouble. I've never been one of these people. I've always had a good experience with SLI, not perfect and not everything was supported, but for the most part, I've always enjoyed SLI and noticed a nice performance bump in most games I play. That said, my last experience with SLI was with 2 980ti's which overall, worked very well.

Fast forward to today, I've been using a Titan Xp for the last 9 months or so and this last week I grabbed a 2nd Titan Xp as I came across a good deal. After getting everything set up, I ran the usual set of benchmarks (Valley, Heaven and Firestrike) and everything looked good score wise. (Firestrike Extreme was around 20.5K). I loaded up a couple games however and I noticed that my FPS were really not that much different from my single Xp. But I noticed my GPU utilization was like only 70-75% on GPU 1 and maybe 80-85% on GPU 2. After much googling, I've come across a complaint that G-sync and Pascal SLI cards don't seem to work well together. With G-sync enabled it diminishes the GPU load by a noticeable bit. Sure enough, I disabled G-sync, fired up Witcher 3 and lo and behold, both my GPUS were around 98-99% load and my FPS rose by about 30 FPS.

I'm using an Acer Predator X34 by the way.

Apparently this is a known issue for the last couple years and Nvidia has acknowledged it but has not found a fix. So now I'm a little disappointed that I either have to play with G sync off, or disable 1 of the cards (which obviously defeats the point of SLI). Lame that Nvidia can't get their own technologies to work well with each.

Anyone else run into this? What would you recommend? Sell, the 2nd Titan Xp and just go back to 1 card? Get a new monitor that doesn't have G-sync (what else is out there that's around the same size?).

Anyway, I think I'm now going to have to apply for membership into the "SLI sucks" camp. :sick:


Pic of rig.

u69Xtea.jpg
 
Last edited:
I don't have the issue with G-sync (I don't use it, too much for little benefit), but I have 2 Titan Xps and noticed minimum gains from SLI. It largely depends on the games you play though. I have had some games actually work better when SLI is disabled.
 
I don't have the issue with G-sync (I don't use it, too much for little benefit), but I have 2 Titan Xps and noticed minimum gains from SLI. It largely depends on the games you play though. I have had some games actually work better when SLI is disabled.

You don't use G-sync because you disable it, or you don't have a G-sync enabled monitor? I'm specifically talking about using a G-sync monitor and SLI with Pascal cards.
 
You don't use G-sync because you disable it, or you don't have a G-sync enabled monitor? I'm specifically talking about using a G-sync monitor and SLI with Pascal cards.

I don't buy G-sync monitors because of the cost. The benefit to cost ratio is not in my favor. I know what you were talking about, I was adding my own experiences with SLI and Pascal cards.
 
One of the reasons I frequent NVIDIA's forum is to be informed of issues like this. People have been complaining to NVIDIA about G-Sync + SLI issues for quite awhile now and they have still yet to fix it. I think I recall a post from an NVIDIA representative explaining that they cannot reproduce the problem in their test environment, so they do not acknowledge the issue in the driver notes nor attempt to fix it.
 
One of the reasons I frequent NVIDIA's forum is to be informed of issues like this. People have been complaining to NVIDIA about G-Sync + SLI issues for quite awhile now and they have still yet to fix it. I think I recall a post from an NVIDIA representative explaining that they cannot reproduce the problem in their test environment, so they do not acknowledge the issue in the driver notes nor attempt to fix it.

It is entirely possible that the issue lies in the HW of the "G-sync" module and I firmware update is unable to fix the issue, meaning Nvidia would have to issue a "recall" and incur the cost and bad PR for .0001% of their customer base. They clearly decided that it is not in their best interest to fix it. IF it can be replicated across multiple builds with different cards etc and different G-sync monitors, I would be headed to a class action lawsuit attorney and be the first in line so that you actually get more then $1.50 when Nvidia settles the suit.

You could also sue Nvidia in small claims court for the cost of the HW/LCD since they do not work properly. It's a roll of the dice but it's super cheap and they cannot bring an attorney..If they do not show up in most states you win by default.

Sorry to hear about your issues. I am glad this is not an issue with Freesync and multi-gpu configs.
 
Looks like a Corsair HX570 or something like that..Or its the smaller ThermalTAke version of the big ass VIEW 71 TG I have...They look so badass, but those glass panels really kill performance..I have my 420mm rad set as an intake, with a rear 120mm intake pulling fresh air in for the M.2 heatsink/VRM area in my case with my 360mm rad mounted up top in exhaust...The glass panel was causing my temps to be 10C higher then my old setup...I thought something was wrong with my loop until I pulled that top pane off and the temperatures dropped nearly instantly..Granted not everyone has an 8 Core CPU and 3 RX VEGAs working 100% 24/7 but its a poor setup.
 
Why not try SLI with the two cards and Gsync turned off before you make a decision. Two Titan Xps in sli should have average FPS higher than your monitor's refresh rate as long as the game you're playing supports it.
 
  • Like
Reactions: N4CR
like this
Looks like a Corsair HX570 or something like that..Or its the smaller ThermalTAke version of the big ass VIEW 71 TG I have...They look so badass, but those glass panels really kill performance..I have my 420mm rad set as an intake, with a rear 120mm intake pulling fresh air in for the M.2 heatsink/VRM area in my case with my 360mm rad mounted up top in exhaust...The glass panel was causing my temps to be 10C higher then my old setup...I thought something was wrong with my loop until I pulled that top pane off and the temperatures dropped nearly instantly..Granted not everyone has an 8 Core CPU and 3 RX VEGAs working 100% 24/7 but its a poor setup.

Yeah, it's a 570X from Corsair. It's a beautiful case. There's glass on all sides so you have to take care to cable management carefully in the back. I'm not noticing any temperature issues. The glass panes over the radiators sit like an inch above the chassis (giving a floating look) and there's plenty of airflow. Under full load, one GPU sits in low to mid 40s while the other is usually about 5 degrees higher. I have a 360 and 240 radiator cooling the entire loop. Again no issues with temperatures.
 
Why not try SLI with the two cards and Gsync turned off before you make a decision. Two Titan Xps in sli should have average FPS higher than your monitor's refresh rate as long as the game you're playing supports it.

Yeah, I didn't get a chance to do much gaming with G-sync turned off to see if I notice any microstutter or issues. Ill be doing that tonight to see if that's an option. I'll be playing Shadow of the tomb raider tonight to see how it feels. Surprisingly even with two Titan Xp's, it's not always a given that I'm above the monitor's 100 hhz refresh rate. The resolution is pretty high at 3440X1440 and I usually max out AA and every option which puts a lot of load on the GPU's. But I do like my eye candy.
 
Which CPU and motherboard do you have?

Whatever NVidia is doing, G-Sync requires a lot of PCI-E bandwidth, at least on pre-turing cards. Also, on pre-turing cards, games with temporal dependencies(like TAA in the Witcher) also use a lot of PCI-E bandwidth for transferring data between the cards. I wouldn't do SLI without full 16x/16x bandwidth.

For the Witcher 3, try the SLI bits listed here https://www.forum-3dcenter.org/vbulletin/showpost.php?p=10674669&postcount=2008
 
Last edited:
Yeah, it's a 570X from Corsair. It's a beautiful case. There's glass on all sides so you have to take care to cable management carefully in the back. I'm not noticing any temperature issues. The glass panes over the radiators sit like an inch above the chassis (giving a floating look) and there's plenty of airflow. Under full load, one GPU sits in low to mid 40s while the other is usually about 5 degrees higher. I have a 360 and 240 radiator cooling the entire loop. Again no issues with temperatures.

Hmm...I do have a 3rd GPU to throw into the mix and am running an 8C/16t maxed, but I do not like my temps with it on...It's not like they are "bad" at 62C lol but I like them around 50C! :) Also forgot to mention that was peak summer with 75-77F ambient no AC.

I went with the TG that is smoked so I do not have to worry about rear CM too much.
 
I'm reading everything I can about Pascal SLI + G-sync and this is the first I've heard of your issue.

I have 3x 1440P G-sync monitors on a single 1080Ti and have been maneuvering my system (case, mobo, psu) to go SLI+Surround+G-sync but damn if you can find much testing or info on that config.
 
I'm reading everything I can about Pascal SLI + G-sync and this is the first I've heard of your issue.

I have 3x 1440P G-sync monitors on a single 1080Ti and have been maneuvering my system (case, mobo, psu) to go SLI+Surround+G-sync but damn if you can find much testing or info on that config.

I know right? I like to think of myself as fairly well versed with known hardware issues and I had never heard of it either. But apparently it's a thing. I can find dozens of threads on Geforce describing this exact thing.


https://forums.geforce.com/default/topic/1065687/sli/sli-gsync-low-gpu-usage/


https://forums.geforce.com/default/...ou-fix-gsync-with-sli-low-gpu-usage-problem-/


https://forums.geforce.com/default/...-1080-sli-low-gpu-usage-while-g-sync-enabled/


Believe me, if I had known about it, I would have seriously paused before adding a 2nd card. The thing is, SLI worked just fine with this same monitor back when I had my 980ti's. The issue seems to be with Pascal generation cards.
 
Why not try SLI with the two cards and Gsync turned off before you make a decision. Two Titan Xps in sli should have average FPS higher than your monitor's refresh rate as long as the game you're playing supports it.
Was going to say this, and if not, then lock it to a rate that works manually.
 
I'm reading everything I can about Pascal SLI + G-sync and this is the first I've heard of your issue.

I have 3x 1440P G-sync monitors on a single 1080Ti and have been maneuvering my system (case, mobo, psu) to go SLI+Surround+G-sync but damn if you can find much testing or info on that config.


I can tell you that NVSurround is essentially an abandoned feature.

It works...but there's annoying interface issues (like inability to make the taskbar span or retract).

The fact that many apps still only maximize to a single screen...

And as far as I can tell, NV hasn't really even touched this feature much since it was introduced.
 
I can tell you that NVSurround is essentially an abandoned feature.

It works...but there's annoying interface issues (like inability to make the taskbar span or retract).

The fact that many apps still only maximize to a single screen...

And as far as I can tell, NV hasn't really even touched this feature much since it was introduced.

I've been using it for about 7 years and it works fine for me with no issues.

Almost all SLI scaling tests are 4K or 1440P - it's very hard to find info on how SLI scales in surround.
 
Can anyone else confirm if this is still an issue? I was about to upgrade to two 2080 Ti's but I also use G-Sync. I looked at the links to the GeForce forums and it seems this has been going on for a few years. 90% of my rigs have been SLI (currently not right now) but I'm pretty new to the G-Sync party and currently using my first G-Sync monitor. I know about all the SLI issues and gripes, but if enabling G-Sync with SLI (in additional to all the normal SLI issues) will drop my GPU usage by 25% then I'd rather not bother wasting my time & money.
 
Okay so after playing Shadow of the Tomb raider for about 5 hours last night, I've come to the opinion (at least based on this game) that SLI is not worth the trouble or expense. Really a shame too. So in addition to G sync affecting the GPU utilization of SLI, there is another bug (or glitch or just the nature of the tech) where TAA (temporal anti aliasing) doesn't play well with SLI resulting in either bad performance, or weird light artifacting. In my case, if I enable TAA while running SLI, the in game benchmark crashes the second I try to run it. With SLI, I can only enable SMAA which is the lowest quality anti-aliasing technique. TAA, SMAAT2x and SMAA4x all crash my game when I try to run the benchmark (using SLI). With just one card enabled all the other AA settings work fine.


So like people suggested, I tried actually playing the game with SLI enabled but G sync turned off. I tried this at two different settings. The first I enabled V-sync, which locked the frame rate at 100 fps (my max refresh). All settings were maxed out except for AA which was set to SMAA (due to the other AA settings not liking SLI). With 2 Titan Xp's in SLI the system easily held a locked 100 fps. Then I kept everything the same but instead turned off V-sync. Again all maxed settings with the exception of SMAA. At this setting my FPS hovered in the 140s and 150s. Both GPUs were running at nearly full load.

Lastly, I played the game with SLI disabled, and G-sync enabled with all the same settings except AA now set to SMAAT2x. (SMAA4x is ridiculously taxing). At this setting a single Titan Xp kept frames on average in the mid 70s or so. Definitely below the max refresh rate.

So, the million dollar question? Which provided the best playing experience? IT WAS NO CONTEST WHATSOEVER!!. The G-sync experience was by far the best experience. Even though the frame rates were far lower (70s versus locked 100 and variable 140s) the actual playing experience is so damn buttery smooth by comparison. Don't get me wrong, non G-sync is not bad and there wasn't really any stutter (and in fact, if I didn't have G-sync to compare I would probably think it's great) but when you compare them side by side the G sync experience is far superior. For the guy above who said he doesn't think it's worth it, I would reconsider. Also, comparing the two side by side, the non SLI play experience looked better because without SLI I can enable SMAAT2x which makes things like Laura's hair look so much better.

So bottom line, 1 card (with G-sync) plays far better AND looks better than the SLI play experience even though it has far higher frame rates.


So yeah, I'm disappointed and I'm about 95% convinced I'm going to sell this 2nd Titan Xp. It's not worth the expense if you're getting a inferior final product.
 
Okay so after playing Shadow of the Tomb raider for about 5 hours last night, I've come to the opinion (at least based on this game) that SLI is not worth the trouble or expense. Really a shame too. So in addition to G sync affecting the GPU utilization of SLI, there is another bug (or glitch or just the nature of the tech) where TAA (temporal anti aliasing) doesn't play well with SLI resulting in either bad performance, or weird light artifacting. In my case, if I enable TAA while running SLI, the in game benchmark crashes the second I try to run it. With SLI, I can only enable SMAA which is the lowest quality anti-aliasing technique. TAA, SMAAT2x and SMAA4x all crash my game when I try to run the benchmark (using SLI). With just one card enabled all the other AA settings work fine.


So like people suggested, I tried actually playing the game with SLI enabled but G sync turned off. I tried this at two different settings. The first I enabled V-sync, which locked the frame rate at 100 fps (my max refresh). All settings were maxed out except for AA which was set to SMAA (due to the other AA settings not liking SLI). With 2 Titan Xp's in SLI the system easily held a locked 100 fps. Then I kept everything the same but instead turned off V-sync. Again all maxed settings with the exception of SMAA. At this setting my FPS hovered in the 140s and 150s. Both GPUs were running at nearly full load.

Lastly, I played the game with SLI disabled, and G-sync enabled with all the same settings except AA now set to SMAAT2x. (SMAA4x is ridiculously taxing). At this setting a single Titan Xp kept frames on average in the mid 70s or so. Definitely below the max refresh rate.

So, the million dollar question? Which provided the best playing experience? IT WAS NO CONTEST WHATSOEVER!!. The G-sync experience was by far the best experience. Even though the frame rates were far lower (70s versus locked 100 and variable 140s) the actual playing experience is so damn buttery smooth by comparison. Don't get me wrong, non G-sync is not bad and there wasn't really any stutter (and in fact, if I didn't have G-sync to compare I would probably think it's great) but when you compare them side by side the G sync experience is far superior. For the guy above who said he doesn't think it's worth it, I would reconsider. Also, comparing the two side by side, the non SLI play experience looked better because without SLI I can enable SMAAT2x which makes things like Laura's hair look so much better.

So bottom line, 1 card (with G-sync) plays far better AND looks better than the SLI play experience even though it has far higher frame rates.


So yeah, I'm disappointed and I'm about 95% convinced I'm going to sell this 2nd Titan Xp. It's not worth the expense if you're getting a inferior final product.
After reading your first post I was going to ask WHY? I’d take gsync or freesync over SLI or crossfire any day of the week after trying all four options in the last couple years with a pair of Fury X, and pair of Vega 56 against a trio of FreeSync HP Omen 32s and a pair of 1080Tis against the FreeSync Omen 32s sucked which made me want VRR back so I had to buy a gsync monitor. When I bought a gsync monitor I realized quickly I didn’t need two 1080tis and sold one.

I can have my frame rates drop into the mid 40s with gsync and it feels buttery smooth(Hunt Showdown). With SLI you need to keep the FPS exactly capped and it STILL doesn’t feel as smooth.

Sell that second Titan XP. There’s no point in it with gsync and 3440x1440. I’ve not found a game yet that can take my 1080ti below gsync range on my Dell Alienware AW3418, and it doesn’t matter that I don’t hit 120hz continually...Cause it’s smooth no matter what frame rate I’m getting.

People who say gsync is too much for too little, either haven’t experienced it or visually perceive much different than me. VRR is the biggest advancement in gaming in the last decade IMO. It improves the whole gaming experience significantly.
 
Last edited:
Strange. I've never had this problem. I ran 3x 980s at one point 2x 1080 and now 2x 2080Ti with G-Sync 1440p Surround and performance is great.
 
Jonny78: Do each of your cards have a full 16x PCI-E connection or are you running 16x/8x or 8x/8x?

Did you try those Witcher 3 SLI bits I posted above?
 
I was doing SLI 770s and had a lot of problems with gsync.

Also I'm into competitive gaming so input lag is important. I did some tests with a high speed camera.
Any extra fps you get with SLI doesn't reduce input lag. And if you're hitting your fps cap it could actually make input lag worse compared to a single card.

So I stopped using SLI.
 
I was doing SLI 770s and had a lot of problems with gsync.

Also I'm into competitive gaming so input lag is important. I did some tests with a high speed camera.
Any extra fps you get with SLI doesn't reduce input lag. And if you're hitting your fps cap it could actually make input lag worse compared to a single card.

So I stopped using SLI.
Better go back to a CRT then.
 
Had the basic SLI challenges when running 970 SLI with G-Sync, but didn't have any 'hard stops'. Definitely noticed when that second card was missing!
 
Okay so after playing Shadow of the Tomb raider for about 5 hours last night, I've come to the opinion (at least based on this game) that SLI is not worth the trouble or expense. Really a shame too. So in addition to G sync affecting the GPU utilization of SLI, there is another bug (or glitch or just the nature of the tech) where TAA (temporal anti aliasing) doesn't play well with SLI resulting in either bad performance, or weird light artifacting. In my case, if I enable TAA while running SLI, the in game benchmark crashes the second I try to run it. With SLI, I can only enable SMAA which is the lowest quality anti-aliasing technique. TAA, SMAAT2x and SMAA4x all crash my game when I try to run the benchmark (using SLI). With just one card enabled all the other AA settings work fine.


So like people suggested, I tried actually playing the game with SLI enabled but G sync turned off. I tried this at two different settings. The first I enabled V-sync, which locked the frame rate at 100 fps (my max refresh). All settings were maxed out except for AA which was set to SMAA (due to the other AA settings not liking SLI). With 2 Titan Xp's in SLI the system easily held a locked 100 fps. Then I kept everything the same but instead turned off V-sync. Again all maxed settings with the exception of SMAA. At this setting my FPS hovered in the 140s and 150s. Both GPUs were running at nearly full load.

Lastly, I played the game with SLI disabled, and G-sync enabled with all the same settings except AA now set to SMAAT2x. (SMAA4x is ridiculously taxing). At this setting a single Titan Xp kept frames on average in the mid 70s or so. Definitely below the max refresh rate.

So, the million dollar question? Which provided the best playing experience? IT WAS NO CONTEST WHATSOEVER!!. The G-sync experience was by far the best experience. Even though the frame rates were far lower (70s versus locked 100 and variable 140s) the actual playing experience is so damn buttery smooth by comparison. Don't get me wrong, non G-sync is not bad and there wasn't really any stutter (and in fact, if I didn't have G-sync to compare I would probably think it's great) but when you compare them side by side the G sync experience is far superior. For the guy above who said he doesn't think it's worth it, I would reconsider. Also, comparing the two side by side, the non SLI play experience looked better because without SLI I can enable SMAAT2x which makes things like Laura's hair look so much better.

So bottom line, 1 card (with G-sync) plays far better AND looks better than the SLI play experience even though it has far higher frame rates.


So yeah, I'm disappointed and I'm about 95% convinced I'm going to sell this 2nd Titan Xp. It's not worth the expense if you're getting a inferior final product.

DX 12, mGPU in Shadow Of The Tomb Raider is working great. 2x EVGA 1080Ti SC Black. TAA works so does SMAA2x but with some artifacts, SMAA is artifact clean. I use SMAA and DSR (rendering 3620 x 2036, monitor at 2560 x 1440) as well as HDR - So far it has been the best visual gaming experience I've ever had. Using Adaptive Sync, the monitor at 100hz -
  • As a note: to get Adaptive Sync hz set, the desktop refresh rate has to be the same as what you want - For example if in game 100 hz refresh rate is used for the monitor, the desktop needs to be 100hz for adaptive sync to work at 100hz in the game
  • Game is utterly beautiful! Awe Inspiring to say the least, for the most part maintaining 100 FPS with all maxed out settings minus motion blur making it very smooth and virtually tear free
  • HDR and getting it to look awesome was some trial and error for in game - desktop on the other hand looks terrible - I can see why some are turned off. Still if one has a good HDR monitor, adjustment and fine tuning is probably needed for the best impact
  • I could not do this with a single 1080 Ti, I would be limited to not using 100 hz but 60 hz and no DSR. While using SMAAT2x would be more usable with a single card, the rendering and anti aliasing would not be as good.
  • Do agree that if pure FreeSync or Gsync was used it would be better or easier to keep it smooth, it is pretty smooth as it is. Using a FreeSync 2 monitor with the two 1080 TI's since Vega FE's just does not do this game good (needs updated drivers that work).
 
DX 12, mGPU in Shadow Of The Tomb Raider is working great. 2x EVGA 1080Ti SC Black. TAA works so does SMAA2x but with some artifacts, SMAA is artifact clean. I use SMAA and DSR (rendering 3620 x 2036, monitor at 2560 x 1440) as well as HDR - So far it has been the best visual gaming experience I've ever had. Using Adaptive Sync, the monitor at 100hz -
  • As a note: to get Adaptive Sync hz set, the desktop refresh rate has to be the same as what you want - For example if in game 100 hz refresh rate is used for the monitor, the desktop needs to be 100hz for adaptive sync to work at 100hz in the game
  • Game is utterly beautiful! Awe Inspiring to say the least, for the most part maintaining 100 FPS with all maxed out settings minus motion blur making it very smooth and virtually tear free
  • HDR and getting it to look awesome was some trial and error for in game - desktop on the other hand looks terrible - I can see why some are turned off. Still if one has a good HDR monitor, adjustment and fine tuning is probably needed for the best impact
  • I could not do this with a single 1080 Ti, I would be limited to not using 100 hz but 60 hz and no DSR. While using SMAAT2x would be more usable with a single card, the rendering and anti aliasing would not be as good.
  • Do agree that if pure FreeSync or Gsync was used it would be better or easier to keep it smooth, it is pretty smooth as it is. Using a FreeSync 2 monitor with the two 1080 TI's since Vega FE's just does not do this game good (needs updated drivers that work).

I know it's subjective but you honestly prefer the way SMAA looks on this game? SMAA is completely ineffectual for AA on Lauar's hair, and since well, that's in your frame the entire time, it starts to really annoy me. TAA and SMAA2x look visually far better to me.
 
I've been running G-Sync displays and SLI since day 1. Never had an issue. Runs beautifully. Currently running dual Asus PG279Q displays with SLI 1080Ti cards.
 
I've been running G-Sync displays and SLI since day 1. Never had an issue. Runs beautifully. Currently running dual Asus PG279Q displays with SLI 1080Ti cards.

Have you actually checked the GPU utilization in SLI and compared it with and without G-sync? May not be something you've actually noticed since framerates may still be high in either case.
 
Have you actually checked the GPU utilization in SLI and compared it with and without G-sync? May not be something you've actually noticed since framerates may still be high in either case.

This was me back in my SLI days. You SLI'd for so long that you really had no concept of just how powerful single cards are...
 
Have you actually checked the GPU utilization in SLI and compared it with and without G-sync? May not be something you've actually noticed since framerates may still be high in either case.
People that claim to "never have an issue" with a niche setup are not even worth debating with. They live in a special world of their own devoid of any facts.
 
Have you actually checked the GPU utilization in SLI and compared it with and without G-sync? May not be something you've actually noticed since framerates may still be high in either case.
People that claim to "never have an issue" with a niche setup are not even worth debating with. They live in a special world of their own devoid of any facts.

I won't speak for Sprayingmango, but having had generations of Crossfire and SLI experience myself, including with G-Sync, I can say that due diligence goes a long way. If you know that you're running a niche setup and you do the research you can avoid much of the headache.

Hell, you can avoid much of the headache just by not trying to play games on release.

What really gets me is why they're running two G-Sync monitors side-by-side :D
 
  • Like
Reactions: noko
like this
I know it's subjective but you honestly prefer the way SMAA looks on this game? SMAA is completely ineffectual for AA on Lauar's hair, and since well, that's in your frame the entire time, it starts to really annoy me. TAA and SMAA2x look visually far better to me.
I use DSR with SMAA, it looks way better then TAA or SMAA2X. I don’t have a GSync monitor but using a HDR 144 hz monitor. Image quality is spectacular and smooth enough for me.
 
Have you actually checked the GPU utilization in SLI and compared it with and without G-sync? May not be something you've actually noticed since framerates may still be high in either case.

I have and in most games I play the utilization is maxed out or near max. In some Ubisoft games like AC (where SLI is not supported) the second card just stays pretty low.
 
Back
Top