Some Nvidia forum posters claim that turning on G-Sync reduced performance when using SLI. The folks over at ExtremeTech grabbed a pair of 1080s to put that theory to the test, and the results were interesting. Deus Ex: Mankind Divided, for example, dropped from 49.5 FPS to 44.6 FPS when turning G-Sync on in DX11 mode, but it had absolutely no effect in DX12 mode. Meanwhile Far Cry 5 seemed to suffer more of a G-Sync deficit as anti-aliasing was cranked down. It seems that G-Sync does affect SLI performance in certain games, but the reason for that drop is still unclear. Nonetheless, the pattern here is clear. Turning G-Sync on and using SLI is not guaranteed to tank your frame rate. We also tested Metro Last Light Redux with a forced AFR2 rendering profile, and that game showed no performance drop at all between G-Sync enabled and disabled. Deus Ex Mankind Divided reported a small penalty that vanished in DX12, while Hitman takes too much of a performance hit in that mode when using SLI to ever justify it. Three games, three different performance models. It's always dicey to try and test forum reports, not because forum commenters are liars, but because most don't provide sufficient technical information to be sure you're reproducing a problem correctly. I don't have an explanation for this issue at the moment and I realize that "timing problem" is extremely vague. It's a theory that happens to fit some common-sense facts. Our results collectively suggest that the issue is real and that the performance gaps could be as large as Nvidia users say, particularly if they continue to worsen as frame rate increases.