So I've had G-SYNC since day 1 where they released the add-on board for the VG248QE and was using a 680. It sounded like it would solve just about every complaint I had with games stuttering or tearing and from what I can recall, it did. Fast forward a few years and I have an Acer X34 and use a 1080 Ti and just about every recent AAA game I play stutters. What happened?
I know the best way to make sure G-SYNC works has changed quite a bit and most recently I think it's been to make sure G-SYNC is enabled in the NVCP (it's on by default), disable V-SYNC in-game, then set a framerate cap a couple FPS below your screen's refresh rate, 88 in my case as I've overclocked the X34 to 90Hz.
Is that still accurate? Because that's what I'm doing and a lot of games are a mess regardless. I use an FPS counter and games are just about always at that 88, maybe dipping to 60-70 at times, but isn't G-SYNC supposed to make those dips unnoticeable? I'm starting to wonder if it's down to poor coding and optimization in the games and no amount of hardware I throw at it will do anything.
Edit: Just found this and it's similar to what I have above except doing a framerate limit of -3 instead of -2 and one other key difference: apparently you're supposed to enable V-SYNC in the NVCP as well, which is not on by default, otherwise G-SYNC loses the ability to compensate for sudden framerate drops. This explains a lot and is what you wanted at launch but I swear they changed it 1.5-2 years ago to where you DIDN'T want V-SYNC on in the NVCP... or I dreamt the whole thing and it never changed at all.
https://www.blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/14/
I know the best way to make sure G-SYNC works has changed quite a bit and most recently I think it's been to make sure G-SYNC is enabled in the NVCP (it's on by default), disable V-SYNC in-game, then set a framerate cap a couple FPS below your screen's refresh rate, 88 in my case as I've overclocked the X34 to 90Hz.
Is that still accurate? Because that's what I'm doing and a lot of games are a mess regardless. I use an FPS counter and games are just about always at that 88, maybe dipping to 60-70 at times, but isn't G-SYNC supposed to make those dips unnoticeable? I'm starting to wonder if it's down to poor coding and optimization in the games and no amount of hardware I throw at it will do anything.
Edit: Just found this and it's similar to what I have above except doing a framerate limit of -3 instead of -2 and one other key difference: apparently you're supposed to enable V-SYNC in the NVCP as well, which is not on by default, otherwise G-SYNC loses the ability to compensate for sudden framerate drops. This explains a lot and is what you wanted at launch but I swear they changed it 1.5-2 years ago to where you DIDN'T want V-SYNC on in the NVCP... or I dreamt the whole thing and it never changed at all.
https://www.blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/14/
Last edited: