G-sync On vs. Off in FPS games? (my observation).

SpongeBob

The Contraceptive Under the Sea
Joined
Jan 15, 2011
Messages
938
So I should start by saying I have had a G-sync monitor before. An official one not just compatible model and I preferred the 144hz over G-sync. I didn't find it made a huge difference in my life/gaming experience. I had to return those models they had issues. Skip ahead 3-4 years later and I buy a new monitor that is g-sync compatible. I saw this as a selling point or definitely a bonus.

So recently I've been playing a ton of Valorant with some buds. I also decided to overclock my RTX 2060. So I download the 3dmark demo on steam this weekend and one of the things it tells you when you start it is to turn g-sync off. So I go ahead do it, run the benchmark, then start a game a couple hours later.

This is where it gets interesting.. So after I joined the game I realized I didn't turn G-sync back on and didn't want to switch it on and screw up my video or crash the game or something. Yeah I was far more accurate and was doing waaaaay better. I feel like G-sync might add a bit of lag or something. I seem to be far more accurate now and I really don't notice a quality drop.

What do you guys think? I'm sorta feeling like g-sync/freesync is just mostly marketing hype and reviewers who talk about how wonderful it must be playing minecraft or something then do reviews. Honestly I kinda doubt the pros are using it even if they are running g-sync monitors.

Just my 2 cents, but I say give it a try for a few hours and see if it makes a difference for you.
 
  • Like
Reactions: noko
like this
So I should start by saying I have had a G-sync monitor before. An official one not just compatible model and I preferred the 144hz over G-sync. I didn't find it made a huge difference in my life/gaming experience. I had to return those models they had issues. Skip ahead 3-4 years later and I buy a new monitor that is g-sync compatible. I saw this as a selling point or definitely a bonus.

So recently I've been playing a ton of Valorant with some buds. I also decided to overclock my RTX 2060. So I download the 3dmark demo on steam this weekend and one of the things it tells you when you start it is to turn g-sync off. So I go ahead do it, run the benchmark, then start a game a couple hours later.

This is where it gets interesting.. So after I joined the game I realized I didn't turn G-sync back on and didn't want to switch it on and screw up my video or crash the game or something. Yeah I was far more accurate and was doing waaaaay better. I feel like G-sync might add a bit of lag or something. I seem to be far more accurate now and I really don't notice a quality drop.

What do you guys think? I'm sorta feeling like g-sync/freesync is just mostly marketing hype and reviewers who talk about how wonderful it must be playing minecraft or something then do reviews. Honestly I kinda doubt the pros are using it even if they are running g-sync monitors.

Just my 2 cents, but I say give it a try for a few hours and see if it makes a difference for you.

You need to go read the blurbusters article and make sure you have it set up correctly. You are also playing a beta game, so maybe not the best testing material. That said, it adds very little lag and imo is great.
 
I'm sorta feeling like g-sync/freesync is just mostly marketing hype and reviewers who talk about how wonderful it must be playing minecraft or something then do reviews.


Yeahhhhh.... You must be doing it wrong.

Check out the Blurbusters article for sure. When done right, G-Sync is the best new-tech I've seen in the past decade.
 
So you guys are saying G-sync doesn't add any lag at all? I'm pretty skeptical.. This is just my observation after all. When done right, well this is my second time around and I've never had a lot of issues with screen tearing. Then again I remember having arguments with people about vsync which was friggin garbage it was like having a ball and chain around your mouse.

I should note I also seem to prefer uncapped FPS vs. fixed so this could just be how my eyes process these images.
 
Last edited:
Setup in sig, + Acer XG270HU FreeSync monitor (144hz 1440P TN panel). No lag with FreeSync on in Modern Warfare, CS:GO or even CS:S (still like to play gungame on AWP clan server sometimes lol)
 
I am going to take a wild guess and assume you're running the game with "competitive" settings in order to achieve the highest framerate you can. If that is the case then G-SYNC is useless to you. There is no use for G-SYNC if you're running FPS that is >= to your monitor's refresh rate.
 
Blurbusters and Battlenonsense used scientific testing methods to determine the actual lag resulting from VRR which is negligible and they pretty much reflect my own personal experience provided the framerates are sufficiently capped below the VRR ceiling. I'm sensitive to tearing so I find VRR indispensable but its usefulness would probably depend on your priorities and preferences I guess.
 
Last edited:
Pretty sure I remember reviews where VRR adds some lag, vs uncapped.
 
Pretty sure I remember reviews where VRR adds some lag, vs uncapped.

It does, as I said in my original post it adds very little. If you want the absolute lowest input lag you need to run fps 2x-3x+ your refresh rate. If OP is running GSYNC + Vsync OFF then he's not even using the tech if he stays beyond his max refresh. And in Valorant that's highly likely.
 
This is a whole big discussion and kind of personal preference. Blurbusters forum is definitely the place you want to be.

I have been kind of testing new 240hz monitors and I got my next one coming at some point here.

From what blurbusters says, they actually say the "optimal" settings for g-sync is actually g-sync on and vsync on which doesnt increase input lag. Theres a huge article there on this. Now from what I understood what they are saying is it doesnt increase input lag on top of the small input lag that is already there from g-sync but... G-sync does add input lag, but I think its fairly minimal. All that g-sync does is smooth the screen out a bit. Some people really prefer this but some people may not care/notice/be worth the small input lag increase. Its not "marketing" in the sense that it does do what it is supposed to, but how worth it is it to pay the $150-200 nvidia g-sync tax too.

And then you get into the whole discussion about g-sync module vs the freesync (g-sync) compatible stuff and that is a whole new discussion. Blurbusters has most of that stuff.
 
Last edited:
I am going to take a wild guess and assume you're running the game with "competitive" settings in order to achieve the highest framerate you can. If that is the case then G-SYNC is useless to you. There is no use for G-SYNC if you're running FPS that is >= to your monitor's refresh rate.
In the case of "competitive" I would think FastSync would suffice. You'd get the benefits of having the game loop running full tilt while eliminating tearing. True that GSync wouldn't be useful while the framerate is over the max threshold, but if it ever does dip below that, you'd have the benefits of GSync as well.

Even without GSync, FastSync is better than nothing at all, imo. Because fuck tearing.

Valorant runs at such a high framerate, it's really difficult to defend GSync usefulness anyway.
 
Last edited:
Valorant runs at such a high framerate, it's really difficult to defend GSync usefulness anyway.

That would be a reason to use g-sync though. :)

If fps are way over the monitor refresh rate, screen tearing would be happening, which is the point of g-sync. :) Of course you can limit fps as well, which would help take care of that.
 
That would be a reason to use g-sync though. :)

If fps are way over the monitor refresh rate, screen tearing would be happening, which is the point of g-sync. :) Of course you can limit fps as well, which would help take care of that.
You make it sound like GSync does something when framerates exceed the refresh rate.

As a sanity check, just ran a source engine title with vsync off + gsync on. Framerate capped = no tearing. Framerate uncapped = tearing. Clearly Gsync is doing nothing towards tearing when beyond the refresh rate.

I can't vouch for input latency benefits. The Blurbusters guys seem to have covered that extensively.
 
You make it sound like GSync does something when framerates exceed the refresh rate.

As a sanity check, just ran a source engine title with vsync off + gsync on. Framerate capped = no tearing. Framerate uncapped = tearing. Clearly Gsync is doing nothing towards tearing when beyond the refresh rate.

I can't vouch for input latency benefits. The Blurbusters guys seem to have covered that extensively.

Just waking up. I think I was confusing myself there. What I meant was, if fps is over your monitor refresh and you want a smooth experience and you run without capping your fps normally, you would want to cap fps and run g-sync because you would probably get tearing otherwise.
 
Just waking up. I think I was confusing myself there. What I meant was, if fps is over your monitor refresh and you want a smooth experience and you run without capping your fps normally, you would want to cap fps and run g-sync because you would probably get tearing otherwise.
Gotcha.

Given the OP, the subject is weighing the benefits of GSync in a "competitive" fps scenario, specifically the title Valorant. Your typical CS:GO champ is not going to want to cap anything if it nets them even the smallest of advantage.

To the rest of us with potato reflexes, sure capping the fps solves the issue. And having GSync makes the experience more tolerable.

However, to those CS:GO cyborgs, GSync will mostly go wasted.

I was more or less concurring with what Armenius stated.

I would be interested in hearing OP's Valorant experience when using FastSync exclusively.
 
Last edited:
One of the questions here is where there is some base input lag penalty to enabling VRR (and G-Sync specifically).

If there is a base input lag penalty, I'd like to know what it is, and what it depends upon.

If there isn't, then the penalty is simply whatever FPS is limited to below the maximum refresh rate in order to ensure only whole frames are transmitted and display, i.e., no tearing.

Theory:

Llet's take 144Hz, and limit to 141FPS. Your hypothetical penalty is then 3/144 seconds, or 2.1ms. This number decreases as refresh rates increase, and would only apply to frametimes that are shorter than 2.1ms.

In a competitive scenario where latencies stack, that might be important; over the internet, not so much.
 
One of the questions here is where there is some base input lag penalty to enabling VRR (and G-Sync specifically).

If there is a base input lag penalty, I'd like to know what it is, and what it depends upon.

If there isn't, then the penalty is simply whatever FPS is limited to below the maximum refresh rate in order to ensure only whole frames are transmitted and display, i.e., no tearing.

Theory:

Llet's take 144Hz, and limit to 141FPS. Your hypothetical penalty is then 3/144 seconds, or 2.1ms. This number decreases as refresh rates increase, and would only apply to frametimes that are shorter than 2.1ms.

In a competitive scenario where latencies stack, that might be important; over the internet, not so much.
How did you come up with 2.1ms? The difference in milliseconds per frame going from 144fps to 141fps is:

(1sec)(1000ms) = 6.94ms per frame​
(144frames)(1sec)

(1sec)(1000ms) = 7.09ms per frame​
(141frames) (1sec)

7.09 - 6.94 = .15ms difference per frame rendered, 144fps to 141fps.

If you wait 3 frames (lag) for an action to occur like with playing over the internet, just multiple the ms/frame numbers. For example:
  • @ 144fps: 6.94ms/frame x 3 frames lag = 20.8ms lag
  • @ 141fps: 7.09ms/frame x 3 frames lag = 21.3ms lag
Some testing data with monitor at 144fps with different FPS limits:
https://onedrive.live.com/view.aspx?cid=a30c9a917ac6394d&id=documents&resid=A30C9A917AC6394D!626&app=Excel&authkey=AHQ6U50hiq4wuf8&

https://blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/3/

Looks like GSync adds around 20-30ms of lag from the above data, at 141fps-144fps rates, about 3 frames behind. Other data charts they also did with different monitor frequencies.
 
Last edited:
The big issue are those capping the frame rates either with frame limiter in the game or outside either through a driver or 3rd party. For AMD it would be Chill. Video below talks about Chill but also covers the results for them all, look at the lag it introduces:



Maybe counter intuitive but those wanting to keep lag as low as possible but stay within GSync or FreeSync range has another option -> GPU max frequency. If you have a very high performing game and a VRR monitor, don't use Vsync or a frame rate limiter but use OCing utility to adjust clock speeds to keep you within range. Lowest lag will always be max fps, no VRR, no frame rate limiter. Frame rate limiters add lag as well, Gsync/FreeSync as well, combine the two and then add in network lag and I think you will have some issues.

For AMD, don't use Chill or Vsync, use Anti-Lag and adjust GPU speeds to keep within VRR. If Chill is selected than Anti-Lag cannot be used. That is if lag is an issue, for me most of the time it is not and Chill works great.
 
Last edited:
Personally, after using gsync for while now, I could never go back. I used it on my 2K 144Hz monitor and now I use it on my 120Hz 4K monitor. At the same time, I don't set my games for "competitive settings". I max shit all the way out (RT and all) and I game on a 4K monitor. With gsync on, 70~90 FPS can feel smooth as butter, where as with it off it feels "jittery" to me even at the same frames. On games where I am maxing out my monitor at 120 FPS, I add a frame cap to ensure no tearing and still leave gsync on.

I guess it all depends what resolution you game at, the video card you have and its capabilities, and whether or not you max shit out or try to get max FPS with lowest settings.

As for the input lag, can't say I have ever noticed. I also ensure Vsync is OFF. You do not need this with gsync and it could be what is adding input lag for people. I swear there was another thread here about this aspect somewhere else... lol.
 
Am I reading this right? can a human even react to that kind of refresh rate on the screen?
I believe the Navy did some reaction time test with fighter pilots and they were able to react to over 1/200sec stimuli in a controlled test. Would have to check. Now off hand I think most folks would care less and would not notice any difference between 100fps to 360fps.
 
Am I reading this right? can a human even react to that kind of refresh rate on the screen?
Humans don't see in frames, they see motion. The higher the refresh rate of the monitor and the framerate of the game, the smoother the motion will appear. Greater motion clarity leads to better reaction times and visual acuity. If you do want to know at what rate the average human can no longer detect individual frames it is somewhere in the 200-300 Hz range, but again our actual vision is not based framerate as our eyes and brains are processing information from a constant stream of incoming data.
 
Personally, after using gsync for while now, I could never go back. I used it on my 2K 144Hz monitor and now I use it on my 120Hz 4K monitor. At the same time, I don't set my games for "competitive settings". I max shit all the way out (RT and all) and I game on a 4K monitor. With gsync on, 70~90 FPS can feel smooth as butter, where as with it off it feels "jittery" to me even at the same frames. On games where I am maxing out my monitor at 120 FPS, I add a frame cap to ensure no tearing and still leave gsync on.

I guess it all depends what resolution you game at, the video card you have and its capabilities, and whether or not you max shit out or try to get max FPS with lowest settings.

As for the input lag, can't say I have ever noticed. I also ensure Vsync is OFF. You do not need this with gsync and it could be what is adding input lag for people. I swear there was another thread here about this aspect somewhere else... lol.
Pretty much agree, I could see a very competitive player wanting a consistent frame rate such that any lag can be compensated for predictably and consistently so that firing off the sniper rifle in the game actually will hit the target. For me, it probably doesn't make much difference :D
 
Agreed on the human vision not being based on frames per second, but I am more concerned with the whole reaction time after it has been registered by the eye. There is certain delay happening until muscle can make the finger click, or rather, move the wrist and >>then click.
Average
.25sec visual stimulus
.17 sec for audio stimulus
.15sec for a touch stimulus

Anyways bunch of data dealing with human reaction time with a search, here is a test to test you speed, I get in the 180's ms. Click on the upper section and when it turns green click:
https://www.humanbenchmark.com/tests/reactiontime

Edit: Interesting, I turned off FreeSync and used Anti-lag and decrease my average time about 10ms with 5 tries, lowest was 172 ms

2020-04-30 (2).png
 
Last edited:
So you guys are saying G-sync doesn't add any lag at all? I'm pretty skeptical.. This is just my observation after all. When done right, well this is my second time around and I've never had a lot of issues with screen tearing. Then again I remember having arguments with people about vsync which was friggin garbage it was like having a ball and chain around your mouse.

I should note I also seem to prefer uncapped FPS vs. fixed so this could just be how my eyes process these images.
Gsync and FreeSync has to add lag, frame to frame timing processing and corrections are being done, that is never free. Same thing with using a frame rate limiter.

You might want to experiment with the above test with and without GSync, Vertical Sync, I see a consistent testing rate change with and without FreeSync - about 10ms in my case.
 
Very interesting, I will try that once I get home!

Perhaps program AI to activate tazers placed in certain parts of your body once enemy sighted since touch stimulus is quickest? I wonder if turing cards can be retrofitted with such tech.
lol, haptic feedback maybe quicker than the eye if the feedback can be faster than the monitor. Tazers should do that :D
 
I’ll try to find a link to the article, but in fortnite they tested input lag. For the lowest lag it was enable free/g sync and cap your FPS to the monitors max.
 
Most of it doesn't matter. I use 1440p 144hz and going back to 1080p for 240hz nets me 10ms. No thanks.

This stuff gets complicated to discuss because theres so many variables in play.

So, I am confused, youre saying youre input lag is higher at 1080p @ 240hz vs 1440p and 144hz?
 
Average
.25sec visual stimulus
.17 sec for audio stimulus
.15sec for a touch stimulus

Anyways bunch of data dealing with human reaction time with a search, here is a test to test you speed, I get in the 180's ms. Click on the upper section and when it turns green click:
https://www.humanbenchmark.com/tests/reactiontime

Edit: Interesting, I turned off FreeSync and used Anti-lag and decrease my average time about 10ms with 5 tries, lowest was 172 ms

View attachment 241678
I fall right into the median of 210ms on my slowass work computer and monitor. Will have to experiment with my gaming gear at home.
 
I fall right into the median of 210ms on my slowass work computer and monitor. Will have to experiment with my gaming gear at home.
Mine was on a 144mhz monitor, sig below, wireless gaming mouse plugged into keyboard next to mouse. Might be a good way to see how different mice can affect the score, wired vs wireless, refresh rates on monitor and so on. At least we will be able to measure it. Probably can do more on the peripherals and awareness (training) end. Since AMD drivers has a built in browser which I can open inside of the game fullscreen, I might see how real game enviroment, FreeSync, Chill on and off, etc. changes things.
 
To anyone who hasn't used G-sync to say it isn't effective or a marketing ploy hasn't used it. Its a night and day difference, not just in FPS games but all games. Best tech to have come out in the past decade that's for sure.


Also, if youre unsure it is working or have a Gsync Compatible monitor download Nvida's Pendulum demo
https://www.nvidia.com/coolstuff/demos#!/g-sync
 
This stuff gets complicated to discuss because theres so many variables in play.

So, I am confused, youre saying youre input lag is higher at 1080p @ 240hz vs 1440p and 144hz?

No, I'm saying I'd have to downgrade from 1440 -> 1080p and my gain would be -10ms in input lag. That is not worth it to me.
 
I am going to take a wild guess and assume you're running the game with "competitive" settings in order to achieve the highest framerate you can. If that is the case then G-SYNC is useless to you. There is no use for G-SYNC if you're running FPS that is >= to your monitor's refresh rate.

I would say that anytime you got 100+ fps in most shooters, the use case for gsync/freesync goes out the window. VRR main purpose is to get rid of tearing and if you don't have it, why bother? Also not all monitors are built the same, gsync compatible monitors could certainly introduce additional input lag. An actual gsync monitor with a gsync module wont have additional input lag and is directly tuned by nVidia. There are some high end freesync/gsync compatible monitors out there that don't add input lag either but those are the better and higher end models.
 
Average
.25sec visual stimulus
.17 sec for audio stimulus
.15sec for a touch stimulus

Anyways bunch of data dealing with human reaction time with a search, here is a test to test you speed, I get in the 180's ms. Click on the upper section and when it turns green click:
https://www.humanbenchmark.com/tests/reactiontime

Edit: Interesting, I turned off FreeSync and used Anti-lag and decrease my average time about 10ms with 5 tries, lowest was 172 ms

View attachment 241678

The problem with that benchmark is you can cheat it via anticipation and repeated tests until you get the number you want. Here's proof of that after I ran it twice:

482100EE-4421-46AC-803E-63837E3E407D.png

On my desktop:
https://i.gyazo.com/29f6e5c832a2f68cbe8d7b16f2e62055.png

I often hear the age argument and so far only one study done on starcraft players says that between 24-40 yrs of age one loses 150 ms reaction time. Keep in mind this is one study that I don't even think was peer reviewed. In actuality, your own physical fitness and diet likely play a far bigger role than simply age.
 
Last edited:
I would say that anytime you got 100+ fps in most shooters, the use case for gsync/freesync goes out the window. VRR main purpose is to get rid of tearing and if you don't have it, why bother? Also not all monitors are built the same, gsync compatible monitors could certainly introduce additional input lag. An actual gsync monitor with a gsync module wont have additional input lag and is directly tuned by nVidia. There are some high end freesync/gsync compatible monitors out there that don't add input lag either but those are the better and higher end models.
You will get tearing whenever the framerate does not match the refresh rate regardless. Doesn't matter if you're using a 60, 144, 240, or 360 Hz monitor. Tearing will always be present without some form of synchronization between frame generation and video output.
 
Back
Top