G-sync On vs. Off in FPS games? (my observation).

You will get tearing whenever the framerate does not match the refresh rate regardless. Doesn't matter if you're using a 60, 144, 240, or 360 Hz monitor. Tearing will always be present without some form of synchronization between frame generation and video output.

Depends on the game and engine in my experience. Not all games will have pronounced tearing. I've noticed tearing in some SP games at 100 fps whereas I have none in others at the same framerate. In Apex Legends or Warzone I push 140+ constantly (on a 240Hz display) and never notice any tearing--if it happens it's not noticeable.
 
To anyone who hasn't used G-sync to say it isn't effective or a marketing ploy hasn't used it. Its a night and day difference, not just in FPS games but all games. Best tech to have come out in the past decade that's for sure.


Also, if youre unsure it is working or have a Gsync Compatible monitor download Nvida's Pendulum demo
https://www.nvidia.com/coolstuff/demos#!/g-sync

Not sure I agree with that statement......but I would hope at least a little of that is true seeing as how Nvidia charges about $150-200 more for a g-sync module..... The question is, is the small benefit worth $150-200 over having freesync. That is the question. Maybe to some it is.

I mean, look at the BenQ zowie 240hz monitors....They still charge $500 for their monitors and they dont have freesync or G-sync lol.


I had a 240hz G-sync monitor in the past and had to return it unfortunately(the stand just didnt work on my desk). I am bored so I currently have a 240hz Acer XF252Q X (freesync) and just ordered the HP Omen X 25 G-sync to compare because I cant remember how smooth my old g-sync monitor was. When the monitor arrives I am eager to test to compare freesync vs g-sync and see if I notice anything. Ill keep whichever I like better.
 
Last edited:
The problem with that benchmark is you can cheat it via anticipation and repeated tests until you get the number you want. Here's proof of that after I ran it twice:

View attachment 241734

On my desktop:
https://i.gyazo.com/29f6e5c832a2f68cbe8d7b16f2e62055.png

I often hear the age argument and so far only one study done on starcraft players says that between 24-40 yrs of age one loses 150 ms reaction time. Keep in mind this is one study that I don't even think was peer reviewed. In actuality, your own physical fitness and diet likely play a far bigger role than simply age.
Yes maybe on one try, that is why there is an average, yours is 231ms average with two tries meaning your first one was much longer. The duration of when the screen turns green is random, I had to wait like 10 seconds once, so yes one can get lucky if one works it long enough. Do 5 to 10 tries and see what your average is, mine was 183ms for 5 tries. Lowest was like 172ms, highest was like 191ms during that run. Let see you get 5 tries less than 200ms average. I am 60 years old so I don't think age is the overall factor.
 
Not sure I agree with that statement......but I would hope at least a little of that is true seeing as how Nvidia charges about $150-200 more for a g-sync module..... The question is, is the small benefit worth $150-200 over having freesync. That is the question. Maybe to some it is.

I mean, look at the BenQ zowie 240hz monitors....They still charge $500 for their monitors and they dont have freesync or G-sync lol.


I had a 240hz G-sync monitor in the past and had to return it unfortunately(the stand just didnt work on my desk). I am bored so I currently have a 240hz Acer XF252Q X (freesync) and just ordered the HP Omen X 25 G-sync to compare because I cant remember how smooth my old g-sync monitor was. When the monitor arrives I am eager to test to compare freesync vs g-sync and see if I notice anything. Ill keep whichever I like better.
The question I would have is GSync uses an additional module that goes to the monitor controller, meaning a two stage process while FreeSync or VRR is built into the monitor controller itself, since GSync is a two step process will it be more limited in lag overall then what FreeSync can achieve? Not saying a low quality FreeSync Monitor will have less lag than a high quality GSync but if both monitors have the fastest possible timing circuitry I would think FreeSync would in the end have less lag potential. Would this really matter in the end, probably not.
 
I would say that anytime you got 100+ fps in most shooters, the use case for gsync/freesync goes out the window. VRR main purpose is to get rid of tearing and if you don't have it, why bother? Also not all monitors are built the same, gsync compatible monitors could certainly introduce additional input lag. An actual gsync monitor with a gsync module wont have additional input lag and is directly tuned by nVidia. There are some high end freesync/gsync compatible monitors out there that don't add input lag either but those are the better and higher end models.
I have seen zero data for any GSync or FreeSync monitor that does not incur some sort of lag when properly tested, if you have a good source please share.
 
Yes maybe on one try, that is why there is an average, yours is 231ms average with two tries meaning your first one was much longer. The duration of when the screen turns green is random, I had to wait like 10 seconds once, so yes one can get lucky if one works it long enough. Do 5 to 10 tries and see what your average is, mine was 183ms for 5 tries. Lowest was like 172ms, highest was like 191ms during that run. Let see you get 5 tries less than 200ms average. I am 60 years old so I don't think age is the overall factor.

That first test was on my iPhone. The second result was an honest test I did on my desktop with 111 ms. I could probably skew it to sub 100 if I tried.

As for the gsync source:

On the plus side, by removing the traditional scaler it does seem that all hardware G-sync module screens have basically no input lag. We have yet to test a G-sync screen that showed any meaningful lag, which is a great positive when it comes to gaming.


https://www.tftcentral.co.uk/articles/variable_refresh.htm

They test monitors with the modules that support that finding
 
Depends on the game and engine in my experience. Not all games will have pronounced tearing. I've noticed tearing in some SP games at 100 fps whereas I have none in others at the same framerate. In Apex Legends or Warzone I push 140+ constantly (on a 240Hz display) and never notice any tearing--if it happens it's not noticeable.
In my experience, you'll notice more tearing in the exceed refresh rate scenario. In fact, you'll see multiple tear points if you are multiples of the refresh. This has always been the case regardless of engine.

Since you have a 240 Hz display, and probably have GSync/FreeSync enabled, you're not going to see tearing while below the refresh. Lower the refresh, turn GSync/FreeSync off, or use a display without that capability and the tearing will certainly be there regardless of refresh rate.

Then again, if you aren't actively looking for it in the first place, then ignorance is bliss.
 
In my experience, you'll notice more tearing in the exceed refresh rate scenario. In fact, you'll see multiple tear points if you are multiples of the refresh. This has always been the case regardless of engine.

Since you have a 240 Hz display, and probably have GSync/FreeSync enabled, you're not going to see tearing while below the refresh. Lower the refresh, turn GSync/FreeSync off, or use a display without that capability and the tearing will certainly be there regardless of refresh rate.

Then again, if you aren't actively looking for it in the first place, then ignorance is bliss.

I never use gsync/freesync but I do use DyAc but even when I had my older 144 hz TN I never noticed tearing at 100+ fps.
 
The question I would have is GSync uses an additional module that goes to the monitor controller, meaning a two stage process while FreeSync or VRR is built into the monitor controller itself
So, 'G-Sync Ultimate' / monitors that use a G-Sync module use that module as an LCD controller. They are one in the same.

Note that Nvidia has been focusing on the reduction of input lag for some time now :)
 
Am I reading this right? can a human even react to that kind of refresh rate on the screen?

No, you did not read it right. In the context of what was previously being discussed which is about G-Sync not operating beyond the maximum refresh rate of the monitor when the framerates go well beyond that like in non hardware intensive competitive games, VRR is unutilized or "wasted" since it falls back to either v-sync mode or uncapped (with tearing). When the maximum refresh rate of the monitor is higher, like in 240Hz and 360Hz monitors, the VRR range ceiling is extended to accomodate much higher frame rates, thus Gsync is active and is not "wasted.
 
Not sure I agree with that statement......but I would hope at least a little of that is true seeing as how Nvidia charges about $150-200 more for a g-sync module..... The question is, is the small benefit worth $150-200 over having freesync. That is the question. Maybe to some it is.

You don't have to buy a Gsync monitor or a module to reap the benefits, I bought a LG 32 inch 144hz "Gsync Compatible" monitor for $350 at Microcenter last year & paired it with my RTX 2070. It works great!
 
That first test was on my iPhone. The second result was an honest test I did on my desktop with 111 ms. I could probably skew it to sub 100 if I tried.

As for the gsync source:

On the plus side, by removing the traditional scaler it does seem that all hardware G-sync module screens have basically no input lag. We have yet to test a G-sync screen that showed any meaningful lag, which is a great positive when it comes to gaming.


https://www.tftcentral.co.uk/articles/variable_refresh.htm

They test monitors with the modules that support that finding
No data with your link

Here is some very detailed data:
https://onedrive.live.com/view.aspx?cid=a30c9a917ac6394d&id=documents&resid=A30C9A917AC6394D!626&app=Excel&authkey=AHQ6U50hiq4wuf8&
Spreadsheet format, tabs on bottom to look at the data for the different monitor refresh rates with the two games tested, Overwatch and CSGO.

Looking at CSGO data, 144 hz monitor:
GSync + Vertical Sync (NV panel), frame limit 142 -> Average lag is 21ms
VSync off + no FPS limit -> Average lag is 15ms

From those tests and data points(more than just the above example), GSync does induce some lag but considering people reaction time is around 200ms, adding 6ms more just does not seem that significant.

The GSync module has memory for one frame buffer. The video card has to write that frame information to that frame buffer and then from the module to monitor display, so inherently there will always be a slight delay or lag. Still that does not look like it is that much in the end with high refresh rate monitors.
 
No data with your link

Here is some very detailed data:
https://onedrive.live.com/view.aspx?cid=a30c9a917ac6394d&id=documents&resid=A30C9A917AC6394D!626&app=Excel&authkey=AHQ6U50hiq4wuf8&
Spreadsheet format, tabs on bottom to look at the data for the different monitor refresh rates with the two games tested, Overwatch and CSGO.

Looking at CSGO data, 144 hz monitor:
GSync + Vertical Sync (NV panel), frame limit 142 -> Average lag is 21ms
VSync off + no FPS limit -> Average lag is 15ms

From those tests and data points(more than just the above example), GSync does induce some lag but considering people reaction time is around 200ms, adding 6ms more just does not seem that significant.

The GSync module has memory for one frame buffer. The video card has to write that frame information to that frame buffer and then from the module to monitor display, so inherently there will always be a slight delay or lag. Still that does not look like it is that much in the end with high refresh rate monitors.

The lower avg is because there is no frame limit for the vsync off run, not because of a difference in input latency. Take a look at the Overwatch data which refutes what you're saying and it's from the source you linked:

b9aeb453cef9b3521da86fd2c57d51b9.png

Now look at the averages, they're essentially w/in margin of error and there's no difference in input latency. Of course if you get unlimited fps your input latency will drop, that's common sense but you're mixing apples with oranges here. In fact, don't take my word for it, go straight to the source and ask Chief Blur Busters and he'll tell you what I've said -- G-Sync hardware module has no added input latency/signal processing.
 
Last edited:
You will get tearing whenever the framerate does not match the refresh rate regardless. Doesn't matter if you're using a 60, 144, 240, or 360 Hz monitor. Tearing will always be present without some form of synchronization between frame generation and video output.
It's less visible with higher Hz/FPS, though.
 
This also depends on the person ;)
Hmm, how so? By its definition, the tearing is smaller the more frames you draw in a single faster monitor scanout. G-Sync could still be beneficial in reducing possible microstutters, while having minimal input lag penalty (a couple of ms or so).
 
Hmm, how so? By its definition, the tearing is smaller the more frames you draw in a single faster monitor scanout. G-Sync could still be beneficial in reducing possible microstutters, while having minimal input lag penalty (a couple of ms or so).
I took visible as being subjective; objectively, yes, minor tearing will be present on screen for shorter periods of time at higher refresh rates.

Subjectively, it can still be just as jarring to the viewer 😉
 
  • Like
Reactions: Meeho
like this
So you guys are saying G-sync doesn't add any lag at all? I'm pretty skeptical
You have good reason to be skeptical because it DOES add input lag. Anyone saying otherwise is just never tested it, does not play competitive or does not have the reaction time to tell the difference. I use Gsync for non competitive games if I am running under the refresh rate of my monitor for a smoother performance, but when it comes to CSGO for example, forget it, you will be adding input lag and if you are playing at a decently high level, you will likely notice it. So take all the questionable feedback you are getting here about no difference, with a grain of salt.
So to RECAP, GSYNC and FreeSync work great for that they are intended to do, but for competitive gaming, I will never use it due to the perceptible hit in input lag.
 
  • Like
Reactions: noko
like this
You have good reason to be skeptical because it DOES add input lag. Anyone saying otherwise is just never tested it, does not play competitive or does not have the reaction time to tell the difference. I use Gsync for non competitive games if I am running under the refresh rate of my monitor for a smoother performance, but when it comes to CSGO for example, forget it, you will be adding input lag and if you are playing at a decently high level, you will likely notice it. So take all the questionable feedback you are getting here about no difference, with a grain of salt.
So to RECAP, GSYNC and FreeSync work great for that they are intended to do, but for competitive gaming, I will never use it due to the perceptible hit in input lag.

Someone didn't read the blurbusters article. If you have a quality monitor, especially one of the "competitive" 240hz 1080p ones, it is single digit ms lag difference. I'm positive if we ran a test you would not be able to tell the difference.
 
Someone didn't read the blurbusters article. If you have a quality monitor, especially one of the "competitive" 240hz 1080p ones, it is single digit ms lag difference. I'm positive if we ran a test you would not be able to tell the difference.
This.

lur-busters-gsync-101-vsync-off-w-fps-limits-144Hz.png
lur-busters-gsync-101-vsync-off-w-fps-limits-240Hz.png
 
Yeah I would have to agree. The difference in input lag with no limits whatsoever versus g-sync is negligible. You would not notice a difference, even in competitive gaming. It is literally single digits input lag difference. Yes, G-sync and v-sync and any frame limiter will increase input lag, but the difference is very small. There is definitely a big perception among people who dont know any better that g-sync somehow adds immense input lag, it just isnt true. There are scenarios where certain settings can compound input lag increase so you want to be careful if you really do care that much, but its pretty minimal.

I personally would rather have the smoothness over small input lag differences. It could actually be more beneficial to negligible input lag differences.


You don't have to buy a Gsync monitor or a module to reap the benefits, I bought a LG 32 inch 144hz "Gsync Compatible" monitor for $350 at Microcenter last year & paired it with my RTX 2070. It works great!

Unfortunately, I used to think the same way. Youre partially right, but no. G-sync actually is known for doing a better job since its actually a hardware module. I am going to test this real world theory soon as I just ordered a 240hz g-sync monitor to test against my 240hz freesync acer XF252QX. I have had many people tell me they can notice a difference with g-sync vs freesync. There is a difference I think.
 
Last edited:
Was just about to ask if the lag findings are applicable to Freesync monitors also (with Nvidia GPUs if it matters)?
 
Hmm, how so? By its definition, the tearing is smaller the more frames you draw in a single faster monitor scanout.
By that same token, one could also argue that the higher the refresh, the more frames there are to tear. So while your tears might be physically smaller, there are more of them to distract you.

Obviously VRR solves this outright.
 
The lower avg is because there is no frame limit for the vsync off run, not because of a difference in input latency. Take a look at the Overwatch data which refutes what you're saying and it's from the source you linked:

View attachment 241988

Now look at the averages, they're essentially w/in margin of error and there's no difference in input latency. Of course if you get unlimited fps your input latency will drop, that's common sense but you're mixing apples with oranges here. In fact, don't take my word for it, go straight to the source and ask Chief Blur Busters and he'll tell you what I've said -- G-Sync hardware module has no added input latency/signal processing.
Your image is incomplete, what monitor refresh rate was that at? Using same source this is the breakdown for Overwatch at 100hz:
https://onedrive.live.com/view.aspx?cid=a30c9a917ac6394d&id=documents&resid=A30C9A917AC6394D!626&app=Excel&authkey=AHQ6U50hiq4wuf8&

1588350094814.png


Everyone, please note, if you are above the GSync range, GSync is not enabled and is as good as being off. Only the data with a frame limit within the GSync range or V-SYNC turned on will indicate any GSync lag. Monitor is still at 100hz regardless of the FPS the video card is outputting.

From above with VSync Off + 98 FPS limit, avg lag was 24ms -> This test is still in the GSync range

With V-Sync Off + 0 FPS Limit (no limit), avg lag was 17ms -> This test is outside the GSync range -> 7ms less lag which is around 1 frame more current
  1. The GSync Module memory buffer will always be at least 1 frame behind when GSync is active, GPU outputs frame to GSync Module and then that is transferred to the monitor
  2. Without GSync you can have more updated information on the monitor as in a more recent progression frame with an older frame (tearing), you still have more information available on screen even if the refresh rate is lower than the FPS
  3. Faster monitors with higher FPS will have more current information and feedback for a player
  4. Top notch competitive players , Pro players, will probably be in the <150ms reaction time and relatively consistent with each other so the one advantage is getting information quicker to make a decision
    1. So going from 29ms to 17ms lag would give them about 2 frames ahead information by not using Gsync if using the above example
    2. Obviously 240hz+ monitor would be preferred
 
Was just about to ask if the lag findings are applicable to Freesync monitors also (with Nvidia GPUs if it matters)?


Some notes:
Using GSync on a FreeSync monitor - never use Vertical Sync in Control Panel or Game because it will just be Vertical Sync. Real GSync monitors with GSync modules the Nvidia Control panel Vertical Sync works with it and is different then just pure VSync.

At time of video, Nvidia card had significant more lag than the AMD card on a FreeSync (Nvidia certified GSync) monitor.

The old AMD frame rate limiter (in video) was garbage and has since been removed from the drivers, Chill has taken it's place but it still adds lag.
 
  • Like
Reactions: Meeho
like this
Your image is incomplete, what monitor refresh rate was that at? Using same source this is the breakdown for Overwatch at 100hz:
https://onedrive.live.com/view.aspx?cid=a30c9a917ac6394d&id=documents&resid=A30C9A917AC6394D!626&app=Excel&authkey=AHQ6U50hiq4wuf8&

View attachment 242096

Everyone, please note, if you are above the GSync range, GSync is not enabled and is as good as being off. Only the data with a frame limit within the GSync range or V-SYNC turned on will indicate any GSync lag. Monitor is still at 100hz regardless of the FPS the video card is outputting.

From above with VSync Off + 98 FPS limit, avg lag was 24ms -> This test is still in the GSync range

With V-Sync Off + 0 FPS Limit (no limit), avg lag was 17ms -> This test is outside the GSync range -> 7ms less lag which is around 1 frame more current
  1. The GSync Module memory buffer will always be at least 1 frame behind when GSync is active, GPU outputs frame to GSync Module and then that is transferred to the monitor
  2. Without GSync you can have more updated information on the monitor as in a more recent progression frame with an older frame (tearing), you still have more information available on screen even if the refresh rate is lower than the FPS
  3. Faster monitors with higher FPS will have more current information and feedback for a player
  4. Top notch competitive players , Pro players, will probably be in the <150ms reaction time and relatively consistent with each other so the one advantage is getting information quicker to make a decision
    1. So going from 29ms to 17ms lag would give them about 2 frames ahead information by not using Gsync if using the above example
    2. Obviously 240hz+ monitor would be preferred

And top notch players will turn down settings and do everything else to get as high as FPS as possible, as the graph shows.

That being said, Overwatch at 98 with GSync+Vsync will be a FAR more enjoyable experience. At only 98 FPS without any vsync the tearing & jutter will make for an absolutely miserable experience. Not worth the marginally better input latency.

And with a 4K monitor I typically see around 90-110 FPS in Overwatch. There is no way i'm hitting 200+ FPS which means G-Sync+Vsync is going to offer a better experience.

The vast majority of modern games folks are playing will not be able to run at 200+FPS without cranking down the render resolution and/or turning down all the settings.
 
Last edited:
When you run uncapped all the extra FPS you're getting over your refresh rate are garbage. Normally the more FPS the smoother the image, but once your fps exceed your monitor's refresh rate Instead of smoother motion you get tearing and stuttering which makes it worse. Input lag will also be inconsistent.

The only advantage of an uncapped framerate is overall lower input lag.

But if you use gysnc and set a cap under your refresh rate using the game engine the game engine will attempt to gather all input until the last possible moment before rendering the frame and input lag will be close to the same as uncapped, depending how well the game does it. You get consistently low input lag and a consistently smooth image with no tearing or stuttering. It's objectively better for competitive gaming.
 
And top notch players will turn down settings and do everything else to get as high as FPS as possible, as the graph shows.

That being said, Overwatch at 98 with GSync+Vsync will be a FAR more enjoyable experience. At only 98 FPS without any vsync the tearing & jutter will make for an absolutely miserable experience. Not worth the marginally better input latency.

And with a 4K monitor I typically see around 90-110 FPS in Overwatch. There is no way i'm hitting 200+ FPS which means G-Sync+Vsync is going to offer a better experience.

The vast majority of modern games folks are playing will not be able to run at 200+FPS without cranking down the render resolution and/or turning down all the settings.
When you run uncapped all the extra FPS you're getting over your refresh rate are garbage. Normally the more FPS the smoother the image, but once your fps exceed your monitor's refresh rate Instead of smoother motion you get tearing and stuttering which makes it worse. Input lag will also be inconsistent.

The only advantage of an uncapped framerate is overall lower input lag.

But if you use gysnc and set a cap under your refresh rate using the game engine the game engine will attempt to gather all input until the last possible moment before rendering the frame and input lag will be close to the same as uncapped, depending how well the game does it. You get consistently low input lag and a consistently smooth image with no tearing or stuttering. It's objectively better for competitive gaming.

Very interesting input lag video, now I know why pro's turn down settings, lower resolutions etc. The lower input lag may not be due to higher FPS alone:

 
Very interesting input lag video, now I know why pro's turn down settings, lower resolutions etc. The lower input lag may not be due to higher FPS alone:



Lowering the settings gets higher FPS, but that is not what directly means lower input lag. The direct connection is the time it takes to render the frame.
So if you capped the FPS and got the same FPS with high and low settings you would still get less input lag with lower settings (as long as the frame pacing algorithm in the game is good).
For example at high settings it would be gathering all input for the last 5 ms then rendering the frame in 10 ms (frame render time adding 10 ms of input lag), but at low settings it would be gathering all input for the last 10 ms then rendering the frame in 5 ms which means the low settings would have 5 ms less input lag even though you're getting the exact same fps.
 


Some notes:
Using GSync on a FreeSync monitor - never use Vertical Sync in Control Panel or Game because it will just be Vertical Sync. Real GSync monitors with GSync modules the Nvidia Control panel Vertical Sync works with it and is different then just pure VSync.

At time of video, Nvidia card had significant more lag than the AMD card on a FreeSync (Nvidia certified GSync) monitor.

The old AMD frame rate limiter (in video) was garbage and has since been removed from the drivers, Chill has taken it's place but it still adds lag.

Thank you, that's a great video.

Chill and NVIDIA's limiter seem to work as good as RTSS, which is very good if no in-game limiters are available and eliminates the need for extra utilities.

I don't get this, though:
"Using GSync on a FreeSync monitor - never use Vertical Sync in Control Panel or Game because it will just be Vertical Sync. Real GSync monitors with GSync modules the Nvidia Control panel Vertical Sync works with it and is different then just pure VSync."

How is it different between true Gsync and Gsync compatible monitors?
 
Last edited:
I don't get this, though:
"Using GSync on a FreeSync monitor - never use Vertical Sync in Control Panel or Game because it will just be Vertical Sync. Real GSync monitors with GSync modules the Nvidia Control panel Vertical Sync works with it and is different then just pure VSync."

How is it different between true Gsync and Gsync compatible monitors?

Yeah not sure statement is correct. Good rule, just to make sure using G-Sync within VRR range and Vsync off when above. However, based on the video (around 13:48 mark) even with VSync on when game is within VRR, GSync is being used (ie. no difference between Vsync Off\On when game FPS is within VRR)
 
Unfortunately, I used to think the same way. Youre partially right, but no. G-sync actually is known for doing a better job since its actually a hardware module. I am going to test this real world theory soon as I just ordered a 240hz g-sync monitor to test against my 240hz freesync acer XF252QX. I have had many people tell me they can notice a difference with g-sync vs freesync. There is a difference I think.

Gsync / Freesync, its all the same variable refresh rate technology. I'm not going to pay Nvidia a ridiculous amount of money just to see what my eyes already tell me. Gsync is a marketing ploy through and through, but I am greatful that they have implemented its compatibility with freesync monitors into their drivers.
 
I don't get this, though:
"Using GSync on a FreeSync monitor - never use Vertical Sync in Control Panel or Game because it will just be Vertical Sync. Real GSync monitors with GSync modules the Nvidia Control panel Vertical Sync works with it and is different then just pure VSync."

How is it different between true Gsync and Gsync compatible monitors?
From a latency standpoint, it isn't different. Even with a GSync Module, once you near the refresh rate the latency is almost as bad as VSync. That's why it's recommended to cap the framerate well below the refresh rate.

However, in the case of GSync Module + VSync On, there is frametime variance compensation going on behind the scenes. This can't be concerted with VSync Off even within the GSync range.

The video does not go into frametime variance, only input latency which is completely different. Without a study on the effects of frametime variance, the statement of not enabling VSync (i.e. VSync Off) for a FreeSync panel applies, since there is no measured benefit with VSync On. Now whether that's true across the board, is up for debate.
 
Gsync / Freesync, its all the same variable refresh rate technology.
Except it isn't.
I'm not going to pay Nvidia a ridiculous amount of money just to see what my eyes already tell me. Gsync is a marketing ploy through and through, but I am greatful that they have implemented its compatibility with freesync monitors into their drivers.
You not being able to see the difference does not equate to the technology being a marketing ploy.
 
You not being able to see the difference does not equate to the technology being a marketing ploy.
I typically ignore the rabble that stroll in spouting nonsense, but pretty much my thoughts exactly.

There are GSync module benefits not even discussed in this thread. Such as variable overdrive, which I'm a big fan of since a lot of panels suck at it even statically.

Those panels Asus put out with GSync + ULMB (ELMB)? That's black magic. If they ever get around to perfecting that...

There's a reason Nvidia has invested a lot of time and effort to QA anything that wants to be labeled "GSync Compatible": there's a ton of crap panels out there.
 
Gsync / Freesync, its all the same variable refresh rate technology. I'm not going to pay Nvidia a ridiculous amount of money just to see what my eyes already tell me. Gsync is a marketing ploy through and through, but I am greatful that they have implemented its compatibility with freesync monitors into their drivers.

One is software, one is hardware. I used to game on a 240hz g-sync monitor and I remember it being "smoother" than my freesync XF252QX. Youre making some random assumptions there with no experience...No offense. :) I always love randos making comments with no experience or proof to back it up, just sayin. Anyways, to be sure I ordered a new g-sync monitor(because I forgot how it felt) to test right against this one so. I will know for sure in like 10 days or so.
 
If you haven't recently used the same hardware gsync panel compared to software gsync/freesync on the same system it's hard to know. However, if it's pretty obvious if you use them back to back on the same system. Gsync hardware just feels smoother. No better way to describe it.
 
Worth mentioning that hardware G-sync and g-sync compatible guarantees vrr ranges for the bottom to top refresh rate range of the monitors. I think freesync 2/freesync premium certification is supposed to do that as well.

Ordinary Freesync on the other hand, has virtually no QA and monitor manufacturers just throw it in their products to tick a marketing checkbox which is why the vrr quality varies wildly across models, with the majority being pretty useless with very narrow vrr ranges. Nvidia tested around 500 freesync models and only around 20-30 passed their certification tests last time i checked.
 
Back
Top