FastSync can remove 1 frame of input lag in some games compared to V-Sync OFF

XoR_

[H]ard|Gawd
Joined
Jan 18, 2016
Messages
1,566
Tested FastSync and RTSS at 1fps and at least in DX9 ganes loose 1 frame of input lag. This is easily visible, after frame I would move mouse and game would have certain frames of delay before input would be registered. DX9 games on UE3 would behave this way as some other DX9 games and games forced in DX9. In DX11 I tested Tomb Raider and Ryse and no change was observed. More testing is needed. There was however no added input lag from FastSync even in these games. OpenGL games do not seem to support FastSync at all.
 
Fast Sync basically decouples the rendering from the display synchronization. It's providing a compromise between classic no-vsync and classic triple buffered vsync. Where Fast Sync shines is when rendering framerates in great excesses of the target framerate (i.e. display refresh rate). Blurbusters states Fast Sync gets near GSync levels of input latency reduction when at 2-5x the max GSync range (i.e. display refresh).

The key words here are excess framerates. Because when you are at or under the target framerate (ideally capped), GSync is simply better. There's more going on behind the scenes with GSync when under the target framerate condition, which I won't go into here. So Blurbusters takes the sensible stance that GSync + capped framerate is the better experience. In fact they go on to point out that Fast Sync actually introduces microstuttering and more input lag, and is basically no better than classic vsync methods when under this condition (capped Fast Sync doesn't make any sense anyway). And if you take a step back from this, you'll realize that GSync and Fast Sync work best at opposite ends of the spectrum. And that's pretty much the real take away from the BB article.

Mixing GSync + Fast Sync together brings two arguments (or cons):

1) Fast Sync within the GSync range only serves to introduce slightly more input latency, which ultimately robs GSync of one of its favorable capabilities.
2) GSync is effectively disabled outside the GSync range, thus rendering in excessive framerates (a la Fast Sync) effectively negates the very thing you paid for.

The fallacy here is that you get the best of both worlds with both enabled.

If you don't have a GSync monitor, then one take away from this is that, assuming you can manage excessive framerates, you are getting near GSync input latency reduction. Again, there's more to the GSync experience than just input latency reduction; however, the topic here is simply that, input latency reduction.

Also something that Blurbusters doesn't really go into is that decoupling the renderer a la Fast Sync brings other auxiliary benefits. Most game engines run a loop that process several inputs (variables) to determine the resulting output. The more iterations a loop has, the more "samples" it gets to interpret. This means an uncapped game engine gets to sample more of your mouse movements, key presses, etc. While these input devices have virtually no affect on the display latency, they are just as equally important to the overall latency. Someone such as a CS:GO player who needs that extra sampling to interpret their reflex action might actually find value in running (uncapped) Fast Sync.

The counter argument to running things "uncapped", in the general sense, is that the hardware is generating unnecessary heat for otherwise diminishing returns. To that end, running things "uncapped" is always going to be subjective.
 
Last edited:
Mixing GSync + Fast Sync together brings two arguments (or cons):

1) Fast Sync within the GSync range only serves to introduce slightly more input latency, which ultimately robs GSync of one of its favorable capabilities.
2) GSync is effectively disabled outside the GSync range, thus rendering in excessive framerates (a la Fast Sync) effectively negates the very thing you paid for.
Perhaps I don’t understand but:
1) fastsync wouldn’t turn on during the gsync range from what I’ve read in another thread - so there shouldn’t be s conflict in gsync range
2)gsync is disabled outside of gsync range? Well yes! With or without fast sync - gsync won’t work outside its range.

Finally Rivia Tuner seems unreliable against Nvidia drivers in Windows 10 to set a max FPS cap inside freesync range. I couldn’t get it to work recently and it would seem others couldn’t either so what tool do you recommend to cap frames? Don’t say game config or ini files - that’s a Copt out and it’s the minority of games that support that. The problem i haven’t been able to solve with gsync alone is that when you go above gsync range you get a FPS hiccup and a bad one — so as you go in and out of gsync range in a easily rendered area like a hallway - the hiccup is really bad and it happens often in games. The seeming fix to me was the suggestion to enable fast sync in the Nvidia control panel. Fast sync doesn’t seem to the obviously mouse lag introduction of vsync (leave vsync off) and I don’t sense any micro-stutter or hitch when going in and out of gsync at 120hz. I guess I’m saying I can’t sense a problem anymore with the combination of gsync and fastsync.

So what recommendation do you have?
 
1) fastsync wouldn’t turn on during the gsync range from what I’ve read in another thread - so there shouldn’t be s conflict in gsync range

Yes and no. It depends if and where you are capping your fps.

There's a period of overlap within the GSync Range near the display's refresh rate where Fast Sync is always active. It really depend on where the fps cap is imposed.

For example, say the GSync range is 30-100Hz and you enable both GSync & Fast Sync. Additionally you also impose a 100 fps limiter. When your frame rate exceeds 97 Hz, Fast Sync is active, thus you experience the same issues you would if regular ol' VSync were enabled instead. When your fps exceeds 97 Hz, you experience the aforementioned frame pacing or microstuttering.

Basically, if you cap your framerate near your monitor refresh with Fast Sync enabled, it's effectively turned into regular ol' VSync. If you're going to use Fast Sync, don't cap your framerate at all, or at least keep it capped at a 2-5x (ideally 5x) multiple of your display's refresh rate.

But if you don't cap your fps anywhere near the GSync range, then yes you can claim Fast Sync is doing something (assuming you can maintain excessive framerates). Just keep in mind that as your excess framerate nears the GSync range (i.e. lowers), you're getting into a weird situation where input latency is not exactly optimal.

Edit: I realize now that my 1st point was worded rather poorly. Hopefully this response clarifies that point a bit.
 
Last edited:
Could someone please tell me in layman's terms what the best settings would be? I have a 144hz 1080p non GSync monitor and I mostly play battlefield 4 and battlefield 3. With a gtx 1060 i'm getting around 190-200fps. Should I use Fast Sync , if so at what settings?
 
Could someone please tell me in layman's terms what the best settings would be? I have a 144hz 1080p non GSync monitor and I mostly play battlefield 4 and battlefield 3. With a gtx 1060 i'm getting around 190-200fps. Should I use Fast Sync , if so at what settings?
Use frame limiter (eg. Riva Tuner Statistics Server) to limit exactly to 100Hz, enable FastSync and set games to run at 100Hz.

If your monitors refresh rate is exactly that of frame rate limiter then you get V-Sync like motion smoothness with less input lag. Because this is all about synchronization having to be perfect you can get occasional frame drops if frame is delivered slightly slower or faster thus slightly more than v-sync but when I did testing eg. Duken Nukem felt smoother with this FastSync trick than v-sync because latter despite not dropping frames seemed to have some frames have game state from two game time moments that were too close resulting in slight jerkiness-like effect. With good synchronization you will get frame drops so rarely it is not an issue.

Of course you need to have monitor at exactly refresh rate and thus 100Hz recommendation. 144Hz, not eg. 144/1.001Hz which is probably what you have because these modes are designed in such a way to match multiple of NTSC and cinema frame rates which are 60/1.001 and 24/1.001 respectively and this wont work as it will cause one in thousdant frames to be dropped. With 144/1.001Hz mode and this trick it is one frame drop once ~7s that is unacceptable. 100Hz should be exactly 100Hz to match 50Hz of PAL video.

If your monitor support strobing at 100Hz it should improve motion resolution considerably.
 
If you're going to use Fast Sync, don't cap your framerate at all, or at least keep it capped at a 2-5x (ideally 5x) multiple of your display's refresh rate.
Why not 6x?
Monitor refresh rate should be exactly integer multiplier of 1Hz and frame rate should be capped at exactly refresh rate or exactly integer divider or integer multiplier then FastSync work as it should.

All your ranting about input lag are completely pointless because you start divagating about things which are:
a) irrelevant to this topic as I haven't even mentioned G-Sync at all
b) mostly wrong

If you can push 1000fps and cap FreeSync at 100fps for 100Hz monitor then you wait with first drawn frame out of 10 possible to display it. With 500fps cap you get 5th frame rendered by GPU while screen was drawing frame which (would be seen below tearing near center of the screen with V-Sync OFF) thus get smaller input lag. If you use V-Sync ON and it is properly implemented (doesn't add additional latency) then it would work out exactly as first example. If you push like consistently 150fps and set frame rate limit to 100fps on 100Hz screen then frame will finish render when it renders and using FastSync instead V-Sync only differs in how buffering works within game and if it behaves poorly with V-Sync ON and also FastSync is suspectible to frame drops when frame rate limiter is not synchronized ideally with screen refresh rate. In some games normal V-Sync and my FreeSync tricks amount to the same input lag.

With properly implemented V-Sync you won't have any disadvantage by allowing game to hit monitor refresh rate over staying within G-Sync range and no frame rate caps will be necessary because V-Sync will cap it out for you. You always wait for frame to finish drawing on screen with G-Sync so it is like V-Sync ON steroids, adds even more waiting to waiting when more waiting is needed so it doesn't display the same frame before next finished being rendered.

But games behave differently and some can stupidly add more input lag with V-Sync even withing G-Sync range when it is enabled. Some even have smaller input lag with FastSync than V-Sync OFF which is I agree a bit strange but if it happens it happens, no arguing about it. We can only argue what causes it but not that in some cases it does happen. And if it happens then obviously this in these cases is the best setting to use.

I have yet to find DX10/11/12 game which behaves in this way but surely all DX9 games I tested so far behaves in this way even games which have switch between DX9 and DX10/11 modes
And who should care about DX9 games? It is just the most games ever made... :banghead:
 
I don't see how reduced lag compared to vsync OFF would be possible.
 
I just tested Mirrors Edge (the old one based on UE3 and using DX9c) with in-game frame rate limiter set to 1 and my findings:
1) RTSS indeed adds one frame of input lag
2) FastSync still is faster
3) Built-in frame rate limiter does not work properly with V-Sync OFF or ON and work fine with FastSync in keeping rendered frames spread properly in time domain no matter what FPS I choose

What I mean by that last point? At very low refresh rates like 1fps two frames can blend together resulting in frame rate being actually more like 0.5fps. At higher FPS eg. 15fps there is visible

Another thing. With V-Sync ON and 31fps limit on 62Hz mode animation is not at all smootch. With FastSync it is.

I will post more results from more games should I test them
 
Could you explain in more detail how anything other than vsync off reduces input lag?
 
Let me get this straight. FastSync gives you the fluidity of Vsync ON but with less input lag. But Vsync Off will always have less input lag than FastSync and also would be the most responsive.
 
Could you explain in more detail how anything other than vsync off reduces input lag?
In the past there was pre-render limit setting of "0", now minimum is "1" and this is my main suspect. Given behavior of UE3 game I tested when limiting frame rate to low values this seems like good explanation.

Game run properly at eg. 2fps with FastSync, one frame displayed per half second with inputs (eg. cursor in menu or view port in game) refreshed as soon as it can be.
Now with V-Sync OFF:
1. top of screen have 2fps but two frames are displayed one after another and then there is almost second wait time for next "batch"
2. bottom of the screen (below tear line) is refreshed only once per second and always get second frame from two rendered (which should be logical why that is - first frame is only displayed on part of the screen because second finished faster than first one got to be displayed fully)

Mouse when moved as soon as there is change will show up in second frame only.
With V-Sync ON it is similar as V-Sync OFF but whole screen show two consecutive frames per second.

With more reasonable setting of 30fps it still happens and bottom of the screen (below tear line) I get 15fps...

Let me get this straight. FastSync gives you the fluidity of Vsync ON but with less input lag. But Vsync Off will always have less input lag than FastSync and also would be the most responsive.
Not if FastSync removes frame pre-rendering completely.
Pre-rednering is basically fooling game to think that frame rendering was finished and it should use CPU to prepare another frame to be rendered. It is trick good for boosting benchmarks scores but never for actual gameplay. Obviously benchmark sell GPUs and not low input lag...

Obviously even in affected games (DX9) it doesn't really mean FastSync is faster. It all depends on frame rate, refresh rate, if frame rate limiter is used, etc.
With G-Sync though... FastSync is clearly all the way better.
 
In the past there was pre-render limit setting of "0", now minimum is "1" and this is my main suspect. Given behavior of UE3 game I tested when limiting frame rate to low values this seems like good explanation.

Game run properly at eg. 2fps with FastSync, one frame displayed per half second with inputs (eg. cursor in menu or view port in game) refreshed as soon as it can be.
Now with V-Sync OFF:
1. top of screen have 2fps but two frames are displayed one after another and then there is almost second wait time for next "batch"
2. bottom of the screen (below tear line) is refreshed only once per second and always get second frame from two rendered (which should be logical why that is - first frame is only displayed on part of the screen because second finished faster than first one got to be displayed fully)

Mouse when moved as soon as there is change will show up in second frame only.
With V-Sync ON it is similar as V-Sync OFF but whole screen show two consecutive frames per second.

With more reasonable setting of 30fps it still happens and bottom of the screen (below tear line) I get 15fps...


Not if FastSync removes frame pre-rendering completely.
Pre-rednering is basically fooling game to think that frame rendering was finished and it should use CPU to prepare another frame to be rendered. It is trick good for boosting benchmarks scores but never for actual gameplay. Obviously benchmark sell GPUs and not low input lag...

Obviously even in affected games (DX9) it doesn't really mean FastSync is faster. It all depends on frame rate, refresh rate, if frame rate limiter is used, etc.
With G-Sync though... FastSync is clearly all the way better.

This is not an explanation.
Just a observation which jump to concussions for what is giving the "batch of wait"
Also is this wait for next batched something observed or just theorized?


Aritfial FPS limitign is not the same as Real world FPS. We know for sure that vsync does no t halve FPS when artificially limited to E.G 50FPS but it does when it can only render 50FPS at full power

AKA (on a 60hz monitor and double buffering)
Computer Can only render 50FPS under vsync off. with vsync on you get 30FPS
Computer nca render 120FPS under vsync off. But its limited to to 50. still renders 50. Because the rendering time is not really lowered so it doesn;t have to wait for next swap constantly.

So if you are using frame limit to come to conclude on anything about sync and framebuffers.. You are doing it wrong to begin with.



also any explanation about how sync works should include frame buffers swapping.
if not I'm hesitant to believes its made up BS since framebuffer swapping is the hole point of syncing.
 
Explanation is not what is interesting here but these effects itself and that they happen at all when obviously as people noticed they should not happen !

Since for G-Sync (or to use FastSync to simulate V-Sync ON behavior) you always need to use frame rate limiter. It is not like I am gonna run games uncapped with V-Sync OFF and 1000fps because it gives the lowest input lag... Besides said behavior doesn't happen with DX10+ titles so it is specifically for games which tend to run very fast and make you want to cap them... especially those which uncapped run too fast or have other glitches...

If you have NV card you can test it as can anyone. Why would I ever make up "BS"? :confused:

And no special tests need to be made anyway as all that is need is enable G-Sync then run any old DX9 game with any frame rate limiter available and compare how it runs with each sync setting and FastSync feels one frame faster.

For reference (for myself in the future mostly because no one seems to be interested in my findings anyway...) drivers I use are 390.77 on 980Ti on Windows 10.0.16299.192
 
Explanation is not what is interesting here but these effects itself and that they happen at all when obviously as people noticed they should not happen !

Since for G-Sync (or to use FastSync to simulate V-Sync ON behavior) you always need to use frame rate limiter. It is not like I am gonna run games uncapped with V-Sync OFF and 1000fps because it gives the lowest input lag... Besides said behavior doesn't happen with DX10+ titles so it is specifically for games which tend to run very fast and make you want to cap them... especially those which uncapped run too fast or have other glitches...

If you have NV card you can test it as can anyone. Why would I ever make up "BS"? :confused:

And no special tests need to be made anyway as all that is need is enable G-Sync then run any old DX9 game with any frame rate limiter available and compare how it runs with each sync setting and FastSync feels one frame faster.

For reference (for myself in the future mostly because no one seems to be interested in my findings anyway...) drivers I use are 390.77 on 980Ti on Windows 10.0.16299.192

Would using the frame limiter and fast sync work well for games in 4K 60FPS?
 
Sure if you're oblivious to stutter and hitching like some people seem to be.


Dont understand. Are you saying that 60FPS can never appear smooth to you even if locked in at that rate. I got the impression in this discussion that limiting the frame rate and using fastsync creates a Gsync/freesync type of smoothing effect. Why would there be stutter and hitching at a coinsistent locked in 60 FPS?
 
Dont understand. Are you saying that 60FPS can never appear smooth to you even if locked in at that rate. I got the impression in this discussion that limiting the frame rate and using fastsync creates a Gsync/freesync type of smoothing effect. Why would there be stutter and hitching at a coinsistent locked in 60 FPS?
All I know is that fastsync causes some hitching and stuttering at 60 fps and it was never designed to be used at 60 fps. Nvidia made that 100% clear that it was made for extremely high frame rates and that using it at low frame rates was worse than using regular vsync.
 
All I know is that fastsync causes some hitching and stuttering at 60 fps and it was never designed to be used at 60 fps. Nvidia made that 100% clear that it was made for extremely high frame rates and that using it at low frame rates was worse than using regular vsync.

OK well I gathered from this thread that it wasnt an absolute frame rate but a frame rate much faster than the refresh rate on your monitor which would be a different thing.
 
it can be user art slower frame rates also, why not
with 60hz monitor need to be 60hz and framerate limiter pretty well implemented

60hz monitors are not 60hz but 60/1.001 hz this the need to do some tweaking
 
Back
Top