Dynamic Vsync

XTF

Gawd
Joined
Oct 11, 2011
Messages
591
Suppose I don't mind the framerate being capped at 60 fps but I do mind the extra latency if it dips below. Is it possible to use vsync only if a new frame was displayed at or after the last vsync?
 
I don't understand, why would there be any extra latency with v-synch on when frame rates drop below the refresh rate? Not really an issue I've dealt with however since going to 120 Hz monitors.
 
I don't understand, why would there be any extra latency with v-synch on when frame rates drop below the refresh rate? Not really an issue I've dealt with however since going to 120 Hz monitors.

Because then you have to wait until the next vsync.
 
I think you enable triple buffering to overcome those drawbacks of vsync IIRC.

http://www.tweakguides.com/Graphics_10.html

Problems with Triple Buffering

It may seem odd that if Triple Buffering resolves the problem of low framerates when VSync is enabled, it doesn't appear as a standard option in many games, or is not enabled by default. There are three main concerns that appear to be the reason behind this:

1. If it is not properly supported by the game in question, it can cause visual glitches. Just as tearing is a visual glitch caused by information being transferred too fast in the buffers for the monitor to keep up, so too in theory, can triple buffering cause visual anomalies, due to game timing issues for example.

2. It uses additional Video RAM, and hence can result in problems for those with less VRAM onboard their graphics card. This is particularly true for people who also want to use very high resolutions with high quality textures and additional effects like Antialiasing and Anisotropic Filtering, since this takes up even more VRAM for each frame. Enabling Triple Buffering on a card without sufficient VRAM results in things like additional hitching (slight pauses) when new textures are being swapped into and out of VRAM as you move into new areas of a game. You may even get an overall performance drop due to the extra processing on the graphics card for the extra Tertiary buffer.

3. It can introduce control lag. This manifests itself as a noticeable lag between when you issue a command to your PC and the effects of it being shown on screen. This may be primarily due to the nature of VSync itself and/or some systems being low on Video RAM due to the extra memory overhead of Triple Buffering.

However it appears that most recent graphics cards and most new games will not experience major problems by enabling Triple Buffering. Given the fact that it can help to both remove tearing while also preventing the significant FPS drop encountered when VSync is enabled, it is at least worth trying for yourself to see the results on your system.
 
yeah V-sync = 16.6 ms added input lag

triple buffering = 33.2 ms added input lag

if its an online FPS game i would not run either due to the cost in input lag. if your playing single player then by all means run it whatever way makes the game look the best to you :)
 
Hmm, interesting I didn't realize v-sync/triple buffering added to input lag?? Is that true?
 
Hmm, interesting I didn't realize v-sync/triple buffering added to input lag?? Is that true?

Yes, but depending on the title and hardware it varies. I've only noticed input lag on some DX9 titles with SLi or crossfire with vsync turned on.
 
I've always heard that there is less input lag with triple buffering over regular vsync. The lag I notice the easiest is cursor lag in menu's in certain titles with regular vsync. Definitely lag seems to vary quite a bit between titles, but I hate screen tearing (that also seems to vary between titles). I always run triple buffering and never notice the input lag but have noticed it with regular vsync before.
 
I've always heard that there is less input lag with triple buffering over regular vsync. The lag I notice the easiest is cursor lag in menu's in certain titles with regular vsync. Definitely lag seems to vary quite a bit between titles, but I hate screen tearing (that also seems to vary between titles). I always run triple buffering and never notice the input lag but have noticed it with regular vsync before.
what are you using to enable triple buffering?
 
yeah V-sync = 16.6 ms added input lag

triple buffering = 33.2 ms added input lag
Not true. With double buffering and vsync, the lag is up to 16.6 ms, but that's only if the frametime is 0 ms, which is unrealistic. If the frametime is 10 ms, the lag is 6.6 ms.
 
what are you using to enable triple buffering?

I just use RadeonPro, there are other tweaks I like to enable now and then and RadeonPro does everything (obviously only works with ATI/AMD cards). Otherwise D3Doverrider is great for just forcing triple buffering. D3Doverrider is part of Rivatuner I believe.
 
Suppose I don't mind the framerate being capped at 60 fps but I do mind the extra latency if it dips below. Is it possible to use vsync only if a new frame was displayed at or after the last vsync?

How about just capping the framerate at 60FPS? There's been a lot of talk about how doing that improves things with or without Vsync on, and something about Nvidia adding an FPS limiter to their drivers soon.
 
How about just capping the framerate at 60FPS? There's been a lot of talk about how doing that improves things with or without Vsync on, and something about Nvidia adding an FPS limiter to their drivers soon.

This would be even better.
 
If that's the goal, just enable normal vsync.
lol, what? have you completely missed all of the other comments about vsync having its own set of issues? you have to take it on a game by game basis as using vsync can result in very crappy performance and framerate fluctuations.
 
lol, what? have you completely missed all of the other comments about vsync having its own set of issues? you have to take it on a game by game basis as using vsync can result in very crappy performance and framerate fluctuations.
You can't avoid tearing without vsync.
 
yeah V-sync = 16.6 ms added input lag

triple buffering = 33.2 ms added input lag

if its an online FPS game i would not run either due to the cost in input lag. if your playing single player then by all means run it whatever way makes the game look the best to you :)

It's actually worse than that. Not only does it add up to another frame time worth of input lag, the frame which would prompt an input is already up to a full frame behind. So with Vsync, you get up to 33ms of additional lag, Triple buffering gives you up to 66ms
 
It's actually worse than that. Not only does it add up to another frame time worth of input lag, the frame which would prompt an input is already up to a full frame behind. So with Vsync, you get up to 33ms of additional lag, Triple buffering gives you up to 66ms


Triple Buffering doesn't mean what you think it means.


http://www.anandtech.com/show/2794/2


^^
Take a look at how the buffers work in triple buffering. There is a reason why it actually decreases input lag.


Editting to add the TLDR version:
"In other words, with triple buffering we get the same high actual performance and similar decreased input lag of a vsync disabled setup while achieving the visual quality and smoothness of leaving vsync enabled."
 
Triple Buffering doesn't mean what you think it means.


http://www.anandtech.com/show/2794/2


^^
Take a look at how the buffers work in triple buffering. There is a reason why it actually decreases input lag.


Editting to add the TLDR version:
"In other words, with triple buffering we get the same high actual performance and similar decreased input lag of a vsync disabled setup while achieving the visual quality and smoothness of leaving vsync enabled."

Yeah, I wasn't thinking.
 
Triple Buffering doesn't mean what you think it means.


http://www.anandtech.com/show/2794/2


^^
Take a look at how the buffers work in triple buffering. There is a reason why it actually decreases input lag.


Editting to add the TLDR version:
"In other words, with triple buffering we get the same high actual performance and similar decreased input lag of a vsync disabled setup while achieving the visual quality and smoothness of leaving vsync enabled."

Perhaps this is true with some game I've never heard of, or for people who can't tell the difference between no input lag & input lag.

As for someone like me, V-Sync can go die in a corner. I can't use it in any game period due to the input lag it creates and triple buffering has never helped that input lag for me. Whether it be L4D/2 or Homeworld/2 it doesn't make a difference.
 
Perhaps this is true with some game I've never heard of, or for people who can't tell the difference between no input lag & input lag.
Unfortunately there is no such animal as "no input lag". Not even with a CRT display is this achievable. Vertical sync is only one of many contributors to the lag time between you making some sort of input and the result of that input being reflected on your display.
 
Not true. With double buffering and vsync, the lag is up to 16.6 ms, but that's only if the frametime is 0 ms, which is unrealistic. If the frametime is 10 ms, the lag is 6.6 ms.


http://www.anandtech.com/show/2803/7

For input lag reduction in the general case, we recommend disabling vsync. For NVIDIA card owners running OpenGL games, forcing triple buffering in the driver will provide a better visual experience with no tearing and will always start rendering the same frame that would start rendering with vsync disabled. Only input latency after the time we would see a tear in the frame would be longer, and this by less than a full frame of latency.

Unfortunately, all other implementations that call themselves triple buffering are actually one frame flip queues at this point. One frame render ahead is fine at framerates lower than the monitor refresh, but if the framerate ever goes past refresh you will experience much more input lag than with vsync alone. For everyone without multiGPU soluitons, we recommend setting flip queue or max pre-rendered frames to either 1 or 0. Set it to 1 if framerate is always less than monitor refresh and set it to 0 if framerate is always greater than or equal to monitor refresh. If it goes back and forth, only NVIDIA's OpenGL triple buffering will provide the best of both worlds without tearing and will further reduce input lag in high framerate situations.

Improperly handling vsync (enabling or disabling a 1 frame flip queue at the wrong time) can degrade performance by at least one additional whole frame. But with multiGPU options, we really don't have a choice. With more than one GPU in the system, you will want to leave maximum pre-rendered frames set to the default of 3 and allow the driver to handle everything. Input lag with multiGPU systems is something we will want to explore at a later time.


In short what Vsync does to input lag varies from game to game and even nvidia vs amd in how they handle things esp triple buffering. what they say is what i'm saying if you care about input lag disable v-sync. yes if V-sync is done right and does not delay frames because your video card(s) are operating faster than 60 FPS then less than 16ms of input lag will be incurred but the safest way to ensure v-sync is never holding your performance back in any game is to disable it outright and honestly when playing FPS games online if your stopping to look at the scenery you probably don't care about how you stack up against the rest of the players in the server but to each their own. i just want to make sure that people are aware that there IS a cost to running v-sync and the cost will vary but it will undoubtedly be there.
 
Just not in any way shape or close to the way you tried to use to describe triple buffering tho.


BTW for amd, if you use RadeonPro you can force real triple buffering, and that is totally worth it.
 
120hz 2233RZ + 120-ish frame limiter works great for weeding out most of the tearing and being very pleasing to the eye. Although I do sort of miss the ludicrously low input lag on my samsung 226BW. Image quality kinda sucks but doesn't bother me.
 
That's not right at all. Vsync is not a fixed 16.7 ms latency.

I'm not sure how a game is able to mess up tripple buffering, but I guess some do.

Ohh that is double buffer vsync, in which you have your frame buffer, and the "back buffer" which cannot be written until it has been sent, and thus yeah a 16.67ms constant lag will be present.

But with triple buffer vsync, which Fito got wrong on page 1 of this thread, you get rid of that input lag feel, because you can write a second back buffer even if the first one hasn't been sent to the monitor in its entirety.

Heck if Fito would have read the article he linked he would have seen that the extra 16-33ms input lag they talk about refers to displays who try to do "smart" postprocessing (another great reason to use digital inputs and turn all monitor//tv based postprocessing off!), and that they say that the best combo is triple buffering with a 120hz monitor (which indeed, for twitch shooters is sweet!)

And edit to add:
The developers of D3Doverride warned against that input lag article on Anandtech stating that the writter didn't really have much of an idea about how d3dprogramming works. (And that is the motive why D3DOverride's forced triple buffering does work)
 
Last edited:
As for someone like me, V-Sync can go die in a corner. I can't use it in any game period due to the input lag it creates
It's mostly just a matter of getting used to it. Assuming you're not one of the 0.1% still rocking a CRT, you've probably done it once already...

Yes, there are absolutely situations where it makes a real difference; I'm not about to look a seasoned Q3A player in the eye and tell them they could play just as well with vsync on. But this isn't the case for most people, most of the time.
 
The developers of D3Doverride warned against that input lag article on Anandtech stating that the writter didn't really have much of an idea about how d3dprogramming works. (And that is the motive why D3DOverride's forced triple buffering does work)

Could you post the link to where they said that? I haven't found it yet.

I did find this thread, though, which contains more discussion on triple buffering:

http://forums.nvidia.com/index.php?showtopic=169911
 
Actually, in that thread there may be an answer to the OP's question:

BFG10K said:
Vsync causes input lag because the rendering system has to stall if both buffers are full but no refresh cycle is currently available. In theory triple buffering solves this problem because it allows the rendering to continue at the expense of dropped frames, but in practice I?ve found it can make the lag worse.

Sora said:
This is because OpenGL's form of triplebuffering can drop the third frame if its not required, where as D3D always displays it.
This does nothing for the double buffering lag under opengl, the only fix for it is still to cap below the refresh rate.

I wish someone could write a definitive post or article on Vsync and Triple Buffering, because they are not simple subjects, there is a lot of wrong info in circulation - especially the Anandtech which has been quoted here a thousand times - and their effects on input lag are conditional. I still don't understand them well enough to write such an article.
 
Could you post the link to where they said that? I haven't found it yet.

I did find this thread, though, which contains more discussion on triple buffering:

http://forums.nvidia.com/index.php?showtopic=169911

Would have to go through the archives on guru3d, since d3doverride hasn't been developed on in a while and the article is a bit old at the moment.

But in any case you can experience it first hand, just try normal vsync with d3doverride or radeonpro and vsync with triple buffering forced on those.


The reason this isn't on the standard driver is because indeed, it isn't part of the d3d spec, and thus there are titles were you can get artifacts to show up.
 
Ohh that is double buffer vsync, in which you have your frame buffer, and the "back buffer" which cannot be written until it has been sent, and thus yeah a 16.67ms constant lag will be present.
Wrong again. You render a frame to buffer A, this takes (for example) 10 ms. Then you have to wait 6.7 ms on vsync. Buffer A is sent to the monitor, and you render a frame to buffer B.

The extra latency is 6.7 ms and the frame is 16.7 ms old when it's sent to the monitor.

With tripple buffering, you can start rendering immediately (because there's always one 'free' buffer). So, framerate goes up. However, the frame might now be older than 16.7 ms at the moment it gets sent to the monitor. So while framerate is higher, frame latency isn't lower. In fact, it's higher.
Both examples assume frametime < 16.7 ms.

Besides vsync, there's another choice related to tripple buffering: do you display all frames and stall rendering when the monitor can't keep up or do you discard frames.
 
all i know is v-sync and triple buffering can and often do add input lag. how much does depend on how the game makes use of them and how your driver do as well (as well as driver settings sometimes)

if triple buffering was so great you would think nvidia would set it to on by default but they do not and that should tell you something right there about if there is a cost to it or not.

as far as a game dropping frames because your rendering too fast, i suppose a poorly written game might do this but most games do not and what happens when you go over 60 fps and this is the cause of horizontal tearing as well on both CRT and LCD displays is when a new frame is done being rendered it gets sent to the display right away, because there are 99% of the horizontal lines on a display that are not at the very top or bottom when the display is getting a new frame mid refresh it creates a tear line horizontally.

to the uneducated LCD panels do refresh horizontal lines of the matrix going from top to bottom, the only display that refreshes the whole screen all at once that i have seen is plasma's i have photo's of this to prove it, one problem with most LCD's is that the LCD matrix is slower than the update rate on many non-gaming LCD's so it can actually have the side effect of blending the horizontal tear, i rarely see them on my display because its a 37" LCD TV with an IPS panel so its slower than your average cheap TN PC monitor but even TN panels don't compare to CRT's

also having the word buffer in any explanation of how something video related works automatically means your adding input lag. Buffer = major source of input lag today on displays esp TV's TV's often have completely separate buffers, one is on the mainboard that is there to buffer frame to do image processing/enhancement and motion estimation/enhancement. those buffers are almost always defeatable via game modes etc. the other never advertised and often overlooked buffer on LCD displays is the one on the panel's Timing controller, the board the controls the actual LC matrix. panels that employ what is commonly known as overdrive but more properly called response time compensation or RTC for short require buffering frames in order to properly overdrive only the pixel state changes that the panel is natively slow at because if it just overdrives all the pixels you get inverse ghosting and other bad side effects like temp image retention that can only be removed by turning the display off for awhile to let the LC's loose the excess charge that kept them from closing fully. i have an older TN PC monitor that has an old bufferless overdrive tech that gets TIR :D

i did quite a bit of research on TV input lag in my search for finding a suitable TV to use as a gaming monitor for my PC. what i discovered is the vast majority of TV's on the market in the past few years have at least 30-45ms of input lag that cannot be removed, the TV i use is one of the lowest lag TV's ever made and its around 20ms due to this handicap i already have from my display i feel any added input lag to the chain much more than your average gamer so that is also a major reason i never run v-sync.
 
Last edited:
If the panel refresh and the video signal aren't in sync, it doesn't make sense to have a fixed refresh rate for the video signal. An async signal would reduce latency.
 
This is a wonderful idea that would effectively solve this problem for both camps.


The key problem I see is in how you precisely test the framerate to know whether or not you can run a v-sync'ed frame or carry on without.

I get the feeling that any system that jigs between the 2 would inherently added exactly the same input lag as you already get with v-sync enabled.

You'd have to come up with an exceedingly cunning test/deploy system to determine when and when not to sync the next frame.


But to be honest, all I want is just a setting that keeps it off all the time in game, but enables it for cutscene cinematics.
 
The key problem I see is in how you precisely test the framerate to know whether or not you can run a v-sync'ed frame or carry on without.
That's easy for the videocard to do, isn't it?
 
Back
Top