NVIDIA Adaptive VSync Technology Review @ [H]

How well does this compare to Lucid Virtu and its Virtual Vsync feature? This will be a major factor in deciding to go with GCN or Kepler for my next build.
 
Lucid involves a second software stack, another chip (iGPU), and (from what I have read on [H]) it doesn't exactly work well. The nice thing about nVidia's Adaptive V-Sync is that it shoots for a target (60fps) and if it misses it does the best it can until it can hit 60fps again. AFAIK Lucid just drops down to 30fps...it isn't adaptive in any way, shape, or form.
 
In these drivers I can choose use 3D app setting and 1-4 not 0-8. But that article says the 3D app setting is really 0 but 0 is really 3?? This is even more confusing.
My advice: don't worry about it. Changing that setting won't significantly impact your experience.
 
Just tried out Adaptive Vsync on my 680. It was grand in Skyrim, but tried cs:source and it's terrible. It's not the same as traditional vsync in cs:source. Normal vsync makes everything smooth but the input lag is awful and makes counter strike unplayable. Adaptive vync just turns into a stuttering lag fest and it's even worse than normal sync!! It's like there are frames missing or something and it's jumping between them.

So for me it's turned off and I will never use it.
 
Just tried out Adaptive Vsync on my 680. It was grand in Skyrim, but tried cs:source and it's terrible. It's not the same as traditional vsync in cs:source. Normal vsync makes everything smooth but the input lag is awful and makes counter strike unplayable. Adaptive vync just turns into a stuttering lag fest and it's even worse than normal sync!! It's like there are frames missing or something and it's jumping between them.

So for me it's turned off and I will never use it.
You know you can enable it on a game-by-game basis using profiles in NVIDIA's drivers right? You could enable it for Skyrim and disable it for CS:S, for example. That's one of the benefits of NVIDIA's control panel.
 
On a 60 hz monitor, if you use MSI afterburner and set the frame rate limit to 60 and game with vsync off, won't that yield the same result as adaptive vsync?
 
On a 60 hz monitor, if you use MSI afterburner and set the frame rate limit to 60 and game with vsync off, won't that yield the same result as adaptive vsync?
why do so many people think vsync is the same thing as a framerate cap? adaptive vsync has to be one the simplest concepts out there yet some people are still confused about it. adaptive vsync will enable vsync at 60fps on a 60hz screen where the framerate cap will not. so if tearing bothers you then a framerate cap will not stop it.
 
Just tried out Adaptive Vsync on my 680. It was grand in Skyrim, but tried cs:source and it's terrible. It's not the same as traditional vsync in cs:source. Normal vsync makes everything smooth but the input lag is awful and makes counter strike unplayable. Adaptive vync just turns into a stuttering lag fest and it's even worse than normal sync!! It's like there are frames missing or something and it's jumping between them.

So for me it's turned off and I will never use it.
I just spent about 15 minutes with it on in CSS and saw nothing wrong at all so it must be something on your end. and as already mentioned just turn it on in the game profiles for the games that you want to use it in.
 
why do so many people think vsync is the same thing as a framerate cap? adaptive vsync has to be one the simplest concepts out there yet some people are still confused about it. adaptive vsync will enable vsync at 60fps on a 60hz screen where the framerate cap will not. so if tearing bothers you then a framerate cap will not stop it.

But if you set the frame rate cap at 60, then tearing won't occur, since you can't exceed the refresh rate of your monitor, which is when tearing mostly occurs?
 
But if you set the frame rate cap at 60, then tearing won't occur, since you can't exceed the refresh rate of your monitor, which is when tearing mostly occurs?
you just said two different things. you said tearing wont occur and then you said when tearing mostly occurs. tearing can occur any ANY framerate when vsync is off. CSS has hardly any tearing for me at 200 fps yet Metro 2033 will tear like crazy at 30 fps so you really have to look at it game by game. so in general a framerate cap will not stop tearing.
 
But if you set the frame rate cap at 60, then tearing won't occur, since you can't exceed the refresh rate of your monitor, which is when tearing mostly occurs?

You are disregarding the sync part of vsync, and that's the critical part. Vsync only sends a frame to the monitor when the monitor is ready to display it, so there is no tearing. Theoretically, a 60 fps cap could put a tear at the same point in every frame, if it was badly out of sync with the monitor and sending a new frame every time the monitor was halfway through the old one. It wouldn't be that bad, but it illustrates the point. Framerate cap is not the same as vsync.
 
I just spent about 15 minutes with it on in CSS and saw nothing wrong at all so it must be something on your end. and as already mentioned just turn it on in the game profiles for the games that you want to use it in.

You can play CS:S on onboard graphics.. Try something thats going to stress the hardware a bit.
 
I just spent about 15 minutes with it on in CSS and saw nothing wrong at all so it must be something on your end. and as already mentioned just turn it on in the game profiles for the games that you want to use it in.

It's nothing on my end. Have tried it on a few different systems. It makes CS:S unplayable on them all. And I understand about the different profiles, I just wanted to see would it be better than Vsync and it isn't, at least not for Counter strike source. If you couldn't notice the difference between adaptive sync on and off in cs:s then you must have just really slow reaction times or else there is something wrong on your end.
 
You can play CS:S on onboard graphics.. Try something thats going to stress the hardware a bit.
The original poster claimed there an issue with CS:S. The poster you responded to then claimed he had no issue with CS:S. Now you're suggesting that he try to find a potential fault in CS:S by playing a different game?
 
You can play CS:S on onboard graphics.. Try something thats going to stress the hardware a bit.
um you might want to look at again at the context of my comment. in was a direct reply to someone having an issue with adaptive vsync with CSS.
 
It's nothing on my end. Have tried it on a few different systems. It makes CS:S unplayable on them all. And I understand about the different profiles, I just wanted to see would it be better than Vsync and it isn't, at least not for Counter strike source. If you couldn't notice the difference between adaptive sync on and off in cs:s then you must have just really slow reaction times or else there is something wrong on your end.
you claimed "Adaptive vync just turns into a stuttering lag fest and it's even worse than normal sync" so I am pretty sure I could detect that. for me it was perfectly smooth with adaptive vsync just like it was with regular vsync.
 
How can Adaptive Vsync even be worse than regular Vsync? Its just Vsync but turned on only over 60 FPS, which in CS Source should be pretty much all the time unless you're playing it on a Pentium III with a GF2...
 
How can Adaptive Vsync even be worse than regular Vsync? Its just Vsync but turned on only over 60 FPS, which in CS Source should be pretty much all the time unless you're playing it on a Pentium III with a GF2...
The sad part is that I'm pretty sure a P3 with a GF2 could still play CS:S, too.
 
This adaptive v-sync option is not working for me. I tried running Mass Effect 3 and Saints Row the Third and both those games are showing tearing with AVS on. Am I doing something wrong? Currently running the beta 301.24 drivers on my 680.
 
This adaptive v-sync option is not working for me. I tried running Mass Effect 3 and Saints Row the Third and both those games are showing tearing with AVS on. Am I doing something wrong? Currently running the beta 301.24 drivers on my 680.

Tearing is normal when your FPS is below 60. This is why AVS is mostly useless to me since I'm very sensitive to tearing and it shows up as soon as FPS dips even a little.
 
This adaptive v-sync option is not working for me. I tried running Mass Effect 3 and Saints Row the Third and both those games are showing tearing with AVS on. Am I doing something wrong? Currently running the beta 301.24 drivers on my 680.
are you sure you dont have conflicting global or game profile settings?
 
Tearing is normal when your FPS is below 60. This is why AVS is mostly useless to me since I'm very sensitive to tearing and it shows up as soon as FPS dips even a little.
unless he his running surround at a massive resolution then there is no way he is not able to maintain 60fps in Mass Effect 3 with a gtx680 though.

EDIT: some cutscenes in ME 3 may be 30 fps so maybe that is where he is seeing tearing.
 
I'm running the games at 2560X1440. My rig consist of an Intel i5 2550k (OCed to 4.8ghz), 8GB DDR3 and a EVGA Geforce 680 SC. Mass Effect 3 and Saints Row 3 run silky smooth with vsync enabled. With adaptive vsync I get screen tearing in Mass Effect 3 just by standing still and panning the camera around. Very strange.

Oh and I checked Global setting vs Program settings and everything matches.
 
No, not strange at all. Your framerate probably gets below 60FPS which means vsync gets turned off and you'll get tearing.
Try the half refresh rate setting. If that eliminates tearing then you're definitely below 60FPS.
Or are you perhaps using a 120Hz monitor?
 
Last edited:
If you are seeing tearing with Adaptive Vsync on, you are below the refresh rate of your monitor. It is that simple. Use FRAPS if you think your FPS isn't that low.
 
ouch this article... & this thread.... only some people are correct

no mention of triple buffering is extremely disappointing, almost like nvidia told people to not mention it

there's a big difference between 60, 30, 60, 30, 30, 60, 30, 60, 30, 30, 60, 60, 30, 60, 30, 60, 30, 30, 60 & a solid 30 that the charts showed or what some people say happens (which should only happen when vsync is double buffered, not triple, or something horribly went wrong with the game engine)

the easiest way to test all of this is to fire up a source engine game that has both triple buffering & fpx_max (not left 4 dead), what should be seen is:

TB vsync on-fps_max > 60, great input lag, ultra smooth no tearing
TB vsync on-fps_max 60, little input lag, ultra smooth no tearing (there is lower input lag yes)
TB vsync on-fps_max < 60, almost no input lag, a little juddery no tearing
vsync off-fps_max 60, always tearing in 1 spot that may or may not slowly move across the screen (you can try 59 & 61 to watch the tear spot change position faster)
vsync off-fps_max any other, no input lag, always tearing (since it's in pretty fast motion, the tears turn into a similar judder as TB vsync when fps < 60)

now for the input lag equation, after about 40ms, things become a real struggle to control (especially mouse cursors), now ALL LCDs have input lag right? but not all are the same amount, so if vsync adds for example 25ms, a 50ms monitor would be pretty unplayable compared to a 20ms monitor with the exact same vsync delay

another thing to note, around catalyst 9.7, input lag was reduced for me on a 4870x2, so l4d is now fairly playable compared to before

should mention, a-vsync is a cool option, the more options the better (a related & to me a more important option would be a permanent cap to a user set number like 60, there is really no reason for any game menu to sit there at 3,000fps 100% gpu usage, plus this should reduce input lag with vsync, you can just use bandicam or dxtory to do this for now)

edit: forgot 1 more thing that adds input lag, prerendered frames/render ahead
 
Last edited:
kn00tcn: so true, QFT

I was reading this stuff and getting annoyed.

How can there be NO talk of triple buffering and the fact that a lot of games (esp console ports) do not support it. Triple buffering was "invented" a long long time ago and is the solution for this. The 30/60 fps thing is because the game is only double buffering.
Enabling triple buffering in the driver options does not successfully force it on in most games that do not support it, yet there are other tools (e.g. D3Doverrider) that are much more successful. The driver should be able to force this.

The other thing not tested or spoken about (quickly dismissed at the beginning of this discussion thread) is the potential future of widespread 120 Hz as that will alleviate the problem greatly - as you have double the frequency of vsyncs the card can hit - every 8ms vs 16ms. Though this seems like a ways off yet.

I detest tearing and besides benchmarking see no use for it. Effectively it is a broken system. How can you view an image that isn't even complete? You might be seeing mostly an old frame and 20 rows of pixels of the next frame that has been "torn" in.

With vsync off (i.e. adaptive vsync @ < 60fps) you WILL get tearing. No ifs or buts. It will happen.

Plus then, titles that do support triple buff properly will suddenly get tearing (just tested this with Crysis 2).

I am glad they (NV) are thinking about it though, I just think the option is rubbish and gimmicky right now.

How about some comparison to Lucid's vsync/hyperformance. That seems a more novel approach to tackling the vsync / input lag problem, even if it is on paper.
 
Guys, please educate yourselves. There is no such thing as triple buffering in DX. We have been over this before. The DX implementation is just a flip que. It is NOT triple buffering. Some DX games have real triple buffering available because the developers sought fit to add it as an extra. Just because there is an in-game option doesn't mean that it has real triple buffering though. They could just be switching on the DX implementation.

There is no FORCING triple buffer in DX for a game that does not have it. There is forcing a render ahead that stops the harsh frame drops, but it adds latency.

http://www.anandtech.com/show/2794/1

Derek Wilson
UPDATE: There has been a lot of discussion in the comments of the differences between the page flipping method we are discussing in this article and implementations of a render ahead queue. In render ahead, frames cannot be dropped. This means that when the queue is full, what is displayed can have a lot more lag. Microsoft doesn't implement triple buffering in DirectX, they implement render ahead (from 0 to 8 frames with 3 being the default).

The major difference in the technique we've described here is the ability to drop frames when they are outdated. Render ahead forces older frames to be displayed. Queues can help smoothness and stuttering as a few really quick frames followed by a slow frame end up being evened out and spread over more frames. But the price you pay is in lag (the more frames in the queue, the longer it takes to empty the queue and the older the frames are that are displayed).

In order to maintain smoothness and reduce lag, it is possible to hold on to a limited number of frames in case they are needed but to drop them if they are not (if they get too old). This requires a little more intelligent management of already rendered frames and goes a bit beyond the scope of this article.

Some game developers implement a short render ahead queue and call it triple buffering (because it uses three total buffers). They certainly cannot be faulted for this, as there has been a lot of confusion on the subject and under certain circumstances this setup will perform the same as triple buffering as we have described it (but definitely not when framerate is higher than refresh rate).

Both techniques allow the graphics card to continue doing work while waiting for a vertical refresh when one frame is already completed. When using double buffering (and no render queue), while vertical sync is enabled, after one frame is completed nothing else can be rendered out which can cause stalling and degrade actual performance.
 
After some time using Adaptive VSync, and testing other options regarding forcing "Triple Buffering" (or whatever you want to call it), I've decided to just cap my framerate at 60 (or slightly below 60) and live with the moderate tearing for now. It seems like all other options lead to input lag for me. [strike=option]I may also go back to VSync + TB + framerate cap but it seems like there is even some noticeable input lag on that.[/s]

EDIT: VSync in-game + TB forced + framerate cap = almost no input lag, and so far that is the ONLY combination I've found (other than VSync completely off) that does it.
 
Last edited:
EDIT: VSync in-game + TB forced + framerate cap = almost no input lag, and so far that is the ONLY combination I've found (other than VSync completely off) that does it.

Agreed, though I have no experience with Adaptive VSync, I'm sure disabling VSync below 60fps is something I don't want. Everything else I've tried feels like a compromise. It may be a slight PITA to set up for each game, but, IMHO it's worth it.
 
is there some cons about this technology?
should we enable it always when we need vsync?
the only con is that there will still be tearing below your refresh rate. if you want no tearing at all in a game where you still dip below refresh rate then use regular vsync.
 
Can you elaborate more please?



How can you see tearing if you are under the monitor refresh rate?
please don't tell me actually think tearing only occurs above refresh rate. I just don't get why people say something like that when you can fire up a game and see tearing at ANY framerate. if you don't notice it below refresh rate then how do you notice it all?
 
How can you see tearing if you are under the monitor refresh rate?
Because frames aren't being swapped in sync with the swap interval. Tearing is going to happen at the monitor's refresh rate, below it and above it without vertical synchronization.
 
if you don't notice it below refresh rate then how do you notice it all?

Those who say tearing doesn't happen below refresh rate have not played a game in which it happens noticeably. Some games don't tear noticeably below refresh rate, and Adaptive Vsync is great for those, but some tear really badly, which I've seen myself, and you shouldn't use AV for those.
 
Those who say tearing doesn't happen below refresh rate have not played a game in which it happens noticeably. Some games don't tear noticeably below refresh rate, and Adaptive Vsync is great for those, but some tear really badly, which I've seen myself, and you shouldn't use AV for those.
yes it comes down to the game as some noticeably tear while others dont. still we have gamers that have been playing for years and still act like tearing only happens above refresh rate and that drives me nuts.
 
Back
Top