NVIDIA Adaptive VSync Technology Review @ [H]

Alot of games let you do that. You can set the frame rate cap and the game would always keep the frame rate around that number. This allowed you to simply disable V-Sync and not experience tearing. Doom 3, Quake IV were two notable examples of this. I believe you could do it in Unreal Tournament 2004 as well.

EVGA precision tool has settings like that. 100hz with 50fps works very well for my 560ti 448 for gaming.
 
Alot of games let you do that. You can set the frame rate cap and the game would always keep the frame rate around that number. This allowed you to simply disable V-Sync and not experience tearing. Doom 3, Quake IV were two notable examples of this. I believe you could do it in Unreal Tournament 2004 as well.
why do some people keep saying that though? tearing can occur at any framerate not just when you exceed the refresh rate.
 
why do some people keep saying that though? tearing can occur at any framerate not just when you exceed the refresh rate.

True. It's usually more frequent and more obnoxious at high framerates than at low ones though.

That, and I think we all agree that while at high framerates we are willing to give up some FPS to avoid tearing. At low FPS we would rather keep the framerate and deal with the tearing.
 
570 SLI user here, so far no issues that I can pinpoint as being Adaptive VSync related.

So far the experience has been very good. I don't have to bother with forcing triple buffering and frame caps now, which is nice. Very smooth gameplay experience. As others have said, if you look closely you can sometimes find tearing below 60 FPS/Hz, but it's minimal and not noticeable unless you specifically try to get it to happen.
 
why do some people keep saying that though? tearing can occur at any framerate not just when you exceed the refresh rate.

Fair enough but it's usually not noticable. At least I rarely if ever see it under those circumstances. In any case, it minimizes the crap out of it.
 
If the framerate drops even just a little below 60 FPS VSync will drop all the way from 60 FPS to 30

I'm not sure it drops in half. I think its more like 60-45-30-20-15.

Not sure though
 
As it looks like some have said in this thread, it would have been nice to get a bit more information about input lag and this technology..

Not having any input lag is far more important to me than avoiding some tearing. I normally can't stand playing with Vsync at all because input lag is just so noticeable to me.
Logically there should be input lag with this technology with a framerate over 60, as vsync is on at this point.
 
I'm curious about the "30hz" option. Maybe it could be used to save power in games that don't have fast motion.
 
It's half. 30-60-15.

wouldn't tripple buffering, help framerates between 30 and 60?

I haven't slept in a couple of days so bare with me.

Ok I think I get it, I was thinking average FPS or something... ok I need a redbull...
 
Thank you for this amazing article. My next purchase will be more informed thanks to you.

As one of those people that is highly bothered by tearing, I am happy to see a solution that fits my needs.
 
I'm curious about the "30hz" option. Maybe it could be used to save power in games that don't have fast motion.
see below. and it may be 30hz for 60hz users but of course that would be 60hz for 120hz users using the half refresh rate setting.

I just tried the vsync half refresh rate option for Alan Wake to see how that would go and I am very impressed. I cant average but around 45 fps on the settings(highest and 2x AA) that I am using and probably only get to 60fps about 10-15% of the time. adaptive vsync obviously can only stop the tearing when I hit 60fps so I gave the half refresh rate setting a spin. I figured 30fps would be abysmal but it was just fine for the 15 or so minutes that I tried it. now I am using a controller and this is not BF 3 by any means so 30fps is fine for this type of game.

EDIT: tried the half refresh rate option in Metro 2033 and it was horrible and choppy just when panning the view around. maybe it seemed okay in Alan Wake because where I tested at was really dark. anyway the half refresh setting should only be a last resort for 60hz monitors as 30fps is probably going to look and feel like crap in most games.
 
I dont see why you would need it. triple buffering is helpful when you dip below refresh rate with vsync on but with adaptive vsync, vsync will disable below your refresh rate anyway.

I'm surprised there's not a single mention of triple buffering in the entire article. :confused: Most of the time, triple buffering provides nearly the same performance as adaptive Vsync, but without any tearing regardless of your frame rate.

I wish Nvidia would have worked with developers of Direct3D titles to make triple buffering an option in games instead. Too many games still lack this option, forcing you to enable it via D3DOverride or similar.

The only drawback with triple buffering is that it can introduce some input lag, which can be a problem in fast paced twitch shooters. However in such games you'd just disable VSync altogether to make sure you're squeezing every last frame out of your system.
 
Kyle/Brent. Do you feel Adaptive Vsync would tip the scale on a gameplay experience scenario when comparing AMD vs Nvidia?
 
Can someone explain to me why I have never seen this drop to 30fps phenomenon with vsync enabled? I have used Vsync for as long as I can remember. I have a gtx680 now, before that, tri sli gtx280s. I don't use triple buffering and always ran precision in my g15 lcd to monitor fps and never saw dips to 30fps when running vsync.

Also curious to me, I've always had tearing when without vsync even at sub refresh rate (60hz) fps. This baffles me as it clearly states in the article that this does not happen, when in fact I experience it. It is part of the reason why adaptive vsync is absolutely terrible for me to use on any games like Mass Effect 3 that make use of preredered FMV cut scenes. The FMV scenes tear like crazy.

I don't get it.
 
Can someone explain to me why I have never seen this drop to 30fps phenomenon with vsync enabled? I have used Vsync for as long as I can remember. I have a gtx680 now, before that, tri sli gtx280s. I don't use triple buffering and always ran precision in my g15 lcd to monitor fps and never saw dips to 30fps when running vsync.

Also curious to me, I've always had tearing when without vsync even at sub refresh rate (60hz) fps. This baffles me as it clearly states in the article that this does not happen, when in fact I experience it. It is part of the reason why adaptive vsync is absolutely terrible for me to use on any games like Mass Effect 3 that make use of preredered FMV cut scenes. The FMV scenes tear like crazy.

I don't get it.
I hate to keep repeating this but I know at this point people are not going to read every post in a big thread. some games already have triple buffering and enable it automatically when selecting vsync. in most other cases the framerate you see on the screen does not always tell the full story. if you look in the framerate logs you will see that indeed a lot of time is spent at 30fps if you cant maintain 60fps in a game using vsync without triple buffering.
 
Yeah, I don't see how tearing could somehow be magically gone with fps less than the refresh rate either.

The whole point with vsync is to *synchronize* the frames the video card sends with the monitor's refresh/update. That's part of the reason you often have much input lag with vsync, since it holds back frames thus causing a 'delay' between mouse-camera movement and the actual change in game. Without vsync there is no synchronization, the video card just spits out frames as soon as it's made them. If the frame is only half-finished when the monitor does a refresh, then there will be tearing. I can see how the tearing would be worse at higher fps, but it should still be there even with fps below the monitor's refresh.
 
Can someone explain to me why I have never seen this drop to 30fps phenomenon with vsync enabled? I have used Vsync for as long as I can remember. I have a gtx680 now, before that, tri sli gtx280s. I don't use triple buffering and always ran precision in my g15 lcd to monitor fps and never saw dips to 30fps when running vsync.

Also curious to me, I've always had tearing when without vsync even at sub refresh rate (60hz) fps. This baffles me as it clearly states in the article that this does not happen, when in fact I experience it. It is part of the reason why adaptive vsync is absolutely terrible for me to use on any games like Mass Effect 3 that make use of preredered FMV cut scenes. The FMV scenes tear like crazy.

I don't get it.

I would suggest that you do not have VSnyn enabled, or you are just not sensitive to frame rate fluctuation. Some folks simply are not sensitive to changes in frame rate when others are.
 
I've left adaptive vsync on since the driver came out. 120hz, 2x 460s. It works, its nice, it doesn't get in the way. I play some older games that blow way past 120, wasting power for no real reason, so in those cases this is nice to have enabled globally.
 
You can't always trust frame counters. A frame counter is not instantaneous, but instead averages the frame times of multiple frames, so the frame rate it displays is not necessarily representative of current frame times. If a frame counter is the average of two frames, for instance, and the first took 16.7ms, while the second took 33.4ms, the frame counter would read 45 fps for that instant. What the counter reports depends not only on frame times but on how those frame times are averaged.


Monitors do not tear. Frames tear as a result of a lack of synchronization between the GPU issuing data to the display and the display. Thus, the type, make, model or frequency of the display has no bearing on which frames will be torn and which will not.


Carmack was reporting on NVIDIA's OpenGL swap-tear extension (the predecessor to adaptive vsync) prior to the introduction of Virtual Vsync.


Not quite. 60/30/~20/~15/~12/~10/etc. Each miss adds one cycle of latency; it doesn't double the latency.
sorry but I believe that is not always true. of course its the game itself that tears but the tearing can sometimes appear different from monitor to monitor. not only does that seem plausible since monitor's specs can vary wildly but I have tested it plenty of times myself.
 
all this vsync and tearing talk makes me think I need to re-evaluate my priorities... I've never, ever actually noticed tearing in a game while I was playing it :eek:
 
sorry but that is false. of course its the game itself that tears but the tearing can sometimes appear different from monitor to monitor. not only does that seem like common sense because monitors specs can vary wildly but I have tested it plenty of times myself.
Note the word you used: "appear". Tearing can appear less significant depending on the display. That was not your original claim.
 
Zarathustra[H];1038618702 said:
So what does the GTX680 do when Adaptive Vsync stops it from going above 60fps?

Does that dynamic clock rate and voltage wizardry kick in, and help it save even more power, or does it stay at the same settings and just save power through less load?

I believe Kyle mentioned this in his article but yes it does. In CS:S for example I see the 680 hovering around 600mhz on the core.
 
I've been trying this out on my GTX470 the last few days and have been very impressed. ME3, CODs and Borderlands have all gotten a boost.

It's also worth mentioning, for those of us with space-heater Fermi's, this has made an audible effect in noise and heat output in some of these games. Witcher 2 is next.
 
Switching to full vsync (non-adaptive) this tearing no longer happens even though the FPS has dropped to 30 during these FMV scenes according to monitoring software. I really don't understand how it can be claimed this doesn't happen when it does? Is something wrong on my end or is the statement incorrect?
I'm not sure I'm following you. What was claimed doesn't happen? The drop from 60 fps to 30 fps?
 
I'm not sure I'm following you. What was claimed doesn't happen? The drop from 60 fps to 30 fps?

Page 1 of the article (7th paragraph from the top, 3rd paragraph under the adaptive vsync header):

The result is a visually distorted image that can bother gamers. Note that tearing only occurs if the framerate exceeds the refresh rate, so if all your performance is under 60 FPS, you won't see tearing even with VSync off. Or, if you have one of those new 120Hz displays, you generally won't see tearing, as long as the game isn't rendering over 120 FPS.

I'm questioning this statement, because I do not believe it to be true given my experiences. Vsync off would also apply to situations where adaptive vysnc is enabled, but actual frame rate is lower than the refresh rate causing the non-vsync state. Also note in my posts I was discussing two different topics as well. Sorry for any confusion.
 
Last edited:
Yes, that statement is not accurate. In fact, even a perfectly consistent frame rate perfectly identical to the display's refresh rate will yield tearing if vsync is not enabled, unless you happen to be incredibly fortunate and happened to pick just the right time to launch the game. The odds of that happening are 1 in however many lines the display has, so...pretty unlikely.
 
I've got a tri-SLI setup using some 3GB 580 cards, and the only game I've tested the Adaptive Vsync and Framerate Limiter on is in Mass Effect 3...with poor results. Now, I only tested the MP portion of the game, but I found that when the stock game Vsync and Smooth Framerate are disabled, both Adaptive Vsync and the Framerate Limiter result in massive stuttering and performance drops.

For some reason, with Adaptive Vsync, the constant on-off of Vsync renders the game almost unplayable to me. The framerate tends to fluctuate between 60 and 54 when running around, which is where having Adaptive Vsync on makes the game stutter like mad.

Framerate Limiter was a whole 'nother mess altogether. The GPU performance is adjusted when the framerate is capped off at 60, but the change in clock speed and voltage isn't as instantaneous as the Adaptive Vsync, resulting in periods of my framerate dropping below 20 FPS for a few seconds.

I haven't had much chance to test other games, but from what I can tell, Framerate Limiter is great for older games that you know you'll be going over 60 FPS in, and Adaptive Vsync is better for new games when you know your performance is going to dip below 60 frequently. As far as ME3 is concerned, just leave the stock Vsync and Framerate Smoothing on if you want the best experience.

Is there some sort of difference between the FPS Limiter and Adaptive Vsync for the 6 series cards versus the 5/4 series?
 
Kyle/Brent. Do you feel Adaptive Vsync would tip the scale on a gameplay experience scenario when comparing AMD vs Nvidia?

I would give AMD 3 - 6 months before we see them add adaptive vysnc to the Catalyst control panel since they already have the code in the driver (for a certain game), just not exposed.
 
just wondering why anyone would use this when it appears virtual vsync does the same thing with less overhead when hyperformance is also enabled.
 
Page 1 of the article (7th paragraph from the top, 3rd paragraph under the adaptive vsync header):



I'm questioning this statement, because I do not believe it to be true given my experiences. Vsync off would also apply to situations where adaptive vysnc is enabled, but actual frame rate is lower than the refresh rate causing the non-vsync state. Also note in my posts I was discussing two different topics as well. Sorry for any confusion.

Yes, that statement is not accurate. In fact, even a perfectly consistent frame rate perfectly identical to the display's refresh rate will yield tearing if vsync is not enabled, unless you happen to be incredibly fortunate and happened to pick just the right time to launch the game. The odds of that happening are 1 in however many lines the display has, so...pretty unlikely.

Technically, you are correct, screen tearing can occur below the refresh rate, but it is very less likely to be noticed. The focus of Adaptive VSync is to tackle tearing above the refresh rate and eliminate stuttering below the refresh rate in comparison to traditional VSync. The wording has been changed in that statement to make it more clear, sorry for the confusion.
 
Haven't had time to go through the thread but my experience has been great using the newest NV beta drivers and a 560Ti with a Dell 2407WFP (1920x1200 60Hz S-PVA). I've only had time to test two games:

BF3 multiplayer (1920x1200 in-game setting all high except shadows medium, motion blur off, AA off, 16X AF, AO off) - I used to play with Vsync off and would get lots of tearing as my FPS would jump from 40ish to 80ish depending on the scene. With adaptive vsync on the overall experience is much much smoother with a better resulting image quality. I'm also finding it easier to spot enemies while moving as the image is sharper. I still notice some tearing below 60fps but the overall gaming experience is significantly improved. Very happy NVidia added this feature to their older cards.

TF2 multiplayer (1920x1200, all in game settings maxed exception motion blur off, 8X MSAA 16X AF) - although this is not a demanding game, there are still points where the FPS will drop below 60 during intense firefights when I have AA on. I used to play with Vsync and triple buffering forced on while it was totally playable, the drops to 45fps would be jarring. With adaptive vsync, it is smoother.

Haven't noticed any input lag issues yet but I dunno how sensitive I am to that, as my reaction time is nowhere near as good as it was during the Quake 3/RA3 and CS 1.6 clan days.

Curious to try out Virtual Vsync once I get a Z77 platform PC in a few weeks.
 
I have been using it since I got my 680s and it works great. Games feel smoother. I never realized how annoying the tearing was until I tried adaptive vsync, and noticed how much smoother the screen images were. Now it would be hard to go back.
 
I believe Kyle mentioned this in his article but yes it does. In CS:S for example I see the 680 hovering around 600mhz on the core.

As I recall Kyle said it saves power, and measures how much power it saves, but he didn't go into why and how too much. I could have missed it if he did.

Either way, this is awesome. I just got mine this morning. Can't wait to install after work!
 
Technically, you are correct, screen tearing can occur below the refresh rate, but it is very less likely to be noticed. The focus of Adaptive VSync is to tackle tearing above the refresh rate and eliminate stuttering below the refresh rate in comparison to traditional VSync. The wording has been changed in that statement to make it more clear, sorry for the confusion.

Tearing @ below 60 FPS shouldn't yield multiple "tears" (since the system can't render more than 1 frame per 16ms), maybe that's why you find it less noticeable, but it's still there. Eurogamer regularily compares PS3 and Xbox 360 game footage, the latter of which uses a sort of adaptive vsync. Check out their videos (e.g. Dark Souls console comparison) and I'm sure you'll notice it quite easily. Even with a 120Hz monitor I would enable adaptive vsync. Whether it be 1 or more tears, it is detrimental to the gaming experience.

Another way of describing this tech is that it defines a floor below which screen tearing is considered acceptable for the sake of keeping FPS as high as possible. The next step for nvidia would be to allow users to define that floor - i.e. 30, 60, or 120, based on the capabilities of the system.
 
Tearing @ below 60 FPS shouldn't yield multiple "tears" (since the system can't render more than 1 frame per 16ms), maybe that's why you find it less noticeable, but it's still there. Eurogamer regularily compares PS3 and Xbox 360 game footage, the latter of which uses a sort of adaptive vsync. Check out their videos (e.g. Dark Souls console comparison) and I'm sure you'll notice it quite easily. Even with a 120Hz monitor I would enable adaptive vsync. Whether it be 1 or more tears, it is detrimental to the gaming experience.

Another way of describing this tech is that it defines a floor below which screen tearing is considered acceptable for the sake of keeping FPS as high as possible. The next step for nvidia would be to allow users to define that floor - i.e. 30, 60, or 120, based on the capabilities of the system.

agreed, i would like to be able to set the trigger points
 
Doesn't triple buffering solve the issue of 40fps becoming 30fps too?

I'm surprised there's not a single mention of triple buffering in the entire article. :confused: Most of the time, triple buffering provides nearly the same performance as adaptive Vsync, but without any tearing regardless of your frame rate.

I wish Nvidia would have worked with developers of Direct3D titles to make triple buffering an option in games instead. Too many games still lack this option, forcing you to enable it via D3DOverride or similar.

The only drawback with triple buffering is that it can introduce some input lag, which can be a problem in fast paced twitch shooters. However in such games you'd just disable VSync altogether to make sure you're squeezing every last frame out of your system.

Can you name some? I can't remember a single game that jumps between 60 and 30.
 
Back
Top