NVIDIA Adaptive VSync Technology Review @ [H]

Just installed the 301.24 beta driver yesterday evening. Only game I've had a chance to try out was SW: TFU2. Did notice a smoother gameplay experience with AVS on. Going to try out a few other games over the next couple days and the weekend.

The reason why I tested TFU2 is because I was getting pretty moderate frame tearing during cutscenes/cinematics. But not anymore!

Thank you, nVidia. This is awesome!
 
Last edited:
And for that reason, I don't think it will see wide spread adoption. I just don't see Lucid staying on top of driver updates as often as they may need to be, in addition to it bringing yet another driver update for the end user to consider. It just seems to be dependent on too many variables.

im using it now already with my i5 2500k. it works great. even in games that the driver didnt come with a profile for. ive added 2 games so far and both worked flawlessly.

http://3dmark.com/3dm11/3220761
 
I'm sure this was being worked on before you made that post. :D

As has been mentioned before in this thread, people have been doing this manually in game engines that support a max fps configuration for over a decade.

In the source and gld_src engine's I believe it was just the "fps_max" console command that would do the trick. You'd just set it to equal your refresh rate, and viola, "dynamic vsync".
 
Zarathustra[H];1038624282 said:
As has been mentioned before in this thread, people have been doing this manually in game engines that support a max fps configuration for over a decade.

In the source and gld_src engine's I believe it was just the "fps_max" console command that would do the trick. You'd just set it to equal your refresh rate, and viola, "dynamic vsync".

Not sure if there is something more to it than just a framerate limiter, but if not, limiting FPS does not produce the same effect as VSync.
 
Zarathustra[H];1038624282 said:
As has been mentioned before in this thread, people have been doing this manually in game engines that support a max fps configuration for over a decade.

In the source and gld_src engine's I believe it was just the "fps_max" console command that would do the trick. You'd just set it to equal your refresh rate, and viola, "dynamic vsync".
how many times does it have to be said that a framerate cap is NOT the same as adaptive vsync?
 
how many times does it have to be said that a framerate cap is NOT the same as adaptive vsync?

It may not technically be the same, but it accomplishes pretty much the same thing.

When I had my framerate capped to the same refresh rate as my monitor i VERY rarely saw any tearing.
 
Zarathustra[H];1038624361 said:
It may not technically be the same, but it accomplishes pretty much the same thing.

When I had my framerate capped to the same refresh rate as my monitor i VERY rarely saw any tearing.
well most people including myself still see plenty of tearing at or below the refresh rate. if that did not happen then there would be need to even come up with adaptive vsync which lets you get rid of the tearing at least some of time.
 
Zarathustra[H];1038624361 said:
It may not technically be the same, but it accomplishes pretty much the same thing.

When I had my framerate capped to the same refresh rate as my monitor i VERY rarely saw any tearing.

"very rarely" see tearing != no tearing.

In fact, with AVSync you will ONLY see tearing at below 60 FPS because that's when VSync is off. It's not as noticeable as with 60+ FPS and no VSync, but it's most certainly there.

And in my experience, capping framerate to 60 or even 58-59 does not eliminate tearing even when you are up against that number. In fact the tearing is even more noticeable because it tends to stick in the same spot on the screen rather than scrolling down the screen like normal tearing without VSync.
 
Every method has some drawback. Also I'd like to correct some stuff in this article which wasn't correct.

Vsync eliminates tearing but limits fps to the refresh rate. It also drops the framerate all the way down to half of your refresh rate if your framerate even barely drop below the refresh rate. So even if you can run at 59fps, you'll drop to a solid 30fps on a 60hz display. Also if you dip below and then above 60fps often, you'll get perceivable stutters because of this quantization.

Vsync off (immediate present mode) results in tearing that will show up at any frame rate. Technically it can even show up at 60fps because there's no synchronization happening with the display's refresh cycle. So even if you render at 60fps, the display refresh and the frames present timing are out of phase.

Triple buffering with vsync solves tearing and makes you less likely to drop down to 30fps as fast...I.e. if you have a frame or two that run too long to make the vsync period deadline... So you avoid some of the stutters. It also allows you to run at average frame rates between 30fps and 60fps, but in this case it will do so by having some pattern of 16ms frames (60fps frames) followed by a 33ms frame (30fps frame), repeated over and over. Obviously this will give you a heartbeat style stutter on the displaying of your frames, even if you're achieving some average framerate between 30fps and 60fps over the course of a second. So even if your average framerate reads 45fps, that is merely an average of higher and lower frame times, not a consistent framerate. Additionally triple buffering requires an additional back buffer, meaning more video memory is used and there is an extra frame of input latency.

Adaptive vsync simply dynamically enables vsync when the framerate is able to keep up with the refresh rate and disables vsync when the framerate is lower than the refresh rate to prevent the framerate from dropping all the way to half the refresh rate. The only downside is you get tearing when you're below the refresh rate (or you can set the threshold to be half the refresh rate for 120hz displays).

One way to display frames without a perceivable drawback is to have really high refresh rate displays... The higher the better, because the penalty of missing vsync becomes less and less as refresh rates go up. This way you could have vsync on and keep most if your framerate when you missed a vsync interval.

Another way is to have displays that only refresh when signaled that there's a new frame ready, and in the meantime the display would just continue to light up the pixels that were lit up by the last frame... That would be like some intelligent refresh technology instead of blindly updating all the displays pixels on a fixed schedule.
 
Last edited:
Every method has some drawback. Also I'd like to correct some stuff in this article which wasn't correct.

Vsync eliminates tearing but limits fps to the refresh rate. It also drops the framerate all the way down to half of your refresh rate if your framerate even barely drop below the refresh rate. So even if you can run at 59fps, you'll drop to a solid 30fps on a 60hz display. Also if you dip below and then above 60fps often, you'll get perceivable stutters because of this quantization.

Vsync off (immediate present mode) results in tearing that will show up at any frame rate. Technically it can even show up at 60fps because there's no synchronization happening with the display's refresh cycle. So even if you render at 60fps, the display refresh and the frames present timing are out of phase.

Triple buffering with vsync solves tearing and makes you less likely to drop down to 30fps as fast...I.e. if you have a frame or two that run too long to make the vsync period deadline... So you avoid some of the stutters. It also allows you to run at average frame rates between 30fps and 60fps, but in this case it will do so by having some pattern of 16ms frames (60fps frames) followed by a 33ms frame (30fps frame), repeated over and over. Obviously this will give you a heartbeat style stutter on the displaying of your frames, even if you're achieving some average framerate between 30fps and 60fps over the course of a second. So even if your average framerate reads 45fps, that is merely an average of higher and lower frame times, not a consistent framerate. Additionally triple buffering requires an additional back buffer, meaning more video memory is used and there is an extra frame if input latency.

Informative post. So basically Adaptive VSync is overall a better option unless you can't handle the sub-60 FPS tearing.
 
Informative post. So basically Adaptive VSync is overall a better option unless you can't handle the sub-60 FPS tearing.

Exactly. It's a different set of compromises, but from a certain perspective it's as good as it gets without making the displays themselves more high tech. Right now the pixels get updated on a fixed interval of time that probably won't line up with when frames are ready. Right now with vsync on the displays vsync timing runs the show... It forces the frames to meet a certain deadline... If you miss the bus you have to wait until the next one comes. But given that a games framerate is inherently dynamic because of workload constantly changing, I think it makes more sense to have a frame being ready dictate the refresh timing instead of the other way around. Displays aren't capable of any of this trickery today.
 
Another way is to have displays that only refresh when signaled that there's a new frame ready, and in the meantime the display would just continue to light up the pixels that were lit up by the last frame... That would be like some intelligent refresh technology instead of blindly updating all the displays pixels on a fixed schedule.
It's technology we should have seen introduced years ago. Much like adaptive vsync, in fact. These things are difficult to market, though, so they just don't happen.
 
It's technology we should have seen introduced years ago. Much like adaptive vsync, in fact. These things are difficult to market, though, so they just don't happen.

Totally agree. It's also tough because it almost needs to be a standard to take off... No display maker is going to invest in doing this on their own and possibly end up with some extra cost in their product that isn't used.
 
Went to the nVidia site to download it (much faster connection here at work :D ). Something odd - if I punch into the dropdowns "GeForce 400 Series" (Windows 7 64 bit), it brings up a 296 driver, not the 300 series one. The only way I could get to the 300 series drive was to put in "GeForce 600 series".

Will those still work on my sad old 460 based system? I run in 1920x1200 most of the time and do get tearing on the older games I run, so I wanted to try it out, but I'd prefer not to ru-ru my system with the wrong drivers...
 
Went to the nVidia site to download it (much faster connection here at work :D ). Something odd - if I punch into the dropdowns "GeForce 400 Series" (Windows 7 64 bit), it brings up a 296 driver, not the 300 series one. The only way I could get to the 300 series drive was to put in "GeForce 600 series".

Will those still work on my sad old 460 based system? I run in 1920x1200 most of the time and do get tearing on the older games I run, so I wanted to try it out, but I'd prefer not to ru-ru my system with the wrong drivers...
its a beta driver
 
Went to the nVidia site to download it (much faster connection here at work :D ). Something odd - if I punch into the dropdowns "GeForce 400 Series" (Windows 7 64 bit), it brings up a 296 driver, not the 300 series one. The only way I could get to the 300 series drive was to put in "GeForce 600 series".

Will those still work on my sad old 460 based system? I run in 1920x1200 most of the time and do get tearing on the older games I run, so I wanted to try it out, but I'd prefer not to ru-ru my system with the wrong drivers...

That driver is limited to the GTX 680's, so it won't work. But there is a 301.24 beta driver for your card. When you go to the download page just click beta and archived drivers on the left.
 
I experience some input latency with my GTX 480 with Adapter V-sync on. Is it a deal breaker? Depends on the game. If I was doing competitive gaming it should / would be.

However for the majority of other games its actually pretty awesome. Kudos Nvidia , way to stay a head of the curve.
 
Last edited:
Nice article, but I still either play with either Vsync On or Vsync Off. Adaptive is pretty much useless for me, even though I have 120hz displays.

Why? Well, like some have said and even article mentions:

"Note that tearing can technically occur if the framerate doesn't exceed the refresh rate, but it is much less likely to be noticed."

Basically, here are two scenarios for a typical user:

1) Your frames are always lower than your monitor's refresh rate (for typical users it's <60fps), then Adaptive Vsync does absolutely nothing, so you might as well have it turned off, because you'll get plenty of screen tearing either way.

2) Your framerates are always higher than your refresh rate (>60fps). Adaptive Vsync works exactly like regular vsync. Basically you might as well run with vsync on.

Yeah, it's easier to notice tearing at higher fps with vsync off, but it's still very much there and very very noticeable at lower fps in most games. I was very disappointed by this feature personally (because it sounds better than it really is). Just a marketing gimmick really and doesn't solve anything at all for people who are sensitive to screen tearing.

For those who cannot notice the tearing at fps lower than refresh rate, the option to clip the fps at just below the refresh rate has been there forever and imho is a much better option than this AVsync, since there will never be double buffering, which causes the lag.

The ultimate solution is, like Carmack said, is to make smarter displays, which will refresh based on the frame availability as signaled by the video card, instead of doing so at their own constant pace (refresh rate). Even if Carmack is not the best PC advocate lately it doesn't mean he is wrong. He's pretty much spot on.
 
I have been using the Adaptive solution and have found it valuable in my gameplay.
 
1) Your frames are always lower than your monitor's refresh rate (for typical users it's <60fps), then Adaptive Vsync does absolutely nothing, so you might as well have it turned off, because you'll get plenty of screen tearing either way.

2) Your framerates are always higher than your refresh rate (>60fps). Adaptive Vsync works exactly like regular vsync. Basically you might as well run with vsync on.

I agree, the only time it really would seem to help would be in case # 3, where your frame rates are hovering right around the refresh rate - in that case you might want to get rid of the tearing above refresh, but not deal with the hassles of Vsync below. How often that case comes up, though, kind of depends.

However, setting Adaptive on globally does keep you from having to make changes depending on what game you play, so there is benefit there. If your game is always above refresh, then you get Vsync, and if not, you don't. So it helps from a usability standpoint. Now if they could only come up with a way to get the Vsync part without the input lag, we'd be all set.
 
I have been briefly playing some of my games to see how these beta drivers and A-Vsync affect the performance. While I haven't made my mind up on the input lag kicking in --I can feel it but I haven't played enough to know if it really bothers me yet-- the overall experience is rather positive. However, in Deus Ex Human Revolution the stuttering seems to be even worse than before making the game-play experience rather unpleasant.

EDIT: Also experiencing a lot of tearing in RAGE.
 
Last edited:
you both kind of missed what I was saying which is tearing will not always be exactly the same with different monitors. I have looked at it side by side and have seen tearing look different from monitor to monitor. for example I have seen a game on my crt have a small consistent tear in the same place but have a tear that looked like a wave in the same game on the same settings with my LCD.

I guess it ultimately comes down to user experience as I've seen both those kinds of tearing on LCD and CRT TV/monitors.

For example, as I mentioned earlier, on the same HDTV, GTA IV on the Xbox 360 has screen tearing only at the top of the screen, usually in the overscan area or with 1:1 pixel mapping barely noticeable because of where it always occurs. On the other hand, Saints Row 2 on its default settings on the 360, i.e. V-sync is off (this is one of the few console games with such an option), it exhibits horrendous 'wave' like tears. Both games run at 30 fps, only one uses 'soft' v-sync to limit tearing and the other doesn't (though it does have a v-sync option if you can stomach the framerate drops).

I don't know how Rockstar managed to restrict screen tear in GTA IV to the top of the screen but if Adaptive V-Sync worked that way then I would use it.
 
I have been briefly playing some of my games to see how these beta drivers and A-Vsync affect the performance. While I haven't made my mind up on the input lag kicking in --I can feel it but I haven't played enough to know if it really bothers me yet-- the overall experience is rather positive. However, in Deus Ex Human Revolution the stuttering seems to be even worse than before making the game-play experience rather unpleasant.

EDIT: Also experiencing a lot of tearing in RAGE.
Deus Ex Human Revolution can be a stutterfest in spots for anyone so don't blame adaptive vsync for that. I stutter like crazy walking around in Detroit in DX9 but its almost perfectly smooth on max DX11 settings. really just messing with random settings can impact the amount of stutter in that game.

and why would you use adaptive vsync in RAGE when it has its on similar smart vsync right in the game? I am pretty sure they recommended to not try and force vsync on or off from your cp with this game and to just use the game's options.
 
Deus Ex Human Revolution can be a stutterfest in spots for anyone so don't blame adaptive vsync for that. I stutter like crazy walking around in Detroit in DX9 but its almost perfectly smooth on max DX11 settings. really just messing with random settings can impact the amount of stutter in that game.

and why would you use adaptive vsync in RAGE when it has its on similar smart vsync right in the game? I am pretty sure they recommended to not try and force vsync on or off from your cp with this game and to just use the game's options.

To clarify -- I was one of those who were experiencing noticeable stutter in Deus Ex when it came out. When playing the game with the new drivers it has become worse than before.

Regarding RAGE, my settings were already done in the Nvidia CP. I just launched the game out of curiosity and those were the results. I usually do not create profiles for games ; I like to just have one setting for everything and forget about it which 99% of the time works fine. I guess this won't be the case with RAGE.
 
To clarify -- I was one of those who were experiencing noticeable stutter in Deus Ex when it came out. When playing the game with the new drivers it has become worse than before.

Regarding RAGE, my settings were already done in the Nvidia CP. I just launched the game out of curiosity and those were the results. I usually do not create profiles for games ; I like to just have one setting for everything and forget about it which 99% of the time works fine. I guess this won't be the case with RAGE.
yeah its not a good idea to globally force settings as it will cause issues in a few games or other apps. the only thing I globally force is high quality texture filtering and I clamp the negative LOD bias. most games I just use the in game settings but for a few games I do use FXAA, adaptive vsync and 16x AF. again its best to never force those type settings globally though. ;)
 
yeah its not a good idea to globally force settings as it will cause issues in a few games or other apps. the only thing I globally force is high quality texture filtering and I clamp the negative LOD bias. most games I just use the in game settings but for a few games I do use FXAA, adaptive vsync and 16x AF. again its best to never force those type settings globally though. ;)
Okay. Thanks for the advice :)
 
...Basically, here are two scenarios for a typical user:

1) Your frames are always lower than your monitor's refresh rate (for typical users it's <60fps), then Adaptive Vsync does absolutely nothing, so you might as well have it turned off, because you'll get plenty of screen tearing either way.

2) Your framerates are always higher than your refresh rate (>60fps). Adaptive Vsync works exactly like regular vsync. Basically you might as well run with vsync on...

You do realize that you are exactly making the argument FOR A-Vsync, and why its useful and should be on?

your scenarios:

1) Frames are lower than 60, but a-vsync is ON. A-vsync produces the SAME EFFECT AS TURNING IT OFF. So you are mistaken when you say "might as well have it turned off". It turns itself off (duh).

2) Frames are higher than 60, but a-vsync is on. A-vsync produces the SAME EFFECT AS TURNING on vsync. Which is EXACTLY why you want it on.

And the net result of adaptive v-sync is that above 60 fps, you get smooth, no tearing frames, and when it drops below you get the most performance with only ocassional tearing. The overall effect is frames rates are alot smoother, the gameplay feels more fluid, and the skipping/stuttering/jittering is greatly minimized.

It's a great feature, turn it on.
 
Last edited:
It's a great feature, turn it on.

It's a nice feature, but it still doesn't get around the reason most people run with Vsync off, which is the input lag. I'd bet that the large majority of people (here at least) don't really care about the half-refresh issue with Vsync on, and instead disable it to eliminate Vsync-related input lag. I don't think I've ever heard anyone complain about the half-refresh issue.
 
I never use vsync in any game or program its the first thing i disable and I disable it on the drivers.

the screen tear and frame rate jumps does not bother me and a I rather the GPU just spits out frames as fast as it can and the machine to run as fast as it can not sync with the slow ass monitor.


I might use this new vsync though and its if its cool.
 
Triple buffing seems to make Input lag happen still and results in gaming unplayable

been testing with Vsync with double buffer games are fine

but I am guessing some games do not allow witch one to use
 
Vsync limits the display to multiples of 60?

I thought it was factors of the vertical refresh rate, meaning typically factors of 60... (11 pages later, nobody else thought this. Am I wrong? Do people even care about blatant inaccuracies?)
 
Vsync limits to whatever the display's refresh rate is, yes. It could be 60, 120, 75 or 78.7.
 
I haven't noticed input lag on either BF3 or ME3 using adaptive vsync. I am using a 60hz IPS monitor if that matters.
 
to clarify some

Hz = cycles per second.
FPS = Frames per second

These 2 objectives are not the same comparison at all, so people claiming my 60hz monitor, is capped when there video card gets 60fps is not a true statement.

Seen video cards in the report here some FPS reported was well into the hundreds.

Monitor cycles or a standard 60hz is how that device renders a refresh of the total screen
picture being driven to it on a standard lcd monitor.

GPU's run at 900hz or so allot more then the lcd, but it only pushes about 10-25% average of that speed to equal the FPS that video card can do. Settings and complexity of work determine speed and loading to draw.
 
Vsync limits the display to multiples of 60?

I thought it was factors of the vertical refresh rate, meaning typically factors of 60... (11 pages later, nobody else thought this. Am I wrong? Do people even care about blatant inaccuracies?)

I care, and you are right, although I suspect it was just the wrong choice of words. I think we all understand the 60 -> 30 -> 20 -> 15 downward spiral. Also, despite what the article says, with Adaptive V-sync enabled, if the game's frame rate drops below the refresh rate, a tear is still visible. Perhaps not as noticeable as the rapid tearing that strikes when you're well above the refresh rate with V-sync off, but it's still visible.
 
For what it's worth:

I just tested Adaptive V-sync on my GTX 285 with drivers 301.24 on a 120 Hz monitor.

EDIT: never mind. It works fine at 120 Hz. :)
 
Last edited:
I just did some testing with the game I happen to be playing right now: Morrowind, heavily modded with MSGO 2.0.1, which includes MGEXE 0.9.4. My monitor is 60Hz, and my video card is a GTX 680.

It took me a bit, but I figured out a protocol to understand when both sides of AVsync is working. While all forms of Vsync were forced off in both MGEXE and Nvidia Control Panel, I found two areas: the first area is a small room where I never get less than 130FPS; the second area is the city of Caldera, where for some reason FPS drops down to between 50 and 57 FPS, depending on where I look. Perfect.

In both areas with all Vsync off, tearing was very apparent and annoying. Yes, there was tearing going on at 50FPS, so the myth that no tearing occurs below a monitor's refresh rate is false.

I tried using the form of Vsync found in MGEXE by setting VWait to 1, but that did not seem to be working.

I set the Nvidia Control Panel to force regular Vsync on for Morrowind. In the first area, FPS was limited to 60FPS and there was no tearing. In the second area, FPS again varied between 50 and 57 FPS, with no tearing. (I thought that if normal VSync is on, then FPS is either 60FPS or 30FPS with no in-between? Help me understand this.)

Next, I set the Nvidia Control to Adaptive VSync. In the first area, FPS was limited to 60FPS and there was no tearing, just like regular Vsync. In the second area, FPS varied between 50 and 57 FPS, and there was lots of annoying tearing.

tl;dr: Games can tear badly below the monitor refresh rate, and when they do, Adaptive Vsync sucks.
 
Back
Top