NVIDIA Adaptive VSync Technology Review @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,634
NVIDIA Adaptive VSync Technology Review - NVIDIA has developed a new technology called Adaptive VSync that is poised to improve the gameplay experience. We look at how this technology works, and see if we can graph the performance and compare it with traditional VSync methods. Check out how this technology is moving the "smoother" gameplay experience forward!
 
I really would like to see what some of you 400 and 500 series owners see when using this technology. Please share your experiences here. Thanks.
 
Just buy a 120hz LCD and problem solved :) vsync off, no tearing. Been using one for the last year or so, love it.
 
Thanks for the good read Kyle. I am running a 120hz monitor with my 680 and it is smooth as silk. I keep adaptive on to keep the tearing gremlins away with all of this raw power!
 
I use it with my gtx570 and it works great. in fact I see no point in using regular vsync anymore. if I am exceeding 60 fps then its just like regular vsync and if I dip below that then some tearing is better than stuttering. funny how this was introduced with the gtx680 but has actually given me more reason to keep my gtx570.

btw tearing can occur at ANY framerate not just when you exceed the refresh rate. in fact the games that I have noticed the most tearing in over the years were the ones with the lowest framerates.
 
You know, since the point of your typical GPU previews is to do a "real world" evaluation, you actually should probably run them with vsync on, even though that would make graphs a bit more boring.
 
You know, since the point of your typical GPU previews is to do a "real world" evaluation, you actually should probably run them with vsync on, even though that would make graphs a bit more boring.

not eveyone runs vsync.... up until i got the 680 with adaptive i always forced it off.
 
I have a 560 on the way and plan on testing this. I wonder if it will help older cards from before the 400 series.
 
It actually works great, Adaptive VSync is a nice addition. Helps save on power consumption, and heat as well. I actually removed one GTX 580 because it's too hot already LOL
 
Thanks for the article. However, I do not understand the part regarding input lag. Or rather I did not find a definitive answer to my questioning -- with Adaptive V-Sync activated, when the framerate exceeds 60fps, does the input lag kick in?
 
Last edited:
One issue you guys didn't address was the input lag on/input lag off toggle that you get when going from >60 fps to <60 fps. Because v-sync is turning off and on again, in theory you should get different amounts of input lag. I haven't tried adaptive v-sync myself yet but I have a feeling this would really throw many people off, especially in shooters, and it's the reason I'm just running v-sync off.
 
One issue you guys didn't address was the input lag on/input lag off toggle that you get when going from >60 fps to <60 fps. Because v-sync is turning off and on again, in theory you should get different amounts of input lag. I haven't tried adaptive v-sync myself yet but I have a feeling this would really throw many people off, especially in shooters, and it's the reason I'm just running v-sync off.
I have spent hours using it and never noticed anything odd from vsync kicking off and on.
 
When using adaptive vsync should in game tripple buffering be enabled or disabled? Traditionally I've always ran with vsync+tripple buffering when possible but am anxious to try this new technology. Thanks!
 
When using adaptive vsync should in game tripple buffering be enabled or disabled? Traditionally I've always ran with vsync+tripple buffering when possible but am anxious to try this new technology. Thanks!
I dont see why you would need it. triple buffering is helpful when you dip below refresh rate with vsync on but with adaptive vsync, vsync will disable below your refresh rate anyway.
 
I'm confused here. I thought the old issue of framerate jumping was just a relic of Windows XP being designed primarily for CRTs, because LCD monitors don't technically have "refresh rates" the same way CRTs do.

I've been using VSync for years on all my games in Vista and now Windows 7, with both nVidia (previously) and AMD cards (5870 currently), and neither FRAPS nor any in-game counter I've used (like Crysis) has ever showed any jumping at all. Back in XP, I had to use some program (I forget the name) to force triple buffering in DirectX to prevent the jumping, but again... not a problem in either Vista or Windows 7.

Soooooooo I don't know what "problem" there is for adaptive VSync to solve, because it hasn't been an issue for years.
 
I'm confused here. I thought the old issue of framerate jumping was just a relic of Windows XP being designed primarily for CRTs, because LCD monitors don't technically have "refresh rates" the same way CRTs do.

I've been using VSync for years on all my games in Vista and now Windows 7, with both nVidia (previously) and AMD cards (5870 currently), and neither FRAPS nor any in-game counter I've used (like Crysis) has ever showed any jumping at all. Back in XP, I had to use some program (I forget the name) to force triple buffering in DirectX to prevent the jumping, but again... not a problem in either Vista or Windows 7.

Soooooooo I don't know what "problem" there is for adaptive VSync to solve, because it hasn't been an issue for years.
did you read the review? they clearly showed that using adaptive vsync when framerates fluctuate above and below refresh rate is massively better than using vsync. framerate being displayed with FRAPS does not always reflect exactly what is going on. if you look at the actual frame render times then you will see how bad the drops are when dipping below refresh rate when using vsync without triple buffering. and if you cant feel the little stutter or hitch when dropping below refresh rate with vsync on in those cases then you must be completely oblivious to it.
 
Is this different than just running a framerate limiter capped at your refresh rate?
 
Is this different than just running a framerate limiter capped at your refresh rate?

A limiter capped at refresh rate will not do anything with screen tearing.

As to the article, I do have a question or two:

What are the practical differences between this and triple buffered, standard vsync ? Are they just different means to the same end or are there perceivable differences in game ?
 
did you read the review? they clearly showed that using adaptive vsync when framerates fluctuate above and below refresh rate is massively better than using vsync. framerate being displayed with FRAPS does not always reflect exactly what is going on. if you look at the actual frame render times then you will see how bad the drops are when dipping below refresh rate when using vsync without triple buffering. and if you cant feel the little stutter or hitch when dropping below refresh rate with vsync on in those cases then you must be completely oblivious to it.

Yes, cool guy, I read the review. And I just booted up a game of BF3 to see if I was tripping. No jumps... the frame rates were consistent regardless of where I was... around 50 in some areas, low 30s in others. None of this 60, 30, 20 stuff. I'm familiar with jumping because it annoyed me to no end... in Windows XP.

So, alright, maybe FRAPS doesn't show exactly what's going on... neither does Crysis' FPS counter, because I've never seen any jumping there either. And perhaps there's hitching that I'm totally oblivious to, which obviously means it's not exactly a problem if I can't actually, y'know, see or feel it and it doesn't show up in FRAPS. Maybe I'm just not as 733t as you are.

I mean sure, maybe adaptive VSync works a little better. Fine. But jumping? I haven't seen it in years. I used to use the D3D Overrider utility to force triple buffering, but... again, I haven't needed that utility in ages. I've always thought VSync jumping was a relic of an OS designed for CRT monitors.
 
Yes, cool guy, I read the review. And I just booted up a game of BF3 to see if I was tripping. No jumps... the frame rates were consistent regardless of where I was... around 50 in some areas, low 30s in others. None of this 60, 30, 20 stuff. I'm familiar with jumping because it annoyed me to no end... in Windows XP.

So, alright, maybe FRAPS doesn't show exactly what's going on... neither does Crysis' FPS counter, because I've never seen any jumping there either. And perhaps there's hitching that I'm totally oblivious to, which obviously means it's not exactly a problem if I can't actually, y'know, see or feel it and it doesn't show up in FRAPS. Maybe I'm just not as 733t as you are.

I mean sure, maybe adaptive VSync works a little better. Fine. But jumping? I haven't seen it in years. I used to use the D3D Overrider utility to force triple buffering, but... again, I haven't needed that utility in ages. I've always thought VSync jumping was a relic of an OS designed for CRT monitors.
I have only seen the framerate counter shoot down 30 in a few cases too if I could not maintain 60. AGAIN the actual frame render times show that this does happen without triple buffering though regardless of what is showing on the screen at the time. and you should be happy that you don't notice the stutter when dropping below 60 fps with vsync on. heck just panning the mouse around in game with vsync on is choppy if I am below my refresh rate. its clearly a problem for most us and that's why adaptive vsync came about.
 
My question here would be how you managed to get the fps drop to 30, 20, 15? This is something I have never seen in years - but maybe it's because I never use ingame vsync but force vsync through the control panel. I guess then triple buffering is always active, no matter if the game supports it or not.

Could you please elaborate on that?
 
We should all thank John Carmack and Rage for this!

He was rather annoyed that the consoles have had adaptive vysnc for years but had never been implemented in PC drivers despite the chips supporting it.

So he put some pressure on NVIDIA and ATI to add the feature to their drivers so Rage could make use of it.

Before the 680 both ATI and NVIDIA drivers have supported adaptive vsync for a while but relied on the lazy game developers to add the neccesary code.

But it looks like NVIDIA are the first to allow gamers to turn it on for any game.

I imagine we will see a similar feature added to ATI drivers in the near future.

I have been waiting for this type of technology for years...
 
My question here would be how you managed to get the fps drop to 30, 20, 15? This is something I have never seen in years - but maybe it's because I never use ingame vsync but force vsync through the control panel. I guess then triple buffering is always active, no matter if the game supports it or not.

Could you please elaborate on that?

Doesn't triple buffering add input lag?
 
I have only seen the framerate counter shoot down 30 in a few cases too if I could not maintain 60. AGAIN the actual frame render times show that this does happen without triple buffering though regardless of what is showing on the screen at the time. and you should be happy that you don't notice the stutter when dropping below 60 fps with vsync on. heck just panning the mouse around in game with vsync on is choppy if I am below my refresh rate. its clearly a problem for most us and that's why adaptive vsync came about.

THIS:

My question here would be how you managed to get the fps drop to 30, 20, 15? This is something I have never seen in years - but maybe it's because I never use ingame vsync but force vsync through the control panel.

I booted up Crysis Warhead, with VSync enabled in-game. Sure enough, both FRAPs and the in-game counter showed a jump between 30 and 60 fps. But then I hopped over to the Catalyst Control Center, switch the VSync option to "On, unless application specifies", and my framerates returned to normal, averaging around 40fps DX10/enthusiast settings @ 1080p, but hovering the 30s most of the time.

So apparently forcing VSync at the driver level eliminates the jumping.
 
I've noticed that adaptive vsync only works on my GT 430 until the first instance my frame rate dives below my refresh rate, then it's no sync at all. Could be the result of using these drivers in Windows 8, though. So, for now, it's basically useless to me.

Thanks for the article. However, I do not understand the part regarding input lag. Or rather I did not find a definitive answer to my questioning -- with Adaptive V-Sync activated, when the framerate exceeds 60fps, does the input lag kick in?
It "kicks in", but won't add one frame's worth of latency when it does so.

What are the practical differences between this and triple buffered, standard vsync ? Are they just different means to the same end or are there perceivable differences in game ?
Adaptive vsync is tear-if-miss. Standard vsync is hold-if-miss: nothing is presented until a full frame is ready for the next swap interval. Thus, standard vsync will never exhibit tearing.

Doesn't triple buffering add input lag?
Depends on the method. If the present interval is simply increased, latency is also increased. That's the 'dumb' approach. The 'smart' approach does not add latency.
 
Doesn't triple buffering add input lag?

Yes it does, but I find that more manageable than the constant jumping or the loss of so much performance (from 55 to 30 for instance). Also, you can get almost completely rid of the input lag by using a framerate limiter - even if the fps are below the limit. Don't know why, but it works.

@littledoc:
Interesting that you can confirm this :)
 
THIS:



I booted up Crysis Warhead, with VSync enabled in-game. Sure enough, both FRAPs and the in-game counter showed a jump between 30 and 60 fps. But then I hopped over to the Catalyst Control Center, switch the VSync option to "On, unless application specifies", and my framerates returned to normal, averaging around 40fps DX10/enthusiast settings @ 1080p, but hovering the 30s most of the time.

So apparently forcing VSync at the driver level eliminates the jumping.
its not doing that for me at all so that must be an AMD issue. I have in game vsync enabled in game and not once has it shot from 60 to 30 looking at FRAPS on screen. disabling in game vsync and using Nvidia control panel vsync gives me the same framerate.

btw forcing vsync on from the CCC officially has no effect on DX games anyway. it even tells you that is for OpenGL games. its one of the reasons that I switched back to Nvidia.
 
two things not mentioned, i think.

vsync may add horrible input lag. what about a-vsync?

im pretty sure you can get screen tearing with < 60fps. in fact many games on the consoles have tearing and those games are targeted for 30fps, i would hardly think they go over 60fps that often...

personally vsync+triple buffering has worked for me very well, but I can definitely see the use of this in Rage since d3doverrider doesn't work in openGL.
 
two things not mentioned, i think.

vsync may add horrible input lag. what about a-vsync?

im pretty sure you can get screen tearing with < 60fps. in fact many games on the consoles have tearing and those games are targeted for 30fps, i would hardly think they go over 60fps that often...

personally vsync+triple buffering has worked for me very well, but I can definitely see the use of this in Rage since d3doverrider doesn't work in openGL.
if you notice lag with vsync on while still getting framerates equal to or better than your refresh rate then you will still notice it with adaptive vsync when vsync kicks in at your refresh rate. I do not usually notice lag with vsync on but I do notice the stuttering if dipping below refresh rate with vsync on.

and yes tearing can occur at any framerate when vsync is off. I have never understood why some people seem to think tearing only occurs above refresh rate.
 
I absolutely love it. Crysis Warhead looks fantastic. I've never seen it look that good. Not just the framerates...but the smoothness. I've always scoffed at vsync. No more.
 
I just like to add to those people that never see tearing with vsync off. It's not your 120 Hrz Monitors, or other people's imaginations seeing tearing... The tearing example that Kyle visually presented is what I see all the time almost every second if I play anything with FPS low or high with vsync off. Why is that, I'm the few individuals that not only colorblind but I see less colours and possess "primitive vision"; I'm wired to sense more detail and motion of the slightest degree. So I must have vsync on+triple buffering on all the time... I have no choice. This is a boon to me Adaptive Vsync, reguardless that I do not clearly understand it's concept. ;)

Advantage of my type of vision though is that certain type of camouflage do not work with me. ;)

I have tested this on Skyrim and Borderlands 1 and on LOTRO MMO, there's still "micro-tearing I've noticed with a-vsync but very little, I'm guessing when it "switches" on/off state? For shiets and giggles i tried a monitor test program that does a rapid colour and patern changes, and vsync-off you can see mega-tearing, vsync on none, a-vysnc a little bit - just to let you know... if you like to try it.
 
its not doing that for me at all so that must be an AMD issue. I have in game vsync enabled in game and not once has it shot from 60 to 30 looking at FRAPS on screen. disabling in game vsync and using Nvidia control panel vsync gives me the same framerate.

????????? If that's the case then I have no idea why you objected to my original post or why you care about adaptive VSync. It sounds like you're saying you don't get the jumping at all (even with the control panel set to application setting and in-game VSync on) regardless of your settings, which is precisely what adaptive VSync is supposed to fix.
 
????????? If that's the case then I have no idea why you objected to my original post or why you care about adaptive VSync. It sounds like you're saying you don't get the jumping at all (even with the control panel set to application setting and in-game VSync on) regardless of your settings, which is precisely what adaptive VSync is supposed to fix.
you still are not paying attention to what I am saying. for the THIRD time just because FRAPS on screen does not always show its not dropping all the way down does not mean its not. again for the THIRD time, the actual framerate logs is where it will show this.

and you are really starting to make yourself look silly now anyway. you just claimed the in game vsync was making you go from 60 to 30. you then say using vsync from the CCC did not do that. I just told you that is because turning it on from the CCC does NOT actually work in DX games. that means you are running the game without vsync on if all you are doing is forcing it on from the CCC. if that is what you have been doing for all your games then you are not ever seeing it go from 60 to 30 because you do NOT even have vsync on.
 
Give him a break. One would surely recognize if vsync was on or not. Either by tearing or by fps in excess of 60 (or 120) fps.
 
btw tearing can occur at ANY framerate not just when you exceed the refresh rate. in fact the games that I have noticed the most tearing in over the years were the ones with the lowest framerates.

I just came in this thread to post the very same thing as I was surprised to read an article on this site that implied that tearing doesn't occur when the framerate dips below the refresh rate of the display because it most certainly does, at least on every display I've ever used, CRT and LCD!

Adaptive V-Sync is nothing new anyway, not if you're a console gamer as it was first introduced in Gears of Wars on the Xbox 360 back in 2006. This was a game with a capped 30 fps framerate that disabled v-sync whenever the frame took longer than 33.33ms to render. When this happened there was screen tearing on my 60 Hz HDTV, sometimes it wasn't that noticeable but sometimes it was, hideously so. Previous generation games were rife with screen tearing as anyone who's played God of War on the PlayStation 2 will tell you. That was a 60 fps capped game that frequently tore on 60 Hz TVs when it dipped below 60 fps, which is did frequently even on my old 28" 100 Hz Panasonic CRT TV and my 21" Sony TV.

This use of 'Soft' V-sync, as it's called on the consoles, is the very thing that drove me to buy multiformat games for my PC as I cannot stand screen tearing. I'm very sensitive to it to the point where I can notice even slight tearing and it totally spoils a game for me. This is why I always use V-Sync + Triple Buffering (the latter enabled through D3DOverrider) to play all my games without sudden framerate dips and without any screen tearing whatsoever. The only side effect of using triple buffering is increased video RAM usage due to the extra buffer and additional input lag but I can't say I've ever noticed the latter and the former isn't a problem for me either as I game at 1920x1200 with a 2 GB graphics card.

I've always been puzzled as to why NVIDIA and AMD don't support Triple Buffering for DirectX games. The Triple Buffering option that exists currently in their graphics driver control panels is only for OpenGL. As such it means having to use a third-party tool from the no-longer-supported RivaTuner package in order to use triple buffering with DirectX games. Anyone know why it has never been supported by either manufacturer? Personally, I think Adaptive V-Sync is a step backwards from V-Sync+Triple Buffering, an inferior solution to an issue that existed on inferior console technology due to their limited video memory. Of course, if intermittent screen tearing doesn't bother you but input lag does and/or framerate dips do then it may well be the best thing ever.
 
Back
Top