NVIDIA Adaptive VSync Technology Review @ [H]

Give him a break. One would surely recognize if vsync was on or not. Either by tearing or by fps in excess of 60 (or 120) fps.
well surely he does not. no need to copy and paste everything he has said but his comments prove that he does not understand what is going on.
 
Guys I have a dumb question: why is regular Vsync capping the framerate at 30 FPS? I thought since the refresh rate of the monitor is 60 FPS, it should cap it at 60. I haven't been keeping up with new games and their graphics APIs.
 
With VSync enabled I have never seen it jump between 15, 20, 30, 60, 85, 98 FPS (matching the refresh rates I've been running my screens with) either during the last 15 or so years, and that counts for both, AMD / Ati and Nvidia. So I really wonder what's going on. Have all my systems, hardware, software and operating system combinations over all these years miraculously errored in such a way they selectively produced something that is now to be known as A-VSync?

Up to date example: SWTOR. When in fleet watching all the other players, I get around 40 to 50FPS. That should be capped to 30, but it ain't so. And that's on a Radeon 6970 with VSync forced on via either the ingame switch or Ati Tray Tools. I've seen this exact same behavior in two other MMOs I've been playing on a Geforce 8800 GTS G92: Tabula Rasa and Age of Conan.
 
Guys I have a dumb question: why is regular Vsync capping the framerate at 30 FPS? I thought since the refresh rate of the monitor is 60 FPS, it should cap it at 60. I haven't been keeping up with new games and their graphics APIs.
it would only do that in a game that has no triple buffering and you cant average anywhere near 60 fps.
 
Just buy a 120hz LCD and problem solved :) vsync off, no tearing. Been using one for the last year or so, love it.

Where might I buy this 2560x1600 S-IPS American-warrantied 30" LCD monitor that runs 120hz, sir? I would love to get one! (shame it doesn't exist) Enjoy your TN quality, and the tearing that still occurs.
 
With VSync enabled I have never seen it jump between 15, 20, 30, 60, 85, 98 FPS (matching the refresh rates I've been running my screens with) either during the last 15 or so years, and that counts for both, AMD / Ati and Nvidia. So I really wonder what's going on. Have all my systems, hardware, software and operating system combinations over all these years miraculously errored in such a way they selectively produced something that is now to be known as A-VSync?

Up to date example: SWTOR. When in fleet watching all the other players, I get around 40 to 50FPS. That should be capped to 30, but it ain't so. And that's on a Radeon 6970 with VSync forced on via either the ingame switch or Ati Tray Tools. I've seen this exact same behavior in two other MMOs I've been playing on a Geforce 8800 GTS G92: Tabula Rasa and Age of Conan.
I have mentioned this numerous times in this thread but here goes. if the game has triple buffering then it wont just go to 30fps. even in games that dont have triple buffering the on screen framerate counter can sometimes hang out between 30 and 60 fps even though you are not getting 60 fps with vsync on. if you look at the actual framerate looks though you will see that your real average framerate was indeed pretty close to 30 fps in those situations regardless of what the on screen frame rate counter says. tbh I don't really fully understand it either.
 
P.S. It would have been interesting if this article had also compared the framerates from using Adaptive V-Sync and normal V-Sync + Triple Buffering (using D3DOverrider).

Also, wouldn't capping the framerate at the refresh rate of your display with V-Sync off also effectively be the same thing as Adaptive V-Sync, i.e. if on a 60 MHz display, it was capped at 60 fps then as long as the game ran at 60 fps there would be no tearing (since each frame would be drawn within exactly one refresh of the screen) but if it dips below 60 fps then there would be intermittent screen tearing?

I know that NVIDIA already have a hidden Framerate Cap in their driver which with the use of NVIDIA Inspector allows you to set the cap in multiples of 5 frames so you could, for example, have a cap of 40 fps if a game tends to drop to that level. This in turn results in the game running smoother within huge variations in the framerate that you'd get with v-sync off or even on. Adaptive V-Sync only lets you cap at the refresh rate of your display or half the refresh rate, e.g. 30 fps and 60 fps on a 60 Hz display or 60 fps and 120 fps on a 120 Hz display.
 
Two questions:
1. Will AMD likely bring this feature into their drivers? It sounds feasible from a software standpoint.
2. Will adaptive vsync work on laptop nvidia graphics like the 570m?

I am loving Deus Ex except the tearing is driving me crazy. Laptop runs 570m, might switch over to it for the rest of the game (about halfway through it now).
 
P.S. It would have been interesting if this article had also compared the framerates from using Adaptive V-Sync and normal V-Sync + Triple Buffering (using D3DOverrider).

Also, wouldn't capping the framerate at the refresh rate of your display with V-Sync off also effectively be the same thing as Adaptive V-Sync, i.e. if on a 60 MHz display, it was capped at 60 fps then as long as the game ran at 60 fps there would be no tearing (since each frame would be drawn within exactly one refresh of the screen) but if it dips below 60 fps then there would be intermittent screen tearing?

I know that NVIDIA already have a hidden Framerate Cap in their driver which with the use of NVIDIA Inspector allows you to set the cap in multiples of 5 frames so you could, for example, have a cap of 40 fps if a game tends to drop to that level. This in turn results in the game running smoother within huge variations in the framerate that you'd get with v-sync off or even on. Adaptive V-Sync only lets you cap at the refresh rate of your display or half the refresh rate, e.g. 30 fps and 60 fps on a 60 Hz display or 60 fps and 120 fps on a 120 Hz display.
it can help but capping a game right at the refresh rate does not prevent tearing as no game will ever be perfectly in sync the whole time. it only takes few minute to test that out in a few games and see. of course results can vary wildly from game to game.
 
Where might I buy this 2560x1600 S-IPS American-warrantied 30" LCD monitor that runs 120hz, sir? I would love to get one! (shame it doesn't exist) Enjoy your TN quality, and the tearing that still occurs.

I'd buy one of those in a second. Really disappointed with today's display options.
 
it would only do that in a game that has no triple buffering and you cant average anywhere near 60 fps.

1334549254xAknM1AAS2_2_7.gif


Why is Vsync capping at 30 FPS for no reason though. It should be 60 FPS. I mean when I play CS:S or BFBC2 it caps at 60 FPS, the way it is supposed to. Is this because of the new graphics engine or DX11.1 or something?
 
Last edited:
image.html


Why is Vsync capping at 30 FPS for no reason though. It should be 60 FPS. I mean when I use CS:S or BFBC2 it caps at 60 FPS, the way it is supposed to. Is this because of the new graphics engine or DX11.1 or something?
well I just said the reason why and that's because you cant average anywhere near 60fps so without triple buffering it will stay around 30 fps with vsync on. those other games are not demanding enough to keep you from getting 60 fps. turn down your settings in Batman AC if you want to get 60 fps with vsync on.
 
So does that mean normal Vsync has different "tiers" or something, and it only goes up to 60 if your GPU is rendering >60 FPS ? And it only renders 20/30 FPS if it is rendering >20 or 30 FPS ?

That makes sense I guess...

Adaptive Vsync is the perfect solution to the problem I never knew existed.
 
So does that mean normal Vsync has different "tiers" or something, and it only goes up to 60 if your GPU is rendering >60 FPS ? And it only renders 20/30 FPS if it is rendering >20 or 30 FPS ?

That makes sense I guess...

Adaptive Vsync is the perfect solution to the problem I never knew existed.

If you take the time to read the review it explains precisely how vsync works, when you get 60 fps, when you get 30 and when you get <30
 
I believe Tom's wasn't overly excited about adaptive vsync because the sudden tearing they noticed when dropping below 60hz was more distracting to them that the stutter you get from regular vsync.

I suppose it's a matter of personal preference, but there is still tearing going on below your refresh. I don't understand why, but I have always heard/read that tearing is more pronounced, or easier to notice when your framerate is above your refresh rate. It doesn't seem like it would matter, either way it seems like you would still get part of an old frame, and part of a new frame with a tear in between.
 
I believe Tom's wasn't overly excited about adaptive vsync because the sudden tearing they noticed when dropping below 60hz was more distracting to them that the stutter you get from regular vsync.

I suppose it's a matter of personal preference, but there is still tearing going on below your refresh. I don't understand why, but I have always heard/read that tearing is more pronounced, or easier to notice when your framerate is above your refresh rate. It doesn't seem like it would matter, either way it seems like you would still get part of an old frame, and part of a new frame with a tear in between.
every game is different. the tearing could be just as bad or even worse at low framertates in some games than at higher. at some framertes the tearing could be on one part of the screen but at a different framerate be on another. in some games the tearing is hardly even noticeable at all. just like with any other graphical setting its best to take it on a game by game basis.
 
every game is different. the tearing could be just as bad or even worse at low framertates in some games than at higher. at some framertes the tearing could be on one part of the screen but at a different framerate be on another. in some games the tearing is hardly even noticeable at all. just like with any other graphical setting its best to take it on a game by game basis.

Screen tearing is a very subjective thing though. You can prove it occurs as I've seen console game benchmarks were the percentage of screen tearing is shown and the tears are shown as red vertical lines in the graphs that display during gameplay videos (such as the ones on Eurogamer.net's Digital Foundry articles).

However, whether the person actually sees it is down to how perceptive they are and where the tear actually occurs. Tears at the top of the screen are far less noticeable that those that occur in the middle of the screen where your eyes are focused and tears that occur in the overscan (hidden) area of some older HDTVs are not noticeable at all but might show in these benchmarks (Grand Theft Auto IV on the Xbox 360 being a good example as it only tore at the top of the screen so it wasn't visible unless you used 1:1 pixel mapping on your HDTV and even then you'd have to be looking for it). This issue has nothing to do with the type of display you're using either as I've seen some people claim; tearing occurs just the same whether you're using a 14" black & white CRT portable or a 55" Plasma or LCD screen (it would be less obvious on the smaller screen though).

I'm very sensitive to screen tearing myself and can see even slight tearing, usually in the form of tell-tale "wobble" as I'm turning the camera in a game. I sometimes wish I was as oblivious to this issue as some many others are as it really does spoil my enjoyment of a game if I see it. It can take me right out of a game, in fact. It's why I've moved away from gaming on the consoles where 30 fps is the norm for framerates and screen tearing is depressingly noticeable even with "soft" v-sync.
 
I agree with most of that but actually the display can play a role. I have compared it on different monitors and certainly noticed a difference. my crt tore less than my lcds at the same settings. a 120hz will actually tear a bit less than a 60hz screen. even different 60hz lcds can tear a bit differently too.
 
Man i just tested adaptive vysnc. And the mouse lag issue with vsync is still there. Guess ill stick with vsync off.
 
Quick skim, didn't see this asked,

Can Adaptive vsync be compared to a game engine that caps itself at your monitor's refresh rate? I know many UE3 games are like this, though it's a frame limit of 60ish fps, not exactly matching system refresh. Side by side, does current game engine implementation and adaptive vsync offer a similar experience 60hz?
 
Quick skim, didn't see this asked,

Can Adaptive vsync be compared to a game engine that caps itself at your monitor's refresh rate? I know many UE3 games are like this, though it's a frame limit of 60ish fps, not exactly matching system refresh. Side by side, does current game engine implementation and adaptive vsync offer a similar experience 60hz?
it was asked about somewhere in this thread. a framerate cap does not stop tearing so if tearing bothers you then you need vsync.
 
Would this be of any use for bf3 with 560 2gb sli and a crt? I am doing ok atm but am always looking for an edge. Right now, the fps goes anywhere from 85 to 200 which is nuts.
 
Would this be of any use for bf3 with 560 2gb sli and a crt? I am doing ok atm but am always looking for an edge. Right now, the fps goes anywhere from 85 to 200 which is nuts.
at what refresh rate? if its 85 or below then you can use vsync or adaptive vsync but the results will be the same since you never go below your refresh rate.if you are just trying to keep consistent framerates and tearing does not bother you then just use a framerate cap.
 
I can see future displays being able to display images non-isochonously. Refresh rates are a legacy from CRT monitors, and aren't really necessary for LCD type displays.

Tearing definitely happens using adaptive vsync. To minimize it I try to get my games playing as close to 60 FPS as possible to maintain vsync on. The upside is reduced memory use on memory-heavy games, as I no longer need to have three frames worth of rendering going on when I had to use triple buffering.
 
For those you who say triple buffering is suffcient, John Carmack disaggrees:
https://twitter.com/#!/ID_AA_Carmack/status/190111153509249025

And we all know that Carmack's biggest interest is making sure that PC gamers have a great experience. Oh wait....Carmack's opinion does not carry weight with me any more. He is only concerned with what makes him money and will throw the entire PC gaming community under the bus as we have witnessed first hand. Go post about Carmack in the console forum.
 
We should all thank John Carmack and Rage for this!

You must have missed the JC Fellatio booth on the way here....

If you want to point a finger at a company that really pushed this technology into the spotlight, you would have look at Lucid's Logic. That company had a comparative technology that released last June.

Virtual Vsync – Eliminate the Debate between Quality and Performance

Visual quality and performance is boosted even further with the introduction of Lucid's Virtual Vsync technology. A proprietary solution for Virtu Universal, Virtual Vsync enables games to run at higher FPS with faster user responsiveness, with no image tearing artifacts.

Enhanced performance – when system has dGPU/mGPU and IGP

Unlimited frame rates – some as much as 120 FPS and beyond with in-game Vsync on

User-defined frame rate and visual quality selection

Flawless visual quality without tearing

Up to 250% better responsiveness

The fact of the matter is that tech had lots of other overhead associated with it sadly and never gained any traction. What NVIDIA is doing is a much better solution. Interestingly the NVIDIA folks that I was exposed to claimed to have never heard of Lucid's tech. :D
 
I must say this Adaptive Vsync is most impressive. Glad to see Nvidia firing on all cylinders again with their desktop GPU's.
 
I'm not overly sensitive to jitter/stutter and tearing (well, tearing if it is obvious). However, one thing I am sensitive is my electric bill every month. Reducing your power connsumption while gaming will not go unnoticed. My question is, is this technology being pushed into tablets/laptops? how much more battery life will tihs give hardcore laptops? 15 minutes..30 minutes?
 
Stupid question, so I am assumine Adaptive Vsync will work with 120hz monitors like they do 60hz?

And I did read the article!
 
Stupid question, so I am assumine Adaptive Vsync will work with 120hz monitors like they do 60hz?

And I did read the article!
it assume works fine on 120hz monitors. there is also adaptive vsync half refresh rate option which would be nice for 120hz owners who are not getting remotely close to 120 fps in a game as that would turn on vsync for them at 60hz.
 
Last edited:
I just tried the vsync half refresh rate option for Alan Wake to see how that would go and I am very impressed. I cant average but around 45 fps on the settings(highest and 2x AA) that I am using and probably only get to 60fps about 10-15% of the time. adaptive vsync obviously can only stop the tearing when I hit 60fps so I gave the half refresh rate setting a spin. I figured 30fps would be abysmal but it was just fine for the 15 or so minutes that I tried it. now I am using a controller and this is not BF 3 by any means so 30fps is fine for this type of game.

EDIT: tried the half refresh rate option in Metro 2033 and it was horrible and choppy just when panning the view around. maybe it seemed okay in Alan Wake because where I tested at was really dark. anyway the half refresh setting should only be a last resort for 60hz monitors as 30fps is probably going to look and feel like crap in most games.
 
Last edited:
Finally, a good solution! I have always hated screen tearing, which is why I use Vsync on all the time, but it's often tricky to configure some games to reach the "sweet spot" where the frame rate is steady without looking like picmip has been turned up to 5 levels of mips or low geometry, or running at lower rez. If the game has no console or doesn't support cvars to cap the game's internal refresh it makes things difficult. I just wish they had done this several years ago.
 
Fantastic new tech by NV. Like many things after the fact it seems like a "duh" why didn't we do that before. Kudos to NV for a nice innovation and to [H] for a very nice explanation of how it works.
 
So what does the GTX680 do when Adaptive Vsync stops it from going above 60fps?

Does that dynamic clock rate and voltage wizardry kick in, and help it save even more power, or does it stay at the same settings and just save power through less load?
 
Fantastic new tech by NV. Like many things after the fact it seems like a "duh" why didn't we do that before. Kudos to NV for a nice innovation and to [H] for a very nice explanation of how it works.

Back in the day I used to mimic this behavior in the source engine by setting the "maxfps" (fps_max? I can't remember) game variable equal to my refresh rate, and then forcing vsync off. It didn't work perfectly all the time, but it was pretty decent.
 
Zarathustra[H];1038618710 said:
Back in the day I used to mimic this behavior in the source engine by setting the "maxfps" (fps_max? I can't remember) game variable equal to my refresh rate, and then forcing vsync off. It didn't work perfectly all the time, but it was pretty decent.

Alot of games let you do that. You can set the frame rate cap and the game would always keep the frame rate around that number. This allowed you to simply disable V-Sync and not experience tearing. Doom 3, Quake IV were two notable examples of this. I believe you could do it in Unreal Tournament 2004 as well.
 
Back
Top