NVIDIA Adaptive VSync Technology Review @ [H]

THIS is the real solution, 120hz or go home
Theoretically, doubling the refresh rate doubles the number of tear lines for a given time interval. The tear lines are just visible for half as long at they would be otherwise.

I call that a wash.
 
When using adaptive vsync should in game tripple buffering be enabled or disabled? Traditionally I've always ran with vsync+tripple buffering when possible but am anxious to try this new technology. Thanks!

If the game you are playing is a DX game, then I'm sorry, but there is no option for triple buffering in DX10.

Unless the game developer has specifically coded in the ability to do proper triple buffering, you will never see it. What Microsoft calls triple buffer is actually an in order flip cue. That actually adds to latency.

I do like the idea of adaptive V-sync for a few reasons.
First, the aforementioned lack of triple buffer in DX makes turning on V-sync in a DX game something that is only viable if your frame rate exceeds 60 at all times. Then you may still experience mouse lag.
Second, game developers refusal to add a frame limit to new engines because it isn't required on a console.

Ultimately this is what we are talking about. With a proper frame limiter, visible tearing can be reduced to a non issue to begin with.

You still have the issue that with adaptive V-sync on, and running below 60 fps, screen tearing is a certainty. Whether you perceive it or not, it's happening.

I feel that the real issue here isn't touched on in the article at all.

Is adaptive V-sync better than engine based frame limiting? Isn't that the real question?:confused:
 
I was going to say that same thing... and great article!

When will the WHQL 300 drivers be released?

So, according to the article... it overrides in-game VSync, so I'm assuming there's no point to using in-game then?
Would having it on in-game effect it any way? Does Adaptive work well enough that there's still no tearing?

And, to put it in-short: adaptive works by running as if vsync is just off completely, which gives that smoothness of having much higher than 30fps, but when hitting 60fps, will kick-in to keep it from going above 60fps. I cant imagine how fast AVSync has to work if you're always hovering between, say, 55-60fps, and it has to kick on and off in microseconds every other microsecond, ha ha, but that's really impressive.

I could also imagine how it would make gameplay feel so much smoother.

It's just seems like a ridiculously awesome technology, almost seemingly "stupidly obvious" to try to do, and to glad that nVidia came up with it and has apparently implemented it so well. I'd love to try it on my 580.

Could it be shown what it does through a video of using FRAPS to show FPS, and just use a camera to record your monitor's screen?
That would probably show people how it's working, no? Since graphs cant do it justice.

I'm wondering... would this be of help even with games that perhaps aren't optimized so well, for someone who always runs VSync?

Guild Wars 2 would be an example, which needs some serious work on optimization right now. When there's a lot on-screen, damn, my FPS tanks, and I'm running a high-end system. I getting better FPS in Crysis, lol ;).

I've got think of a game that right now with my system tends to be under 60fps so I can test on a 580.

Could you use FRAPS to benchmark your FPS? Wouldn't that work to really show how Adaptive is working? Where it's keeping your min/max/avg?

P.S.
Anyone else test on the 580 yet? Is it working as well? I'm trying to figure out what game I don't run maxed at 60fps I could test. Hoping my FRAPS idea will work as well.

There are already WHQL R300 drivers - http:///drivers/results/42929
 
THIS is the real solution, 120hz or go home
When you can find me a 120hz IPS panel that's higher resolution than 1920x1080, please let me know. TN panels are trash and 1080p isn't enough resolution anymore.
 
Good read, thank you. I've been forcing adaptive vsync globally and it works great. One of the best features added in a while
 
Thank you so much for the article. I had always assumed this new process was how V-Sync always worked. I never would have figured V-Sync affected framerates lower than refresh. (I thought it was just a cap)

This could explain why my 6870 was stuttering on Skyrim @ 1080p Ultra with V-Sync on my stock i3-550 but not when I had it on my OC'd i7.

I thought something was wrong with the i3 machine, but what it looks like now is the i7 was pushing the 6870 hard enough to keep the frames at 60, but the i3 can't, resulting in what I'm assuming to be V-sync 30/60 stutter.

This could also explain why some people here claim to never have experienced the stuttering. If you are using high powered cards + OC'd CPUs with 1080p or less resolutions, or run older engine games chances are you are pushing 60+ frames and won't get stutter with V-Sync.

But using mid-range cards, running 2560 res. or Eyefinity/nV Surround, especially on new engine and max quality settings you will.
 
There are already WHQL R300 drivers - http:///drivers/results/42929

Hmmm... no official WHQL 300 drivers I see on nVidia's actual site though... weird.

Well, I can report first findings:

- on GTX 580 in Saints Row: The Third, there's absolutely no difference except one for the worse. Using Adaptive VSync actually causes very noticeable screen tearing.

Watched FPS closely using FRAPS and used the benchmark tool, and the results were basically exactly the same in terms of FPS, no change, no benefit, which there should be in terms of gaining more fps, if this were actually working "as if not having vsync on at all" except when capped at 60fps.

So, not sure... maybe it doesn't work as well/at all with the 580, or some games, who knows. I've got nothing to judge it against. But, if adaptive was working as claimed, then general FPS should be higher, because I'm always under the 60fps mark with vsync on in Saints Row. I even tried turning of in-game vsync, no difference.

So, thus far, it's pointless for me on the 580 in one game. Actually caused screen tearing, which I'm very sensitive to in-game.
 
Maybe try a non-console port? Some ports have funny things happening with fps capping etc already.
(I'm not trying to troll, I'm just suggesting a possible explanation)
 
Maybe try a non-console port? Some ports have funny things happening with fps capping etc already.
(I'm not trying to troll, I'm just suggesting a possible explanation)

I don't think you're trolling, and it's a good point.

Problem is, with Saints Row: The Third, it wasn't ported; the PC version was developed specifically for PC (supposedly), not just ported over.

I'm not sure what else I could try, since I max everything at 60fps as it is with every game, almost never an fps drop, and if there is, not enough of one that I think Adaptive VSync would make the gameplay look/feel any smoother or really increase the fps.

Disappointing, because I'd love to try out and see results from what seems like a really great implementation with this tech right now. Seems like [H] had great results and really noticed a difference. Guess it will really depend on the game you're playing then, perhaps?
 
Last edited:
I dont see why you would need it. triple buffering is helpful when you dip below refresh rate with vsync on but with adaptive vsync, vsync will disable below your refresh rate anyway.
Ok thanks. So should I disable it in-game with adaptive enabled or would it make no difference anyway?
 
When you can find me a 120hz IPS panel that's higher resolution than 1920x1080, please let me know. TN panels are trash and 1080p isn't enough resolution anymore.

cant do it, you know just as much as I do that it does not exist

but come on now, its not that bad. my samsung S27A950D does look ok. i think the 120hz out weighs its flaws.

but yes, it would be nice to see 120hz moved to IPS panels.

I don not think the resolution issue will be resolved anytime soon as, i think greater than 1080p at 120hz is out of spec for all cable/connector standards.

---

someone a few posts back talked about sing more tears at 120hz. i do not know if there is more, but yes, at 120 , you can still see them pretty clear. That is clear proof that even 120hz is not enough. it is also why i would like to be able to set the limit points for 120hz use vs 60hz use. in its 60hz state, the GTX 680 feature is useless to me
 
cant do it, you know just as much as I do that it does not exist

but come on now, its not that bad. my samsung S27A950D does look ok. i think the 120hz out weighs its flaws.

but yes, it would be nice to see 120hz moved to IPS panels.

I don not think the resolution issue will be resolved anytime soon as, i think greater than 1080p at 120hz is out of spec for all cable/connector standards.

---

someone a few posts back talked about sing more tears at 120hz. i do not know if there is more, but yes, at 120 , you can still see them pretty clear. That is clear proof that even 120hz is not enough. it is also why i would like to be able to set the limit points for 120hz use vs 60hz use. in its 60hz state, the GTX 680 feature is useless to me
I know it doesn't exist, that's exactly my point. I play more RPG/Strategy/Adventure games than I do twitch FPS, so image quality is more important to me than refresh rate. Plus, I hate having bad viewing angles.

I've debated picking up a S27A950D to try out 120hz because I hear that the newer TN panels do look better, but I've been running 1920x1200 for 2 years and now I have a 2560x1440 panel and 1080p would be a huge downgrade in that respect. :\
 
This is exactly what i want. Vsync without the input lag. I have to go nvidia next time around. Thanks for the review!
 
This is exactly what i want. Vsync without the input lag. I have to go nvidia next time around. Thanks for the review!

you can use any gpu you want with virtu mvp virtual vsync.

so really, you just need to go ivy bridge, at least for your motherboard anyway
 
while this is a step forward (thanks nVidia), i would really like to see more of Lucid's Virtual Vsync in action

it seems like that's a better way to get at the problem (no vsync input lag and also no tearing)

i'm sensitive to both... but i gotta compromise...
 
while this is a step forward (thanks nVidia), i would really like to see more of Lucid's Virtual Vsync in action

it seems like that's a better way to get at the problem (no vsync input lag and also no tearing)

i'm sensitive to both... but i gotta compromise...

Lucid's implementation is pretty interesting, though I can certainly see how it could be quite buggy after reading up on how it works. I was going to try it out once I get my 3770k but the game I want to try it out on most is BF3 but it's one of the games thats buggy and causes stuttering. It would be nice if AMD/nVidia licenced the techonlogy from Lucid and implimented it in their drivers.
 
For the authors: one quick nit - did you mean Batman Arkham City, and not Arkham Asylum? Arkham Asylum never had support for Directx 11 or tessellation.
 
you can use any gpu you want with virtu mvp virtual vsync.

so really, you just need to go ivy bridge, at least for your motherboard anyway

hmm, I've never heard of that "virtu mvp" you're talking about. I've been out of the loop for a while. I'm guessing it has to do with the new intel ivy bridge cpu and associated mobo? Guess I should look into that.
 
you can use any gpu you want with virtu mvp virtual vsync.

so really, you just need to go ivy bridge, at least for your motherboard anyway

We're going to have to see that in action first - and a lot hinges on Lucid's drivers, which may be problematic.
 
We're going to have to see that in action first - and a lot hinges on Lucid's drivers, which may be problematic.

And for that reason, I don't think it will see wide spread adoption. I just don't see Lucid staying on top of driver updates as often as they may need to be, in addition to it bringing yet another driver update for the end user to consider. It just seems to be dependent on too many variables.
 
can't say I've ever noticed jumping between 30 and 60 before either. I often enable vsync to avoid tearing and I know I've seen frames rate in the 40s and 50s before.

I'm going to have to pay more attention next time i play. Currently playing metro2033 with max settings.
 
you just claimed the in game vsync was making you go from 60 to 30. you then say using vsync from the CCC did not do that. I just told you that is because turning it on from the CCC does NOT actually work in DX games. that means you are running the game without vsync on if all you are doing is forcing it on from the CCC. if that is what you have been doing for all your games then you are not ever seeing it go from 60 to 30 because you do NOT even have vsync on.

Wrong. It's triple buffering, not VSync, that does not work in DirectX. Forcing VSync at the driver level works fine in DirectX and always has; in fact, that's how I've almost always done it, even when I used Direct3D Overrider with Windows XP. I'd be happy to post a video to Youtube with it on and with it off, from the driver control panel so you can clearly see that a) forcing VSync in DX does actually work, and b) forcing it from the driver level eliminates jumping.


p.s. - and don't tell me that FRAPS doesn't show everything, because FRAPS can definitely distinguish between a steady frame rate in the 40s or 50s, and jumping from 60 to 30.



can't say I've ever noticed jumping between 30 and 60 before either. I often enable vsync to avoid tearing and I know I've seen frames rate in the 40s and 50s before.

I'm going to have to pay more attention next time i play. Currently playing metro2033 with max settings.


If you force VSync at the driver level instead of using in-game options, there's no jumping. That's why I was confused too at what Adaptive VSync is really supposed to be fixing.
 
Last edited:
i dunno... forcing VSYNC in AMD 12.3 did nothing for TERA online and Dragon Nest... i had to use D3DOverrider in order to stop the tearing
 
If you force VSync at the driver level instead of using in-game options, there's no jumping. That's why I was confused too at what Adaptive VSync is really supposed to be fixing.

That's only true if forcing it in the driver also turns on triple buffering. No matter where the Vsync is enabled, double buffered Vsync will have fractional frame rates. It's just the way it works.
 
I agree with most of that but actually the display can play a role. I have compared it on different monitors and certainly noticed a difference. my crt tore less than my lcds at the same settings. a 120hz will actually tear a bit less than a 60hz screen. even different 60hz lcds can tear a bit differently too.

I respectively disagree with that as I've played games across a wide variety of displays and I've seen bad tearing across all of them, even on my old 17" CRT monitor. LCDs in my experience are no worst than CRTs for showing screen tearing but, of course, the amount of tearing and how obvious it is can not only vary between games but also in the same sections of the same game so it makes it very difficult to quantify or prove.
 
For those you who say triple buffering is suffcient, John Carmack disaggrees:
https://twitter.com/#!/ID_AA_Carmack/status/190111153509249025

Again I disagree completely with that statement as I've been using triple buffering in both OpenGL and Direct3D games for many years now and all I've noticed is how much smoother the gameplay feels in comparison to double-buffered v-synced games. As I cannot stomach screen tearing, even in minor amounts, I've only briefly played games with v-sync off but the feel of the game didn't seem any different from that of a triple-buffered, v-synced one as both ran fine without juddering/hitching, only the game without v-sync obviously had much inferior image quality!
 
Monitors do not tear. Frames tear as a result of a lack of synchronization between the GPU issuing data to the display and the display. Thus, the type, make, model or frequency of the display has no bearing on which frames will be torn and which will not.

Exactly. I wish I'd written that!

I've seen screenshots of tearing in action and those are taken directly from the frame buffer before the image is sent to the display using a piece of hardware so that is proof that the display has no bearing on whether tearing happens or not.
 
Kyle, with your connections at AMD, would you happen to know if they'll be implementing anything like this in the not-too-distant future?

I am very sensitive to screen tearing and always play with VSync on if performance allows. I'd love to see this show up for Team Red... otherwise this would be one more reason for me to go with nVidia for my next upgrade.

This. I'm also quite sensitive to tearing so I'm jealous of this new feature... I won't be upgrading my two 6950s until next year tho, but this is something that would definitely carry weight when I do.
 
Yeah, I don't see how tearing could somehow be magically gone with fps less than the refresh rate either.

The whole point with vsync is to *synchronize* the frames the video card sends with the monitor's refresh/update. That's part of the reason you often have much input lag with vsync, since it holds back frames thus causing a 'delay' between mouse-camera movement and the actual change in game. Without vsync there is no synchronization, the video card just spits out frames as soon as it's made them. If the frame is only half-finished when the monitor does a refresh, then there will be tearing. I can see how the tearing would be worse at higher fps, but it should still be there even with fps below the monitor's refresh.

It is. The article is incorrect IMO as many others have pointed out. If a game with v-sync disabled drops to, say, 50 fps on a 60 Hz display (as would happen with Adaptive V-Sync) and stays at that level for a few seconds (unlikely but let's pretend it does for the sake of argument) then it will tear during that time because 50 complete frames cannot be displayed in 60 refreshes of a screen; some images output to the screen will contain parts of other frames (60/50 = 1.2). How noticeable the tearing is depends on how much movement is occurring in the game, for example, in a racing game where you're moving forward, tearing would be less noticeable, at least until you get to a corner and then it usually isn't!

I've tested Adaptive V-Sync with a number games and in all cases the games tore when the framerate dipped below 60 fps but were fine when the framerate stayed at 60 fps. The tearing was especially bad in Metro 2033 for example where the framerate dipped below 60 fps more often than the other games I tested (I was playing on DX11, Very High, 4xAA, DoF @ 1920x1200 settings). Adaptive V-Sync, in my view, is best used in games that maintain 60 fps (or 30 fps with the Half-Refresh option) more often than not.
 
If you force VSync at the driver level instead of using in-game options, there's no jumping. That's why I was confused too at what Adaptive VSync is really supposed to be fixing.


I don't remember ever enabling it from the driver.
 
Wrong. It's triple buffering, not VSync, that does not work in DirectX. Forcing VSync at the driver level works fine in DirectX and always has; in fact, that's how I've almost always done it, even when I used Direct3D Overrider with Windows XP. I'd be happy to post a video to Youtube with it on and with it off, from the driver control panel so you can clearly see that a) forcing VSync in DX does actually work, and b) forcing it from the driver level eliminates jumping.


p.s. - and don't tell me that FRAPS doesn't show everything, because FRAPS can definitely distinguish between a steady frame rate in the 40s or 50s, and jumping from 60 to 30.






If you force VSync at the driver level instead of using in-game options, there's no jumping. That's why I was confused too at what Adaptive VSync is really supposed to be fixing.
again you have no idea what you are talking about. you can claim that forcing it on from CCC has always worked but you are flat out wrong. ever since Vista forcing vsync from CCC has rarely worked and that is a fact. not only have I used ATI/AMD cards but even a 10 second google search will show you that. you are getting different framerates when using the in game vsync because when using it from the CCC, vsync is not actually getting applied in most cases. you have to use something like D3Doverider to consistently force vsync on for AMD if you are not using the in game vsync.
 
Last edited:
I would imagine that AMD has taken notice of this and will probably develop a similar solution but I gotta hand it to Nvidia for blazing a trail. Looks like they did it right too.
 
Just buy a 120hz LCD and problem solved :) vsync off, no tearing. Been using one for the last year or so, love it.
I've got a Samsung 120Hz monitor, and a couple of 580GTX in SLI ; on Crysis 2, I have more than 120 fps with "Vsync off", but I am limited at 50 fps in both "V sync on" and "Adaptative Vsync" modes ???

Thanks for your advices
 
It is. The article is incorrect IMO as many others have pointed out. If a game with v-sync disabled drops to, say, 50 fps on a 60 Hz display (as would happen with Adaptive V-Sync) and stays at that level for a few seconds (unlikely but let's pretend it does for the sake of argument) then it will tear during that time because 50 complete frames cannot be displayed in 60 refreshes of a screen; some images output to the screen will contain parts of other frames (60/50 = 1.2). How noticeable the tearing is depends on how much movement is occurring in the game, for example, in a racing game where you're moving forward, tearing would be less noticeable, at least until you get to a corner and then it usually isn't!

I've tested Adaptive V-Sync with a number games and in all cases the games tore when the framerate dipped below 60 fps but were fine when the framerate stayed at 60 fps. The tearing was especially bad in Metro 2033 for example where the framerate dipped below 60 fps more often than the other games I tested (I was playing on DX11, Very High, 4xAA, DoF @ 1920x1200 settings). Adaptive V-Sync, in my view, is best used in games that maintain 60 fps (or 30 fps with the Half-Refresh option) more often than not.
As far as I understand, at 50 FPS your screen will draw the same frame 1/6th of the time, so there's no tearing.

Above 60 FPS, the card sometimes sends two frames during the same refresh cycle and that's why screen tearing occurs.
 
I agree with most of that but actually the display can play a role. I have compared it on different monitors and certainly noticed a difference. my crt tore less than my lcds at the same settings. a 120hz will actually tear a bit less than a 60hz screen. even different 60hz lcds can tear a bit differently too.


I respectively disagree with that as I've played games across a wide variety of displays and I've seen bad tearing across all of them, even on my old 17" CRT monitor. LCDs in my experience are no worst than CRTs for showing screen tearing but, of course, the amount of tearing and how obvious it is can not only vary between games but also in the same sections of the same game so it makes it very difficult to quantify or prove.


kinda disagree too, i have used 120 lcds, and 200 hz monitors, and the observation of tearing is about the same as if it where a 60hz setup. its just as equal of a problem at high refresh rates.
 
kinda disagree too, i have used 120 lcds, and 200 hz monitors, and the observation of tearing is about the same as if it where a 60hz setup. its just as equal of a problem at high refresh rates.
you both kind of missed what I was saying which is tearing will not always be exactly the same with different monitors. I have looked at it side by side and have seen tearing look different from monitor to monitor. for example I have seen a game on my crt have a small consistent tear in the same place but have a tear that looked like a wave in the same game on the same settings with my LCD.
 
Should have tested Rage, that game tears like a biatch without vsync on.

Seems like that would be a moot point though, as RAGE attempts to change visual settings on the fly to keep a constant frame rate, so you likely won't see framerates peak and dip, which is where adaptive vsync is the most useful.
 
Back
Top