NVIDIA Adaptive VSync Technology Review @ [H]

yes it comes down to the game as some noticeably tear while others dont. still we have gamers that have been playing for years and still act like tearing only happens above refresh rate and that drives me nuts.

To be fair, everyone isn't born with that knowledge and must learn it somewhere, somehow. Even you had to, just as much as me and everyone else that is already aware of it. Wer're on this forum to learn as well as teach, after all. ;)
 
sincerely I never seen tearing under 60FPS because I never played under 60FPS, when my SLI goes under 60FPS I upgrade my PC :D
 
Faster? The Anandtech article seemed to indicate there was a small performance penalty, and it's very software dependant and doesn't work with every game (unlike NV's solution)... It might be more eelegant on a technical level but it looks like it's lagging in execution. I have two HD6950s and I'd rather have NV's adaptive Vsync, specially when it can be added at the driver level and doesn't require a new mono. I hope AMD isn't banking on Virtu here...

I have one of these virtu mobo and here is my thought:

Virtu adaptive vsync: useless
nVidia adaptive vsync: useless

Why?
1.Test your game settings/look a benchmark of your GPU.
2.Dont look to average: look at a standard deviation: how much FPS varies from a given value (30; 35; 40; 60 ...).
3.Choose your quality/performance ratio: maxed out with ~30 FPS for example.
4.Turn on vSync.
5.Limit the FPS to ~33 using console command/video capture aplication/etc.
Result: get the optimum smoothness: minimum (near zero) variation in FPS and no tearing.

Thats not my idea: consoles always use this.
Halo 3: 30 FPS
Forza 4: 60 FPS
Hobbit movie: 40 FPS ^^
etc

The control panel doesnt need adaptive vSync. It needs a FPS limiter (like a racing simulator offers)

Now: TXAA and 2 6-pin TDP: thats why kepler is the only current choice.
AMD needs to launch another card or cut the prices.

But: if the consumer needs to configure anything in more than 2 clicks he is going to be upset and change of brand.
Only nVidia understand this.
 
Adaptive when it works properly is the perfect solution. The problem for me is that it doesn't always work properly. In some of my games it works perfect and in other games it's not consistently turning on vsync at fps above refresh and I still see tearing. Nvidia still has work to do to perfect this technology.

Borderlands 2 is an example of it not working right. When I use Adaptive I see tearing all over the place. Turning it off and just using in game vsync eliminates the tearing completely.
 
Adaptive when it works properly is the perfect solution. The problem for me is that it doesn't always work properly. In some of my games it works perfect and in other games it's not consistently turning on vsync at fps above refresh and I still see tearing. Nvidia still has work to do to perfect this technology.

Borderlands 2 is an example of it not working right. When I use Adaptive I see tearing all over the place. Turning it off and just using in game vsync eliminates the tearing completely.
this has been said over and over again that of course you will get tearing with adaptive vsync on if you are below your refresh rate.
 
this has been said over and over again that of course you will get tearing with adaptive vsync on if you are below your refresh rate.

That doesn't make any sense. If you got tearing with Adaptive every time the fps dropped below the refresh rate, it would be pretty worthless.

please don't tell me actually think tearing only occurs above refresh rate. I just don't get why people say something like that when you can fire up a game and see tearing at ANY framerate. if you don't notice it below refresh rate then how do you notice it all?

Yes that's exactly what I thought. I don't use fps counters in games so maybe that's why I thought the way I did about vsync. I must be a dumb ass cause this flies in the face of everything I thought I knew about vsync. Why in Hardocp's article on Adaptive was this never mentioned at all (about tearing below refresh rate)?
 
Last edited:
That doesn't make any sense. If you got tearing with Adaptive every time the fps dropped below the refresh rate, it would be pretty worthless.
you really need to read up on this because that is exactly what happens. the whole point of it is to use vsync at your refresh rate and then disable it when you drop below the refresh so it does not stutter. so again if you are below your refresh rate with adaptive vsync on then you will have some tearing because vsync is not enabled at that point.
 
That doesn't make any sense. If you got tearing with Adaptive every time the fps dropped below the refresh rate, it would be pretty worthless.



Yes that's exactly what I thought. I don't use fps counters in games so maybe that's why I thought the way I did about vsync. I must be a dumb ass cause this flies in the face of everything I thought I knew about vsync. Why in Hardocp's article on Adaptive was this never mentioned at all (about tearing below refresh rate)?

It occurs less and/or it's less noticeable under your refresh rate, it can still happen tho.
 
It occurs less and/or it's less noticeable under your refresh rate, it can still happen tho.
yeah they say tearing is less noticeable at low frame rates but I don't think that is very accurate in my experience though. the games with the lowest frame rates are usually the ones I notice the most tearing in.
 
Why in Hardocp's article on Adaptive was this never mentioned at all (about tearing below refresh rate)?

The article does mention it, on the first page:

"The consequence is called 'tearing,' and it is a very real visual anomaly that you will notice more as you play your games as the framerate exceeds the refresh rate. Tearing is described as a frame literally breaking in half, or sometimes even in three parts, and part of the frame lagging behind the other part of the same frame. The result is a visually distorted image that can bother gamers. Note that tearing can technically occur if the framerate doesn't exceed the refresh rate, but it is much less likely to be noticed."

http://www.hardocp.com/article/2012/04/16/nvidia_adaptive_vsync_technology_review

Some games don't tear at all below refresh rate, and Adaptive VSync is great for those titles. Some games tear a lot below refresh rate, and Adaptive VSync sucks for those titles.
 
I gotta pay more attention using a fps counter (which I don't ordinarily do) to see if I see tearing below refresh rate. I'm one of those "show me" kind of guys so I will have to see this for myself. I looked at Borderlands 2 again using Precision X fps counter and now Adaptive is working fine, I see no tearing. The fps counter confirmed what I already knew, this game does not go below refresh rate on my system so the tearing I was seeing was sure not from below refresh rate. Don't know what the deal was before when Adaptive didn't appear to be turning on for this game. I had vsync off in the game, Adaptive on in Control Panel and vsync just appeared to be off cause of the severe tearing (and fps was above refresh rate). Cycled through turning vsync on in game, turning off Adaptive, turning Adaptive back on, turning vsync in game back off and now Adaptive seems to be working fine (with no tearing). There are issues with Adaptive being implemented right. Sometimes I have to fiddle with it like the example above. You guys are saying with Adaptive on, you will get tearing when fps drops below refresh rate. Fine. I will have to see it for myself to believe it. So far I haven't. Any tearing I have seen to date has been above refresh rate, not below it. Not saying anyone is wrong, I just haven't seen it. I was gaming for years on a crt monitor with a 100 hz refresh rate with vsync off and never once saw any tearing. What are the odds of that happening when you know many times the games were running below 100 fps. As soon as I got a LCD monitor (60 hz) I began to see tearing immediately and it was from above refresh rate, not below it. So you can see I'm a bit skeptical.
 
Last edited:
I cant even begin to comprehend how some people claim to never see tearing below the refresh rate. we don't a scientific study or in depth analysis as simply looking at the screen is all it takes to see that tearing does occur. there is nothing magical that happens below your refresh to stop tearing. if frames are out of sync then they are out of sync and tearing occurs. it varies widely from game to game and one game can tear insanely bad at 25 fps while another game may hardly tear at all while getting 150 fps.
 
Try comprehending a little less dude, it's going to give you a headache. People see what they see no matter how hard it is for you to accept it. End of story.. I'm done debating with the know it alls about what one sees and another doesn't.
 
In-game settings do we need V-Synce enable/on, for this to work ? And of course set the NVIDIA CONTROL PANEL options to Adaptive.
 
control panel settings override the game setting but you can just leave the in game vsync off to make sure there are no conflicts.
 
In-game settings do we need V-Synce enable/on, for this to work ? And of course set the NVIDIA CONTROL PANEL options to Adaptive.

control panel settings override the game setting but you can just leave the in game vsync off to make sure there are no conflicts.

I was wondering the same thing but recently ran into a post on the Nvidia forums where a mod stated that you must disable V-Sync in-game and enable it in the CP in order for Adaptive V-Sync to work correctly
 
Back
Top