Wait, shouldn't Vsync negate all microstuter with sli/CF systems?

xebo

n00b
Joined
Oct 12, 2011
Messages
36
I've been in the market for a new system/video card setup for a while. One of the main things influencing my decision to go single/multi (nvidia/ati) has been Microstutter.

If I'm understanding everything correctly:

1. Vsync makes your graphics card wait until your monitor is ready to refresh before sending in a new frame
2. Microstuttering is when your graphics card is putting out frames at varying/unequal intervals of time (frame A is on your screen for 20ms, frame B is on your screen for 60ms, etc), which gives the illusion of choppy game play.

So, if that is correct, wouldn't turning Vsync on when your FPS is higher than your monitor's refresh rate completely remove all microstutter? Consider this incredibly scientific example:

hBMK4.png


It seems like, with Vsync ON and FPS higher than your monitor's refresh rate, all microstuttering should go away completely. If this is correct, then I'd like to mention a few more things that could use some verification or debunking by the community here:

1. With FPS > Refresh rate, input lag is non existent
2. Triple buffering removes all performance losses (or just most?) when your FPS falls below your refresh rate when vsync is on.

If the above 2 points and my assumption about microstuttering are correct, I see no reason to not go Crossfire+vsync on for gaming, and just ensuring your FPS stays above your refresh.
 
Last edited:
For me the smoothest gameplay experience does come from situations where I either have VSync enabled or I'm getting 100+ fps with VSync off.

I don't really notice any input lag with VSync on.

There is still a performance dip with Triple-Buffering, your FPS will fall to 45. With Triple Buffering disabled it would fall to 30.
 
I was wondering this as well can haven't been able to get a clear consensus on the subject. It does seem if you're powering along at greater than 60 FPS (minimum) turning on vertical sync would smooth it all out. In actual practice I have no idea (and that's what's held me back from Crossfire).
 
Here is a talking point that I'd love to kick start, if anyone has the knowledge to discuss it:

kIZOD.gif


I've marked where, on a frametime vs frame benchmark of a microstuttering system, a 120hz monitor can be expected to refresh. Two Notes: That benchmark shows 75 frames per second. A 60hz monitor will only refresh once, using the top line at around 16ms). if I turn on Vsync, would, in the 60hz or 120hz case, the game display the same frame 2+ times in a row?

In the 120hz case, it looks like it would for the larger bars. In the 60hz case (Using only the top line as a reference), it seems like it would only show 1 frame per refresh, but it would delay the smaller bars quite a bit (keep them on the screen longer). What does it mean that the top of the large bars are cut off by the highest refresh point? Do those extra parts accumulate over time, leading to a repetitive "hitch" every now and again?

The picture seems to insinuate, at least to a layman, that (with vsync on) if FPS is higher than your refresh rate, you'll experience a steady game with mouse lag, and if your FPS is lower than your refresh rate, you'll experience microstutter.

What do you guys think?
 
Vsync does help with microstutter, as does any FPS cap, as long as your FPS is higher than the limit. But if it drops below that limit, stutter will be back, right when the FPS is lowest. So if you like to play games with Vsync on so that it never goes below 60, micro-stutter won't be a problem. Unfortunately for setups like Eyefinity having 60+ FPS at all times is impossible even with CF, unless you drop settings. Also since in most games FPS varies quite a bit making sure CF stays above 60 would mean that in less demanding scenes performance would be simply wasted.

As long as AFR micro stutter is around, CF/SLI is of limited use when playing <60 FPS.
 
Yes, this is why I play BF3 on my 120Hz Eyefinity setup at 120+ FPS. Zero micro-stutter. Takes a 5.3Ghz 2700k and highly over clocked quad crossfire though!
 
Yes, this is why I play BF3 on my 120Hz Eyefinity setup at 120+ FPS. Zero micro-stutter. Takes a 5.3Ghz 2700k and highly over clocked quad crossfire though!

Oh fantastic, someone with first hand experience. Could you go into detail regarding:

1. How you know there is 0 microstutter?
2. What settings you are using to require 2700k? (2500k for medium?)
3. Are you getting any mouse lag/delay? Have you tested to confirm/deny this?
4. Triple buffering? Double Buffering?
 
Last edited:
you lost me.

vsync is a software setting. it makes the software ATTEMPT rendering and displaying of frames at the same rate as the display refreshes. if the video card cannot keep up for any reason, the view will look skippy, and that is what microstutter will do regardless of vsync.
 
Here is a talking point that I'd love to kick start, if anyone has the knowledge to discuss it:
kIZOD.gif

Aren't stutter and microstutter just a low FPS that doesn't show up, if you display the average frames per second?
In the previous example half the frames are at 50 FPS and half at 166 FPS.
Having Vsync on isn't going to match the monitors refresh rate of 120 or 60, when it is only 50 for half the frames.
I think having half of the frames running 2/3 slower would be noticeable to you, so I guess the theory is vsync would limit it to a maximum of 60.

I've seen benchmarks of dual cards, where the maximum FPS is dramatically higher but the minimum FPS isn't any better then a single card solution.
In an instance where some of the frames are rendered under 30 FPS and some are over 200 FPS, this could show an average FPS of over 100.
The question is how much of the 1 second interval has to rendering low frame rates, for you to notice it by feeling mouse lag.
 
Last edited:
For me the smoothest gameplay experience does come from situations where I either have VSync enabled or I'm getting 100+ fps with VSync off.
I don't really notice any input lag with VSync on.
There is still a performance dip with Triple-Buffering, your FPS will fall to 45. With Triple Buffering disabled it would fall to 30.

If your monitor had a refresh rate of 60 and you were getting 50 FPS with Vsync off, would Vsync give you 30 FPS with triple buffering disabled and 45 FPS with it enabled?
What would your FPS be with Vsync on, if you were slowing to 38 FPS with Vsync off?
 
I like the effect of Vsync (no tearing) but every time I use it it seems like there is horrendous mouse lag. Any way to get rid of that?
 
I like the effect of Vsync (no tearing) but every time I use it it seems like there is horrendous mouse lag. Any way to get rid of that?

I've heard it thrown around that there are methods to do Vsync without input lag, but I haven't seen it. I too always get awful input lag with Vsync. I just can't stand it so I deal with tearing.
 
I've heard it thrown around that there are methods to do Vsync without input lag, but I haven't seen it. I too always get awful input lag with Vsync. I just can't stand it so I deal with tearing.

Same here. :( Unfortunate.
 
Yes you can have Vsync without input lag. Force triple buffering in an application like D3DOverrider.
 
Never experienced microstuttering. 2gt580 in sli

From what I've read SLI configs are less prone to have it and the stronger your cards the less likely you'll notice. The worst set ups are lower to midrange Crossfire.
 
I guess I don't really understand frame buffering in games at a fundamental level. The user's movement is non deterministic so doesn't the GPU need to wait and see what is in the user's view at the time before rendering the scene? If I swipe my mouse back and forth randomly in a FPS I am definitely changing the view faster than 60fps, so how is it the GPU can ever render two frames in advanced without knowing what i'm going to be looking at on the 3rd frame?
 
It does wait, it's delaying frames, not predicting future ones.
 
It does wait, it's delaying frames, not predicting future ones.
So, at 40fps each frame is being shown for 25ms. With tripple buffering on that means the input lag is 50ms greater than with tripple buffering disabled?
 
The exact number depends on a couple of things, but it does increase, yes, at least in current application where triple buffering actually does render ahead. The linked Anandtech article goes in some detail.
 
No, it makes it worse. Have you read the last page of the article? Or is it actually possible today to do triple buffering that doesn't mean render ahead?

Uh no, render ahead is not triple buffering.

Triple buffering gives you all the benefits of double buffering with no vsync enabled in addition to all the benefits of enabling vsync. We get smooth full frames with no tearing. These frames are swapped to the front buffer only on refresh, but they have just as little input lag as double buffering with no vsync at the start of output to the monitor.

On my VSync setups, as soon as I enabled triple buffering it was immediately apparent and input lag virtually vanished.
 
Yes, this is why I play BF3 on my 120Hz Eyefinity setup at 120+ FPS. Zero micro-stutter. Takes a 5.3Ghz 2700k and highly over clocked quad crossfire though!

So did you actually get the quad CF and portrait eyefinity working correctly?
 
Uh no, render ahead is not triple buffering.

Triple buffering gives you all the benefits of double buffering with no vsync enabled in addition to all the benefits of enabling vsync. We get smooth full frames with no tearing. These frames are swapped to the front buffer only on refresh, but they have just as little input lag as double buffering with no vsync at the start of output to the monitor.

On my VSync setups, as soon as I enabled triple buffering it was immediately apparent and input lag virtually vanished.

Wow, yeah, I just tried vsync+triple buffering, usually ran with no vsync, and the difference is immediately obvious. EXCELLENT tip!
 
Last edited:
I guess I don't really understand frame buffering in games at a fundamental level. The user's movement is non deterministic so doesn't the GPU need to wait and see what is in the user's view at the time before rendering the scene? If I swipe my mouse back and forth randomly in a FPS I am definitely changing the view faster than 60fps, so how is it the GPU can ever render two frames in advanced without knowing what i'm going to be looking at on the 3rd frame?

That's why vsync gets the input lag, since it renders two frames to the buffers, so you're two frames behind what is actually happening in the game. Triple buffering drops that back to one since it employs three buffers that are filled and emptied on the fly so there should always be a frame to send. Of course vsync implementations in games can vary somewhat. Some games seem to drop to 30 FPS right after FPS drops below 60, while other games have better granularity, so vsync drops to 50 Hz etc. GPUs also have other things affecting input lag and frame balancing. There's prerendered frames setting in Nvidia Control Panel for example that can balance frame output at the cost of input lag. Also Nvidia claims they do frame balancing when the GPU sends the picture to display, so it wouldn't show up in FRAPS. I haven't tested any SLI setups so I don't know how well it works.

Here's a good article about vsync: http://www.anandtech.com/show/2794/1
And here's some CrossFire X tech talk, a bit dated though: http://techreport.com/articles.x/14284/2
 
Last edited:
That's why vsync gets the input lag, since it renders two frames to the buffers, so you're two frames behind what is actually happening in the game.
Unsynchronised rendering uses the same two buffers. The problem comes from the fact that vsync delays rendering.

Triple buffering drops that back to one since it employs three buffers that are filled and emptied on the fly so there should always be a frame to send.
The point of double buffering (synchronised or not) is making sure there's always a frame to send. The point of triple buffering is making sure there's always a vacant buffer to write to.

Of course vsync implementations in games can vary somewhat. Some games seem to drop to 30 FPS right after FPS drops below 60, while other games have better granularity, so vsync drops to 50 Hz etc.
That's just double vs. triple buffering. But if it's not following the typical sequence (60, 30, 20, 15...) then it should be continuous. I can't think of any example of (or any reason for) a different step size.
 
As stated in the Anandtech article some games simply create a render ahead queue instead of proper triple buffering.

Actually I was thinking of testing how changing the flip queue would affect micro stutter using CF. I think the flip queue is set to 3 as default (according to the article I referenced it's because Microsoft set it to prevent Nvidia from using too high render ahead queues that caused input lag). I'm just not sure whether it's even possible to change it anymore with Radeons. Nvidia at least used to have the setting in their control panel, but flip queue was never in CCC. IIRC I could dig it up with ATT or RadeonPro though. Maybe it would help to increase it to 4 or more? I don't really mind a bit of lag in single player games if it reduces micro stutter.
 
With RadeonPro you can change it that is for sure. Cant tell if it works, so good luck.
 
Uh no, render ahead is not triple buffering.

Triple buffering gives you all the benefits of double buffering with no vsync enabled in addition to all the benefits of enabling vsync. We get smooth full frames with no tearing. These frames are swapped to the front buffer only on refresh, but they have just as little input lag as double buffering with no vsync at the start of output to the monitor.

On my VSync setups, as soon as I enabled triple buffering it was immediately apparent and input lag virtually vanished.

As the Anandtech article stated, DirectX doesn't do triple b., but render ahead, and practically all games used render ahead for their triple b. implementation. If you are seeing true triple b., then either games have finally started to support it, or you're playing with low(ish) fps.
 
From what I've read SLI configs are less prone to have it and the stronger your cards the less likely you'll notice. The worst set ups are lower to midrange Crossfire.

on my brothers rig we have 560ti's in sli and still no microstutterrig.

i think for some its a placebo effect because they hear others that supposedly have it.
 
Personally I'm more inclined to believe people who claim they don't see it are either playing 60+ FPS and possibly even vsync on. And in other cases they just assume 30 FPS is supposed to feel like stuttering crap. After all eyes tend to get used to lower FPS after a while.

But this is actually very easy to test with a game that has a built in FPS command like Crysis 2. Just up the detail level so FPS is <40 and then use sys_maxfps to set it a bit lower. Voilà, much smoother gameplay. :)
 
Personally I'm more inclined to believe people who claim they don't see it are either playing 60+ FPS and possibly even vsync on. And in other cases they just assume 30 FPS is supposed to feel like stuttering crap. After all eyes tend to get used to lower FPS after a while.

But this is actually very easy to test with a game that has a built in FPS command like Crysis 2. Just up the detail level so FPS is <40 and then use sys_maxfps to set it a bit lower. Voilà, much smoother gameplay. :)

30fps oes play like stuttering crap though on single card setups..... :confused:.
 
I've had 2 3870's in crossfire, a 4870x2, a 4870x2+ 4870 trifire, and i've honestly yet to deal with "microstuttering". Maybe I'm just a retard...
 
I've had 2 3870's in crossfire, a 4870x2, a 4870x2+ 4870 trifire, and i've honestly yet to deal with "microstuttering". Maybe I'm just a retard...

No, you just don't suffer from gullibility and placebo effects. :) The phenomena DOES exist, but it's so minor and actually exists on single-card setups as well (uneven frame times) that it is ridiculously negligible.
 
I've used 2 SLI setups now and personally I've never noticed the phenomenon known as "microstutter" either. I don't doubt that it exists for some people (seemingly more CFX people like someone else said) but I've never had a problem.
 
As the Anandtech article stated, DirectX doesn't do triple b., but render ahead, and practically all games used render ahead for their triple b. implementation. If you are seeing true triple b., then either games have finally started to support it, or you're playing with low(ish) fps.

I never said DirectX natively does triple buffering. That is why you need a program such as D3DOverrider running full time to process the triple buffering. They don't call it D3DOverrider for nothing..

If anyone is using Vsync ON without D3DOverrider, they are complete fail as there is a world of difference.
 
Much like the FXAA hack, D3D Overrider isn't 100% guaranteed to work either. Some Direct3D applications cannot be forced in this way, or other DLL hooks compete with each other resulting in one cancelling the other out (or something negative to that effect).

But D3D Overrider is pretty smart about what its hooking to. So if the application already supports triple buffering natively or is not playing nicely, D3D Overrider lets the app go about its way sans the override.

In my own personal experience, D3D Overrider and 3rd party overlays like Xfire, Raptr or Steam play together weirdly and can make something like alt-tabbing a nightmare. But that's why game exception lists exist in those apps (I just disable the offending overlay outright).

A long time ago, Nvidia used to comment in their control panel that forcing triple buffering only worked in OpenGL. I can see why this is a source of much confusion now. Why they (Nvidia) stopped mentioning that, I have no idea. Basically, D3D Overrider is the only (good) tool we've had for a long time that let us force triple buffering on in our Direct3D applications that do not natively support it. Buggy or not, it was better than nothing.

Come to think of it, does D3D Overrider work at all with a Direct3D 11 app? Unwinder hasn't updated D3D Overrider in quite some time.
 
For me the smoothest gameplay experience does come from situations where I either have VSync enabled or I'm getting 100+ fps with VSync off.

I don't really notice any input lag with VSync on.

There is still a performance dip with Triple-Buffering, your FPS will fall to 45. With Triple Buffering disabled it would fall to 30.

Same here somewhat, I find I get smoothest gameplay when I get 90+ FPS with VSync off but if I have VSync on I get input lag. Mind you this is running 2 GTX260 Core 216's in SLI. I always leave Triple-Buffering on though, anyone know if turning it off will improve performance?
 
Back
Top