Okay, I posted this over at XS-forums as well, but I'd love to generate some awareness with the [H] community as well. It seems to me that microstutter is a massively overlooked feature of multi-GPU setups, which really makes a difference to gameplay. Since [H]ardOCP have always been at the forefront of testing games for playability, rather than just benchmark numbers, I thought it might be worth a cross-post.
Microstutter, for those that don't know, is the (rather crappy) name for the irregular output of frames - usually as a result of multi-GPU setups operating in AFR mode. Since the human eye notices the gap between frames as a measure of smoothness, rather than the raw number of frames that are spit out, a game with irregular frame output will not look as smooth as one with uniform output.
It stands to reason, for example, that a game outputting two frames 0.1ms apart, with a 49.9ms gap until the next frame, would look like it was running at 20fps, even though the framerate counter would read 40fps. This is a 'worst case scenario' for a two-GPU setup.
Aaaanyway. A while back I wrote a little program to quantify microstutter from FRAPS benchmarks. You can download it here. Basically it looks at the frame-by-frame variation from the local frametime, as a percentage of the average frametime (more details in the readme). From this you can gain a "microstutter index", which you can think of as a "percentage microstutter", 100% being the scenario I described above. It also shows an "effective framerate", which is a measure of smoothness with microstutter taken into account. An equivalent framerate if there was no microstutter.
Okay, now onto some examples. First I'll show you a snapshot of 35 consecutive frametimes I took from the middle of a crysis benchmark, using my GTX480 SLI setup. Not all the scene was as bad as this, but it gives you an idea of the problem:
In terms of the microstutter index, the runs that generated the plots above, had the following results.
Single GPU (2560*1600, 4xAA):
SLI (2560*1600, 4xAA):
This isn't as bad as with my old 4870x2, but it certainly shows that microstutter is still with us.
Now, it's important to note that as the game becomes less and less GPU limited, the microstutter effect reduces significantly. For example, the above benchmark had the GPUs running at near-100% load. At 1920*1200, 8xQAA, the GPUs are in the region 85-95% load for the most part. Here the microstutter index drops significantly:
Okay, well that's my findings. What I'd love to see is some results from ATI users, with multi-GPU setups. I'd like to compare the two technologies and see how they compare. Please post any results you generate, but please make sure you're really at, or close to, 100% GPU load, otherwise you will see an artificially low microstutter index. Afterburner is a good way to check this.
My hope is that maybe, just maybe, if we can generate more awareness of this problem we can get ATI / nvidia driver teams to pay attention and do something about it. A 5% performance drop (say) would be a small price to pay for regular frame output. Of course, this will not happen until review sites start taking note of microstutter when reviewing cards!
Microstutter, for those that don't know, is the (rather crappy) name for the irregular output of frames - usually as a result of multi-GPU setups operating in AFR mode. Since the human eye notices the gap between frames as a measure of smoothness, rather than the raw number of frames that are spit out, a game with irregular frame output will not look as smooth as one with uniform output.
It stands to reason, for example, that a game outputting two frames 0.1ms apart, with a 49.9ms gap until the next frame, would look like it was running at 20fps, even though the framerate counter would read 40fps. This is a 'worst case scenario' for a two-GPU setup.
Aaaanyway. A while back I wrote a little program to quantify microstutter from FRAPS benchmarks. You can download it here. Basically it looks at the frame-by-frame variation from the local frametime, as a percentage of the average frametime (more details in the readme). From this you can gain a "microstutter index", which you can think of as a "percentage microstutter", 100% being the scenario I described above. It also shows an "effective framerate", which is a measure of smoothness with microstutter taken into account. An equivalent framerate if there was no microstutter.
Okay, now onto some examples. First I'll show you a snapshot of 35 consecutive frametimes I took from the middle of a crysis benchmark, using my GTX480 SLI setup. Not all the scene was as bad as this, but it gives you an idea of the problem:
In terms of the microstutter index, the runs that generated the plots above, had the following results.
Single GPU (2560*1600, 4xAA):
SLI (2560*1600, 4xAA):
This isn't as bad as with my old 4870x2, but it certainly shows that microstutter is still with us.
Now, it's important to note that as the game becomes less and less GPU limited, the microstutter effect reduces significantly. For example, the above benchmark had the GPUs running at near-100% load. At 1920*1200, 8xQAA, the GPUs are in the region 85-95% load for the most part. Here the microstutter index drops significantly:
Okay, well that's my findings. What I'd love to see is some results from ATI users, with multi-GPU setups. I'd like to compare the two technologies and see how they compare. Please post any results you generate, but please make sure you're really at, or close to, 100% GPU load, otherwise you will see an artificially low microstutter index. Afterburner is a good way to check this.
My hope is that maybe, just maybe, if we can generate more awareness of this problem we can get ATI / nvidia driver teams to pay attention and do something about it. A 5% performance drop (say) would be a small price to pay for regular frame output. Of course, this will not happen until review sites start taking note of microstutter when reviewing cards!
Last edited: