A Way to Measure Micro-Stutter (?)

nsx241

Gawd
Joined
May 13, 2008
Messages
687
There's a great article on Techreport on the micro-stutter phenomenon.

http://techreport.com/articles.x/21516

And describes a metric that could be used to quantify it (99th percentile frame time). We all know FPS hardly captures the whole picture, so seems another metric like this could only help paint a better picture.

Interesting tidbit toward the end. Both AMD and Nvidia are working on solving this problem, though apparently Nvidia has a solution in their cards for while called frame metering.
 
Fascinating article. Props to techreport for investigating the issue properly.
 
I always thought the older ati cards had a weird micro stutter that nvidia cards didn't exhibit. This article really does show that micro stutter is a real phenomenon and that they gnu vendors are well aware of it. Also pretty evil not telling us about it either and selling their dual gpu beasts.
 
I don't remember it being an issue in every game, but the HD3870 microstutter in FEAR was a damn outrage even with a single card. It made 70-80fps feel like 15. As soon as I upgraded to a 512MB HD4870 the stutter was completely gone, even at twice the detail level.
 
It's a great article. Saw it on /. and came here to see if it had been posted yet. Looks like it has.
 
Interesting article for sure, but there are a few things that still leave some questions in my mind.

-What they identify as "Micro-stutter" seems to be completely different from game to game with single-GPU cards regularly showing the same problem. In some games single-GPU cards appear to show the problem even worse. The author didn't seem to know why this was the case or even what was really going on.

-I'm not entirely convinced that fraps would be the best way to measure this issue. That isn't taking into account what impact running fraps itself might have on the results, and even the author admits that it ignores additional processing that happens farther down the line.

-The author admits he didn't notice stutter while actually playing a game using one of the configs that supposedly measured among the worst:

The worst example we saw in our testing alternated between roughly six and twenty milliseconds, but it didn't jump out at me as a problem during our original testing. Just now, I fired up Bad Company 2 on a pair of Radeon HD 6870s with the latest Catalyst 11.8 drivers. Fraps measures the same degree of jitter we saw initially, but try as I might, I can't see the problem.
 
I know that on my 5870 micro-stutter always seemed to be related to my refresh rates.
For instance, my TV's refresh rate is 60hz, but sometimes a game would be slightly off if there was no option to change it.
I know Half-Life 2 was a horrible offender, but with a console command line tweak or two it could be fixed. Ditto with Far Cry 2. The real problem was with games that had no option to change it or lock it.
 
Everywhere else benchmarks 6870 CF above GTX580 but here with individual frame analyzed one can see how half the frames produced by 6870 CF have frame times worse than a those from a GTX580. This pretty much confirms micro-stutter is real.
 
Everywhere else benchmarks 6870 CF above GTX580 but here with individual frame analyzed one can see how half the frames produced by 6870 CF have frame times worse than a those from a GTX580. This pretty much confirms micro-stutter is real.

Except for the part where the author used that exact setup and didn't notice any problem during actual gameplay? I would say the only thing it "confirms" is that this issue is complicated at best.

Article said:
The worst example we saw in our testing alternated between roughly six and twenty milliseconds, but it didn't jump out at me as a problem during our original testing. Just now, I fired up Bad Company 2 on a pair of Radeon HD 6870s with the latest Catalyst 11.8 drivers. Fraps measures the same degree of jitter we saw initially, but try as I might, I can't see the problem.
 
The HD6870CF and GTX560Ti are clearly the worst case scenarios.
What puzzles me though is the considerable difference between HD6950CF and HD6970CF when by rights they should be almost identical.
 
Except for the part where the author used that exact setup and didn't notice any problem during actual gameplay? I would say the only thing it "confirms" is that this issue is complicated at best.

Because even at 20ms the perceived fps is 50fps which is still quite acceptable. The problem is the crossfire setup did not provide any visible performance advantage over a single card despite the higher average fps.
 
Indeed, it's the 50ms instances that become noticeable, and there were some of those, just not that many.
 
Definitely a real issue. I do wonder if alot of those differences that appear to be quite pronounced on the graph really may not matter as much to the eye.

This does lead credence to the issue that microstuttering seems to be quite evident once you start to dip below 45 FPS with a multi-gpu setup.
 
That's one thing I've always liked about [H]ard; in addition to showing min FPS they show FPS over time in actual game play. I like to see how often a card dips low and how steady it may keep FPS.
 
to be fair though, the techreport article goes further in depth, because hardOCP only sample to the nearest second, whereas techreport are now considering to the nearest frame. As the article states, such discrepancies don't show up when sampling by the second.
 
The TR article is a great read, I recommend it for everyone I can!

I've done quite a few graphs myself, and I find the visually easiest graph for me personally is a short take (500-1000 frames) of gameplay turned to FPS equivalent numbers, like below.




The upper one is with a single 6950, lower with CrossFire. Unfortunately this was taken before I had installed the newest CAP, which increased CF scaling considerably, but introduced a considerably worse stutter than what you can see in this picture, which is quite mild in fact, in the bad case the whole graph is one big blue mess instead of a neat line. The data was captured with FRAPS frametime recording, and the frametime differences (in ms) converted back to FPS numbers.

-What they identify as "Micro-stutter" seems to be completely different from game to game with single-GPU cards regularly showing the same problem. In some games single-GPU cards appear to show the problem even worse. The author didn't seem to know why this was the case or even what was really going on.

Stutter is stutter whatever causes it. In the end it's about frames not being output evenly. AFR system in SLI/CF causes it, but some game engines handle frames badly with just one GPU. FC2 for example, and according to the article SC2 too. It's dependant on everything that has to do with the frame being output to the display, CPU, GPU, RAM etc. That's why it's so hard to test, pinpoint and solve.

-I'm not entirely convinced that fraps would be the best way to measure this issue. That isn't taking into account what impact running fraps itself might have on the results, and even the author admits that it ignores additional processing that happens farther down the line.

As stated in the article, Nvidia claims they do some extra processing after FRAPS has measured the frametime, and this supposedly helps alleviate the stutter. I'm not sure if I believe this though, but I haven't done any tests with SLI. CF for sure follows the FRAPS measurements perfectly, if it shows in the graph, I can see it in the game.
 
Last edited:
Back
Top