Micro-Stuttering And GPU Scaling In CrossFire And SLI

Final8ty

Gawd
Joined
Jun 13, 2006
Messages
1,001
We've received many emails from readers asking about the phenomenon known as micro-stuttering and what it means to multi-GPU setups in CrossFire and SLI. After running plenty of benchmarks, we're ready to weigh in on what turns out to be a real issue.

Single GPU or multi-card setup? That's a question we face every time we start a System Builder Marathon series or evaluate the worth of a flagship GPU.

Do you want a just one high-strung racehorse or a pair, trio, or quartet of draft horses? Can a team of inexpensive cards perform the work of a pricier one and still come in at a lower cumulative cost?

Welcome to Groundhog Day. Due to recurring forum questions and direct requests by our readers, we decided that it's time to go beyond the usual performance-oriented benchmarks of CrossFire- and SLI-based systems, and shed some light on the underlying principles. Frame rates in and of themselves do not necessarily translate into a high-quality experience.

Yes, we're going to tackle the issue of micro-stuttering, which seems to keep so many sensitive eyes from investing in multiple cards running in concert. We'll also look at the scaling of two, three, and four GPUs. Where is the benefit? And at what point is actual added value really realized, or is churning out high (but ultimately useless) frame rates a self-defeating exercise? As we're sure you can imagine, at some point, the pursuit of performance can become a money pit and a power hog. At the same time, we've seen multi-card setups yield incredible value.

http://www.tomshardware.co.uk/radeon-geforce-stutter-crossfire,review-32256.html
 
Very interesting article. I had read this yesterday, and something that really stood out to me is they claim that when using two different cards, of different speeds, that the faster is NOT down clocked to the lower speed? This is news to me and contrary to everything I've heard from people. I would really love to hear from anyone who has used cards with different speeds. This has me totally re-thinking my CF'd 6870's (One's a BE and one is regular).
 
Why only test Call of Juarez? In fact, why test that game at all? Pretty fail article IMO. Their testing methodology, graphs, etc.

Unrelated to the article (from the comments), but every time someone says "duel" gpu I cringe. It's dual damnit.
 
Their testing is a complete fail to begin with.

Like someone mention already.. Why use CoJ into the testing?

They should at least use some game that does not flavor any side.
 
Interesting article. I really don't notice any micro-stutter with my 2x 4870x2 and my games seem just as silky-smooth as any single-GPU setup I've seen. Their results seem to confirm that there is very little micro-stutter using triple or quad GPU setups.

I had read this yesterday, and something that really stood out to me is they claim that when using two different cards, of different speeds, that the faster is NOT down clocked to the lower speed?

Clocks, etc don't change with crossfire. When using mismatched cards in crossfire you may see a slight benefit due to that one card being faster but overall it will still be held back by the slower card just due to being assigned less work to do. A 6970+6950 will be faster than 2x 6950 but just barely.
 
I didn't really get much from that article. They did all of their testing in 1 game without showing anything for the others except the average frame-rate which tells us nothing. I've not noticed microstutter on my 6850 crossfire setup. I would like to see graphs similar to their CoJ stuff for the other games that are more balanced performance-wise and more popular. As of late, everything I have seen on Toms seems to be biased towards Nvidia (outside of their best cards for the money stuff) compared to the results from reviews here on [H]. I've almost totally quit reading Toms and depend almost solely on [H] now. Their claims just don't seem right in this article either compared to my personal crossfire experiences.
 
Indeed just testing one game is not conclusive but maybe they found that game to be the worst for the problem, so used it to show a worst case scenario.

Also they should of done tri sli.
 
I ran 8800GTS 512 in SLI for a year and never had any micro stutter. I believe it's on a game-by-game basis.
 
I have run 5850's, 6870's and unlocked 6950's in crossfire and have never noticed a issue with micro-stutter. I have noticed a occasional brief hitching, when frames drop low for a moment and recover, but generally turning on vsync tunes that out and it was not a issue that presented itself all the time. Granted I am sure the issue must exist with certain games, but I have not really noticed it.
 
They're saying that micro stuttering happens when there are noticeable dips and peaks in the FPS, but in their test the game is already running well beyond the 100+fps mark and it doesn't go below it while in x-fire or SLI.

My question is, if Vsync is enabled so the game is played out at 60FPS at all times (I always turn it on because I can't stand screen tearing without it), will you still be able to notice the micro stuttering effect?

Thanks for sharing the article OP, really helped me out understand a few things better.
 
Looks like Xfire did pretty bad. I've always been a fan of one powerful GPU over 2. But to each their own. I wonder if this is something that can actually be fixed or if it's just a side effect of dual GPUs.

Then again if you don't notice it, is it really there.
 
I've never seen this "micro-stuttering" phenomenon that everyone keeps talking about...
 
I had to add an edit to explain :); there are TWIMTBP games that are Pro AMD, this one was originally TWIMTBP, but due to the pre-release benchmark issues it got out*, but on the actual release it is very Pro-Nvidia.

*out of the Nvidia program "officially", still ended up being very favorable to their architechture.

Still, News at 11.
 
I've never seen this "micro-stuttering" phenomenon that everyone keeps talking about...

Once you know what you are looking for its hard to miss. Its not a stutter and that is what I think confuses people. The best explanation that I can think of is when I picked up a cheap 4870x2. Coming from a GTX280 in Crysis that despite the higher framerate on the x2 the game just didn't feel any smoother.

I had experinced similar issues with perceived performance on my 6950 crossfire setup. The difference there was that when that did happen and its not all of the time there is not a single gpu on the market that would have run the game at those settings comfortably.

I will also add this. Just because you don't notice it does not mean that it isn't there it seems like most people don't notice the gamebyro stutter in Oblivion, Fallout 3, and New Vegas even though everyone with a 60hz display has the stutter in those games.

I had to add an edit to explain :); there are TWIMTBP games that are Pro AMD, this one was originally TWIMTBP, but due to the pre-release benchmark issues it got out*, but on the actual release it is very Pro-Nvidia.

*out of the Nvidia program "officially", still ended up being very favorable to their architechture.

Still, News at 11.

This isn't an AMD vs Nvidia article so stop trying to turn it into one.
 
Sighs, RTFA Baba.

They did a Crossfire vs SLi article and praised SLi, and if you notice the game and resolution they chose to show us is one in which a 560ti > 6950, so it wouldn't be a surprise that SLi on the chosen game shows a more stable performance, and that is the whole basis they use about how much or little microstutter there is on each of the solutions.
 
Glanced through the article. As someone who has used both ATI + NVIDIA dual GPU solutions (5870 X2, now 580 SLI) I didn't notice microstuttering on either setup. My 5870 X2 chugged a bit in Bad Company 2 at 5870x1200 unless I turned the settings down, but not sure if that was a micro-stutter issue or just a bad fps issue (I never bothered to benchmark the frame rate). And that was the only experience I have of it. 580 SLI has no issues. Hell, I may go TriFire or Tri-SLI next time I upgrade.
 
Very interesting article. I had read this yesterday, and something that really stood out to me is they claim that when using two different cards, of different speeds, that the faster is NOT down clocked to the lower speed? This is news to me and contrary to everything I've heard from people. I would really love to hear from anyone who has used cards with different speeds. This has me totally re-thinking my CF'd 6870's (One's a BE and one is regular).

Ive never mixed different types of cards so i dont know first hand but ive always heard that the faster card downclocks to the slower cards speed as well. If this isnt happening, then the faster card completes its frame faster and sends it to the buffer and then sits idle while the slower card finishes its frame. I could see how this would be better then 2 equally powered cards cause its 50% less time the system may be waiting to send the frames out. You have a 6970 and a 6950 crossfired, the 6950 has frame 1 and 6970 has frame 2. Frame 2 obviously completes first and is sent to the buffer to wait for frame 1 to complete from the 6950, this allows 1 and 2 to come out immediately when 1 completes instead of two 6970s completing the frames right at the same time then having to be organized into the buffer in the proper sequence or only having 1 device accessing the buffer at that instant then the other. Probably a rough, hacky kind of example but I think I get the point across...
 
Very interesting article. I had read this yesterday, and something that really stood out to me is they claim that when using two different cards, of different speeds, that the faster is NOT down clocked to the lower speed? This is news to me and contrary to everything I've heard from people. I would really love to hear from anyone who has used cards with different speeds. This has me totally re-thinking my CF'd 6870's (One's a BE and one is regular).

When I ran a 6970 + unlocked 6950 the 6970 would be running at 93% usage while the 6950 was at 99%.
 
Source: the comments on the review page

alangeering 24/08/2011 12:19
Hide
-0+

I'd like to first thank Igor and Greg for a very insightful article and for

discussing the not often talked about phenomenon of stuttering.

There's one thing I'd like to expand upon.

A few times in the article the observation is made that while dual GPU scaling is

good, the stuttering effect is bad.
No real point is made that when scaling is poor, stuttering is less pronounced.

It's precisely because three cards aren’t as efficient that stuttering is reduced.

Bear with me and I'll explain.


For the following thought experiment I've used the data from the Call of Juarez

graph on the page called "Step 2: Crossfire with three GPUs"

Three situations:
A: 1 card @ 70 fps average
B: 2 cards @ 135 fps average
c: 3 cards @ 160 fps average

In other words:
A: The card takes an average of 14.3 ms to produce the frame.
B: Each card has 14.8 ms to produce the frame to maintain the average.
C: Each card has 18.8 ms to produce the frame to maintain the average.

Look again at the data from Call of Juarez.
The lowest frame rate recorded for the single card is 60fps or 16.7 ms per frame.

This is well below the 14.8 ms required to not delay/stutter the pipeline in

situation B but...
This is well within the 18.8 ms time frame for the 3 card set up in situation C.

As frames are now arriving in time for use, the evidence of stuttering is reduced.

So efficiency is good; but inefficiency in scaling allows each card a little

longer to provide its frame, and the eventual combined frame rate is less

variable.

A quote from the article:
"This phenomenon manifests itself even more seriously in CoJ. While CrossFire

scales well under load, it becomes even more susceptible to micro-stuttering."

And another:
"For some reason, the third GPU almost always eliminates micro stuttering and has

a less-pronounced effect on performance."

You got so close; it just needed another jump of statistical thinking. Efficiency

correlates with stuttering (NVIDIA and AMD) and there is a logical reason why.
alangeering 24/08/2011 12:37
Hide
-0+

The above post isn't trying to explain why microstuttering occurs - only why it's more pronounced as multi-gpu scaling increases. (and less so as scaling efficiency decreases)

And on top of that the scaling on the ATI 5xxx is less efficient than the AMD 6xxx.
 
I have used a 9800 GX2 (dual-chip card) and Crossfire 6950's over major periods of time.

If you cannot see micro-stuttering then you most likely either don't know what to look for or are just ignorant. I can't stand micro-stuttering. Single card performance just looks so much better IMO.

I will never use SLI or Crossfire again (at least until this may be a non-issue with a technology breakthrough in drivers).
 
Ive noticed that in Tri-fire I can see microstutter anytime im at 60 or under. Vsync also causes it to happen for me as well since it limits the game to 60fps. If I turn off vsync and the framerate bounces around above 60, I dont notice it.
 
Whatever. Dual GPU can cause micro stuttering and all you that claim it doesn't are FOS.
 
Yeah, if you can't see microstuttering then lucky you, but I hate when people act like it doesn't exist. It's like claiming your LCD has no input lag...
 
Some LCDs have very very small amounts of input lag that won't effect your gaming at all though.

Such LCDs like ASUS are used in international fighting game tournaments (i.e SSFIV) and those games require frame perfect timings.
 
No LCDs have input lag that will not effect your gaming. It all just depends on if you have the sensitivity to realize it or not. All major tournaments must use LCDs now and it has nothing to do with performance. The fact is it is just way to impractical to buy CRTs and use them for competitive gaming anymore due to the fact so few are made and the shipping costs are much more.
 
I'm not arguing that micro-stutter doesn't exist. I'm saying that I feel the article was very poorly done using only 1 game that isn't really an even playing field to begin with. Also- I think the effect of micro-stutter is exaggerated by most. Some people are sensitive to it, I understand that, but in general, most people won't notice it unless the game is already getting close to the playability threshold framerate-wise anyway.
 
I'm not arguing that micro-stutter doesn't exist. I'm saying that I feel the article was very poorly done using only 1 game that isn't really an even playing field to begin with. Also- I think the effect of micro-stutter is exaggerated by most. Some people are sensitive to it, I understand that, but in general, most people won't notice it unless the game is already getting close to the playability threshold framerate-wise anyway.

It doesn't have anything to do with framerate, like when I had my 4870x2 I was getting nearly 300fps steady in counterstrike source, but I always had to disable one GPU to get rid of the microstutter problems.
 
It doesn't have anything to do with framerate, like when I had my 4870x2 I was getting nearly 300fps steady in counterstrike source, but I always had to disable one GPU to get rid of the microstutter problems.

I understand that. I wasn't saying that it had anything to do with framerate, just that as far as I can tell, most people don't tend to NOTICE it until they are getting into lower framerates. I personally rarely notice any kind of micro-stutter at all.
 
I understand that. I wasn't saying that it had anything to do with framerate, just that as far as I can tell, most people don't tend to NOTICE it until they are getting into lower framerates. I personally rarely notice any kind of micro-stutter at all.

Yeah but what they are noticing at that stage is just stutter, it's not the microstutter that's been talked about here.And it was your statement about people who notice it exaggerating the problem that annoyed me. I mean how can you comment since you don't notice it at all?

It's just like phospors trails on plasma TV's some people can see them others can't see them at all. They do exist and to the people that see them plasma TV's are really annoying to watch. If we were to follow your logic then Plasma manufacturers would never work on fixing the problem because some people don't notice it and the ones that do are exaggerating the problem.

EDIT: annoyed me was a bit of a strong term, it didn't really annoy me!!
 
Three cards probably won't be the end all solution, most likely it helped in this case because it switched the bottleneck to cpu, and the FPS was high enough in any case. The stutter is caused by uneven frame rendering, and that's a combination of CPU and GPU processing, and those depend on various different things, so the combinations are almost limitless. This means some rigs stutter less, some more.

Limiting the framerate below the avarage with vsync, console commmand or 3rd party tool will alleviate the stutter, though all of these have some problems. Vsync usually varies depending on the fps you would get, so if your fps dips under 60 it'll drop it to 30 which isn't really a good solution if you'd otherwise get 59 fps. Limiting the fps in other ways still mean that the FPS isn't as high as it could be at times, and if it drops lower than the limit you'll end up with stutter regardless, just when you need it the most.

I've done quite a bit of testing on this, since I notice the stutter quite easily in some of the games I've played. Crysis for example was more playable with a single 6950 @5670x1080 than in Crossfire. The fps with single is around 20 and smooth (still too slow), while Crosfire is 30+ fps and completely unplayable.
 
Back
Top