Microstutter in latest-generation cards (GTX480 examples inside)

This simply isn't true. Games can't get away with fluid motion at 30FPS under most circumstances (contrast dependent. Only extremely low contrast motion will be acceptable at 30FPS).

Movies can do it at 24FPS because film captures motion blur. Every frame is exposed for 1/24th of a second, which means you have thousands of samples over that short span of time composited together into a single frame. This creates a higher effective framerate, smooths the transitions between frames, and creates fluid motion.

Games do not have this luxury. To get the same type of motion blur in a game, you would have to render thousands of frames every 1/24th of a second, then composite all those into a single frame and send it to the monitor. The game would have to run at thousands of frames per second in order to make 24FPS or 30FPS look as smooth as a film.

What games do to avoid this huge problem is take advantage of Persistence of Vision. They run at a high enough framerate, while presenting every single frame, that the human visual system merges the frames together and creates fluid motion. 60FPS is about the minimum for this effect to become solidly fluid under most circumstances. The larger the change in contrast between two frames, the higher the framerate must be to maintain fluid motion using this method (yes, those 120Hz monitors will make motion smoother under certain circumstances if your framrate is high enough to take advantage of them).

Game console hardware is too slow to reliably maintain 60FPS, so those games are locked at 30FPS in order to prevent stuttering due to framerate deceleration. They also tend to enable V-Sync to eliminate image tearing, which means if the game can't maintain 60FPS, it's forced directly to 30FPS. Triple buffering would allow some intermediate framerates, but consoles don't have the video RAM to spare for that, and it would re-introduce the framerate deceleration problem.

Without that 30FPS cap, the framerate would fluctuate up and down depending upon what's being rendered. As the framerate is forced to decelerate from 60FPS to 30FPS, there's a sudden jolt as the time between frames changes from 16ms to 33ms. This jolt is apparent to players, so they cap the framerate low enough that such fluctuations are avoided. Unfortunately, this also means motion isn't as smooth due to the low framrate.

If console hardware were capable of outputting 60 FPS consistently in the titles you mention, believe me, they wouldn't be capping it. It all comes down to hardware limitations.

well it's sort of pointless getting very technical about how liquid 30 fps is. my point is that the frame rate provides a smooth gaming experience. of course 60 fps is a faster, more precise frame rate, so it's always a treat to be running at a locked 60 fps. but in no way should 30 fps be the stuttery choppy mess that mutli-gpu solutions exhibit so often.
 
In my experience, multiGPU setups only exhibit any kind of "stutter" if they are taxed to the max, and then some.


Sort of like running Crysis, maxed at 1080p, on SLI 9600gt cards. They cannot keep up, and frames are no long in sync with each other in terms of how long each one took to render.

Just my take on it.
 
well it's sort of pointless getting very technical about how liquid 30 fps is.
This entire thread is all about being technical...

Unless the difference between frames is very low contrast, 30FPS isn't enough for fluid motion (unless it has true motion blur, which isn't possible in games).

but in no way should 30 fps be the stuttery choppy mess that mutli-gpu solutions exhibit so often.
An effective framerate (after compensating for microstutter) of 30FPS from a multi-GPU setup looks identical to 30FPS from a single-GPU. (They're both a stuttery mess for high-contrast motion).

If you failed to correct for microstutter, and instead were using the inaccurate framrate counter reading of 30FPS, then you need to go back and recheck your numbers. The framrate counter doesn't give a correct reading on multi-GPU systems, so if you attempt to cap the framrate based on this value you'll cap the multi-GPU system at the wrong framrate. You're forcing it to run slower than the single-GPU system because the framerate counter doesn't know it's reading incorrectly.
 
yes there is no frame rate variance with vsync on and fps of 60. but is this really fair? why should multi-gpu solutions require a frame rate of 60 when a single gpu can happily run at 30 fps to give a smooth gameplay experience? if this is not the definition of being ripped off, then i don't know what is.

Huh?:confused: Require 60 FPS? That's the whole point of a multi-GPU solution, to give better frame rates. So when a multi-GPU delivers the very thing it's supposed to and doesn't exhibit microstutter that's a rip off? This makes absolutely no sense.

You keep harping on this idea that 30 FPS is the Holy Grail of gaming and this simply isn't the case for all games and all gamers. It's all a matter of situation and perception just like microstutter.
 
Huh?:confused: Require 60 FPS? That's the whole point of a multi-GPU solution, to give better frame rates. So when a multi-GPU delivers the very thing it's supposed to and doesn't exhibit microstutter that's a rip off? This makes absolutely no sense.

You keep harping on this idea that 30 FPS is the Holy Grail of gaming and this simply isn't the case for all games and all gamers. It's all a matter of situation and perception just like microstutter.

you must have missed this quote;

but where it really counts is when you are playing the latest game and your single gpu cannot max it out. you are hurting at 18 frames per second. you decide to add another gpu in the mix and now you are at 30 frames per second but the gameplay is not smoother than the single card setup at 18 fps.

i would trade 60 fps for 30 fps for much better graphical fidelity and so would you.
 
This entire thread is all about being technical...

Unless the difference between frames is very low contrast, 30FPS isn't enough for fluid motion (unless it has true motion blur, which isn't possible in games).


An effective framerate (after compensating for microstutter) of 30FPS from a multi-GPU setup looks identical to 30FPS from a single-GPU. (They're both a stuttery mess for high-contrast motion).

If you failed to correct for microstutter, and instead were using the inaccurate framrate counter reading of 30FPS, then you need to go back and recheck your numbers. The framrate counter doesn't give a correct reading on multi-GPU systems, so if you attempt to cap the framrate based on this value you'll cap the multi-GPU system at the wrong framrate. You're forcing it to run slower than the single-GPU system because the framerate counter doesn't know it's reading incorrectly.

completely losing you here. what does contrast of the image have anything to do with frame rate? then you are saying that FRAPS is not an accurate frame counter or that there is no way to measure frame rates in multi-gpu solutions......

i will give you a quick example of when i first discovered micro stutter. it was in 2008 when i had my first SLI system, 8800gtx sli. i loaded up crysis with sli enabled. i was getting 35-40 fps in a particular spot and it did not feel like these frame rates. really pissed me off. i decided to disable sli and what do you know? the single gpu ran the game as smooth or even a tad smoother than sli. the only thing sli provided over the single gpu was a smoother mouse movement while in-game, for example, aiming the gun sights and looking around, etc. the contrast of the image did not change. the frame counter was accurate in both gpu modes and even using FRAPS and the built-in frame counter within crysis console.
 
what does contrast of the image have anything to do with frame rate?
It has everything to do with framrate...if you want fluid motion, anyway.

The sensitivity of human vision is contrast based. The higher the contrast of motion, the higher the framerate needs to be in order to maintain the appearance of fluid motion.

"Contrast" in this case means the difference between one frame and the next, not the dynamic range of the image.

then you are saying that FRAPS is not an accurate frame counter or that there is no way to measure frame rates in multi-gpu solutions......
I said the in-game framerate counter will not report the correct framerate on a multi-GPU system, and that in-game framerate caps will not work correctly either. A cap of 30FPS will cap a single-GPU system at 30FPS, but will cap a multi-GPU system at 18 to 27 FPS (depending on how bad the microstutter is). You'll need to set the cap higher in order to cap the multi-GPU system at the correct framrate.
 
This thread made me depressed. I game on 2560x1600, and I have a single 480, which I was planning on SLI-ing later.

Should I?
 
This thread made me depressed. I game on 2560x1600, and I have a single 480, which I was planning on SLI-ing later.

Should I?

Microstutter isn't something you should worry about. You're going to be better off with two GPUs than one. Especially at 2560. All microstutter does is occasionally reduce the benefit you're getting from it. The big myth about microstutter is that you can "see" it. This is perpetuated by people who have other kinds of stuttering that is anything but micro.

Theoretical numbers here from theoretical frame rate counter in some game:

Single GPU says you're getting 40 fps. It "feels" like 40fps.
Multi-GPU says you're getting 80fps. It "feels" like 60fps.

Because the "stuttering" is on a frame-to-frame basis and is relatively consistent, there's no visible hanging or lagging. You're simply getting a high frame rate that feels like a lower frame rate.

It's still way better than trying to game at 2560 on a single GPU though. You lose nothing, you just gain less in certain micro-stutter prone games.
 
This thread made me depressed. I game on 2560x1600, and I have a single 480, which I was planning on SLI-ing later.

Should I?

Yes, you should :)

As NKDietrich pointed out, adding a second GPU still improves real-world performance in almost all circumstances.

First, consider GPU-limited circumstances (which will be most of the time in demanding games at 2560 res). Adding a second GPU will generally net you a 70 to 90% increase in framerate. Microstutter will generally "eat away" 10 to 30% of performance. So, in practice you're still getting a fairly large bump in real-world performance.

In CPU-limited scenes (and there will be more of these once you add a second GPU), microstutter all-but disappears as the GPU output syncs to that of the CPU, which is regular.

Also, if you enable vsync, this will remove microstutter in all but the 'edge' cases (that is, where you are very close to the limit of switching from one frame-mode to another). As each GPU must wait until the framebuffer is cleared before rendering the next frame, the "one GPU output catching up the other" effect, which leads to microstutter in non-vsync AFR mode, will disappear. Each GPU will begin its frame rendering workload at a time specified by the monitor, and so will become regular.


In short: Go ahead and add a second GPU. So long as you are aware that the real-world scaling you see from adding a second GPU won't always be the same as you would expect from benchmarks, you'll be just fine.
 
Micro-fluctuations. Stutter is missing frames or dropping to zero fps.

This whole calc is erroneous.

Um... :confused:

The name is not representative, I agree, and micro-fluctuations is a better way of describing the effect. I said in my original post that microstutter was a crappy name for the effect, as it is not really stuttering.

But micro-fluctuation is precisely what I am measuring (relative deviation from a localised average frametime), and is exactly what will result in an apparent reduction of game smoothness, for a given framerate.
 
In my experience, multiGPU setups only exhibit any kind of "stutter" if they are taxed to the max, and then some.


Sort of like running Crysis, maxed at 1080p, on SLI 9600gt cards. They cannot keep up, and frames are no long in sync with each other in terms of how long each one took to render.

Just my take on it.

I agree entirely. I get annoyed as hell by that stutter, so I tweak my settings so it won't happen. Like in COD4 at 5760x1200, I set max frames to 90, and my game is butter smooth, because I have gpu overhead to survive those taxing scenes. The only time I'll experience any stutter is when all my gpu's are maxed out under 60 fps. Which even with 480 tri sli, will happen. All the smoke and grass and particulates can bring any system to its knees. The trick is to know this and set each game up accordingly. Sometimes turning off settings like HBAO in BC2.
 
to prove that micro stutter can only be eliminated by achieving frame rates that match or exceed your monitor's refresh rate (and most likely this is a hertz of 60) you will need 60fps or greater and lock that in with vsync.

this video i did shows a 30fps run (that plays like 15 fps due to micro stutter) and then i intentionally lower the graphic fidelity to raise the fps to 60 or greater. since vsync is locked in, the frame intervals are matched up and micro stutters dissapear.

http://www.youtube.com/watch?v=S9-R3nZ9SlI

just remember my video was shot at 30 fps on an HD camcorder. so you can't even see the 60 fps that is shown but you sure can visually identify how the stutters dissapeared.

is this fair, to demand a multi-gpu solution to achieve such high frame rates in today's games and the future of tomorrows' games? i don't think so, especially when a frame rate of 30 should prvovide a solid gameplay experience, as i already described.
 
Last edited:
i don't think so, especially when a frame rate of 30 should prvovide a solid gameplay experience, as i already described.

Once again this notion that 30 FPS might be true from you but not for EVERYONE. 30 FPS is okay in some situations, in others it's not.

This whole subject matter is entirely subjective. If single monitor 30 FPS gaming works for you, awesome. All I know is that my SLI rigs have gotten better and better over the 5 years I've been using SLI. Does microstutter exist, yes, people also observe it with single GPU setups as well.

Multi-GPU isn't perfect but for me the benefits FAR exceed the problems. That's what most multi-GPUs have concluded and I'll be buying three of the 480's sucessor.

Until one sees it IN PERSON PLAYING A REAL GAME can one conclude how much of an issue microstutter will be for them. You can't see it in videos and charts.
 
Would anyone here vehemently OPPOSE me in wanting to get a second 480 for 2560x1600?
 
Would anyone here vehemently OPPOSE me in wanting to get a second 480 for 2560x1600?

I would vehemently SUPPORT you!:D Really no amount of talk or charts can say if you'll like SLI but really, look at the repeat buyers and the money people spend on these rigs, my sig rig set me back $5000, this is my 2nd 3x SLI rig. Digitalcaveman just put together an even more expensive setup with 3 30" monitors; no single card can drive that anyway microstutter notwithstanding.

The cost of these rigs and the fact that people buy this stuff over and over tells me more than this thread ever could.
 
I want to get it, but I'm on an older chipset, (mobo is p5b deluxe) and I dont know if it'll do SLI well with the new 480 cards.

How much do used/secondhand 480s go for now? I think the cheapest I've seen a few months ago were $375.

Also, I have my 480 overclocked (825 core), should I delete the profiles before installing the second 480 so it won't load the the 825mhz (and potentially damage the second card)?
 
If the frames per second generated by the card don't match a division of the monitors native refresh rate then you are going to get uneven motion on screen. If you're not synching to the monitor AND you're getting variations in the framerate then it's going to be even worse than being off by a constant amount.

I agree that "microstutter" exists, but I think you're making a mountain out of a mole hill - any person who's capable of understanding the issue should also be capable of dealing with it. Sadly you can't always expect a better experience from adding a second card, and you showed that again.

Have you ever tried forcing a refresh rate of 30Hz and enabling vsync? That might be a good solution for you when your system isn't capable of maintaining more than 60FPS.

Also, not all reviews are bought and paid for... I see things like this mentioned quite often [H]ere
We are also experiencing much smoother performance with more consistent framerates, the gameplay isn’t as choppy as it was before.
http://www.hardocp.com/article/2010/09/16/nv_gtx_460_1gb_sli_vs_ati_hd_5870_cfx_redux
 
You should be fine. You'll probably want to retest for both cards at that speed just like you did I would assume as you did for a single card and redo your profiles.
 
If the frames per second generated by the card don't match a division of the monitors native refresh rate then you are going to get uneven motion on screen. If you're not synching to the monitor AND you're getting variations in the framerate then it's going to be even worse than being off by a constant amount.

I agree that "microstutter" exists, but I think you're making a mountain out of a mole hill - any person who's capable of understanding the issue should also be capable of dealing with it. Sadly you can't always expect a better experience from adding a second card, and you showed that again.

Have you ever tried forcing a refresh rate of 30Hz and enabling vsync? That might be a good solution for you when your system isn't capable of maintaining more than 60FPS.

Also, not all reviews are bought and paid for... I see things like this mentioned quite often [H]ere

http://www.hardocp.com/article/2010/09/16/nv_gtx_460_1gb_sli_vs_ati_hd_5870_cfx_redux

i tried but it would never work. driver said monitor does not support it. it was one of the "ideas" i had that seemed like it would be a "fix" because the frames will be vsynced and locked at that. but looking back at this i don't think any monitor supports running at 30 hertz.

and yeah i agree that all these review sites are bought out and simply scared to confront the micro stutter issue because they fear they will upset the gpu makers and no longer be sent free cards to review. look at this review guy on youtube. he is replying to a question about micro stutter. he first acknowledges that is a problem but "he" does not have micro stutter on "any" games that he plays, LOL! so mis-leading. i wonder what games HE plays.

http://www.youtube.com/watch?v=jT1KDoX7i_Y
 
No microstutter so far here whether SLI or Single. Went through about nine GTX 480's, EVGA, MSI, PNY, Galaxy, Gigabyte, Asus. All reference.

Had a GTX 295 in the past, and the microstutter was terrible on it.
 
i tried but it would never work. driver said monitor does not support it. it was one of the "ideas" i had that seemed like it would be a "fix" because the frames will be vsynced and locked at that. but looking back at this i don't think any monitor supports running at 30 hertz.

and yeah i agree that all these review sites are bought out and simply scared to confront the micro stutter issue because they fear they will upset the gpu makers and no longer be sent free cards to review. look at this review guy on youtube. he is replying to a question about micro stutter. he first acknowledges that is a problem but "he" does not have micro stutter on "any" games that he plays, LOL! so mis-leading. i wonder what games HE plays.

http://www.youtube.com/watch?v=jT1KDoX7i_Y



If you can connect with HDMI try and check if it offers you the option of 1080i for example, sure 1080 could be a smaller res than you intend, but the interlaced version is ran at 30hz (and that would be the solution for my setup if i wanted to, were i can't choose 60hz in a "normal" way but i can put it interlaced and vsync it with HDMI)
 
If you can connect with HDMI try and check if it offers you the option of 1080i for example, sure 1080 could be a smaller res than you intend, but the interlaced version is ran at 30hz (and that would be the solution for my setup if i wanted to, were i can't choose 60hz in a "normal" way but i can put it interlaced and vsync it with HDMI)

i no longer have a multi-gpu solution or the triple monitor setup. i went to a single 1080p monitor and single gtx480.
 
Would anyone here vehemently OPPOSE me in wanting to get a second 480 for 2560x1600?

If you can maintain 60 frames with Vsync on then it's smooth as butter. But drop below 50 fps and it's a stutter fest. I disabled my sli for 2560X1600 and couldn't be happier.
 
That is my experience. With EF and CF if I had less than 60 or so FPS (on all games I was playing) the stuttery-ness was awful. So bad I swapped my 5970 for a 5870, the average FPS went down but the experience was much better.

Would anyone here vehemently OPPOSE me in wanting to get a second 480 for 2560x1600?

Nope but (if at all possible) try before you buy. My 5870 was my mates, I sold my 5970 he kept his other 5870. We are now both happy with a single card. just note though he latter picked up and OC'd 5970 4GB and that was great, cos it could put out high enough frames that micro stutter was noticeable... pity it died on him! So yeah some people don't notice it but once you do notice it you have to get rid of it cos it is really really enjoyment killing.
 
Last edited:
and yeah i agree that all these review sites are bought out and simply scared to confront the micro stutter issue because they fear they will upset the gpu makers and no longer be sent free cards to review.

:rolleyes:


Micro Stutter is generally not an issue unless you are trying to play games that your single card has trouble running. i can *cause* MicroStutter or make it completely unnoticeable - depending on the setting and resolution chosen.

Mult-GPU is not perfect, but it allows the user to select a higher resolution and especially higher detail levels and more filtering than with a single GPU. i have run CF since 2900XT days and MS is a non issue for most of us - especially those of us who know how to set up our PC
:cool:
 
:rolleyes:


Micro Stutter is generally not an issue unless you are trying to play games that your single card has trouble running. i can *cause* MicroStutter or make it completely unnoticeable - depending on the setting and resolution chosen.

Mult-GPU is not perfect, but it allows the user to select a higher resolution and especially higher detail levels and more filtering than with a single GPU. i have run CF since 2900XT days and MS is a non issue for most of us - especially those of us who know how to set up our PC
:cool:

your post is full of contradictions.

first you state that MS is not an issue unless you don't play with settings the single gpu can't handle. then you state multi-gpu allows you to play with higher settings.......
 
your post is full of contradictions.

first you state that MS is not an issue unless you don't play with settings the single gpu can't handle. then you state multi-gpu allows you to play with higher settings.......

Perhaps they are apparent contradictions to you. Let me give you an example.

If you are trying to play Crysis on 'Very High' with a single HD 5870 you will have a pretty bad experience at 2560x1600. Simply throwing HD 5870 CF at the same settings will give you a stuttery mess

Now if you play Crysis on 'very high' with a single HD 5870 at 1920x1200, you will have a fair experience. Adding a second HD 5870 in CF will turn this experience into a very playable one without noticeable MS and you will be able to add more AA
;)
 
Perhaps they are apparent contradictions to you. Let me give you an example.

If you are trying to play Crysis on 'Very High' with a single HD 5870 you will have a pretty bad experience at 2560x1600. Simply throwing HD 5870 CF at the same settings will give you a stuttery mess

Now if you play Crysis on 'very high' with a single HD 5870 at 1920x1200, you will have a fair experience. Adding a second HD 5870 in CF will turn this experience into a very playable one without noticeable MS and you will be able to add more AA
;)

definition of "Ripped Off"; A product or service that is overpriced or of poor quality.

example; the 2nd gpu you added to your system does not produce smoother gameplay than a single gpu solution.
 
definition of "Ripped Off"; A product or service that is overpriced or of poor quality.

example; the 2nd gpu you added to your system does not produce smoother gameplay than a single gpu solution.
So what experience do you have with Multi-GPU to blanket condemn it as a "rip off" ? i have been using CrossFire since the 2900XT days; i have experience with HD 4870-X3 TriFire and i am gaming currently on HD 5870 CF and i also have GTS 450 SLI and most recently (yesterday) GTX 480 SLI.
:rolleyes:

Adding the second GPU will generally increase your framerates while allowing you to add more detail or take the resolution up.
--SLI and CF allow you to take your gaming experience to a new and higher level; as long as you understand the limitations.
:cool:
 
Last edited:
So what experience do you have with Multi-GPU to blanket condemn it as a "rip off" ?

SLI and CF allow you to take your gaming experience to a new and higher level; as long as you understand the limitations.
:cool:

i posted video proof of micro stutter and how it is not smoother than single gpu solution in this thread. see for yourself. the requirement of turning down graphical settings to eliminate micro stutter is kind of an "oxymoron". you don't add more gpu power to turn down the graphics settings.
 
Whatever micro-stutter is, I've never experienced the phenomenon and my eyes are quite sensitive to any miscellaneous anomalies. This has been my experience in single, CF or SLI configurations.

From my perspective, considering the 100s of articles and posts I've read regarding the subject, I find that there are way too many factors absent for me to even consider micro-stutter a truely plausable, identically re-creatable event. Until proven otherwise, this is how I view micro-stutter.
 
Whatever micro-stutter is, I've never experienced the phenomenon and my eyes are quite sensitive to any miscellaneous anomalies. This has been my experience in single, CF or SLI configurations.

From my perspective, considering the 100s of articles and posts I've read regarding the subject, I find that there are way too many factors absent for me to even consider micro-stutter a truely plausable, identically re-creatable event. Until proven otherwise, this is how I view micro-stutter.

what games do you play and at what frame rate?
 
i posted video proof of micro stutter and how it is not smoother than single gpu solution in this thread. see for yourself. the requirement of turning down graphical settings to eliminate micro stutter is kind of an "oxymoron". you don't add more gpu power to turn down the graphics settings.

Really? "proof" .. where?
:confused:

Not in your first post. You have proof of nothing.
:rolleyes:

And you seem to have no understanding of what i am saying >> You add more GPU power to turn UP setting.

However, you *will* get MS if you exceed the capabilities of the two cards just like you will get crap results if you overload your single card.
 
Well I have a single GPU setup. I was griping about Gothic 4 and the 'stuttering', yet 30-50 FPS I am getting with fraps. I used your program and it says I have an apparent frame rate of 11 with a average micro stutter index of 10%

Again, with a single GPU.
 
Interesting, I ran this on my 5970 playing Civ5 for little while and did some regular gameplay.

These are the results

capturefj.jpg


I'm not sure a deviation of 8% is very significant.

How are you calculating this exactly, how do you work out what the average is for any given time? The reason I ask is because the frame rate can change drastically in game depending on what load the GPU is under, for example sudden effects like explosions which contain many particles, smoke, shader effects and which make changes to the environment can cause sudden increase in frame render time but for only short periods.

You can also have sudden increase/decrease in average over a long time, in Civ5 if I zoom in my average stays high because there isn't much on screen, if I zoom right out the frame rate might half for the duration.

I'm not convinced this sort of stats are really applicable to gaming, if games rendered with a more steady average like if you stand still and nothing is changing in the scene then this might be more accurate. But as a lot changes in the scene even local averages can inaccrate.

There's simply no way for the application to tell what is causing any given difference in frame time, if it's the game varying the load on the GPUs or the GPUs being out of sync, and because the games load on the GPU is dependant on player action we'd need some kind of common benchmarking guidelines across people using the tool.
 
Last edited:
v-sync is a viable option to control microstutter but it is a difficult pill to swallow for someone whom has spent a small fortune on video cards. They want to see their framerates at 100+, even if it is for an instant.
 
Back
Top