Poll: Micro stutter

Do you notice micro stutter.


  • Total voters
    118

Gorankar

[H]F Junkie
Joined
Jul 19, 2000
Messages
11,107
How many of you noticed micro stutter on multi gpu setups?

We have had SLI and Crossfire for a while now. A lot of people have used multi gpu setups around here. Nv's 6800, 7800, 7900, 8800, 9800, and now 200 series. And Ati's 1900, 2900, 3800, and now 4800 series. I have almost never heard micro stutter talked about since the 6800 series launched. Suddenly it seems to have become a big deal. Just wondering why, and who really notices it.

I have extensively used Voodoo 2, Voodoo 5, 6800gt, 7800gt, and 8800gts multi gpu setups and never noticed it.
 
I didn't notice it on my 8800GTXs or on my current GTX280s.
I do play at 1920x1200 with Vsync and Triple buffering enabled though. :)
 
I didn't notice it on my 8800GTXs or on my current GTX280s.
I do play at 1920x1200 with Vsync and Triple buffering enabled though. :)
I though sli had issues with triple buffering, also how do you actually enable triple buffering in directx games?
 
When I'm playing Bioshock on my SLI setup, every 5 minutes or so it will hickup for split second, like a 1/10 second or shorter freeze. Is that Micro stuttering?

I did not notice it in crysis, with vsync on.

So what do people think about vsync and trip buffering on multi-gpu, on/on off/off off/on what?
 
When I'm playing Bioshock on my SLI setup, every 5 minutes or so it will hickup for split second, like a 1/10 second or shorter freeze. Is that Micro stuttering?

I did not notice it in crysis, with vsync on.

So what do people think about vsync and trip buffering on multi-gpu, on/on off/off off/on what?
Im not really sure if there are issues with triple buffering in SLI or not but I have heard that. also turning on triple buffering in drivers only affects opengl not directx games. the only way to force triple buffering in directx games is to use a third party tool.
 
I never had the problem on any of the SLI configurations I've had except my 9800GX2 setup. I've never had the problems with my Crossfire configurations though I've had far less of those than I have SLI systems.
 
When I'm playing Bioshock on my SLI setup, every 5 minutes or so it will hickup for split second, like a 1/10 second or shorter freeze. Is that Micro stuttering?

I did not notice it in crysis, with vsync on.

So what do people think about vsync and trip buffering on multi-gpu, on/on off/off off/on what?

My understanding of microstutter, which someone will correct if I am wrong, is that its a continuous judder effect, so that even though you may be getting high frames the overall experience is not very smooth.

This video is probably the best demonstration of it given that the video feeds for all the different cards are shown side by side:

http://www.youtube.com/watch?v=DYnXxI1UjxE
 
I though sli had issues with triple buffering, also how do you actually enable triple buffering in directx games?
I enable it in Control Panel. I also set the Texture filtering-Quality to High quality.
 
I enable it in Control Panel. I also set the Texture filtering-Quality to High quality.
lol. thats what I thought. if you enable it in the control panel then you arent actually using triple buffering in DX games. enabling it there ONLY affects OpenGL games. you have to use third party tool to force triple buffering for DX games.
 
nHancer also shows it enabled. nHancer does have a force Triple buffer for OpenGL though.
 
nHancer also shows it enabled. nHancer does have a force Triple buffer for OpenGL though.
Im not sure about nHancer since I dont use it but ...Enabling Triple Buffering for OpenGL-based games such as Doom 3, Quake 4, Prey or ET:QW is very simple - go to your graphics card's control panel and enable it from there. However enabling Triple Buffering for Direct3D-based games - the bulk of modern games - actually requires more effort. You will need to use a utility called DXTweaker or D3DOverrider (which comes with RivaTuner) for Nvidia graphics cards - see the Advanced Tweaking section of my Nvidia Forceware Tweak Guide; or use ATI Tray Tools as covered under the Advanced Tweaking section of my ATI Catalyst Tweak Guide. Note that SLI users may have problems enabling Triple Buffering and will need to experiment with various SLI modes to get it to work properly.

http://www.tweakguides.com/Graphics_10.html
 
uh is there where you start shaking in one place for a brief split second then go back to normal?


anyway

i used to have a decent rig:
m2n sli
2 x 8800gt
700watt xvs ultra

and i had that problem and was so fustrated.. i saw a 1k jump in 3dmark but when playing source engine games.. no fps boost and i would get the stutering..


anyway i jumped to conclusion and sold the 8800gt for 80$ and then found out about this news
 
Im not sure about nHancer since I dont use it but ...Enabling Triple Buffering for OpenGL-based games such as Doom 3, Quake 4, Prey or ET:QW is very simple - go to your graphics card's control panel and enable it from there. However enabling Triple Buffering for Direct3D-based games - the bulk of modern games - actually requires more effort. You will need to use a utility called DXTweaker or D3DOverrider (which comes with RivaTuner) for Nvidia graphics cards - see the Advanced Tweaking section of my Nvidia Forceware Tweak Guide; or use ATI Tray Tools as covered under the Advanced Tweaking section of my ATI Catalyst Tweak Guide. Note that SLI users may have problems enabling Triple Buffering and will need to experiment with various SLI modes to get it to work properly.

http://www.tweakguides.com/Graphics_10.html
Yep you were correct. I must have been thinking it still enabled it like in older FW versions.
 
When I play games, I don't load fraps and change the settings so that I will get XXfps in that game but I will change the settings so that I can play the game smoothly. If it doesn't play smoothly then I would lower the settings. If it plays very well then I would try to use a higher setting. That's why in games like Crysis, I can still play even when the fps is just around 30fps and in some other games I need more than 50fps. After using several multi GPU setups myself, I can say that adding another card would most of the time let me use a higher settings.
 
It's a tough issue, as I think many people don't understand what micro-stuttering really is (and often confuse it with regular stuttering and hiccups).

Personally I've never used a multi-gpu setup, however I had a broken nvidia driver on my 8800GT once which would, oddly enough, cause micro-stuttering in Oblivion.

The best way to observe it is when panning the screen horizontally, and focusing on some object in the foreground (eg, trees or grass in Oblivion).

What you see is non uniform motion. If the frame-times remain consistent, every frame your reference object should move the same amount (assuming you use a steady pan). When microstuttering is happening, you will notice non uniform/ "Jerky" movement (fast then slow, fast then slow, , different speeds etc). This gives the appearance that it's "pausing" many times, very fast, ie. stuttering. What is worth noting is that the average movement (and thus framerate) is fine, but when you watch closely it's not moving the same amount between frames.

It was hugely distracting in Oblivion when that particular driver was messing up (some sort of v-sync bug I'd imagine), but in that specific case I belief the frame-times were much more erratic (rather then the steady back and forth you some observe with dual GPU solutions) so I am really curious to see how significant an issue it is in person, with crossfire or SLI.

If it's as big a problem as people say, then it might be worth it to come up with some sort of V-sync-like feature for Syncing AFR. (So that the second GPU starts it's rendering when the first is half way done it's frame, that way they are always evenly spaced).

As we've gone multi-core on the desktop, multi-gpu is likely the future of graphics card too, so this issue might become more prominent in the future.
 
I have had three different Crossfire rigs. X1950 Pros, HD3870s, HD4870s. Have never seen the micro-stutter.

-V
 
I'm running a GX2 now, and I can honestly say that I'm still not sure what all this talk about microstuttering is about. I'm just not noticing it. Maybe I would if I actually had a GTX280 card here to see the difference.
 
I noticed it fairly badly on my SLI 8800GT setup. With SLI GTX 280, it's still there, but much less frequent and much more tolerable when it does occur.
 
I'm running a GX2 now, and I can honestly say that I'm still not sure what all this talk about microstuttering is about. I'm just not noticing it. Maybe I would if I actually had a GTX280 card here to see the difference.

You'd only notice it if the framerate you feel you're running at is lower than the actual framerate being displayed by the game or fraps. Or if you had a program that could measure the time delay between each frame, which fraps cannot do.
 
If it's as big a problem as people say, then it might be worth it to come up with some sort of V-sync-like feature for Syncing AFR. (So that the second GPU starts it's rendering when the first is half way done it's frame, that way they are always evenly spaced).
The problem is that the next frame could take significantly longer or shorter than the first, depending on what's different. I remember reading somewhere that Crysis is/was notorious for "breaking" AFR prediction heuristics in Nvidia's drivers.

Back when I had SLI 8800GT, I actually used a utility called fpslimiter (search for it if you're interested; I think I've mentioned it a couple of times here) that attempted to lock the framerate to whatever you specified. It actually did work for DX9 games. 30 fps in Crysis actually felt like 30 fps should in that game, instead of 20 fps.
 
Microstuttering != Stuttering

Just remember that, a lot of people saying they see MS might be seeing plain old stuttering
 
The poll results speak volumes on how important this really is to gaming.
 
Personally I've never used a multi-gpu setup, however I had a broken nvidia driver on my 8800GT once which would, oddly enough, cause micro-stuttering in Oblivion.
That's actually a common problem with Oblivion's engine and isn't microstuttering. I don't know exactly why it can't render smoothly, but I assume there's just something severely wrong with that renderer.
 
You can't microstutter on single cards. That's a driver or GPU performance issue.

MS can only exist in AFR, which only works during multiGPU
 
You can't microstutter on single cards. That's a driver or GPU performance issue.

MS can only exist in AFR, which only works during multiGPU

There are multi-GPU cards out there. They can certainly experience micro-stuttering.
 
There are multi-GPU cards out there. They can certainly experience micro-stuttering.

Pretty sure he was refering to multi gpu and not just multi card setups.


You can't microstutter on a single GPU. That's a driver or GPU performance issue.

MS can only exist in AFR, which only works during multiGPU

fixed
 
I'm running a GX2 now, and I can honestly say that I'm still not sure what all this talk about microstuttering is about. I'm just not noticing it. Maybe I would if I actually had a GTX280 card here to see the difference.

I'm not sure what this microstuttering is either, I don't notice any dents in my gameplay. Even if it does exist (which I am not denying) I haven't seen it. Therefore I am not concerned about it. I don't care how many graphs, charts and plain old rants some members have about this "microstutter," if it ain't affecting my gameplay, I'm sorry but I don't care. My 9800GX2 is smooth as butter.
 
I'm not sure what this microstuttering is either, I don't notice any dents in my gameplay. Even if it does exist (which I am not denying) I haven't seen it. Therefore I am not concerned about it. I don't care how many graphs, charts and plain old rants some members have about this "microstutter," if it ain't affecting my gameplay, I'm sorry but I don't care. My 9800GX2 is smooth as butter.

If it doesn't affect your gameplay then thats good, I'm glad you're happy with your card.

The reason I'm concerned with MS is that looking at all the benchmarks for the 4870x2 in reviews, if the 4870x2 and GTX280 are, for example, both getting 40fps in a game, you'd assume they are both performing the same. However IF the 4870x2 is microstuttering (IF), then even though it displays 40 its probably only actually getting 30-35fps, making the GTX280 a better performing card. You still might not notice it, but effectively you may be playing on a worse performing system.

It doesn't concern me so much for the 4870x2, it concerns me for the approaching 4850x2 which I'm more likely to buy, and which will have performance in the 4870-GTX260-GTX280 range. If its microstuttering then it may make those single GPU options superior.
 
If it doesn't affect your gameplay then thats good, I'm glad you're happy with your card.

The reason I'm concerned with MS is that looking at all the benchmarks for the 4870x2 in reviews, if the 4870x2 and GTX280 are, for example, both getting 40fps in a game, you'd assume they are both performing the same. However IF the 4870x2 is microstuttering (IF), then even though it displays 40 its probably only actually getting 30-35fps, making the GTX280 a better performing card. You still might not notice it, but effectively you may be playing on a worse performing system.

It doesn't concern me so much for the 4870x2, it concerns me for the approaching 4850x2 which I'm more likely to buy, and which will have performance in the 4870-GTX260-GTX280 range. If its microstuttering then it may make those single GPU options superior.


Hmm, that the thing, why are we only talking about it now? I've searched this forum looking for people talking or complaining about MS and came up almost empty, til fairly recently. Almost no mention of it at all from 6800 series up until a few months prior to the 200 and 4800 series launches.

Just wondering why these people who notice it are only telling us about it now?
 
Because of the 4870x2 :p

Also I think the term itself, microstutter, is a more recent term. Doesn't mean it didn't exist before, just means it didn't have a name, or no one thought to analyse an SLI/crossfire set up frame by frame rather than simple framerate like FRAPS and most games display.
 
That's actually a common problem with Oblivion's engine and isn't microstuttering. I don't know exactly why it can't render smoothly, but I assume there's just something severely wrong with that renderer.

Agreed, while not technically microstuttering in name, the end result seemed to be about the same (non-uniform frame times), which is why I found it interesting.

Did early iterations of SLI use other methods other then AFR? I'd imagine that they wouldn't have this problem, which could explain why it's a more recently observed phenomina.

For microstuttering to be a nuisance, the difference in frame times has to be noticeable. Once framerates get high enough, the frame times are so low that the differences don't result in any noticeable change in motion on the screen.

If you don't notice it, I wouldn't advise trying to look for it, as like with MP3 artifacting, once you train yourself to see the difference, you can never go back :/ A horizontal "pan" test across the screen (generating movement) using a target in the foreground, seems like the best way to see if it's visible. If the object moves the same amount across the screen between frames (uniform frame time), then the motion will seem smooth. If the object moves different amounts across the screen between frames (non-uniform frame times) then the motion will appear to be jerky (not smooth). Depending on the framerate and such, you may have to pan at different speeds.
 
Agreed, while not technically microstuttering in name, the end result seemed to be about the same (non-uniform frame times), which is why I found it interesting.

Did early iterations of SLI use other methods other then AFR? I'd imagine that they wouldn't have this problem, which could explain why it's a more recently observed phenomina.

For microstuttering to be a nuisance, the difference in frame times has to be noticeable. Once framerates get high enough, the frame times are so low that the differences don't result in any noticeable change in motion on the screen.

If you don't notice it, I wouldn't advise trying to look for it, as like with MP3 artifacting, once you train yourself to see the difference, you can never go back :/ A horizontal "pan" test across the screen (generating movement) using a target in the foreground, seems like the best way to see if it's visible. If the object moves the same amount across the screen between frames (uniform frame time), then the motion will seem smooth. If the object moves different amounts across the screen between frames (non-uniform frame times) then the motion will appear to be jerky (not smooth). Depending on the framerate and such, you may have to pan at different speeds.

3dfx primarily used scan line interleave. Basically each card rendered every other vertical scan line. For load balancing the work those cards did at the time, it was excellent. The load on each gpu remained much, much closer than what is reasonably possible with today's games and gpus with all the work they do. Micro stutter was not really possible, or rather if/when it occurred it would show up as tearing. I never saw it though, or if I did, I thought it was regular tearing due to vsync being off.

Pretty much all other companies used AFR, or split screen rendering. With AFR being the most common method.
 
3dfx primarily used scan line interleave. Basically each card rendered every other vertical scan line. For load balancing the work those cards did at the time, it was excellent. The load on each gpu remained much, much closer than what is reasonably possible with today's games and gpus with all the work they do. Micro stutter was not really possible, or rather if/when it occurred it would show up as tearing. I never saw it though, or if I did, I thought it was regular tearing due to vsync being off.

Pretty much all other companies used AFR, or split screen rendering. With AFR being the most common method.

So the question why dont they do that with crossfire and SLI now?

Maybe its because it gives them higher average framerates, while the actual playable framerate is lower than what is displayed, making their card look better in reviews.
 
Every graphics card I have ever owned has stuttered sometimes when they load textures. Usually it was due to me setting the texture quality too high. I don't know what microstuttering is, but my 9800GX2 is the smoothest graphics card I have ever owned. It has exeeded my expectations but it does occasionally studder when loading new textures. However, it was not a consistant terrible stutter. I have only owned the 9800GX2 for 4 months, but in that time I have played Oblivion, Crysis demo, Bioshock demo, and Sins of a Solar Empire. I set all the settings to max and it works great.
 
Every graphics card I have ever owned has stuttered sometimes when they load textures. Usually it was due to me setting the texture quality too high. I don't know what microstuttering is, but my 9800GX2 is the smoothest graphics card I have ever owned. It has exeeded my expectations but it does occasionally studder when loading new textures. However, it was not a consistant terrible stutter. I have only owned the 9800GX2 for 4 months, but in that time I have played Oblivion, Crysis demo, Bioshock demo, and Sins of a Solar Empire. I set all the settings to max and it works great.


If you see it then its not microstuttering :) By definition, "micro" would imply something you can't see and affects you on a different level (ie. effective framerate being lower than what's displayed). Just like you can't see an electron orbitting a nucleus, however its the electrons flowing on the surface of a metal that makes it shiny.
 
Back
Top