Poll: Micro stutter

Do you notice micro stutter.


  • Total voters
    118
If you see it then its not microstuttering :) By definition, "micro" would imply something you can't see and affects you on a different level (ie. effective framerate being lower than what's displayed). Just like you can't see an electron orbitting a nucleus, however its the electrons flowing on the surface of a metal that makes it shiny.

If the main effect is much lower fps than what is expected, then this reminds me of a SLI Anandtech article I read that said performance was better if vsync was left enabled. I leave the drivers at default and vsync enabled. I'm wondering if people experiencing microstutter are disabling vsync.

http://www.anandtech.com/showdoc.aspx?i=3275&p=4
 
If the main effect is much lower fps than what is expected, then this reminds me of a SLI Anandtech article I read that said performance was better if vsync was left enabled. I leave the drivers at default and vsync enabled. I'm wondering if people experiencing microstutter are disabling vsync.

http://www.anandtech.com/showdoc.aspx?i=3275&p=4

Its not a lower framerate than what is expected, its a lower framerate than what is displayed. You could have FRAPS or whatever telling you that you're getting 30fps, but because of the variation in rendering time from one frame to the next, its effectively only 25fps... but because the average is 30fps that's what FRAPS or whatever program you're using is telling you.

So you're being told is a certain value, but what you're "feeling" is actually less. If you actually feel stuttering, chances are it IS stuttering and not microstutter.

EDIT: This is why I'm so interested in microstutter before I buy a 4850X2 (when it comes out of course, assuming its a reasonable price :p). In most games 4850X2 is equal to or higher than the framerate of a GTX260 or 4870, and a lot higher with filters and resolution turned up. But if its microstuttering exists for these cards, at the end of the day it may actually be no better for gaming than the GTX260 or 4870. Its a case of "what is the max playable settings?" opposed to "what framerate are you getting at these specific settings?".
 
So the question why dont they do that with crossfire and SLI now?

Same reason why SFR and SuperTiling are hardly used now, each card would have to render a lot of the same data (certain geometry data, certain vertex data, etc) making things extremely redundant (and hampering performance) whereas back in the 3dfx days you didn't have this problem (thus AFR provides much better performance increase nowadays).

If it's as big a problem as people say, then it might be worth it to come up with some sort of V-sync-like feature for Syncing AFR.

Like Vsync (without triple buffering)...:p
 
If you see it then its not microstuttering :) By definition, "micro" would imply something you can't see...

I thought micro just meant "small", :p.

Seriously though, I'm curious to know what you mean by the "perceived" framerate being lower then the actual framerate? As someone who hasn't had any dual GPU setups, I haven't been able to observe this phenominal in it's natural form (although I believe I have seen something similiar).

If my understanding is correct, it's not so much as a "lower then actual" framerate, as it is a non-uniform framerate. Due to the non uniform gap between frames it will appear extra "jerky" / "stuttery" / "choppy". However every single frame is still being rendered and displayed, it's just that they don't feel as smooth as they should?

I'm not sure if 40 "microsttering" fps should appear the same as 30 "uniform" fps, as the 30fps still gives a decently smooth sense of motion, while microstuttering makes the motion appear to jerk and stutter (albeit on a small scale).

Seeing as mutli-gpu seems to be the future, this is something that should definitely be of interested to the enthusiast gamer.
 
If you see it then its not microstuttering :) By definition, "micro" would imply something you can't see and affects you on a different level (ie. effective framerate being lower than what's displayed). Just like you can't see an electron orbitting a nucleus, however its the electrons flowing on the surface of a metal that makes it shiny.

So you how do you see teh shinyness then?
 
I thought micro just meant "small", :p.

Seriously though, I'm curious to know what you mean by the "perceived" framerate being lower then the actual framerate? As someone who hasn't had any dual GPU setups, I haven't been able to observe this phenominal in it's natural form (although I believe I have seen something similiar).

If my understanding is correct, it's not so much as a "lower then actual" framerate, as it is a non-uniform framerate. Due to the non uniform gap between frames it will appear extra "jerky" / "stuttery" / "choppy". However every single frame is still being rendered and displayed, it's just that they don't feel as smooth as they should?

Pretty much. The problem is that many people's eyes will "latch" onto the longest gap between frames and perceive that as the actual framerate for the purposes of smoothness and playability. The term "microstuttering" is supposed to be used to denote uneven framerates due to AFR timing issues and not from an external factor like disk activity or extra CPU load bottlenecking the framerate.
 
Upgraded from a 8800gtx to a 9800gx2 and get mad stuttering in CSS. I want my single GPU back.
 
I have the 177.83s. Just haven't installed them yet.

P.S. I also have the 177.89s waiting to check out. ;)
 
So you how do you see teh shinyness then?

I was about to write a long thing about how it works, but I found this site that explains it for me. :p

http://www.chemistry-react.org/go/Faq/Faq_3723.html

Basically the electrons give you the shiny surface, but you dont see the electrons... Like microstuttering, microstuttering gives you a percieved framerate lower than the framerate displayed by FRAPS, but you dont actually see the microstutter itself (unless your framerate is so incredibly low that you can make out the variations in rendering time). ;)

As a side, the prefix "micro" either means something x10^-6 (micrometer, microsecond, etc) or can also mean something which is too small to see (which is usually the micrometer scale, its pretty much impossible to see something thats a few micrometers across). If you were talking about something you can see, it'd be called "macro". My materials engineering lecturer always used these terms :p The macro stuff is the building itself, the micro stuff is what's actually holding the building together.
 
This is what has scared me away from dual cards. I think about SLI alot but then this makes me not want to do it. I know I'll be one of the people who can see it and it will piss me off to no end. I say,
 
This is what has scared me away from dual cards. I think about SLI alot but then this makes me not want to do it. I know I'll be one of the people who can see it and it will piss me off to no end. I say,

Well, if it's any consolation, I think that it's gotten better with the current generation of cards. I was very vocal here about my displeasure with the microstuttering that I saw in SLI 8800GT, but I'm very happy with my SLI GTX 280.
 
lol my point was they are still "seeing" the microstutter - just indirectly by your definition.

I'm probably taking what you first said the wrong way... but I don't feel like going back through the thread to check for such a tangential discussion.
 
Back
Top