From 2x 570's to 1x 7970. Could NOT be happier.

Again, that does not mean that microstutter actually effects every mutli-GPU setup, or at least to the same extent on every setup.
It affects every multi-GPU setup, as they all use AFR. However, the extent of which is dependent on the hardware setup, the application, and other software.
So if I run my 560 Ti SLI with vsync on my 60hz monitor my game is actually running at 30fps because I'm using SLI and all SLI has microstutter no matter if I realize it or not?
No, not at all. You need to go back and re-read some of the explanations posted in this thread.
 
It affects every multi-GPU setup, as they all use AFR. However, the extent of which is dependent on the hardware setup, the application, and other software.

No, not at all. You need to go back and re-read some of the explanations posted in this thread.

I did read them. It was stated several times in here that microstutter is the game running at a lower fps than what is actually able to be read/displayed. Which is absolutely retarded.

I think the problem here is that people are confusing raw frames being pumped out vs how smoothly those frames are being pumped out. For example 120 fps on an SLI setup vs 120 fps on a single GPU, the SLI may not be as "smooth" but that doesnt change the fact that it is still 120fps. This is the whole reason most people are not noticing any stutter, because really, there is none :p
Only on systems that are actually being taxed are going to notice any kind of stutter.
 
I did read them. It was stated several times in here that microstutter is the game running at a lower fps than what is actually able to be read/displayed. Which is absolutely retarded.

Where was that said? Mostly I've seen people say it "feels" like a lower FPS than a single card, not that it actually *is* a lower FPS.
 
When I was running 6990+6970 in tri-fire. I usually ran with Vsync on.

Never did I once notice any microstuttering at all. Only problems I had was the HORRIBLE Crossfire support from AMD.

I think microstuttering is a Myth. Seen quite a few friends with crossfire/sli and not one ever noticed microstuttering.
 
I did read them. It was stated several times in here that microstutter is the game running at a lower fps than what is actually able to be read/displayed. Which is absolutely retarded.

so if I pump out 120 frames in half a second and do nothing for the other half a second.... its still 120fps. but I think you'll notice. Is the game running 120fps? Not for the other half second.

The time it takes to render the frame, is crucial. If it takes longer to render evert 3rd/4th/5th frame the fluidity of the game play will be disrupted.
 
When I was running 6990+6970 in tri-fire. I usually ran with Vsync on.

Never did I once notice any microstuttering at all. Only problems I had was the HORRIBLE Crossfire support from AMD.

I think microstuttering is a Myth. Seen quite a few friends with crossfire/sli and not one ever noticed microstuttering.

Microstutter is less apparent with tri and quad-fire. You do know that you can measure micro-stutter. There are quite a few articles on microstutter and what it does, a couple of which have been posted in this thread if you want to educate yourself on the subject.

I didn't think that I had ever noticed microstutter either. I had noticed that my 4870x2 in Crysis it didn't feel any smoother than my GTX280 despite the higher framerates. Little did I know thats what microsutter was.
 
When I was running 6990+6970 in tri-fire. I usually ran with Vsync on.

Never did I once notice any microstuttering at all. Only problems I had was the HORRIBLE Crossfire support from AMD.

I think microstuttering is a Myth. Seen quite a few friends with crossfire/sli and not one ever noticed microstuttering.

Did you read any of the thread at all or any of the explainations of microstutter?

Tri-fire setups reduce microstutter, this has been explained, that's why you didn't notice it.

And Both Nvidia and AMD have stated that it's a problem. So it's not a myth.

Go back and read the thread.
 
so if I pump out 120 frames in half a second and do nothing for the other half a second.... its still 120fps. but I think you'll notice. Is the game running 120fps? Not for the other half second.

The time it takes to render the frame, is crucial. If it takes longer to render evert 3rd/4th/5th frame the fluidity of the game play will be disrupted.

If your cards are taking a full half a second to render any frames, you've got bigger problems. :eek:
 
Where was that said? Mostly I've seen people say it "feels" like a lower FPS than a single card, not that it actually *is* a lower FPS.

It was said several times in here, but this is the best example.

Again, you misunderstand the concept of microstutter. You can't "see" it. All it is is a lower actual framerate than what's being reported in FPS. If you're getting 90 FPS, for example, you may actually be only rendering at a framerate closer to 70FPS, however, that's still more than most monitors' 60Hz refresh rate, so you can't tell the difference. Or, you could be rendering 50FPS, but the actual framerate is at 40 FPS. Many people can't detect or feel the difference between 40FPS and 50FPS.
 
Did you read any of the thread at all or any of the explainations of microstutter?

Tri-fire setups reduce microstutter, this has been explained, that's why you didn't notice it.

And Both Nvidia and AMD have stated that it's a problem. So it's not a myth.

Go back and read the thread.

And on top of that using Vsync.
 
I did read them. It was stated several times in here that microstutter is the game running at a lower fps than what is actually able to be read/displayed. Which is absolutely retarded.

I think the problem here is that people are confusing raw frames being pumped out vs how smoothly those frames are being pumped out. For example 120 fps on an SLI setup vs 120 fps on a single GPU, the SLI may not be as "smooth" but that doesnt change the fact that it is still 120fps. This is the whole reason most people are not noticing any stutter, because really, there is none :p
Only on systems that are actually being taxed are going to notice any kind of stutter.
It was said several times in here, but this is the best example.
Like I said, you need to go back re-read the explanations because you clearly don't understand them. Framerate is not the same as FPS. FPS is an acronym for frames per second, which is a common way of reporting framerate, but it's not the actual framerate. That's what you don't seem to understand.
 
When I was running 6990+6970 in tri-fire. I usually ran with Vsync on.

Never did I once notice any microstuttering at all. Only problems I had was the HORRIBLE Crossfire support from AMD.

I think microstuttering is a Myth. Seen quite a few friends with crossfire/sli and not one ever noticed microstuttering.

I think microstuttering is completely gone in 3way CF(6990 counts as 2 :)).
 
Like I said, you need to go back re-read the explanations because you clearly don't understand them. Framerate is not the same as FPS. FPS is an acronym for frames per second, which is a common way of reporting framerate, but it's not the actual framerate. That's what you don't seem to understand.

I think you need to reread your own explanations because you clearly dont understand yourself.

You said:

If you're getting 90 FPS, for example, you may actually be only rendering at a framerate closer to 70FPS

How I am I not understanding? You basically saying that if I'm getting 120 FPS my framerate is actually lower and can only be perceived but cannot be measured in any real way, so therefor my game is not running at 120 FPS but is running at some other magical number that is much slower.

And yet, if I turn on vsync, then the problem suddenly goes away and my framerate is then fine?

I had microstutter once, but then i rebooted my machine :p
 
I think he's trying to say that because some frames take longer than others to render, you're not really getting the same overall frames in, say, a minute, than you would be without microstutter.

However, the 90 vs 70 thing seems way overblown. Probably more like 90 vs 88 or something, if that.
 
I think you need to reread your own explanations because you clearly dont understand yourself.
Only ignorance breeds this kind of arrogance, amazing.
How I am I not understanding? You basically saying that if I'm getting 120 FPS my framerate is actually lower and can only be perceived but cannot be measured in any real way, so therefor my game is not running at 120 FPS but is running at some other magical number that is much slower
The fact that you attempt to reiterate my points incorrectly, several times, indicates that you do not understand. 120FPS is a rendering rate, but rendering is not constant. Look at the FPS graphs in any review on [H], you'll see that the FPS change constantly per unit time. For example: http://www.hardocp.com/image.html?image=MTMzMDQ5NzEyMVA2S3lyMkdTeVhfNl8zX2wuanBn . However, these graphs have a resolution of one second. Within that second, many frames are rendered. For example, at a rate of 120FPS, within that second, 120 frames are rendered. However, these frames are not rendered in sync at an even 8.33ms apart. In fact, there's a large variance over that second, even with a single GPU and definitely with multi GPU. As I stated previously, in dual-GPU systems, there's generally a quick, staccato pattern to rendering, where frame 1 and 2 are closer (<8.33ms in this case) and frame 2 and 3 are further apart (>8.33ms). The "feel" or smoothness of the game actually comes from the lowest common denominator, in this case the lag that is greater than 8.33ms. Therefore, even though you technically are rendering 120 frames in that second, the game won't be nearly as smooth feeling than if those frames had been rendered at a perfect, consistent 8.33ms apart.

As it stands, single GPUs have some variance, but not nearly as bad as multi-GPU systems that use AFR rendering methods. It's a limitation of the technology, plain and simple. Now, a third GPU added to a multi-GPU helps a lot in the feeling of smoothness and removing micro stutter because it actually eats into that "lag time" after the frames are rendered, which increases that lowest common denominator, and therefore increases the smoothness of the system. Anyone who has used a multi-GPU system with 3 or more cards can attest to this.
And yet, if I turn on vsync, then the problem suddenly goes away and my framerate is then fine?
Because you're artificially imposing a synchronization on the rendering, which is the main problem with AFR methods. Basically, by limiting the maximum speed at which frames can be drawn (in the example provided, 120FPS, or ever 8.33ms), this allows the system to catch up and the lag time isn't as long, especially in comparison. However, vsync comes with all its own problems (input lag, staggering frame drops, etc.).
I think he's trying to say that because some frames take longer than others to render, you're not really getting the same overall frames in, say, a minute, than you would be without microstutter.

However, the 90 vs 70 thing seems way overblown. Probably more like 90 vs 88 or something, if that.
No, 90 vs. 70 is actually being generous on my part, it's many times much worse. 90 vs. 88 is something you'd see in a single GPU system.

More reading for those interested:
http://hardforum.com/showthread.php?t=1317582
 
So if I run my 560 Ti SLI with vsync on my 60hz monitor my game is actually running at 30fps because I'm using SLI and all SLI has microstutter no matter if I realize it or not?

No, that is not what I meant at all. I'm saying just because someone is not as sensitive to microstutter as another person does not mean microstutter does not exist as someone here is claiming. Some people claim man never never landed on the moon too.

Oh, and using a photo of a person with downs syndrome to try and make someone look foolish is low.
 
Last edited:
I did read them. It was stated several times in here that microstutter is the game running at a lower fps than what is actually able to be read/displayed. Which is absolutely retarded.

It is a sync issue and not so much a FPS issue. Think about it,multi-gpu setups alternate frames between each GPU so any slight deviation in sync has the potential to cause microstutters.
 
No, 90 vs. 70 is actually being generous on my part, it's many times much worse. 90 vs. 88 is something you'd see in a single GPU system.

More reading for those interested:
http://hardforum.com/showthread.php?t=1317582

That link doesn't say anything to that extent. Basically microstuttering is the variance of frame time per frame as a result of GPUs having to "wait for" each other when rendering. The variance is on the level of like 10-70 ms. With that kind of "lag" time per frame, how are you getting a 20 FPS drop? Simply doesn't make sense.

He did say that "theoretically" and "mathematically" it's possible for a game running at 60 FPS to "feel" like 30 FPS, but that's not what was actually experienced.

Also note that that link you posted says if your framerate is greater than your refresh rate you reduce or eliminate microstutter. For most games on an SLI setup it's not hard to get more than 60 FPS.
 
I used to have microstuttering, but then I took a card out of my case....Like 3 people have posted articles proving the effects and existence of Microstuttering. It's there if you have 2 cards.
 
Only ignorance breeds this kind of arrogance, amazing.
The fact that you attempt to reiterate my points incorrectly, several times, indicates that you do not understand. 120FPS is a rendering rate, but rendering is not constant. Look at the FPS graphs in any review on [H], you'll see that the FPS change constantly per unit time. For example: http://www.hardocp.com/image.html?image=MTMzMDQ5NzEyMVA2S3lyMkdTeVhfNl8zX2wuanBn . However, these graphs have a resolution of one second. Within that second, many frames are rendered. For example, at a rate of 120FPS, within that second, 120 frames are rendered. However, these frames are not rendered in sync at an even 8.33ms apart. In fact, there's a large variance over that second, even with a single GPU and definitely with multi GPU. As I stated previously, in dual-GPU systems, there's generally a quick, staccato pattern to rendering, where frame 1 and 2 are closer (<8.33ms in this case) and frame 2 and 3 are further apart (>8.33ms). The "feel" or smoothness of the game actually comes from the lowest common denominator, in this case the lag that is greater than 8.33ms. Therefore, even though you technically are rendering 120 frames in that second, the game won't be nearly as smooth feeling than if those frames had been rendered at a perfect, consistent 8.33ms apart.

As it stands, single GPUs have some variance, but not nearly as bad as multi-GPU systems that use AFR rendering methods. It's a limitation of the technology, plain and simple. Now, a third GPU added to a multi-GPU helps a lot in the feeling of smoothness and removing micro stutter because it actually eats into that "lag time" after the frames are rendered, which increases that lowest common denominator, and therefore increases the smoothness of the system. Anyone who has used a multi-GPU system with 3 or more cards can attest to this.
Because you're artificially imposing a synchronization on the rendering, which is the main problem with AFR methods. Basically, by limiting the maximum speed at which frames can be drawn (in the example provided, 120FPS, or ever 8.33ms), this allows the system to catch up and the lag time isn't as long, especially in comparison. However, vsync comes with all its own problems (input lag, staggering frame drops, etc.).
No, 90 vs. 70 is actually being generous on my part, it's many times much worse. 90 vs. 88 is something you'd see in a single GPU system.

More reading for those interested:
http://hardforum.com/showthread.php?t=1317582

The best part of this post is that you begin by telling me I'm ignorant and dont understand but instead of refuting anything I said you just explained how microstutter is an issue of smoothness and never once accounted for any drop in framerate. If you go back and look at what I posted previously, that is precisely what I said.

Its funny how you said you didnt say something, then when I quoted it accused me of misunderstanding, then when on to essentially agree with what I said and never actually supported your original claim.
 
No, that is not what I meant at all. I'm saying just because someone is not as sensitive to microstutter as another person does not mean microstutter does not exist as someone here is claiming. Some people claim man never never landed on the moon too.

Oh, and using a photo of a person with downs syndrome to try and make someone look foolish is low.

It has nothing to do with sensitivity, it has to do with people pushing their multi GPU setups too hard, which isnt going to be alleviated by going to a single card unless that single card is more powerful than the multi gpu setup.

I am not the creator of the pic.
 
can't we just end this madness with saying that microstutter is there and if you notice it, it sucks?
 
That link doesn't say anything to that extent. Basically microstuttering is the variance of frame time per frame as a result of GPUs having to "wait for" each other when rendering. The variance is on the level of like 10-70 ms. With that kind of "lag" time per frame, how are you getting a 20 FPS drop? Simply doesn't make sense.

He did say that "theoretically" and "mathematically" it's possible for a game running at 60 FPS to "feel" like 30 FPS, but that's not what was actually experienced.

Also note that that link you posted says if your framerate is greater than your refresh rate you reduce or eliminate microstutter. For most games on an SLI setup it's not hard to get more than 60 FPS.
You realize that 10-70ms is a difference of about +/-40FPS (as in 20-100FPS) if you're rending average is 60FPS, right?
The best part of this post is that you begin by telling me I'm ignorant and dont understand but instead of refuting anything I said you just explained how microstutter is an issue of smoothness and never once accounted for any drop in framerate. If you go back and look at what I posted previously, that is precisely what I said.

Its funny how you said you didnt say something, then when I quoted it accused me of misunderstanding, then when on to essentially agree with what I said and never actually supported your original claim.
Because you don't understand the concept. I'm not sure how many times or how many different ways I can break it down for you to understand it. Your framerate is changing constantly, frame by frame, and with multi-GPU, microstutter de-syncs frame-to-frame rendering. If you think microstutter should manifest itself with an FPS drop, then you haven't understood anything said in this thread.
 
You realize that 10-70ms is a difference of about +/-40FPS (as in 20-100FPS) if you're rending average is 60FPS, right?

Okay, but let's think about this for a second. If you look at that graph you can see how the variation is fairly cyclical (as in, first 10ms, then 70ms, then 10, etc.) If you're fluctuating between 20 and 60 FPS then your average is still going to be 40 FPS, whereas the single card is "stable" at 40 FPS (for example). Unless the frame render time is consistently higher than the single-card then you're not actually losing average FPS, it's just more wildly varied (yet at a rate which is nearly imperceptible to most people).

Because you don't understand the concept. I'm not sure how many times or how many different ways I can break it down for you to understand it. Your framerate is changing constantly, frame by frame, and with multi-GPU, microstutter de-syncs frame-to-frame rendering. If you think microstutter should manifest itself with an FPS drop, then you haven't understood anything said in this thread.

Actually, differences in average framerate DOES affect microstutter manifestation, as referenced by that link you posted. The faster your card is able to render over your refresh rate, the less (or perhaps even none) likely it is to get microstutter.
 
When I was running 6990+6970 in tri-fire. I usually ran with Vsync on.

Never did I once notice any microstuttering at all. Only problems I had was the HORRIBLE Crossfire support from AMD.

I think microstuttering is a Myth. Seen quite a few friends with crossfire/sli and not one ever noticed microstuttering.

I've had two crossfire systems (6950s & 4850s) for a few years now.
I've had microstutter once.
1. Heroes of newarth

I went to a lan party and instead of bringing a 3x1 eyefinity triple monitor stand, i brought a single 17" display I had sitting around. The microstutter was SOOoOOOo bad. It looked like the game's pixels were being attacked by a sand monster of sorts.

I fixed it in 4 mins by using radeonpro to force it to max antialiasing and post processing. No issues after that.
 
Okay, but let's think about this for a second. If you look at that graph you can see how the variation is fairly cyclical (as in, first 10ms, then 70ms, then 10, etc.) If you're fluctuating between 20 and 60 FPS then your average is still going to be 40 FPS, whereas the single card is "stable" at 40 FPS (for example). Unless the frame render time is consistently higher than the single-card then you're not actually losing average FPS, it's just more wildly varied (yet at a rate which is nearly imperceptible to most people).
Not at all. The frame render time is consistently higher every render "cycle," and that's what is detected as microsutter. That lag time is what makes a game rendering at 60FPS feel like it's rendering at 40FPS (because it is for a fraction of every render "cycle").
Actually, differences in average framerate DOES affect microstutter manifestation, as referenced by that link you posted. The faster your card is able to render over your refresh rate, the less (or perhaps even none) likely it is to get microstutter.
Right, and no where did I say contrary to that. What's your point?
 
Folks should quit calling it "microstutter" and call it "Uneven frametime distribution". This way people can understand what it is. You still get 60 FPS with two cards, it's just when the first and second frame are very close together on the timeline, followed by a long pause, and then the third and fourth frame appear close to each other... Well, it feels like 30 FPS, and almost looks like 30 FPS when slowed down to one tenth speed, even though it's technically still 60. Then the pause between frames goes from high latency between frames, to even latency between frames, making it an up and down cycle, and an overall miserable experience. It's easily noticable the lower the framerate is. So where you really need it most like an expensive scene in metro or crysis will feel incredibly "smoother" on a single card, than two cards... if the fps is 20-40.
 
:confused:
What exactly are you trying to say here, then?
Read above. Pick out the parts you don't understand and I'll do my best to clarify.
Folks should quit calling it "microstutter" and call it "Uneven frametime distribution". This way people can understand what it is. You still get 60 FPS with two cards, it's just when the first and second frame are very close together on the timeline, followed by a long pause, and then the third and fourth frame appear close to each other... Well, it feels like 30 FPS, and almost looks like 30 FPS when slowed down to one tenth speed, even though it's technically still 60. Then the pause between frames goes from high latency between frames, to even latency between frames, making it an up and down cycle, and an overall miserable experience. It's easily noticable the lower the framerate is. So where you really need it most like an expensive scene in metro or crysis will feel incredibly "smoother" on a single card, than two cards... if the fps is 20-40.
Exactly. :cool:
 
Back
Top