GPU "Microstuttering" FAQ

It's GPU to GPU sync. Alternate frame rendering across two GPU's is the cause of it. v-sync helps, the sli AA modes and split frame rendering *might* help too.

I was wondering why I never saw microstuttering on my SLI 7900GTs. I've been running vsync with triple buffering. I can't stand the tearing I get without vsync.
 
Is it worse in lower average FPS or in higher average FPS?


The way I understand it is that any game that always runs over, say, 60fps all the time or 99.99% of the time will be fine. A game that runs under 60 FPS will stutter, a game that runs 300 fps but drops below 60fps "sometimes" will stutter when it drops below 60 (sometimes). The lower the FPS below 60 the more likely the game is to stutter AFAIK.
 
The way I understand it is that any game that always runs over, say, 60fps all the time or 99.99% of the time will be fine. A game that runs under 60 FPS will stutter, a game that runs 300 fps but drops below 60fps "sometimes" will stutter when it drops below 60 (sometimes). The lower the FPS below 60 the more likely the game is to stutter AFAIK.

Well, it's more to do with your refresh rate. So if your refresh rate is 85Hz or 100Hz, then those are going to be your magic numbers.
 
Is it worse in lower average FPS or in higher average FPS?

when you get up above 50fps, delay between frames can only be so big (well, 1/50th of a second at worst lol), so the higher the FPS the better the worst case scenario is. As you get bigger the difference in delay shrinks so it becomes less noticable.
 
when you get up above 50fps, delay between frames can only be so big (well, 1/50th of a second at worst lol), so the higher the FPS the better the worst case scenario is. As you get bigger the difference in delay shrinks so it becomes less noticable.

unless you have one cards that has serious syncing issues. But then your overall FPS would drop.
 
To bring this back up... I really do not see this issue addressed by any reviewer site. If a lot of people are seeing this issue, why are the sites not drilling nvidia about this obvious SLI issue?
 
If this is true then basically every review site except [H] is totally bogus. Much respect to Kyle.

Question for SLI/CF owners: Should I bother picking up a second 8800GTS or is SLI basically dead-on?

SLI = Its A Lie?
 
if memory serves me correctly ATI had this exact same problem when they had the rage fury maxx
 
Microstuttering is not that big of a deal, when you get the fps above refresh rate it should not be noticeable. It's a bigger psychological problem than it is technically.
 
We should all see psychiatrists and get rid of microstutter - brilliant! You obviously skipped a few pages (if not all) of the thread.
 
This is an interesting statement, because I'd say that frame rates of closer to 60fps are what is considered to delivery acceptable gameplay, fast jerky movement are always smooth and areas where frame rate dips does so and remains a reasonable.

In fact do me a favour, apply your acceptable game settings you arrived at in Age of Conan, about 30fps average in Old Trantia, then head into the Noble District and do some of the Villa missions (these are available daily from level 40 upwards)

This is good proof that an average that sits borderline isn't ideal, I noticed you giving ATI special treatment in benchmarks of Crysis, you came out with a low 30fps acceptable frame rate with certain settings but made an exception in the last level because it ran so bad.

Then again my background is more geared towards pro gaming where fluid frame rates are essential, I've just come to expect that from my casual gaming, still it's important to realise that sitting borderline isn't good, these random pauses and slowdowns are more noticeable and can result in frustrating gameplay experience if they occur at inconvenient times. Certainly at frame rates I conisder acceptable microstutter isn't so much of a problem.


I don’t see the microstuttering issue having much to do with frame rates. I just noticed the issue in CoD4 with 2 way GTX 280 SLI. In SLI the frame rates are through the roof, the minimum frame rate is around 70, average 110, but I’d rather play in single card mode at least in single player where my average frame rate is 70 and the minimum is around 45. But that rate is high enough were at least I can’t tell the difference between single mode and SLI mode anyway.
 
This is good proof that an average that sits borderline isn't ideal, I noticed you giving ATI special treatment in benchmarks of Crysis, you came out with a low 30fps acceptable frame rate with certain settings but made an exception in the last level because it ran so bad.

So when the new raptors came out we should have thrown them all in the garbage because the firmware wasn't complete and shut their doors? Is that really how you view life and reviews of hardware. There is no gray? There is no understanding of what is truth and when to not believe what you see?

Learning to look at the big picture and making calls on that is where the [H] seperates themself from pretty much every review site out there. If Kyle and crew were really favoring ATI, they never would have mentioned the last level on Crysis. Think about that for a moment.
 
I don’t see the microstuttering issue having much to do with frame rates. I just noticed the issue in CoD4 with 2 way GTX 280 SLI. In SLI the frame rates are through the roof, the minimum frame rate is around 70, average 110, but I’d rather play in single card mode at least in single player where my average frame rate is 70 and the minimum is around 45. But that rate is high enough were at least I can’t tell the difference between single mode and SLI mode anyway.

Its more noticeable with lower frame rates, and less noticeable with higher frame rates, it's about as directly correlated with frame rate as you can get!

Microstuttering isn't some bug that can be squashed, it's just a natural problem with multi GPU's sharing a workload which varies by the millisecond.

So when the new raptors came out we should have thrown them all in the garbage because the firmware wasn't complete and shut their doors? Is that really how you view life and reviews of hardware. There is no gray? There is no understanding of what is truth and when to not believe what you see?

Learning to look at the big picture and making calls on that is where the [H] seperates themself from pretty much every review site out there. If Kyle and crew were really favoring ATI, they never would have mentioned the last level on Crysis. Think about that for a moment.

I think you're a little off base here, my point wasn't that ATI got special treatment thats a whole other topic on it's own, my point was using an acceptable frame rate number of 30fps is close to the borderline of accepatable and unacceptable for many people and because games vary in workload from area to area, this becomes an issue.

Crysis is an example of that issue, and there are others although [H] don't mention them or may not be aware of them, AoC was my other example, they assumed the areas aroun Old Tarantia were the worst perfoming in the game and thats wrong by a long shot, their magic 30fps in that area will net them more like an average of 10fps in the villas and in other areas.

This leads back to my main point which is that of course [H] found SLI/Xfire to be worse because they're benchmarking so close to the acceptable standards that anything that veers slightly away from average is noticeable, for example in an MMO when a lot of people happen to be on screen at once, or if 3 grenades all go off at the same time in a first person shooter. If something like 50fps was standard or roughly 25-30fps mininum was a standard Im willing to bet microstutter wouldn't be an issue.
 
For anyone that is actually experiencing this microstutter: Does V-Sync help or fix the problem? It seems that V-Sync would fix it assuming you were getting above 60fps and even if it dipped lower the performance loss would be the same with-or-without SLI. For example, if one video card could get 50fps in a game then with V-Sync you would get a smooth 30fps. If 2 video cards could net you 80fps, then with V-Sync you should get 60fps smooth (with no dips below 60fps). So in that borderline situation an SLI card would net you double the performance of a single card with no micro-stuttering. Is that an accurate example? Because if thats true it seems SLI might still be worthwhile.
 
I was wondering why I never saw microstuttering on my SLI 7900GTs. I've been running vsync with triple buffering. I can't stand the tearing I get without vsync.

I always run VSync and Triple Buffering as well. I've never understood how people could stand tearing either. I usually buy hardware that can handle running the games I play synced. On the chance that they don't though, I'd still rather see it drop to a much lower rate than see torn frames.
 
For anyone that is actually experiencing this microstutter: Does V-Sync help or fix the problem? It seems that V-Sync would fix it assuming you were getting above 60fps and even if it dipped lower the performance loss would be the same with-or-without SLI. For example, if one video card could get 50fps in a game then with V-Sync you would get a smooth 30fps. If 2 video cards could net you 80fps, then with V-Sync you should get 60fps smooth (with no dips below 60fps). So in that borderline situation an SLI card would net you double the performance of a single card with no micro-stuttering. Is that an accurate example? Because it thats true it seems SLI might still be worthwhile.

I always force VSync and Triple Buffering, and so far, I haven't seen this issue in anything. I probably won't have time to try turning them off today though to prove that I get it with these features off. If I get a few minutes I'll try it out.
 
For anyone that is actually experiencing this microstutter: Does V-Sync help or fix the problem? It seems that V-Sync would fix it assuming you were getting above 60fps and even if it dipped lower the performance loss would be the same with-or-without SLI. For example, if one video card could get 50fps in a game then with V-Sync you would get a smooth 30fps. If 2 video cards could net you 80fps, then with V-Sync you should get 60fps smooth (with no dips below 60fps). So in that borderline situation an SLI card would net you double the performance of a single card with no micro-stuttering. Is that an accurate example? Because it thats true it seems SLI might still be worthwhile.

If you're getting higher than 60FPS then microstuttering shouldn't be an issue anyhow since it happens faster than you display can show it (assuming 60hz refresh).
 
If you're getting higher than 60FPS then microstuttering shouldn't be an issue anyhow since it happens faster than you display can show it (assuming 60hz refresh).

it's dependant on the interval measured. ok for 9 seconds out of 10 I get a total of 600 frames (avg 67 fps). For one of those seconds I get 0 fps. Average = 60fps. Do I have a problem? This is the analogy on a macro level. microstuttering happens at the micro level. substitute fps for fps/1000. re-read the FAQ.
 
I agree, having 60fps doesn't gurantee lack of stutter, it's always going to stutter like that, even single GPU setups will stutter. The important thing is the amount it stutters, when you have a higher fps the average time to render frames is decreased, eventually with a high enough frame rate even the wrost case stuttering is going to seem transparent, like it does with a normal single GPU which has exactly the same kind of irregularity in frame render times as SLI/Xfire, just to a much lesser extent.

as was pointed out the theoretical extreme is 1/2 the frame rate, we could suppose that the 2nd card always renders the even frame numbers almost instantly after the odd frames, lets say 1ms afterwards. In that scenario you're getting 2 frames almost exactly at once then a long pause and then another 2 almost at once. If you work it out you'll see that cards doing this at 60fps is going to result in a gap between the frames which appears to be 30fps which most people seem to find acceptable

Thats the worst case scenario remember, but is good theoretical proof that simply having higher frame rates masks the stutter.
 
Hey I'm just wondering if I run 2 8800GT's in SLI will my CPU (Q9300 @ 2.5GHZ (stock)) hold them back?

If so, how much of a bottleneck will it be?

I didn't want to make another thread for such a small question.

Cheers.
 
Hey I'm just wondering if I run 2 8800GT's in SLI will my CPU (Q9300 @ 2.5GHZ (stock)) hold them back?

If so, how much of a bottleneck will it be?

I didn't want to make another thread for such a small question.

Cheers.

All depends on the game but in most cases I don't think that it would be a problem.
 
Thanks for the reply.
Just for the record, I play at 1920x1200, do high resoloutions usually minimise CPU limitated GPUs even in SLI.
 
The higher the res, the lower the framerate so that is your main concern when trying to eliminate microstutter.
If you can maintain above 60fps (or your refresh rate), you wont see any microstutter.
The new top end cards in SLI have a good chance of succeeding in every game but below them you may get it in some games.
 
OK wow apologize in advance for the huge post.

To bring this back up... I really do not see this issue addressed by any reviewer site. If a lot of people are seeing this issue, why are the sites not drilling nvidia about this obvious SLI issue?

Because the issue isn't blatant Again, some people see it, some don't. For the gamer like me who can't stand dips into the 50fps range (I'm sorry I played 1.6 for too long) its not noticable. For some people even in the 30's its not noticeable It's alot like extremely high pitched noises. The 18 year old and the dog in the room are going nuts trying to find this 23kHz noise, but the 60 year old woman can't hear it --thats not to say that if you don't notice microstuttering your eyes are bad, its just, some see it, some dont. If you don't, enjoy your SLI/Crossfire setup regardless of what we have to say.

Microstuttering is not that big of a deal, when you get the fps above refresh rate it should not be noticeable. It's a bigger psychological problem than it is technically.

We should all see psychiatrists and get rid of microstutter - brilliant! You obviously skipped a few pages (if not all) of the thread.

I donno again this doesn't seem like such a bad point to make. Some people are so sure microstuttering's there that they create it. There's alot to be said for psychosomatic effects. Don't look for microstuttering. If you like the performance and how the game plays on your rig, again, dont let us snobs bring you down!

This is an interesting statement, because I'd say that frame rates of closer to 60fps are what is considered to delivery acceptable gameplay, fast jerky movement are always smooth and areas where frame rate dips does so and remains a reasonable FPS.

That depends on the programming. Sometimes what you cant see off screen is rendered to the point of z-cull. Sometimes its not. It depends on the game. If the game has elected to not render what's directly behind you to save resources, jerky movements will cause slow downs. If the game does render everything around you jerky movements should have no impact on performance.

Frostex said:
In fact do me a favour, apply your acceptable game settings you arrived at in Age of Conan, about 30fps average in Old Trantia, then head into the Noble District and do some of the Villa missions (these are available daily from level 40 upwards)

This is good proof that an average that sits borderline isn't ideal, I noticed you giving ATI special treatment in benchmarks of Crysis, you came out with a low 30fps acceptable frame rate with certain settings but made an exception in the last level because it ran so bad.

I agree that [H] telling its readers what's acceptable is a little annoying, but I still believe their method of benchmarking is superior to all others even if its only for the amount of data collected. 90% of the time in a situation you posed the settings would scale back equally. Furthermore if a villa is what I imagine it to be (something like the AH in wow, with lots of players on screen) the load increases exponentially (literally, x is the amount of players on screen and the workload on CPU is to the power of x.)

Then again my background is more geared towards pro gaming where fluid frame rates are essential, I've just come to expect that from my casual gaming, still it's important to realise that sitting borderline isn't good, these random pauses and slowdowns are more noticeable and can result in frustrating gameplay experience if they occur at inconvenient times. Certainly at frame rates I conisder acceptable microstutter isn't so much of a problem.

Ok, dropping the formalities, I think when you say "my background is more geared towards pro gaming" you think you sound cool but really you sound even more nerdy than me (and that's hard to do). I played Cal-M 1.6 for two years and I'd never say something like that for fear of being labeled one of them "too cool for us mortals" snobs.

Its more noticeable with lower frame rates, and less noticeable with higher frame rates, it's about as directly correlated with frame rate as you can get!

Microstuttering is a modifier to framerate. If you understand microstutter you understand that an extra 5fps on SLI Setup A is not the same as an extra 5fps on Single Card Setup B. Yes, microstuttering is pretty directly related to framerate but its only true for that one scenario. You cant apply microstutting data to all fps readings (perticularly, obviously, those running a single card.)

Microstuttering isn't some bug that can be squashed, it's just a natural problem with multi GPU's sharing a workload which varies by the millisecond.

Again, lol, I disagree. Microstuttering can be fixed and I hope it will be fixed. Weather its a software or hardware implementation microstuttering can be solved.

I think you're a little off base here, my point wasn't that ATI got special treatment thats a whole other topic on it's own, my point was using an acceptable frame rate number of 30fps is close to the borderline of accepatable and unacceptable for many people and because games vary in workload from area to area, this becomes an issue.

Crysis is an example of that issue, and there are others although [H] don't mention them or may not be aware of them, AoC was my other example, they assumed the areas aroun Old Tarantia were the worst perfoming in the game and thats wrong by a long shot, their magic 30fps in that area will net them more like an average of 10fps in the villas and in other areas.

[H] tries to find the hardest areas possible, if its not, its probably pretty close and if such is the case the decrease would probably be reflective of the cards relative performance in the test.

Frostex said:
This leads back to my main point which is that of course [H] found SLI/Xfire to be worse because they're benchmarking so close to the acceptable standards that anything that veers slightly away from average is noticeable, for example in an MMO when a lot of people happen to be on screen at once, or if 3 grenades all go off at the same time in a first person shooter. If something like 50fps was standard or roughly 25-30fps mininum was a standard Im willing to bet microstutter wouldn't be an issue.

Right, but that's your opinion, and its actually kind of ironic. I bring this guy up alot and I don't think he's that far from the norm: Jamie, a friend of mine, plays WOW on his 6600LE. Frequently his FPS hits the teens. That's acceptable to him (and since were on the topic of "pro" gamers, this guy had the full on tier 3 set before the expansion... he has some wicked stuff now but I don't know exactly what. He's living proof you don't need the best hardware to be good. Hah, I've got a whole nother story there about the roflcopter but I'll leave that one for some other time.) So what you describe as "borderline" isn't borderline for some people. I won't play a game at 30FPS. Jamie will play a game at 15FPS. You think the cutoff is ~30FPS. This sounds like a good topic for a poll, why don't you create one?

And again, [H] does judge by the minimum FPS, which is marked in red in the right hand column. You should pay as much attention to that as you do the average (especially when your building your argument for [H]'s pro ATI bias which is what it seems you think).

For anyone that is actually experiencing this microstutter: Does V-Sync help or fix the problem? It seems that V-Sync would fix it assuming you were getting above 60fps and even if it dipped lower the performance loss would be the same with-or-without SLI. For example, if one video card could get 50fps in a game then with V-Sync you would get a smooth 30fps. If 2 video cards could net you 80fps, then with V-Sync you should get 60fps smooth (with no dips below 60fps). So in that borderline situation an SLI card would net you double the performance of a single card with no micro-stuttering. Is that an accurate example? Because if thats true it seems SLI might still be worthwhile.

I always run VSync and Triple Buffering as well. I've never understood how people could stand tearing either. I usually buy hardware that can handle running the games I play synced. On the chance that they don't though, I'd still rather see it drop to a much lower rate than see torn frames.

V-sync fixes tearing, which I hate, and as such I always run V-sync. But it has nothing to do with microstutter. The most general reason I can give is that tearing deals with problems in a single frame (the use of old-pixels in new frames), where as microstuttering is an issue between whole frames.

I suppose disabling V-sync would increase your pFPS (partial frames per second), thus reducing microstuttering, but if the tradeoff is tearing, I'm not willing to accept.

it's dependant on the interval measured. ok for 9 seconds out of 10 I get a total of 600 frames (avg 67 fps). For one of those seconds I get 0 fps. Average = 60fps. Do I have a problem? This is the analogy on a macro level. microstuttering happens at the micro level. substitute fps for fps/1000. re-read the FAQ.

That's a really good example, but when your dealing with the microscenario the monitors laziness really does come into play. If you can pump the frames out at 70 or 80 fps, on a monitor with a refresh rate of 60Hz, it doesn't matter if delay between frames is different as the monitor just cant change that fast anyways. But yes, when your under or around 60FPS, that analogy works well.

I agree, having 60fps doesn't gurantee lack of stutter, it's always going to stutter like that, even single GPU setups will stutter.

Uhh. no? look at the chart. The single GPU config is flat line, and where its not there was a change in load or a bad bit of data recorded. People always seem to skip the fact that nothing is 100% accurate. Anand's testing data might be 95% accurate, in which case 1 or 2 of those points might be off. And they are. And its easy to spot the two.

The important thing is the amount it stutters, when you have a higher fps the average time to render frames is decreased, eventually with a high enough frame rate even the wrost case stuttering is going to seem transparent, like it does with a normal single GPU which has exactly the same kind of irregularity in frame render times as SLI/Xfire, just to a much lesser extent.

What?

as was pointed out the theoretical extreme is 1/2 the frame rate, we could suppose that the 2nd card always renders the even frame numbers almost instantly after the odd frames, lets say 1ms afterwards. In that scenario you're getting 2 frames almost exactly at once then a long pause and then another 2 almost at once. If you work it out you'll see that cards doing this at 60fps is going to result in a gap between the frames which appears to be 30fps which most people seem to find acceptable

That's the worst case scenario remember, but is good theoretical proof that simply having higher frame rates masks the stutter.

Exactly correct, aside from another little jab at [H] at the end.
 
That depends on the programming. Sometimes what you cant see off screen is rendered to the point of z-cull. Sometimes its not. It depends on the game. If the game has elected to not render what's directly behind you to save resources, jerky movements will cause slow downs. If the game does render everything around you jerky movements should have no impact on performance.

My point is simply that certain things in games, your own actions, can cause drops in frame rate which cause problems for the user by creating jerky movement, or stuttering or whatever, by running at what I'd consider the more reasonable average of 50-60fps, you leave headroom for the frame rate to drop and still remain playable.

I agree that [H] telling its readers what's acceptable is a little annoying, but I still believe their method of benchmarking is superior to all others even if its only for the amount of data collected. 90% of the time in a situation you posed the settings would scale back equally. Furthermore if a villa is what I imagine it to be (something like the AH in wow, with lots of players on screen) the load increases exponentially (literally, x is the amount of players on screen and the workload on CPU is to the power of x.)

The Villas are singleplayer areas with very high level of detail, they've got many enemies inside and have heavy use of paralax mapping on all the surfaces, they also use many highly detailed models to decorate the interiour. They run at least 3x slower than Old Tarantia does, seriously if you're getting 30fps in the docks area you're going to get 10fps in the villas maybe even closer to 5fps.

I understand they can't realistically play the entire game since it's an MMO, but this highlights the issue of playing with an expected average frame rate of 30fps as your expected minimum, I feel that needs to be higher, and if it was higher microstuttering wouldn't be seen as an issue.

Ok, dropping the formalities, I think when you say "my background is more geared towards pro gaming" you think you sound cool but really you sound even more nerdy than me (and that's hard to do). I played Cal-M 1.6 for two years and I'd never say something like that for fear of being labeled one of them "too cool for us mortals" snobs.

I dont play in leagues, I dont even play in clans, but when I play, I play [H]ard. I expect high frame rates, if Im feeling competative and play soemthign like Unreal Torunament I'd expect 85fps average, butter smooth movement, most of the time I expect 60fps average. I like to join pubs and own it up, just because Im not in a leage or a clan doesn't mean I don't play to win and play to the best of my ability, and expect the best from my hardware, I can say as someone who plays to win, that having a good frame rate is essential.

I don't think I sound cool, I'm not trying to sound cool, nor do I care how you, or anyone else reads into that, it's irrelevant. Being a "snob" or being "pro" doesn't make you wrong, unless you have some supporting evidence to back that up?

Microstuttering is a modifier to framerate. If you understand microstutter you understand that an extra 5fps on SLI Setup A is not the same as an extra 5fps on Single Card Setup B. Yes, microstuttering is pretty directly related to framerate but its only true for that one scenario. You cant apply microstutting data to all fps readings (perticularly, obviously, those running a single card.)

This makes sense. it's not exactly a "modifier", but could be viewed as one when comparing against single GPU cards.

What you clearly don't understand though, is that this is a general effect, it can be described as "variations in the time it takes to render a frame", when we look at it like this we can say absolutely that single cards suffer microstuttering, just a much lesser degree (maybe picostuttering is a better way of putting it) the graph of time to render frames looks flat for a single GPU because it's not viewed in a resolution that would display the variances. Make no mistake each and every frame renderd by a video card takes a different amount of time to render, there is ALWAYS variance, we just cannot feel/see that variance for a single GPU because its so small.

Again, lol, I disagree. Microstuttering can be fixed and I hope it will be fixed. Weather its a software or hardware implementation microstuttering can be solved.

It's a naturally occuring problem of this type of setup, it's not due to some inherent flaw in the system, just a unfortunate side effect, stopping it entirely would involve reguarly prediciting whats going to happen in the future so you know to best space the frames that's impssible to do with 100% accuracy. The situation could probably improve somewhat, maybe even to the point where the average person cant see/feel the stutter, but it can never be eliminated entirely.

[H] tries to find the hardest areas possible, if its not, its probably pretty close and if such is the case the decrease would probably be reflective of the cards relative performance in the test.

They clearly do not do that, they didn't apply that philosophy at all with Crysis and the end level of the game, they found the best solution for the majorety of the game, then they made an exception at the end to make their review style fit.

Nor did they manage to find the worst running areas in AoC, im not sure if thats an honest mistake, or if they knew about the terrible performance in the villas and made the same type of exception. Either way the waters are muddy,

Right, but that's your opinion, and its actually kind of ironic. I bring this guy up alot and I don't think he's that far from the norm: Jamie, a friend of mine, plays WOW on his 6600LE. Frequently his FPS hits the teens. That's acceptable to him (and since were on the topic of "pro" gamers, this guy had the full on tier 3 set before the expansion... he has some wicked stuff now but I don't know exactly what. He's living proof you don't need the best hardware to be good. Hah, I've got a whole nother story there about the roflcopter but I'll leave that one for some other time.) So what you describe as "borderline" isn't borderline for some people. I won't play a game at 30FPS. Jamie will play a game at 15FPS. You think the cutoff is ~30FPS. This sounds like a good topic for a poll, why don't you create one?

MMO's are VASTLY different FPS games, you're comparing apples and oranges. What a pro gamer expects from their PC when playing CS:S in a CPL league is another world from what a gamer needs to do well in WoW. In fact i'll go so far as to say that WoW doesn't really need any skill, no reflex skill at least, the "pro" players in WoW are the ones who have grinded all the top dungeons 1000 times to get all the best armour, it has nothing to do with skill and everything to do with persistance, you do not need a good frame rate to play WoW. You should know this...

And again, [H] does judge by the minimum FPS, which is marked in red in the right hand column. You should pay as much attention to that as you do the average (especially when your building your argument for [H]'s pro ATI bias which is what it seems you think).

No, it's nothing to dow ith BIAS, I addressed this in my last post, it has nothing to do with special treatment to ATI, the point I was making is that showing us numbers borderline to acceptable in the reviews brings up problems when playing the game, namely if you go too close to borderline you end up having to stop playing in graphically stressful situations and change your settings to re-aquire an acceptable frame rate. I used this point to link into another point I was making, which is that if you benchmark so close to acceptable standards you're likely to see microstuttering, where as if you benchmarked at what I'd considering a decent average fps such as 50-60fps you account for all the issues in game where the fps dips and causes problems, it also makes microstuttering a moot problem.

Uhh. no? look at the chart. The single GPU config is flat line, and where its not there was a change in load or a bad bit of data recorded. People always seem to skip the fact that nothing is 100% accurate. Anand's testing data might be 95% accurate, in which case 1 or 2 of those points might be off. And they are. And its easy to spot the two.

I've discussed this above, the graph looks flat at that resolution, I've already said the effect is much less on a single gpu setup, several times, you cannot see it on a graph like that you have to view the timeline much further up close to see that there is variation.


Again see above, microstuttering is considered to be irregularities in the frame rate, no GPU renders each fraem at a set interval each frame takes a different amount of time to process because each frame is unique in what is being rendered, it's transparent to us because the variation is so slight we cannot see/feel it when playing. What people consider microstutter is effectively this effect amplified so that irregularities become noticeable. Thats not physically how it occurs, it's not a pure mathematical amplification of the same problem, the cause is different, but the effect we see is still essentially a variance in time to render frames, just much greater that with a single gpu.

Exactly correct, aside from another little jab at [H] at the end.

[H] needs jabbing, keeps them on their toes and at the top, currently I think they have good intentions with their review methods but theres several huge flaws with it, having to make exceptions is bad, they're saying "these are the settings the card can run at, except", except nothing [H]!! Your philosophy is to show us max playable settings, not approximate them and make exceptions to make your approximations fit. Anyhow thats OT, I apologise.
 
Threading issues

As I understand it there are a lot of reasons why a game will “stutter” or lag
One scenario could be threadproblems.

Bottlenecks will slow down performance. Bottlenecks are created if something gets overheated with too much work.

Here are some thoughts about C2Q.
C2Q is two C2D without internal communication. They communicate through the FSB. Also Intel design is done to use the FSB as little as possible because they have some latency issues (the computer is at its slowest). It is vital that the cache is used for C2D or C2Q to be speedy and that means that the hit rate needs to be very high.
If you take the C2Q which has two C2D (I call them A and B here). One thread is positioned on one core in C2D-A, and another thread is located on core in C2D-B. Then if the thread on C2D-A is moved to the other core on C2D-B it means that it will need to re-fetch all data that was stored on C2D-A L2 cache and all this data needs to go through the FSB that isn’t that fast (high latency). When the thread has moved the hit rate for cache data goes down until data has been processed. This traffic in the FSB also need to handle I/O graphics and that may slow it down more, maybe switching in the Northbridge takes extra time. So for a fraction(?) of a second the C2Q is slowed until data in cache is refilled and the FSB gets up speed again because it doesn’t there isn’t any queue of data that needs to be sent or retrieved.
If this is the case than microstuttering (or FSB-stuttering)could be a problem that is related to games that scales to a lot of cores and use some memory.

All processors are more or less sensitive for switching treads to other cores (of course it depends on how much memory they are using). I think that Vista is NUMA aware and Phenom supports that. That is a technique to add some intelligence on where threads are placed in order to optimize memory latency. I know that there are some who say that NUMA doesn’t get any advantages but it is very hard to measure the performance gain with it because it isn’t that often the no-NUMA is hurt with issues like the one described. But this microstuttering (or I/O problem in this case) could be something that is solved with NUMA. I think that Nehalem is going to have NUMA and I don’t think that Intel put it there for fun.
Also both Nehalem and AMD Phenom isn’t as sensitive for un-optimized threading as C2D and C2Q.
 
A large part of the problem is that you need to essentially predict the future.

The cards alternate frames:

Frame1 = Card1
Frame2 = Card2
Frame3 = Card1

In this scenario we need to alter card 2 to place it's frame equally in between frame 1 and 3, but we dont know what point that is until frame 3 is renderd. Leaving card2 to render frame 2 as fast as possible more or less means you're going to see frame2 appear very quickly after frame 1 and then have a long delay (relatively) before frame 3 is ready.

The rendering speed for several frames in a row is going to be quite close for the most part, so averaging the render time for the last 3 frames and delaying the "odd" frames to only display after that time/2 has passed, would probably work.

Im not sure what (if anything) is done at the moment to try and keep frame rate in sync.
 
Pls explain what you mean as that makes no sense.
What I mean is that nothing is stronger than their weakest part. The FSB is intels weakest part and the CPU isn't designed for heavy multithreading.
 
Back
Top