Microstutter in latest-generation cards (GTX480 examples inside)

Arseface

[H]ard|Gawd
Joined
Apr 14, 2002
Messages
1,095
Okay, I posted this over at XS-forums as well, but I'd love to generate some awareness with the [H] community as well. It seems to me that microstutter is a massively overlooked feature of multi-GPU setups, which really makes a difference to gameplay. Since [H]ardOCP have always been at the forefront of testing games for playability, rather than just benchmark numbers, I thought it might be worth a cross-post.



Microstutter, for those that don't know, is the (rather crappy) name for the irregular output of frames - usually as a result of multi-GPU setups operating in AFR mode. Since the human eye notices the gap between frames as a measure of smoothness, rather than the raw number of frames that are spit out, a game with irregular frame output will not look as smooth as one with uniform output.

It stands to reason, for example, that a game outputting two frames 0.1ms apart, with a 49.9ms gap until the next frame, would look like it was running at 20fps, even though the framerate counter would read 40fps. This is a 'worst case scenario' for a two-GPU setup.


Aaaanyway. A while back I wrote a little program to quantify microstutter from FRAPS benchmarks. You can download it here. Basically it looks at the frame-by-frame variation from the local frametime, as a percentage of the average frametime (more details in the readme). From this you can gain a "microstutter index", which you can think of as a "percentage microstutter", 100% being the scenario I described above. It also shows an "effective framerate", which is a measure of smoothness with microstutter taken into account. An equivalent framerate if there was no microstutter.


Okay, now onto some examples. First I'll show you a snapshot of 35 consecutive frametimes I took from the middle of a crysis benchmark, using my GTX480 SLI setup. Not all the scene was as bad as this, but it gives you an idea of the problem:




In terms of the microstutter index, the runs that generated the plots above, had the following results.

Single GPU (2560*1600, 4xAA):





SLI (2560*1600, 4xAA):





This isn't as bad as with my old 4870x2, but it certainly shows that microstutter is still with us.

Now, it's important to note that as the game becomes less and less GPU limited, the microstutter effect reduces significantly. For example, the above benchmark had the GPUs running at near-100% load. At 1920*1200, 8xQAA, the GPUs are in the region 85-95% load for the most part. Here the microstutter index drops significantly:





Okay, well that's my findings. What I'd love to see is some results from ATI users, with multi-GPU setups. I'd like to compare the two technologies and see how they compare. Please post any results you generate, but please make sure you're really at, or close to, 100% GPU load, otherwise you will see an artificially low microstutter index. Afterburner is a good way to check this.

My hope is that maybe, just maybe, if we can generate more awareness of this problem we can get ATI / nvidia driver teams to pay attention and do something about it. A 5% performance drop (say) would be a small price to pay for regular frame output. Of course, this will not happen until review sites start taking note of microstutter when reviewing cards!
 
Last edited:
My hope is that maybe, just maybe, if we can generate more awareness of this problem we can get ATI / nvidia driver teams to pay attention and do something about it. A 5% performance drop (say) would be a small price to pay for regular frame output. Of course, this will not happen until review sites start taking note of microstutter when reviewing cards!

Probably because not everybody notices it? I cannot really say, since the only game I play a lot is Portal... on my HD5870 crossfire at 1080p (gotta be future proof).

What benches are you thinking about? I may be able to help, a little bit.
 
Probably because not everybody notices it? I cannot really say, since the only game I play a lot is Portal... on my HD5870 crossfire at 1080p (gotta be future proof).

What benches are you thinking about? I may be able to help, a little bit.

Well, it's not something that you can "just notice". It's not like tearing, or visual artifacts. Without two game scenes running side-by-side at the same framerate, you don't know it's there. It just reduces the effective framerate of the game. That is, the framerate your FPS counter is showing is no longer representative of the gaming experience. Of course, if you push the framerate up high enough you will still get a "perfectly" smooth gameplay experience. But then, if you have a high enough framerate anyway, what is the point of using a second GPU?

In short - the effect of microstutter is to *decrease the performance-value* of multi-GPU setups.


As for benchmarking, well it seems that we need to be at or close to 100% GPU load for good results. I imagine that your 5870x-fire setup doesn't go much above 50% in portal? Any game, at settings where you can push the GPU loads close to 100%, would be fine. Crysis would ideal (just for consistency).
 
Well, it's not something that you can "just notice". It's not like tearing, or visual artifacts. Without two game scenes running side-by-side at the same framerate, you don't know it's there.Its effect is to reduce the effective framerate. That is, the framerate your FPS counter is showing is no longer representative of the gaming experience. Of course, if you push the framerate up high you will still get a "perfectly" smooth gameplay experience. But then, if you have a high enough framerate anyway, what is the point of using a second GPU?

In short - the effect of microstutter is to *decrease the performance-value* of multi-GPU setups.


As for benchmarking, well it seems that we need to be at or close to 100% GPU load for good results. I imagine that your 5870x-fire setup doesn't go much above 50% in portal? Any game, at settings where you can push the GPU loads close to 100%, would be fine. Crysis would ideal (just for consistency).

Hmmm... Yeah, I noticed in my 9800GX2 (sold, now), only when the GPU was struggling, would there be "stutter" no microstutter, though... I think...

I got FRAPS and Vantage... I think I will try that.

EDIT:

This file is neither allocated to a Premium Account, and can therefore only be downloaded 10 times.
 
It seems to me that microstutter is a massively overlooked feature of multi-GPU setups,

Where have you been? This subject is dead horse material. The bottom line is that actual effect of this issue on game play is simply dependent on the game, the hardware and the person. There's just not much else that can be said about this issue with any consensus. Yes it exists, people see it with single GPU setups as well.
 
This subject is dead horse material. The bottom line is that actual effect of this issue on game play is simply dependent on the game, the hardware and the person.

The effect can be quantified - I did so in the program I posted above.

Clearly it varies depending on the game, and the hardware (what doesn't?), and also the GPU load, but the effect on the person is not going to change. As I stated earlier, it isn't something you either notice or don't - it's an apparent reduction of the framerate (and therefore a very real reduction in the performance of the hardware).



This file is neither allocated to a Premium Account, and can therefore only be downloaded 10 times.

Okay, I'll re-host [done - now on mediafire].
 
Last edited:
Where have you been? This subject is dead horse material. The bottom line is that actual effect of this issue on game play is simply dependent on the game, the hardware and the person. There's just not much else that can be said about this issue with any consensus. Yes it exists, people see it with single GPU setups as well.

Hmmm.. running 3dMark Vantage "Jane Nash" I do see stuttering, not (IMO) microstuttering, but maybe that is a fine line that I don't know about. Either way:

@Arseface, I have uploaded my results to mediafire. (just the FRAPS stuff and an image showing MSI Afterburner + Vantage settings)
http://www.mediafire.com/?xgl5795ifpv4h4z
My setup:

i7 930, stock
6GB DDR3 (no OC, so I dunno what it is set to)
HD5870 <--two of them

HD5870:
725c
1000m

I underclocked, and undervolted, both HD5870, since they would be a real waste of power, otherwise.

EDIT: Catalyst 10.8, CAP 10.8a; all ontop of Windows 7 Home Premium 64bit
 
Hmmm.. running 3dMark Vantage "Jane Nash" I do see stuttering, not (IMO) microstuttering, but maybe that is a fine line that I don't know about. Either way:

@Arseface, I have uploaded my results to mediafire. (just the FRAPS stuff and an image showing MSI Afterburner + Vantage settings)
http://www.mediafire.com/?xgl5795ifpv4h4z
My setup:

i7 930, stock
6GB DDR3 (no OC, so I dunno what it is set to)
HD5870 <--two of them

HD5870:
725c
1000m

I underclocked, and undervolted, both HD5870, since they would be a real waste of power, otherwise.

EDIT: Catalyst 10.8, CAP 10.8a; all ontop of Windows 7 Home Premium 64bit


Thanks :)

I ran that frametime log in the program, and it came out with an idex of ~18 - similar to the GTX480s under full load in crysis:





Worryingly however, looking closer at the frametime log there seems to be large periods of virtually no microstutter, mixed in with some periods of truly horrible results. Take a look at the following:





In case you want to verify the results, just take the difference between consecutive frametimes in the file, then take 1000/(this_quantity) to get the instantaneous framerate. There are several patches in that framelog where this kind of behaviour occurs... I'll try to grab vantage and see if I get something similar with the GTXs.
 
Last edited:
Yeah, I dunno why, lol. Some areas seemed fine, while others - even with a 30-40fps framerate, did exhibit stuttering (I dunno the line between stutter and microstutter).
 
The effect can be quantified - I did so in the program I posted above.

Clearly it varies depending on the game, and the hardware (what doesn't?), and also the GPU load, but the effect on the person is not going to change. As I stated earlier, it isn't something you either notice or don't - it's an apparent reduction of the framerate (and therefore a very real reduction in the performance of the hardware).
Okay, I'll re-host [done - now on mediafire].

you can graph what you think causes the effect but you can not quantify it. that problem here is that its completely subjective. some people notice it easily but others can't see it. that really makes it impossible to quantify. you might be able to quantify it to your perception but it will in not way match everyone else. I happen to be sensitive to it (then again I can't stand blue ray movies without smooth motion tech, looks very choppy to me) so I avoid multi card setups. but what ruins the effect for me more often then not goes unnoticed by the vast majority of people. I appreciate the work you put in the above but surely you know this has all been done before, ad nauseum. the situation has improved over time but the effect does remain for those that experience it.

to be brief the reason that its massively overlooked is that the majority of people do not perceive it. this makes it largely a non issue with the public.
 
you can graph what you think causes the effect but you can not quantify it. that problem here is that its completely subjective. some people notice it easily but others can't see it. that really makes it impossible to quantify. you might be able to quantify it to your perception but it will in not way match everyone else.

Ignore the graphs - they are just for illustration.

In the program I posted above, I have simply applied well known statistics to the data, in order to produce a non-dimensional quantification of the framerate variation away from a local mean. This gives a perception-independent representation of the degree of microstutter over a benchmark. It depends only on the data.

I don't doubt that different users will "feel" the effect to different degrees - that is the case with everything, including framerate itself. But no-one is saying that framerate is a useless measure of performance, just because some people see 30fps as "smooth", and others require >100fps.
 
I'm hoping they'll either find a way to improve AFR mode, or switch away from it completely. This problem is what makes SLI/CF with anything but the best cards a potentially very bad idea, at least for graphically demanding games.

What you've picked here though are rather kind examples. It's usually at sub-30 fps where the microstutter gets really bad.
 
Hmmm... Yeah, I noticed in my 9800GX2 (sold, now), only when the GPU was struggling, would there be "stutter" no microstutter, though... I think...

Same here. Got a 9800GX2 some time ago. Stutter (and many other issues) was a nightmare especially in racing games. No more SLI for me.
 
I'm hoping they'll either find a way to improve AFR mode, or switch away from it completely.

I agree completely. I dislike the idea of AFR - irregular frame output is inevitable.

Unfortunately, it's the mode which requires the least communication between GPUs, and so offers the best multi-GPU scaling in average FPS benchmarks. Since this is what the performance of the cards is judged by, via review sites, I don't see ATI or nvidia moving away from it any time soon.

A real shame if you ask me...
 
CAn't believe MC is still prevalent in these latest gen cards. You would think they would have taken care of it by now...
 
CAn't believe MC is still prevalent in these latest gen cards. You would think they would have taken care of it by now...

The problem is, anything that can be done to reduce microstutter will mean at least a small reduction in average framerate (either switching to a "multiple GPUs, single frame" rendering method, or delaying output of frames in AFR mode).

Since it's the average framerate that sells cards (via review sites), they DO NOT want to consider it. Which is a real shame.
 
I don't think there's any grand conspiracy to ignore micro-stutter like you're implying. It's an issue that hasn't received much attention because it effects so few. Providing this tool to measure it is helpful and will shed some much needed light on the issue.
 
Ignore the graphs - they are just for illustration.

In the program I posted above, I have simply applied well known statistics to the data, in order to produce a non-dimensional quantification of the framerate variation away from a local mean. This gives a perception-independent representation of the degree of microstutter over a benchmark. It depends only on the data.

I don't doubt that different users will "feel" the effect to different degrees - that is the case with everything, including framerate itself. But no-one is saying that framerate is a useless measure of performance, just because some people see 30fps as "smooth", and others require >100fps.

I may not have explained myself well. you can do all that and your still going to end up with a result that will mean nothing to 98% and it will mean something different to every person out of the last 2%. that is what I mean.
 
I may not have explained myself well. you can do all that and your still going to end up with a result that will mean nothing to 98% and it will mean something different to every person out of the last 2%. that is what I mean.

well that's nice, but the rest of us who know this is a real issue and aren't looking for reasons to defend their sli/crossfire setups to themselves, appreciate this program.
 
I don't think there's any grand conspiracy to ignore micro-stutter like you're implying. It's an issue that hasn't received much attention because it effects so few. Providing this tool to measure it is helpful and will shed some much needed light on the issue.
Yeah, the only time I notice *any* kind of stuttuer (I dunno if that _is_ microstutter, lol....), is when I max settings to the point where two HD5870s (my setup :)) would choke, and choke hard. Anything less, and twin HD5870 work exceptionally well - no stutter. Of course, there are the *other* problems of dual ATi cards... :mad:
well that's nice, but the rest of us who know this is a real issue and aren't looking for reasons to defend their sli/crossfire setups to themselves, appreciate this program.

Well, at least it helps quantify this in some way, :p
 
It's an obviously real problem and infinitely annoying to those who are keen enough to notice it. Here's a good example of it

Watch the 9800GX2, even though the framerate is often higher than the other cards, it also appears alot more choppy, a direct result of microstutter:

http://www.youtube.com/watch?v=DYnXxI1UjxE
 
Thanks for the support, butterflysrpretty :)

I don't think there's any grand conspiracy to ignore micro-stutter like you're implying. It's an issue that hasn't received much attention because it effects so few. Providing this tool to measure it is helpful and will shed some much needed light on the issue.

Perhaps I am overreacting... I don't really believe that there is some big conspiracy not to address microstutter. But, I do believe that it will take more awareness before anything is done about it - particularly because addressing it will require at least a small reduction in average FPS performance to regulate the frame output. I don't believe that anything will get done until big name review sites (like [H], toms, anandtech etc) start to report it as a real issue. This is one of the main reasons I'm going to the trouble of compiling all these results.


Anyway... I think I've found a candidate for "worst-case scenario" as far as microstutter goes. The heaven benchmark seems to really make the cards sweat (~100% load), and microstutter like crazy while they're at it.

Running at 1920*1200, with 4xAA, 16xAF, and starting a 60s FRAPS benchmark right as I start the benchmark in heaven, I get the following:


Normal tessellation, render max 3 frames ahead:


Extreme tessellation, render max 3 frames ahead:







Choosing instead to render 5 frames ahead, I see an improvement, albeit very slight:

Normal tessellation:


Extreme tessellation:



I have to say, as I watched the benchmark unfold I KNEW the value was going to be high. I mean, it seemed pretty smooth overall, but for me 60fps is usually rock-solid (by eye anyway). In this, I saw the framerate counter at ~74, 75, and I could still tell that it wasn't perfectly smooth. As object passed by they seemed to be just a little jerky. Looking more closely at the frametimes file, there is some pretty poor behaviour in there. Not quite as bad as the plot I posted above from the 5870 x-fire vantage run, but still pretty disappointing.


As an appendum, I have some tri-fire (5970+5870) results for the heaven benchmark, from another forum. To summarise, the microstutter index for 'normal' and 'extreme' tessellation were 38% and 48% respectively. This is notably worse than the GTX480s, but possibly to be expected, since we have 3 GPUs working together rather than two.
 
I have never noticed microstutter in the 9 months I have had an SLI setup.

As I mentioned before, it's not something you can really "just notice". The difference between microstutter and regular drop in FPS is, for the most part, indistinguishable. It's only when the framerate drops to a crawl (say <25fps), when you can catch individual frames being output, that you notice they are irregular. Only in this case would you really see the effect as "stuttering". (I think this is what you are seeing, jeremyshaw).

To put it a different way... Say you are getting 40fps with your GTX260 setup, and assuming you have a fair degree of microstutter (like the benchmarks above). If you put a different computer next to it, running the same game at an identical framerate, but using only a single GPU (say a GTX480), the single GPU setup would look smoother. You would need a higher framerate in order to get the same smoothness.
 
Have to agree w/the above...it's not the microstutter-adjusted min fps is meaningless or subjective, rather it is that the displayed in-game FPS is meaningless on crossfire setups.
 
The problem is, anything that can be done to reduce microstutter will mean at least a small reduction in average framerate (either switching to a "multiple GPUs, single frame" rendering method, or delaying output of frames in AFR mode).

Since it's the average framerate that sells cards (via review sites), they DO NOT want to consider it. Which is a real shame.

Bingo!
 
i have been playing with this micro stutter BS since 2008 when i got my first multi-gpu machine, 8800 gtx sli. i first noticed this in crysis. single card was smoother than in alot of cases. i had to achieve a frame rate reading of at least 45 fps to not notice the micro stutters as much.

i can tell you from personal experience that the effects of micro stutter are almost strictly driver related. when i upgraded windows xp to windows 7 and still used the same 195.62 drivers last year the micro stutter in crysis magically dissapeared. i was literally jumping up and down and could not believe it! even though i do not play crysis any longer i still use it as the ultimate gpu benchmark program. the stutters came back with newer drivers, though. it is a never-ending battle with this garbage.

on another game, world of warcraft, sli really performed well with the same 195.62 nvidia drivers. in dalaran i would get 25 fps for example, and it really felt like 25 and not like 7 fps due to micro stutter. it definately is driver related more so than game related. we are at the mercy of the driver teams.

i just built a new computer with gtx 460 sli and core i7 930. with the latest nvidia drivers, 258.96, the micro stutter is back full force. heaven benchmark and crysis stutter like there is no tomorrow. it is so dissapointing that i often times ask myself why the hell i went multi-gpu route and not just get one gtx 480 or equivalent....i actually know why. it's because i need 2 gpu's to run my triple monitor setup for surround gaming. maybe i should have gone with a single hd 5870 and the expensive display port adapter to be in single gpu without and still run surround gaming.

i made a youtube video trying to get attention to this matter.

http://www.youtube.com/watch?v=4g1vVJTGdtU

when i attempted to bring this subject in other forums in general i usually get the same stupid replies that include; " 30 fps is not smooth, only 60 fps is", "i don't see stuttering the video you posted", "you don't have enough vram", "sli is not for you", and the list goes on and on by the people that need to defend their sli products instead of acknowledging it.

the biggest reason why this subject is not getting enough attention is because these paid and bought-for review sights never ever mention it. i question a reviewer's legitimacy if he cannot see the micro stuttering right in front of his face before glorifying how awsome the frame rates are because he added another card in the review machine. for example, the latest reviews for the gtx 460 (which are obviously using the same drivers as everyone else right now) state the heaven benchmark results even in 2560x1440 and higher but never say that the frame rate readings are not smooth.

i installed a gtx 470 in my system after taking out the gtx 460 cards and ran heaven benchark. the gtx 460's outperformed the single gtx 470 by a large margin, but the experience wiht the single gtx 470 was a smooth one compared to the micro stutter and choppy gameplay of the gtx 460 sli setup, even though the frame rates were 50-100% higher!
 
Thanks for the reply, Argh - it's good to see someone else trying to draw attention to the microstutter problem! I completely agree that the lack of attention from review sites is contributing to the issue to being ignored or underestimated in the community. I think that if just one major review site did an analysis piece on microstutter then it would rapidly gain acceptance. Hell, I'd happily write the damn thing if I was given access to a few different multi-GPU setups! (Like that will ever happen...).

It's interesting that you say you see a big difference depending on drivers... This isn't something I have thoroughly checked yet, but I will look into it more closely. I'll try some older driver sets and get back to you with some numbers.

I have to say, it's difficult to see the effect with that youtube video you posted, but I think that's more down to the limited framerate of the video than anything else. It would be easier to see on something that can make the cards crawl down to the level where individual frames can be captured by the video (say <20fps). Maybe heaven benchmark on the three-screen setup would work?
 
the stutters start at 58 seconds into the video. 0-58 seconds is a single gpu run with my ati card 5870 as a comparison. just pay attention at the foiliage and the ground. it might give you siezures.
 
I just get it. No one is denying that the problem exists, at least I'm not, but there are simply two many variables. I think a Crysis video is being mentioned here and if I'm referencing the correct video, I had an interesting experience, I didn't really see the stutter until I saw the side by side comparison of a 5870 and 460 SLI and indeed at around 30 frames a second yes the stutter is pretty noticeable but the game still looked plenty playable, I've seen worse, not all microstutter is created equal.

And so even my OWN perception in this particular instance changed within minutes of seeing the side by side comparison and the tester was intentionally limiting the performance to 30 FPS to prove I think an odd point, that 30 FPS in a game should ALWAYS be smooth when his own tests concluded that this not the case. I guess his point though was that games usually are or should be smooth at 30 FPS and I simply don't agree with the point. I can simply feel how much more a responsive and game is at 100+ vs 30 FPS.

The thing about SLI and Crossfire is performance and with more performance the less likely even people who pick up on this issue well will see it and that seems to be at least a consensus that this is true however it does seem to satisfy a number of folks how want the underlying issue fixed, and it's just that easy for a number of ways and there are limits to technology, I don't see why video cards would be different from any other technology. I just think we're running up a against some limits from the beginning not even necessarily perceptible by all lot of people. Sure you can imperially measure it but that's still not the same thing as human perception.

So once again just too many variables.
 
Last edited:
I just get it. No one is denying that the problem exists, at least I'm not, but there are simply two many variables. I think a Crysis video is being mentioned here and if I'm referencing the correct video, I had an interesting experience, I didn't really see the stutter until I saw the side by side comparison of a 5870 and 460 SLI and indeed at around 30 frames a second yes the stutter is pretty noticeable but the game still looked plenty playable, I've seen worse, not all microstutter is created equal.

And so even my OWN perception in this particular instance changed within minutes of seeing the side by side comparison and the tester was intentionally limiting the performance to 30 FPS to prove I think an odd point, that 30 FPS in a game should ALWAYS be smooth when his own tests concluded that this not the case. I guess his point though was that games usually are or should be smooth at 30 FPS and I simply don't agree with the point. I can simply feel how much more a responsive and game is at 100+ vs 30 FPS.

So once again just too many variables.

try playing with this stutter in online matches it will get you killed. yes it is playable. even 14 frams per second is playable if you want to take it to this extreme. i say this because that is what it this micro stutter feels like at 30fps.

but the problem is not if it's playable. the problem is that your investment is not performing as it should. the frame rate reading is false. the addition of a second gpu is pointless if your game micro stutters like my video shows.

regarding a locked frame rate of 30 always being smooth? YES! it should always be smooth because it's a frame rate and nothing more. 30 fps is the target frame rate of the gaming industry since it's inception. at 30 frames per second, motion is percieved to be liquid. just look at the biggest games out there right now running at a locked 30 fps; halo franchise, gears of war franchise, BC2 on console, battlefied 1943 on console, etc etc. these games are locked at 30 fps.

another example; you are at 18 frames with your single gpu setup. you decide to add another gpu into the machine and are now hitting 30 fps but it feels like 18 again. do you know how frusterating it is to have blown your cold hard cash for something like this? you might as well have given the hundreds of dollars to a homeless bum person on the street so he can buy more vodka instead.

the review sites are doing us an injustice by not being up front about this devistating symptom. the new generation of young gamers are un-informed and quite frankly un-educated when it comes to game play. they have been fed the lies of "60fps" and don't realize that even a 30fps is an absolute acceptable frame rate for gaming.

i am not arguing with you about 60 fps being better than 30. of course it's better. it is a quicker, more responsive frame rate. but by no means should anyone state that 30 is "not smooth" because am getting sick of mis-informed people spouting off BS.
 
Good post. The "min fps" never make any sense. The frame variation is a much better method to gauge the problem. I am surprised most, if not all the review sites are too stupid to notice that.
 
Good post. The "min fps" never make any sense. The frame variation is a much better method to gauge the problem. I am surprised most, if not all the review sites are too stupid to notice that.

they ain't stupid. they know this. but they will never admit it because it would mean nvidia and ati will stop sending them free cards to "review".

if you want the truth you really have to dig for it yourself. i dug up some of the truth for you guys already, and for the mis-informed and downright stubborn ones.

http://www.youtube.com/watch?v=aQa97-ApWvc

the 5970 micro stutters and is not nearly as smooth as single gpu at the same frames compared to the single gtx 480.

http://www.youtube.com/watch?v=DYnXxI1UjxE&feature=channel

the 9800GX2 micro stutters. it's 30 fps looks like 15 fps compared to the other 3 single gpu solutions out there.
 
Micro-fluctuations. Stutter is missing frames or dropping to zero fps.

This whole calc is erroneous.
 
my 9800 gx2 slli setup only every sucked in close quater combat in cod mw other thtn that never noticed it.
 
regarding a locked frame rate of 30 always being smooth? YES! it should always be smooth because it's a frame rate and nothing more. 30 fps is the target frame rate of the gaming industry since it's inception. at 30 frames per second, motion is percieved to be liquid.
This simply isn't true. Games can't get away with fluid motion at 30FPS under most circumstances (contrast dependent. Only extremely low contrast motion will be acceptable at 30FPS).

Movies can do it at 24FPS because film captures motion blur. Every frame is exposed for 1/24th of a second, which means you have thousands of samples over that short span of time composited together into a single frame. This creates a higher effective framerate, smooths the transitions between frames, and creates fluid motion.

Games do not have this luxury. To get the same type of motion blur in a game, you would have to render thousands of frames every 1/24th of a second, then composite all those into a single frame and send it to the monitor. The game would have to run at thousands of frames per second in order to make 24FPS or 30FPS look as smooth as a film.

What games do to avoid this huge problem is take advantage of Persistence of Vision. They run at a high enough framerate, while presenting every single frame, that the human visual system merges the frames together and creates fluid motion. 60FPS is about the minimum for this effect to become solidly fluid under most circumstances. The larger the change in contrast between two frames, the higher the framerate must be to maintain fluid motion using this method (yes, those 120Hz monitors will make motion smoother under certain circumstances if your framrate is high enough to take advantage of them).

just look at the biggest games out there right now running at a locked 30 fps; halo franchise, gears of war franchise, BC2 on console, battlefied 1943 on console, etc etc. these games are locked at 30 fps.
Game console hardware is too slow to reliably maintain 60FPS, so those games are locked at 30FPS in order to prevent stuttering due to framerate deceleration. They also tend to enable V-Sync to eliminate image tearing, which means if the game can't maintain 60FPS, it's forced directly to 30FPS. Triple buffering would allow some intermediate framerates, but consoles don't have the video RAM to spare for that, and it would re-introduce the framerate deceleration problem.

Without that 30FPS cap, the framerate would fluctuate up and down depending upon what's being rendered. As the framerate is forced to decelerate from 60FPS to 30FPS, there's a sudden jolt as the time between frames changes from 16ms to 33ms. This jolt is apparent to players, so they cap the framerate low enough that such fluctuations are avoided. Unfortunately, this also means motion isn't as smooth due to the low framrate.

If console hardware were capable of outputting 60 FPS consistently in the titles you mention, believe me, they wouldn't be capping it. It all comes down to hardware limitations.
 
Last edited:
This is why I just disabled SLI on my GTX480 set up. I've been running mulitple GPU's since Nvidia's 8800 series and even ran AMD's HD4870X2... Every card has given me stutter below 52 fps. I was beginning to wonder why people thought 30+ fps was playable until I disabled my SLI on my 480's and tried playing some games like Metro2033 and STALKER COP.

Well I'm pretty happy with lower frames now. While I still notice some choppieness at lower frames... I still get an overall smoother experience regardless the lower fps counter.

I think I'm just going to stick with single cards for now on and save my money. I run 2560X1600 res and the latest single cards seem to run most games good enough.
 
Single GPU is not really an option for multiple monitor gaming. Most games run well enough for me, but there are a couple where the 'micro-stuttering' is pronounced (Crysis, AC2).

What if you turn on V-Sync and you can maintain an almost constant 60 FPS? Then there should be no framerate variation, right? Given the disparity between today's game engines and today's available horsepower, it's doable in most games.
 
Single GPU is not really an option for multiple monitor gaming. Most games run well enough for me, but there are a couple where the 'micro-stuttering' is pronounced (Crysis, AC2).

What if you turn on V-Sync and you can maintain an almost constant 60 FPS? Then there should be no framerate variation, right? Given the disparity between today's game engines and today's available horsepower, it's doable in most games.

yes there is no frame rate variance with vsync on and fps of 60. but is this really fair? why should multi-gpu solutions require a frame rate of 60 when a single gpu can happily run at 30 fps to give a smooth gameplay experience? if this is not the definition of being ripped off, then i don't know what is.

the only way i see multi gpu solution helping is if you are hitting a frame rate of 45 in the game you are playing and wish to be at the more precise frame rate of 60. then you can add another card to go multi-gpu which will put you at this vsync-locked frame rate.

but where it really counts is when you are playing the latest game and your single gpu cannot max it out. you are hurting at 18 frames per second. you decide to add another gpu in the mix and now you are at 30 frames per second but the gameplay is not smoother than the single card setup at 18 fps.
 
Back
Top