microstutter / stutter 100% FIX!

Of course you can force triple buffering through the driver. Nvidia has been doing it in DirectX apps automatically since I can remember. At least 6 years. You can even force it in OpenGL, but there you have to do it manually afaik. Don't know how it's handled on the Radeon side.

As I said, I use Dxtory, and if I disable vsync, I get tearing. Could be that it helps a little bit, but not much.
Only in OpenGl games not Dx
 
If my car was making funny noises and shaking at 60mph on the freeway, and my mechanic told my the "100% FIX" was to simply keep my speed below 30mph, I would not be happy.

This isn't a fix, it's trading one problem for another.
 
If your car was making funny noises and shaking violently at 60mph on the freeway, and your mechanic told you the "100% FIX" was to simply keep your speed below 30mph, would you be happy with that solution?

This isn't a fix, it's trading one problem for another.

This sums up what I feel as well.
 
lol, thanks for saving me the time, I'm still in disbelief that there is an actual conversation taking place about 30 fps being 'fluid' lol

I'm confused. so when you watch TV or movies, you don't think it's fluid?

C
 
I'm confused. so when you watch TV or movies, you don't think it's fluid?

C

Film and TV are deliberately not fluid. This gives things a 'film effect', which your brain expects to see when watching movies.

Go to an electronics store and find one of the TVs that interpolates frames to show everything at 60fps. It will look surprisingly different in a way that's hard to pin down at first, because it's so fluid. The difference is even more pronounced if you can watch a normal 24fps film right next to it.

Film and TV can get away with lower frame rates than video games because cameras can capture motion blur. The motion blur 'tricks' your eyes into being okay with the lower framerate. If you really want to see the difference, find a video camera, set the framerate to 24fps, but set the shutter speed to 1/500 or something tiny. Start panning around or filming some motion, and you'll immediately notice that it looks choppy. Or go watch some of the zombie scenes in 28 days later, which have this choppy effect to make everything appear odd.

You can't compare film frame rates to game frame rates. Apples and oranges.
 
Only in OpenGl games not Dx

No, both. I should now, I've been using a Geforce since 8800GTX. I'm often below 60fps and don't have just 30, 20, 15...fps but everything in between. In absolutely every game, no exception.
 
Of course you can force triple buffering through the driver. Nvidia has been doing it in DirectX apps automatically since I can remember. At least 6 years.

nope driver based triple buffering doesnt actually do anything unless its opengl lol. you have to use d3doverrider.

and for the love of god people dont start talking about 30fps like there isnt a huge discernible difference between 35, 40, 50, 60fps. it depends on the person, some people can detect higher framerates, and some people have crappy old person eyes. deal with it.

::edit::

this is the worst thread ever. why did i even come in here?
 
I can't say whether or not this soltuion fixes stuttering(as with a only using single cards I have never personally seen it), but 30 fps for games like shooters or even racing is most definitely not playable, even when it goes down in the 40's, I can immediately tell. It probably just depends on the person and their eyesight.

As a side note, this thread defenitely has the feeling of "troll" about it...
 
You see, that's weird because TV is 30 fps, nobody complains that their A team reruns have low framerate.
 
Well this is the problem i've had with a couple of games. Right now it's battlefield 3. with my set up I never dip below 60fps. hell it goes up to 100fps some times. The problem is that it never feels smooth. like it's a much lower frame rate. I use MSI to check my fps and it's always well above 60. When I used this program to cap my frames at 60, the game became butter smooth. There is still screen tearing, but I don't really care as long as the game is smooth!

Is this called micro stutter or un-even frames?
 
You see, that's weird because TV is 30 fps, nobody complains that their A team reruns have low framerate.

I covered this in a post above. Film and TV have motion blur, so they can get away with a lower framerate.

Your video games do not have motion blur, so they need a higher frame rate.

You can't compare TV/film to video game frame rates.
 
Last edited:
OK, thanks for clearing that up, sorry I wasn't paying attention above
 
I covered this in a post above. Film and TV have motion blur, so they can get away with a lower framerate.

Your video games do not have motion blur, so they need a higher frame rate.

You can't compare TV/film to video game frame rates.

QFT. People don't realize that 30 FPS might be acceptable on a TV, but looks horrible on perfect non-motion blur'd computer frames.

60 FPS used to be the minimum for "acceptable" computer gaming for me, but now I've bumped it up to a buttery smooth 120 FPS. And yes, it does make a difference.
 
No, both. I should now, I've been using a Geforce since 8800GTX. I'm often below 60fps and don't have just 30, 20, 15...fps but everything in between. In absolutely every game, no exception.
Its because the game can call for triple buffering. If the game supports it you can do it. if the game doesnt you cant. Simple. Im talking of driver level btw not through D3d.
 
Last edited:
I'm interested in this program as long as it can limit things to 60 FPS instead of only 30.
I've definitely had stutter issues with games that report slightly more frames (61) than my TV's refresh rate...which causes stutter, even at very high framerates.
 
i'm curious to try this. we can argue the symantics all day of whether this rids true microstutter or not, but if this can give me a smoother experience at 60fps without introducing input lag, i'm sold
 
Bandicam is better for just FPS capping since the free edition of Dxtory is so annoying and its other features go unused. I've tried a bunch of different solutions for micro-stutter but the best method is to use either an in-game console command to limit FPS or just use Bandicam to do the same. Testing this out is real easy with a game like Crysis 2 that's real demanding so you're not getting 60+ FPS and you can set a cap with console command "sys_maxfps" and immediately see whether it makes a difference or not.

Limiting FPS isn't a perfect solution of course since you need to set the cap so that FPS never goes below it, otherwise stutter re-emerges. Micro-stutter is mostly visible when FPS is <60 which happens mostly using Eyefinity and/or demanding games, and it can be a bit of a pain to set a working FPS cap since FPS tends to fluctuate quite a bit in games. It's still gives the best results in terms of smoothness, and I've used FPS capping in every single game since I discovered it as a cure for all sorts of stuttering behavior.

The bad side of this is that you really start thinking whether CF/SLI is worth it. Personally I'm going for single gpu in the next gen, since it should offer similar fluid gaming experience to my current setup without the extra hassle of setting FPS caps and waiting for CrossFire profiles.

I'm kinda disillusioned that AMD/Nvidia will ever get to fixing AFR stuttering. As long as benchmarks are done using FPS nobody cares. Props for Techreport for stirring the nest a bit. ;)

For those with single GPUs this can be useful too. I remember back a year or two ago when I first came across the issue I tested FC2 with my GTX275 and it had considerable stuttering problems and setting a cap solved them. There's not a lot of games that stutter with single GPUs though.
 
No, both. I should now, I've been using a Geforce since 8800GTX. I'm often below 60fps and don't have just 30, 20, 15...fps but everything in between. In absolutely every game, no exception.
nVidia definitely does not force triple buffering by default, and my GTX580 definitely did suffer from the 60->30->20fps issue without D3DOverrider running.

However, I gather from your post here that you're running SLI, and SLI is a different story. As soon as I plugged in a second GTX580, the vsync framerate steps disappeared. I'm not sure exactly how the frame buffers are laid out in AFR mode, but clearly there are enough of them to avoid this problem.
 
For those of you saying that 30fps (really 24) is fine for TV/Movies.. try convincing somebody who can see the seperate frames when watching TV or movies.

I have to make myself not pay attention to the slow frame rate or it drives me crazy.

As for microstutter with CFX or SLI, I am starting to think it is purely because of latency between the two cards. I may try upping my PCI-E speed tonight to see if it make a difference in Crysis2.

I noticed that if i turn certain graphics settings to ULTRA, even thogh the frame rate only drops 3-5 FPS, it creates microstutter.. and this is at 50-60+ FPS. Some of the settings that cause it don't drop the FPS at all.
 
nVidia definitely does not force triple buffering by default, and my GTX580 definitely did suffer from the 60->30->20fps issue without D3DOverrider running.

However, I gather from your post here that you're running SLI, and SLI is a different story. As soon as I plugged in a second GTX580, the vsync framerate steps disappeared. I'm not sure exactly how the frame buffers are laid out in AFR mode, but clearly there are enough of them to avoid this problem.

that's when load balancing kicks in with the cards.
The 580 is enough to crunch any frame doesn't matter if one frame is heavier than the other. What most people do is that they go up their resolution when adding another gpu. But they forget one thing. Sli or crossfire is not one big engine. Its still being seen as two cards one doing odd and one doing even frames. What happens when a couple of the frames are a lot of rendering that the gpu struggles to handle? It time to do the frame will go through the roof then the load balancing the Gpus do struggle to sync the Gpus.
 
Last edited:
Here's a question - it seems like these FPS limiter programs seem to be a side effect of video recording software.
Are there any (universal) FPS limiters that do that and only that, sans any video recording?
 
nVidia definitely does not force triple buffering by default, and my GTX580 definitely did suffer from the 60->30->20fps issue without D3DOverrider running.

However, I gather from your post here that you're running SLI, and SLI is a different story. As soon as I plugged in a second GTX580, the vsync framerate steps disappeared. I'm not sure exactly how the frame buffers are laid out in AFR mode, but clearly there are enough of them to avoid this problem.

As long as your framerate stays above Vsymc, you won't have the stepping - so most likely your SLI config is just able to keep the frame rate above Vsync.

And I thought it was pretty well known that Nvidia control panel Trible Buffering was OpenGL only - but I guess I was wrong to think that.
 
Here's a question - it seems like these FPS limiter programs seem to be a side effect of video recording software.
Are there any (universal) FPS limiters that do that and only that, sans any video recording?

it would be nice if fraps added this in, since i almost always have it running anyway.
 
nVidia definitely does not force triple buffering by default, and my GTX580 definitely did suffer from the 60->30->20fps issue without D3DOverrider running.

However, I gather from your post here that you're running SLI, and SLI is a different story. As soon as I plugged in a second GTX580, the vsync framerate steps disappeared. I'm not sure exactly how the frame buffers are laid out in AFR mode, but clearly there are enough of them to avoid this problem.

I've read up on the issue and it seems that triple buffering is on by default when using vsync (Nvidia). As I said, I have always experienced the full spectrum of fps when using vsync, not just the discrete values 30, 20...
Edit:
It might be something else like "render ahead". To be honest, one cannot be sure by what fraps displays as framerate because this is only the result and we don't know what leads to it. Quite a mystery...

Btw, Nvidia will implement a fps limiter in their drivers real soon. There has been a petition for it and it has been granted. Just to let you know.
 
Last edited:
As long as your framerate stays above Vsymc, you won't have the stepping - so most likely your SLI config is just able to keep the frame rate above Vsync.

And I thought it was pretty well known that Nvidia control panel Trible Buffering was OpenGL only - but I guess I was wrong to think that.

yeah that's easier to do for a card like the 580. Now take the 6870 for instance. If you run on 1080p trying to max everything out or so. The fps is going up and down as it will struggle at heavier scenes. Now when having two of them same things are going to happen but the one gpu might get the heavier frames. It simply doesn't have the horse power to take everything a game throws at it at 1080p or higher resolutions. Simply because its not one big rendering engine
Tripple buffering can only be done via dx with d3d overrider not the drivers. But it adds a lot of frame latency
 
Here's a question - it seems like these FPS limiter programs seem to be a side effect of video recording software.
Are there any (universal) FPS limiters that do that and only that, sans any video recording?

Not that I know of. I think there was something called fps limiter but it was dated and didn't work with newer games. Bandicam is the most lightweight. There's also TommTi SSAAtool that has fps cap but it's kinda hard to setup. And as it was already said, Nvidia will implement a limiter in their drivers soon. Hopefully AMD follow up on that too.
 
You mean as long as my framerate is above 60 then it's not below 60? Thanks for the insight.

But I'm talking about a steady 50fps.

No, as long as it stays above 60 it'll be 60, but as soon as it drops below 60, with Vsync on, it'll drop down to some fraction of 60 (not, say 58) - hence the word "stepping" in my comment.
 
No, as long as it stays above 60 it'll be 60, but as soon as it drops below 60, with Vsync on, it'll drop down to some fraction of 60 (not, say 58) - hence the word "stepping" in my comment.
I think you missed my point...

Anyway, go and re-read my first comment. I know this happens with one card. It doesn't happen with two.
 
I can now play skyrim at ultra high settings. The stutter is gone, but I can tell at 30 fps when turning that everything isn't as fluid. Doesn't seem to bother me much. Gonna try it on some other games. Stuttering bothers the crap out of me. I'm very sensitive to it.
 
I played around with a few settings in there and set the limit to 60 fps and bingo!!! Skyrim can now flucuate between 28-60 fps without stutter. Witcher 2 at high settings is now smoother. Thank you OP for recomending this. Not all of us are uber rich gaming nerds. This has helped me alot. :)
 
I recall hearing the possibility of an Nvidia driver framerate cap some time ago, going all the way back when I first bought my GTX570. Any idea when this might really happen?
A lot of games have commands for this, but not all of 'em.
 
Film and TV are deliberately not fluid. This gives things a 'film effect', which your brain expects to see when watching movies.

Go to an electronics store and find one of the TVs that interpolates frames to show everything at 60fps. It will look surprisingly different in a way that's hard to pin down at first, because it's so fluid. The difference is even more pronounced if you can watch a normal 24fps film right next to it.

Film and TV can get away with lower frame rates than video games because cameras can capture motion blur. The motion blur 'tricks' your eyes into being okay with the lower framerate. If you really want to see the difference, find a video camera, set the framerate to 24fps, but set the shutter speed to 1/500 or something tiny. Start panning around or filming some motion, and you'll immediately notice that it looks choppy. Or go watch some of the zombie scenes in 28 days later, which have this choppy effect to make everything appear odd.

You can't compare film frame rates to game frame rates. Apples and oranges.
Thank you for the well written explanation!

Why can't video game programmers program in this motion blur into video games so you don't require a super high frame rate. Well, I've seen motion blur but it generally is terribly done and gives me a headache.

IMHO it isn't about numbers, it's about tricking the brain into believing you're seeing well. FPS.. eh, we've been heading to the quantity over quality direction for a while, it seems.
 
30 fps is considered liquid in the video game industry and is the target frame rate of the biggest franchises like gears of war, halo seires, nattlefield, etc. So if 30 fps is unplayable for you then it would not be possible to have played those games i briefley mentioned. If you have never experienced smooth 30 fps because the gpu maketers push high frames over the smoothness of frame output, i feel for you and now the fix is here for everyone.

never be mislead by the marketing departments of gpu makers again. You dont need super high framerates and the newest cards to enjoy enjoy smooth gaming experience.

30fps is considered the minimum acceptable in the gaming industry not the gold standard. Your interpretation of it very odd. Alot of companies believe 60fps is the max needed. But that was mostly when most people only had 60 hz LCDs and CRTs were in serious decline. And of course alot of companies who capped at 60 like id took a major beating from thieir community.

I strive for 120hz and I like it and it is definitely better.
 
While yes 60+ fps is defiantly better, this program sure helps smooth out anything below that. Before I couldn't play games that dropped below 50 fps because of bad stuttering. Now all my games are playable.

Just set your fps limiter in the program to your desired fps cap but watch it eliminate the stuttering in your games.
 
I recall hearing the possibility of an Nvidia driver framerate cap some time ago, going all the way back when I first bought my GTX570. Any idea when this might really happen?
A lot of games have commands for this, but not all of 'em.

Nvidia has only stated that the limiter is added for future drivers, so probably in the end of a long list of fixes. You can go read the petition on Nvidia forums, but that's the gist of it. They said they would add it. Some day.
 
Why can't video game programmers program in this motion blur into video games so you don't require a super high frame rate.

NOOOOOOOOO

you dont understand, tv and movies dont have dynamic cameras jerking around at insanely high speeds because theres no mouse. lower framerates, especially below ~55fps, will always suck hard for first person shooters or any other game that requires target tracking at high speeds while monitoring the detailed surrounding environment for other potential threats.

i vote exactly the opposite, make movies 60fps so we can actually see wats going on...digital post processing workload be damned. okay thats my off topic post, im done
 
Thank you for the well written explanation!

Why can't video game programmers program in this motion blur into video games so you don't require a super high frame rate. Well, I've seen motion blur but it generally is terribly done and gives me a headache.

IMHO it isn't about numbers, it's about tricking the brain into believing you're seeing well. FPS.. eh, we've been heading to the quantity over quality direction for a while, it seems.

Motion blur is so computationally expensive that you'd be better off just rendering at 60fps (or 120fps if your monitor supports it).
 
All these comparisons to video/film are pointless. Think how fast you can rotate your viewpoint 180deg. in an FPS. 1/4 second? Less? I check my 'six' constantly while playing battlefield and other games. It's just a quick flick of the mouse back and forth. Now, at 30 fps I would see massive strobing as my view changed 180 deg. and only a handful of frames were drawn on the screen.

Back in crt days we played on 19" monitors (trust me they were Gigantic back then!) at 100 to 120hz with voodoo cards that could output 150fps. Then 60hz lcds ushered in the dark ages for a few years until 120hz monitors reappeared.

If you spent any amount of time playing at 120 fps and then tried to go back to 60 you wouldn't like it. Individual mileage may vary but for me anything over 90 feels smooth, below that I notice. And yes I think 60 is playable for sure it's just the strobing is very noticable. 30fps - no way. Third person games on a PS3 at 30 are fine on a tv but not for FPS. FPS games are a special case that need high frames because of the unbelievably fast viewpoint changes that are possible and common.

I would always lower graphic settings to maintain high frame rate in any fps.
 
Back
Top