Do you frame limit to make games run smoother? (nvidia)

djoye

2[H]4U
Joined
Aug 31, 2004
Messages
3,114
A few weeks ago I was doing some research on Deus Ex HR stuttering when I found a thread on the Steam forums with a post buried in it saying to use nvidia Inspector to set a frame limit to smooth out Deus Ex, basically setting the frame limit to your refresh rate. I set mine to 58 via nvidia Inspector and it actually greatly reduced the choppiness in that game.

nvidia Inspector applies this setting globally so you see it across all games but if you update your drivers you need to re-apply it. Killing Floor ran a little smoother, some odd micro stutter in Team Fortress 2 disappeared (maybe my mouse software contributing to that), they either improved performance in Skyrim between the 1.4 and 1.5 betas or the frame limit is helping there and I noticed Serious Sam 3 smooth out quite a bit. The games still feel snappy but they appear to be running better.

I always use vsync otherwise I get crazy tearing but the frame limiter seems to do a better job; perhaps it does it more efficiently.
 
I noticed in the Diablo 3 Beta, enabling high quality settings and turning vsync on greatly increased the performance.

Although I was getting 60fps + there was some kind of stuttering going on. I'll have to try Inspector
 
Doing this also removes the lag typically associated with vsync when it's enabled. I use dxtory to do this in games all the time if they don't support a frame limiter in game (and running amd). With DX:HR, it was a must.
 
I always try to use a fps limiter and have set it globally in inspector since that became available.

You can still get frame tearing but it alleviates the issue somewhat since you cap the fps at lower than your monitors refresh rate. You still need vysnc to actually completely get rid of tearing. On a related note, Kepler is rumored to feature dynamic vysnc.

One other reason I use it is because the card essentially works less. I don't feel the need for games to run in excess of 60 fps on a 60 hz monitor. This way the card essentially draws less power, produces less heat, and so the fan doesn't kick in pointlessly (to me) to push 90 fps or something in an older game.

Another reason is I prefer more consistent frame rates. Dropping from say 60 to 40 in a scene is less jarring than going from 90 to 40.

You can also force FXAA using nvidia inspector now. The fps limiter and non-game dependent FXAA are supposed to eventually make its way in the actual control panel.
 
I love the idea of a frame rate limiter at the driver level. My monitor is 60Hz so it's nice to just cap my FPS at 60 and hopefully allow the GPU to save processing power.
 
No. Frame rate limiting and vysnc work differently. Also vysnc does not exactly lower your minimum framerate. If you wan to know what exactly is happening in more detail see here (also gives you more detail on why a frame rate limit won't remove screen tearing) -
http://hardforum.com/showthread.php?t=928593
 
I use it for some games. Started using it with Skyrim with nvidia inspector. Some games don't like it though so I have to turn it off for them.
 
Does it matter if you set it to 58 or 60? Nvidia inspector says 58 produces no input lag with vsync.
 
Ok, so I tried out the frame limiter. I set it to 60 since I have a 60hz monitor and played some BF3.

Two nice effects are that my GPU temperatures are significantly lower and my video card is likely consuming less power.

Now for the bad: I'm getting tons of screen tearing that didn't exist before I set the FPS cap. This is especially interesting because I was usually getting around 60-70 fps before the cap. Anyone know how to fix this?
 
I frame limit my games when possible, max frames to render ahead "1" and 125fps/120hz is the best combo i've found so far for minimal image tearring and input lag, feels really smooth in TF2 now that i got a cpu that can actually keep the framerate sky high.
 
Its actually a really nice tool for older games as well original Unreal for instance will try to run at about a billion fps and ends up looking like its running at 12.
 
I wonder if the new vsync mode that is supposed to come with Kepler will replace the need to use a frame limiter? Sounds like the same concept. Hopefully it'll work on older cards.
 
I still want to know how games like The Witcher 2, Mass Effect 2/3, Skyrim, BF3, etc get away with, what seem be VSYNC=ON, but with zero input lag. Especially Mass Effect 2/3. Controls in those games are super crisp and with no screen tearing at all.

I always thought it was a compromise: VSYNC = Lag, No VSYNC = no lag, but tearing?
 
Ok, so I tried out the frame limiter. I set it to 60 since I have a 60hz monitor and played some BF3.

Two nice effects are that my GPU temperatures are significantly lower and my video card is likely consuming less power.

Now for the bad: I'm getting tons of screen tearing that didn't exist before I set the FPS cap. This is especially interesting because I was usually getting around 60-70 fps before the cap. Anyone know how to fix this?
You'd probably need to enable vsync on top of the frame limiter.

I played with Serious Same 3 with the frame limit set at 58 and vsync off in the game and tearing was pretty bad. Vsync cleared that up and it still feels snappy.
 
I still want to know how games like The Witcher 2, Mass Effect 2/3, Skyrim, BF3, etc get away with, what seem be VSYNC=ON, but with zero input lag. Especially Mass Effect 2/3. Controls in those games are super crisp and with no screen tearing at all.

I always thought it was a compromise: VSYNC = Lag, No VSYNC = no lag, but tearing?

I always turn VSYNC off, but instead run D3DOverider on the desktop to force triple buffering sync (similar to VSYNC). I get the benefits of frame syncing (no tearing in games) with no loss of frames. Meaning, if I run D3DOverider in place of VSYNC I get anywhere from 10-20 more FPS in most games.
 
"Triple buffering sync"? Unless I'm mistaken, that's simply vertical synchronization with triple buffering as opposed to standard double buffering.

I always thought it was a compromise: VSYNC = Lag, No VSYNC = no lag, but tearing?
Basically correct, yes.
 
I always turn VSYNC off, but instead run D3DOverider on the desktop to force triple buffering sync (similar to VSYNC). I get the benefits of frame syncing (no tearing in games) with no loss of frames. Meaning, if I run D3DOverider in place of VSYNC I get anywhere from 10-20 more FPS in most games.

You're just forcing VSync + Triple Buffering.

So far the best combo I've found (mentioned by another user but I can't remember who) is Vsync in-game (or forced, whatever) + Triple Buffering (forced by D3DOverrider) + Framerate limiter set to Refresh Rate - 1 (so, 59 for my 60 Hz setup).

This generally gives me smooth gameplay with zero tearing and very minimal input latency.

Basically, if you only use a framerate limiter you will still get tearing, and if you use VSync without Triple Buffering forced you will get input lag.
 
So does the triple buffering option in the nvidia Control Panel not work? I've always forced that in via the CP but wasn't sure if it was doing anything.
 
I'm not suggesting that it doesn't work, but I'm still waiting for a technical explanation on how the refresh-1 thing works. I can't imagine how it might reduce V-Sync lag...

TBH I'm not sure how it works. Maybe it's not necessary at all. Personally I run it because I don't force VSync in all games through the CP (I use it if it's in the game menu but you never know) and I don't need my framerates going through the roof for like Minecraft or something like that.

So does the triple buffering option in the nvidia Control Panel not work? I've always forced that in via the CP but wasn't sure if it was doing anything.

Only works for OpenGL applications, not DirectX. You have to force with a program like D3DOverrider to get Triple Buffering in DX apps.

And I do mean force it. I tried using D3D with the standard "normal" detection settings and a lot of the time it wouldn't work. I had to set it to High or whatever in order to make sure it was working.
 
So far the best combo I've found (mentioned by another user but I can't remember who) is Vsync in-game (or forced, whatever) + Triple Buffering (forced by D3DOverrider) + Framerate limiter set to Refresh Rate - 1 (so, 59 for my 60 Hz setup).

This generally gives me smooth gameplay with zero tearing and very minimal input latency.

Basically, if you only use a framerate limiter you will still get tearing, and if you use VSync without Triple Buffering forced you will get input lag.

Just tried this with Adaptive VSync, seems to be working well in Dead Space 2 (sticking to DX9 games right now due to 4-way SLI Surround driver issues). Anyone know how this works?
 
Just tried this with Adaptive VSync, seems to be working well in Dead Space 2 (sticking to DX9 games right now due to 4-way SLI Surround driver issues). Anyone know how this works?

Adaptive Vsync turns on Vsync when FPS is above the monitor's refresh rate. If you are using a frame limiter set to the monitor's refresh rate or less, Adaptive Vsync is irrelevant because it will never turn on.
 
Adaptive Vsync turns on Vsync when FPS is above the monitor's refresh rate. If you are using a frame limiter set to the monitor's refresh rate or less, Adaptive Vsync is irrelevant because it will never turn on.
I see no tearing though :s
 
I use it in quite a few lower end/older games. It keeps things cooler, and theres no real point rendering at 394030 at max temperatures when you can sit 20-30 below (and I suppose save power).
 
Back
Top