A Program that Limits FPS?

have2p

Limp Gawd
Joined
Aug 15, 2005
Messages
160
I'm an avid L4D player and I use a couple auto-fire scripts that make use of the "wait" function which waits one frame before executing the next command. Because the length of time spanning the "wait" function is dependent on my framerate, the rate at which I can fire my weapon is also dependent on framerate. This is very annoying. I would like to cap my framerate around 75 FPS, that way it would remain constant. Some older Valve games offer console commands like fps_max, but that is not the case here. I can limit my FPS to 60 by using Vsync, but I can't stand the input lag when running at 60 FPS. Is there any third party software I can use to limit my frames to 75 FPS? I've done quite a bit of googling and can't seem to find anything. Thanks for the help.
 
You could try setting your monitor's refresh rate to 60 as well? but, I've still had input lag even in that scenario.
 
I'm an avid L4D player and I use a couple auto-fire scripts that make use of the "wait" function which waits one frame before executing the next command. Because the length of time spanning the "wait" function is dependent on my framerate, the rate at which I can fire my weapon is also dependent on framerate. This is very annoying. I would like to cap my framerate around 75 FPS, that way it would remain constant. Some older Valve games offer console commands like fps_max, but that is not the case here. I can limit my FPS to 60 by using Vsync, but I can't stand the input lag when running at 60 FPS. Is there any third party software I can use to limit my frames to 75 FPS? I've done quite a bit of googling and can't seem to find anything. Thanks for the help.

input lag will not change regardless of FPS, its a characteristic of the monitor
given that theres a very good chance you're using an LCD, the extra FPS aren't even capable of being displayed

in other words, use vsync, it does what you want, and will not hurt performance

Auto fire scripts?

you know, so you can TK more people faster :p

You could try setting your monitor's refresh rate to 60 as well? but, I've still had input lag even in that scenario.

again, input lag is independent of framerate, its a characteristic of the monitor itself, or the HIDs (it should never be an issue on this end, even with 1990's era junk)
 
Last edited:
Never understood why anyone would disable vsync for any reason other than benchmarking. It improves visual performance (no tearing), reduces the load on GPU, and has no downside.
 
Never understood why anyone would disable vsync for any reason other than benchmarking. It improves visual performance (no tearing), reduces the load on GPU, and has no downside.


$50 says this thread turns into "I CAN SEE 500 FPS IT IMPROVES GAMING IM A GAMER YOU DONT KNOW YOURE INFERIOR" in the next few posts

hence, yes, vsync is the devil, so is anything else you enable that doesn't let it run at 999999999 FPS (and somehow, you know, this is all displayed on a 60hz monitor over an interface that has bandwidth limitations that impose a hard limit of something like what? 240 FPS?)
 
Last edited:
Never understood why anyone would disable vsync for any reason other than benchmarking. It improves visual performance (no tearing), reduces the load on GPU, and has no downside.

It can noticeably increase input lag.
 
I notice a big difference between having vsync on and off. Fast paced games really suffer with it on.
 
$50 says this thread turns into "I CAN SEE 500 FPS IT IMPROVES GAMING IM A GAMER YOU DONT KNOW YOURE INFERIOR" in the next few posts

hence, yes, vsync is the devil, so is anything else you enable that doesn't let it run at 999999999 FPS (and somehow, you know, this is all displayed on a 60hz monitor over an interface that has bandwidth limitations that impose a hard limit of something like what? 240 FPS?)

Become informed before trolling.

While enabling vsync does fix tearing, it also sets the internal framerate of the game to, at most, the refresh rate of the monitor (typically 60Hz for most LCD panels). This can hurt performance even if the game doesn't run at 60 frames per second as there will still be artificial delays added to effect synchronization. Performance can be cut nearly in half cases where every frame takes just a little longer than 16.67 ms (1/60th of a second). In such a case, frame rate would drop to 30 FPS despite the fact that the game should run at just under 60 FPS. The elimination of tearing and consistency of framerate, however, do contribute to an added smoothness that double buffering without vsync just can't deliver.

Input lag also becomes more of an issue with vsync enabled. This is because the artificial delay introduced increases the difference between when something actually happened (when the frame was drawn) and when it gets displayed on screen. Input lag always exists (it is impossible to instantaneously draw what is currently happening to the screen), but the trick is to minimize it.

-Anandtech, Triple Buffering: Why We Love It.
 
Not being a serious gamer, that does make sense. I guess this is an issue with the way the game engine architecture. I would surmise that it's possible (and probably not all that difficult) to overcome these issues with a modified architecture, and probably some games do this already - but I see the issue.

Still, if that is the issue with vsync, would it not apply to limiting the framerate some other way as well?
 
I find turning on VSync in Source engine based games tends to add input lag, as the OP states, which personally made me a bit suspicious of turning on VSync for other games. *Shrug*
 

don't be a dick (in general, kthx), thats unrelated to the actual thread's question, but if you really want to argue temporal characteristics and persistance of vision

the monitor itself is not going to add or lose any input lag with vsync on or off, if its 30 ms today, it will be 30 ms tomorrow, 30 ms the day after, and 30 ms the day after that (you'll of course argue this point, because obviously anything I say is "trolling" (son, you don't know what trolling is))

as far as anand's article:
first, most obnoxious point: they seriously ripped half of this from wikipedia, yet another reason I hate anand (now, this doesn't mean I'm not going to read it, consider it, and respond, I'm just saying "wow, this is seriously just a rip from wikipedia to create more content")

actual issues with the article/thoughts:
- it assumes TMB does not/cannot exist: it does, and can

- human response time is FAR greater than 3.3ms, to the point that you generally aren't gonna notice it (its really just too minute given that we aren't talking DRAMATIC contrast per frame change (if it was alternation of random 180* contrasted colors or something similarly ridiculous, yeah you'd probably pick out a few of them, but frame to frame in a normal video game, not very much is changing (and anand does actually bring this up))

- there is actually nothing wrong vsync OR triple buffering, neither is somehow "inferior", and you aren't going to get "ZOMFG EVIL 90000 SECOND LAG THAT KILLS US ALL" (and the article actually points to vsync as providing higher quality)


so like I said:
$50 says this thread turns into "I CAN SEE 500 FPS IT IMPROVES GAMING IM A GAMER YOU DONT KNOW YOURE INFERIOR" in the next few posts

although I will go back and amend that, adding:
if not 500 FPS, some obnoxious whining arguement about ~10ms (worst case scenario) having some overt and direct effect on gaming

oh, did I get to the part where the anand article is using a 300 FPS input, and not the average 60-70 FPS input that you'll likely have from real-time rendering, and where the performance hang-up is really only a problem if the system can't hit the "bar" that vsync sets (be it 30 or 60 FPS), they do mention "delays added to effect synchronization", but never actually specify what kind of delay we're talking about, given the abilities of modern equipment, I'm gonna venture its in the pretty-damn-quick arena (would the term "statistically irrelevant" mean anything to you?)

yeah, its got a few sound points, but nowhere does it say that vsync will somehow destroy performance or create some "unplayable" level of input lag

it does mention that vsync improves visual quality, and that triple buffering is "the best of both worlds", its a solid compromise

all of that aside (and here's the kicker):
this thread was actually looking for a way to cap FPS, not argue over what vsync may or may not do for your blessed, obviously superior eyes, which are somehow capable of detecting half-shade or quarter-shade differences on images displayed for less than 20ms (at absolute most) on a consistent level

vsync actually accomplishes the OP's needs, and looks fine, since the feature he wanted from source console is disabled, I'm guessing any sort of mod/add-on to the game would count as a hack or exploit (and it sort of is, knowing his usage scenario (I honestly don't doubt its a useful feature))

this takes me to something I should've said in the first place:
you need to write code that uses a non-variable timing scheme as its base, because what you've got now is like games from the 1980's that assume the CPU to run at exactly X mhz, and if it deviates from that, you get erratic performance

Not being a serious gamer, that does make sense. I guess this is an issue with the way the game engine architecture. I would surmise that it's possible (and probably not all that difficult) to overcome these issues with a modified architecture, and probably some games do this already - but I see the issue.

Still, if that is the issue with vsync, would it not apply to limiting the framerate some other way as well?

well, just "cutting off" every frame over a certain point (more or less what the OP wants) is going to look like crap, you'll get all of the tearing and none of the performance, its like the worst case scenario

in order to cap the FPS, you have to clock to something, be it external like a genlock system, or internal, like vsync (which is technically external, in that the rendering system is clocking with the display device), this is going to incur some sort of delay, as has been mentioned (and this is apparently unacceptable under any circumstances :rolleyes:)

so the triple buffering option doesn't really lock you to a given FPS rate (anand is so wonderfully vague on that point as well, don't you love quasi-sensationalist writing?), although it may overcome some perceived issues with vsync (and theres nothing wrong with that, just don't argue with me that you're somehow superman because 2ms vs 4ms is "night and day" for you)

basically either OP goes with vsync or similar, or learns to write code that doesn't rely on an independent variable for its timing scheme (and since this code isn't required to exist, I could really care less if this is also taken as "unacceptable under any circumstances", you aren't even doing something trivial like a professional video display system, let alone something important like medical equipment)
 
Last edited:
I notice input lag on source games as well with vsync on. It's there, and it definitely goes away when I disable.
 
I can't stand playing with Vsync on and yes, its triple buffered. Yes, my monitor is locked at a refresh rate of 60Hz so no, I can't explain the input lag that I experience. But its there, so again, using Vsync is not an option.

Now will someone please answer my original question??
 
I thought just triple buffering introduced lag, I assume I'm wrong but thx for the possible correction.
 
FYI to the OP: Source-based games still have the "fps_max" command. So just set "fps_max 60" or whatever you want it to.

Unless, for some reason, this doesn't achieve your goal... :confused:
 
don't be a dick (in general, kthx), thats unrelated to the actual thread's question, but if you really want to argue temporal characteristics and persistance of vision

the monitor itself is not going to add or lose any input lag with vsync on or off, if its 30 ms today, it will be 30 ms tomorrow, 30 ms the day after, and 30 ms the day after that (you'll of course argue this point, because obviously anything I say is "trolling" (son, you don't know what trolling is))

as far as anand's article:
first, most obnoxious point: they seriously ripped half of this from wikipedia, yet another reason I hate anand (now, this doesn't mean I'm not going to read it, consider it, and respond, I'm just saying "wow, this is seriously just a rip from wikipedia to create more content")

actual issues with the article/thoughts:
- it assumes TMB does not/cannot exist: it does, and can

- human response time is FAR greater than 3.3ms, to the point that you generally aren't gonna notice it (its really just too minute given that we aren't talking DRAMATIC contrast per frame change (if it was alternation of random 180* contrasted colors or something similarly ridiculous, yeah you'd probably pick out a few of them, but frame to frame in a normal video game, not very much is changing (and anand does actually bring this up))

- there is actually nothing wrong vsync OR triple buffering, neither is somehow "inferior", and you aren't going to get "ZOMFG EVIL 90000 SECOND LAG THAT KILLS US ALL" (and the article actually points to vsync as providing higher quality)


so like I said:
$50 says this thread turns into "I CAN SEE 500 FPS IT IMPROVES GAMING IM A GAMER YOU DONT KNOW YOURE INFERIOR" in the next few posts

although I will go back and amend that, adding:
if not 500 FPS, some obnoxious whining arguement about ~10ms (worst case scenario) having some overt and direct effect on gaming

oh, did I get to the part where the anand article is using a 300 FPS input, and not the average 60-70 FPS input that you'll likely have from real-time rendering, and where the performance hang-up is really only a problem if the system can't hit the "bar" that vsync sets (be it 30 or 60 FPS), they do mention "delays added to effect synchronization", but never actually specify what kind of delay we're talking about, given the abilities of modern equipment, I'm gonna venture its in the pretty-damn-quick arena (would the term "statistically irrelevant" mean anything to you?)

yeah, its got a few sound points, but nowhere does it say that vsync will somehow destroy performance or create some "unplayable" level of input lag

it does mention that vsync improves visual quality, and that triple buffering is "the best of both worlds", its a solid compromise

all of that aside (and here's the kicker):
this thread was actually looking for a way to cap FPS, not argue over what vsync may or may not do for your blessed, obviously superior eyes, which are somehow capable of detecting half-shade or quarter-shade differences on images displayed for less than 20ms (at absolute most) on a consistent level

vsync actually accomplishes the OP's needs, and looks fine, since the feature he wanted from source console is disabled, I'm guessing any sort of mod/add-on to the game would count as a hack or exploit (and it sort of is, knowing his usage scenario (I honestly don't doubt its a useful feature))

this takes me to something I should've said in the first place:
you need to write code that uses a non-variable timing scheme as its base, because what you've got now is like games from the 1980's that assume the CPU to run at exactly X mhz, and if it deviates from that, you get erratic performance



well, just "cutting off" every frame over a certain point (more or less what the OP wants) is going to look like crap, you'll get all of the tearing and none of the performance, its like the worst case scenario

in order to cap the FPS, you have to clock to something, be it external like a genlock system, or internal, like vsync (which is technically external, in that the rendering system is clocking with the display device), this is going to incur some sort of delay, as has been mentioned (and this is apparently unacceptable under any circumstances :rolleyes:)

so the triple buffering option doesn't really lock you to a given FPS rate (anand is so wonderfully vague on that point as well, don't you love quasi-sensationalist writing?), although it may overcome some perceived issues with vsync (and theres nothing wrong with that, just don't argue with me that you're somehow superman because 2ms vs 4ms is "night and day" for you)

basically either OP goes with vsync or similar, or learns to write code that doesn't rely on an independent variable for its timing scheme (and since this code isn't required to exist, I could really care less if this is also taken as "unacceptable under any circumstances", you aren't even doing something trivial like a professional video display system, let alone something important like medical equipment)

And if I said the sky was blue, I'd bet you're the kind of person who would tell me that it is actually "azure." OK OK, you're smarter than me :rolleyes:

Go validate your fucking ego somewhere else. No one said they're any better than you at detecting infinitesmal temporal changes, they just said they notice a "difference" that negatively affects their twitch gameplay abilities when vsync is enabled. I highly doubt their EYES could tell the difference, they just FEEL a difference. Input lag due to vsync really can affect your aim, and it doesn't take some superhuman ubergamer to notice it. And, since you admitedly can't tell a difference, maybe you are inferior. Maybe go see a neurologist, hmmm?
 
input lag will not change regardless of FPS, its a characteristic of the monitor
again, input lag is independent of framerate, its a characteristic of the monitor itself, or the HIDs (it should never be an issue on this end, even with 1990's era junk)

+1 for you're screwed. ;)
 
Here you go: http://www.kn00tcn.net/FPSLimiter.rar

run the .jar, add an fps cap to a DX8/DX9/OGL game (like l4d that dropped fpx_max)

I used this program in fallout when I unlock the fps to give it smoothness and yet not go super fast and in other games where my card squeals due to high FPS. This gets rid of that noise :D
 
I thought just triple buffering introduced lag, I assume I'm wrong but thx for the possible correction.

V-sync causes input lag by holding the completed frame in the buffer till the monitor says it is ready for another, then it sends the last completed frame, regardless of how many frames it was able to complete. Without vsync it just sends each frame when it is completed. All else being equal it adds input lag. Even when you have a monitor panel that can only refresh at 60Hz. Some people are sensitive to it, and others not. .
 
V-sync causes input lag by holding the completed frame in the buffer till the monitor says it is ready for another, then it sends the last completed frame, regardless of how many frames it was able to complete. Without vsync it just sends each frame when it is completed. All else being equal it adds input lag. Even when you have a monitor panel that can only refresh at 60Hz. Some people are sensitive to it, and others not. .

I don't think this is it. If the engine draws a frame every 5ms say (200fps), the frame that gets drawn is still a maximum of 5ms 'old' when it's sent to your monitor. Sending on average half of that frame and half of a 5ms newer one wouldn't seem to cause what people are describing as input lag. If you had made an input to the game in one of those intervening frames, it's going to be there in either case, the only time it will make a difference is if the event happens in the 5ms while the previous frame was being drawn. Otherwise you don't see your input until the monitor is ready for another frame anyway, so I don't see how you'd notice a difference, and even if you could, the situations where it would happen would be fairly rare (ie. 1/100 inputs or such).

My suspicion is that some game engines (like Source, apparently) use the age-old technique of synchronizing the whole engine to the output display. Sampling input, sending network packets and drawing frames all as part of the 'do for every frame' process. This would mean that in between frames the engine is effectively 'dead' and not accepting input at all, which means that in the worst case it could take up to 17ms to register and input and send it off to the network, which I definitely think could be noticeable in games like CS and TF2.

Way back in my CS 1.6 days I remember people running at 640x480 or 800x600 for exactly this reason. Higher FPS = more network updates = less lag between you and the server.

If that's accurate though, any way you limit the FPS, it's going to cause the same effect (though you can control the severity).
 
And if I said the sky was blue, I'd bet you're the kind of person who would tell me that it is actually "azure." OK OK, you're smarter than me :rolleyes:

Go validate your fucking ego somewhere else. No one said they're any better than you at detecting infinitesmal temporal changes, they just said they notice a "difference" that negatively affects their twitch gameplay abilities when vsync is enabled. I highly doubt their EYES could tell the difference, they just FEEL a difference. Input lag due to vsync really can affect your aim, and it doesn't take some superhuman ubergamer to notice it. And, since you admitedly can't tell a difference, maybe you are inferior. Maybe go see a neurologist, hmmm?

yeah, call me inferior, win the arguement, you're so pro (I also didn't say I'm good or bad at detecting input lag, although from most response tests, I'm about average (~245-255 ms response time, depending on how awake I am, if I've had my daily coffee or juice, etc))
and FYI I'm not validating my ego, I'm stating fact

one thing I meant to add, but couldn't (was called away from my desk):
the anand article assumes that the frames are actually 1/60th slices of time, in reality they're instantaneous, its sort of like 1/0 (the easiest way I can describe it, and that isn't accurate), Beyond3D has about four articles on the topic (if you're interested, I can dig most of them up)

the point is, vsync will give you the FPS limiting, and shouldn't impact performance as dramatically as some are claiming, yes triple buffering is probably going to get you a solid 60 FPS or 120 FPS to sync with your monitor in an "easier" fashion, vsync is probably higher IQ, and shouldn't be killing you with input lag (again, its like 10 ms, just because you "feel" something doesn't make it a tangible reality, placebo effect is widespread though)


I don't think this is it. If the engine draws a frame every 5ms say (200fps), the frame that gets drawn is still a maximum of 5ms 'old' when it's sent to your monitor. Sending on average half of that frame and half of a 5ms newer one wouldn't seem to cause what people are describing as input lag. If you had made an input to the game in one of those intervening frames, it's going to be there in either case, the only time it will make a difference is if the event happens in the 5ms while the previous frame was being drawn. Otherwise you don't see your input until the monitor is ready for another frame anyway, so I don't see how you'd notice a difference, and even if you could, the situations where it would happen would be fairly rare (ie. 1/100 inputs or such).

My suspicion is that some game engines (like Source, apparently) use the age-old technique of synchronizing the whole engine to the output display. Sampling input, sending network packets and drawing frames all as part of the 'do for every frame' process. This would mean that in between frames the engine is effectively 'dead' and not accepting input at all, which means that in the worst case it could take up to 17ms to register and input and send it off to the network, which I definitely think could be noticeable in games like CS and TF2.

Way back in my CS 1.6 days I remember people running at 640x480 or 800x600 for exactly this reason. Higher FPS = more network updates = less lag between you and the server.

If that's accurate though, any way you limit the FPS, it's going to cause the same effect (though you can control the severity).

higher FPS does not mean lower lag or have any correlation to network performance, if the network has a 40ms delay, it doesn't matter if your system gets 3 or 3000 FPS, its gonna be waiting on the network (think about people with GTX 295's and crap DSL playing games, the online experience sucks, even though the system can hack it)

the point behind running lower resolution is to ensure you have a consistent framerate, if you drop to unplayable slideshow levels when you walk into a new cell, just because some lighting effect has to be loaded, thats kind of a worthless reason to die :)
 
Did you read and understand my post? I said that processing inputs once per frame and doing so at a slower rate delays them more before they hit the network at all, contributing additional lag. I didn't even mention network latency at all...

The folks I was referring to were hardcore gamers with heavy-duty rigs that could easily render 50-60fps or better all the time at much higher resolutions.
 
Did you read and understand my post? I said that processing inputs once per frame and doing so at a slower rate delays them more before they hit the network at all, contributing additional lag. I didn't even mention network latency at all...

The folks I was referring to were hardcore gamers with heavy-duty rigs that could easily render 50-60fps or better all the time at much higher resolutions.

the rendering output is independent of the network, generally speaking (its complex)

running higher resolution, as long as it doesn't grind the system down to a halt, will not influence performance negatively, if the system can do it

e.g: you get 60 FPS at 1920x1200 full max vs 120 FPS at 800x600 full minimum
either is going to be very enjoyable/playable, the 1920x1200 will just look a lot nicer, but it isn't gonna "lag more"

if you really nitpick about the theoretical half an ms here, go for it, but unless your selected settings are really crushing performance, theres no problem
 
I'm suggesting a possible mechanism behind the lag that people apparently experience, which is apparently real. I'm not trying to engage in empty 'its impossible, they're idiots because x' hand waving.

The architecutre of every engine is obviously going to be different, but I know that the Quake 2 engine and derivatives (ie. Half-Life and mods) send network updates once per frame and basically operate with a single main loop that performs all tasks in a certain order. I imagine this architecture is probably still fairly common, and it wouldn't surprise me at all if the HL2 engine was extremely similar in basic architecture.

http://developer.valvesoftware.com/...rver_In-game_Protocol_Design_and_Optimization describes how the HL1 engine operates, and clearly indicates that there is a main 'frame loop' in which all processing is performed. Limiting the speed of this frame loop means you delay input processing by necessity as you wait for the next frame time to start. I believe this delay could probably be significant enough to be noticeable, on the order of 10ms @ 60Hz.
 
although from most response tests, I'm about average (~245-255 ms response time, depending on how awake I am, if I've had my daily coffee or juice, etc)

My eyes are too fatigued to read through a thread of material with which I'm already quite familiar, but this caught my attention.

That time actually seems quite slow. My slowest friend gets about 225ms on average. I remember getting 140-190ms, with about 160ms mean (but I'm slightly better than average).

Maybe the test you're taking is more difficult, however. I've taken a few different types of tests, and on the worst of them I averaged about 180ms.

My times were measurably worse with the combination of a wireless mouse and older LCD.
 
My eyes are too fatigued to read through a thread of material with which I'm already quite familiar, but this caught my attention.

That time actually seems quite slow. My slowest friend gets about 225ms on average. I remember getting 140-190ms, with about 160ms mean (but I'm slightly better than average).

Maybe the test you're taking is more difficult, however. I've taken a few different types of tests, and on the worst of them I averaged about 180ms.

My times were measurably worse with the combination of a wireless mouse and older LCD.

yeah, I know theres different tests, not sure what the variance here is a result of, I've seen average stated numerous times at 250ms, and I will average that 245-255 range

so I don't know, perhaps my response time is slower than the average "gamer", I'd be interested in seeing what you're looking at (test wise) though :)
 
i may have missed something...would not be the first time. But, is it not possible to lock your frame rate to your refresh rate or your monitor?
 
Hmm, I wonder how much the input lag between light hitting our retina and our brain processing the information affects game play :)
 
i may have missed something...would not be the first time. But, is it not possible to lock your frame rate to your refresh rate or your monitor?

:eek: hahahahaha


Hmm, I wonder how much the input lag between light hitting our retina and our brain processing the information affects game play :)

OH NOES, WE MIGHT BE LOSING SOME PERFORMANCE IF WE SIT TOO FAR AWAY, quick, everyone get an HMD or you'll be too sluggish in games
 
Back
Top