What's an acceptable frame rate?

Acceptable frame rate?

  • 100+ FPS

    Votes: 23 4.4%
  • 60+ FPS, no dips below 60 FPS

    Votes: 109 20.7%
  • 60 FPS, occasionally below 60 FPS

    Votes: 122 23.2%
  • 45+ FPS, no dips below 45 FPS

    Votes: 82 15.6%
  • 45 FPS, occasionally below 45 FPS

    Votes: 77 14.6%
  • 30+ FPS, no dips below 30 FPS

    Votes: 113 21.5%

  • Total voters
    526
I find it funny how this type of threat simply turns into a bragging session for somebody's "god system". I for one run a 2800+ Athlon64 and a Radeon 9700 Pro, and I'm happy with it. Would I like to own an X800XT? Yes. Do I need to own an X800XT to have fun in games? Sometimes. Can I live with my 2 year old video card? Yes.
 
theelviscerator said:
well your system is slower than mine and i find your numbers a bit much.

I noticed you removed your system specs from your sig.. I believe you had a slightly faster processor and a slightly slower videocard, so results should be comparable I would think....

Anyway, I ran some tests last night, and I determined that I am basically wrong, but my error was based on forgetting a couple vital facts.

In UT I got 100fps with mid 60's minimum because I was only playing a 1 on 1 deathmatch the other day when I was watching FRAPs. Obviously in 1 on 1 you get much better results. Last night I ran a couple 3 on 3 DM's and got 69fps avg. on one map and 83fps avg. on another (rrajgir(sp?) and corrugation respectively). I also got 104 on Idoma in a 1 on 1 game, so I was right about what I had been getting, but I also hadn't been performing a proper test so the results were not typical. The game is still totally playable with max settings and 16x12x4x8 though.

In Farcry with Medium shadows I ran many tests. The worst was when I spent most of the test in the water approaching the carrier (after driving a jeep over a cliff, YAHOO!), I only got about 40fps average (27 min). The results were better in the rest of the tests. I tested 4 other times, 3 times on the carrier, and one other time in the training scenario. In the 4 test I got averages between 45 and 55 fps (32 min, 18 with Bino's used).

However, with Very High shadows I had about a 25 to 30% drop across the board and averaged between 35 and 42fps (25 min, 15 with Bino's used) in the same areas of the same levels (didn't redo the water one).

Anyway, I admit I was basically wrong about the overall fps's I could get with TRUE maxed settings. However I am still able to play both games with ease at 16x12x4x8 with only 1 thing turned down in Farcry and nothing turned down in UT, and that seems pretty damn awesome to me.

(BTW, I am not trying to brag about my system. However, I do have a fast one, and I want people to know what they can expect if they decide to build a system around the level of mine).
 
i chose the "45+ w/ occasional dips below"... 60+ would be wonderful, but really, i have lived with a crappy system for a long time now and there is nothing wrong with an fps lower than 60, it is perfectly playable IMO IMO IMO!!! ... And if you want the maxed out super crazy details and flashiness of games like farcry, you might just have to get used to sub-60 performance, because
1: its not really that bad you elitist bastards :)
2: Unless you're filthy rich and grab yourself some SLI action there isn't much hope of you running ridiculously beautiful games like farcry (maxed out gfx) at 60+ fps all the time.


...however, i wonder if i will eat my words later, because some games have much more "efficient" gfx engines than others and may be able to produce killer graphics at high fps...we'll see ;)
 
Bad_Boy said:
guys, calm down, i think AcneBrain was joking lol.

I guess the caps lock and bad grammer wasn't enough. Here's the official sarcasm smiley. :rolleyes:
 
AcneBrain said:
I guess the caps lock and bad grammer wasn't enough. Here's the official sarcasm smiley. :rolleyes:

lol!

Personally I'm fairly happy with 45+... can I tell the difference between it and 60... well yes, but it just isn't that big of a deal for me... I'd rather run the game with more visual candy on a cheaper system than have the extra smoothness.
 
Higher is of course better, but Im only unhappy when framerate drops below 30.
 
i've dealt with my fair share of shitty ass graphic cards (and being too young to do anything about it), so anywhere between 30-45 is fine with me...

however I will of course try to get every last frame possible, but 30-45 is adequate
 
I think alot of people are voting for what they want and not the lowest that they would accept.
 
30 fps since I would never want to sacrifice the IQ to the level I would need to be able to play at 60 fps.

Even if I can get 60 fps in Far Cry at 1280x960 with my future setup in my sig I will still play it at 1600x1200 if I get over 30 fps in that resolution.
 
krizzle said:
Now this is what I always thought the vid card companies should figure out... Realistic Motion Blur for in-games. then, they could keep it at about 30fps and let the rest of the beast work on rendering other things, e.g. IQ settings.

Whenever Kyle brings up 3dfx in an article, it's always about their supposed FPS mania legacy, but towards the end they were doing just what you describe. Through the use of their "T-Buffer" they added some functionality to Glide 3.x for Motion Blur, IIRC, and the game had to be written for Glide and the object actually had to be tagged for it (like tagging for edge AA), but the point is that the concept is not at all alien. I imagine nVidia now owns that piece of IP...

Now, until someone comes up with a good general solution that can be applied to any 3D scene, it could probably be done using a pixel shader or equiv.
 
I don't know about you guys, but I would set my graphics options to MINIMUM if it took that much to run it at 60+ fps.
 
30FPS with no dips below. What annoys me more than a low sustained frame rate is the game freezing for a few seconds, or dropping down to like 5FPS for a short period of time. If it does that, I just can't play.

30FPS looks good for what I'm playing right now (Simcity 4). Simcity 4 has an added issue: if you build a large city, it begins to look like a slideshow, because it has to do a lot of processing for the extra people.

For Counter-Strike, I let it run at 100FPS. I tried dropping it down to 60FPS (fps_max 60.0), and it began to look terrible, even though my monitor is set at 60Hz. No clue why. :confused:

When HL2 comes out, I want to be able to do at least 40-45FPS. Any less, and I'd probably have difficulty enjoying it. All the eye-candy isn't worth it for me if it looks like a damn slideshow.

Just to help put this in perspective, I always run at 1024X768 (a lot of experimenting lead me to use that) and right now I'm running at 60Hz. I haven't really screwed around with my refresh rate, though.
 
lorcani said:
30FPS with no dips below. What annoys me more than a low sustained frame rate is the game freezing for a few seconds, or dropping down to like 5FPS for a short period of time. If it does that, I just can't play.

30FPS looks good for what I'm playing right now (Simcity 4). Simcity 4 has an added issue: if you build a large city, it begins to look like a slideshow, because it has to do a lot of processing for the extra people.

For Counter-Strike, I let it run at 100FPS. I tried dropping it down to 60FPS (fps_max 60.0), and it began to look terrible, even though my monitor is set at 60Hz. No clue why. :confused:

When HL2 comes out, I want to be able to do at least 40-45FPS. Any less, and I'd probably have difficulty enjoying it. All the eye-candy isn't worth it for me if it looks like a damn slideshow.

Just to help put this in perspective, I always run at 1024X768 (a lot of experimenting lead me to use that) and right now I'm running at 60Hz. I haven't really screwed around with my refresh rate, though.
You might want to run at a higher refresh rate. 60 Hz can be really bad on the eyes.
 
tranCendenZ said:
I hope HardOCP is looking at this poll :)

agreed
:)

obviously this thread has proved that many people have different taste on minimum fps.
 
Well the acceptable frame rates for games depends a great deal on which game your playing and to what level of play your playing those games.

If we were just talking about the basic smoothness of the graphics and movement I think I would settle at around 85-125 fps in the games I play now.

All these poeple quoting that the human eye can not see anything above 24 or 30 or 60 or whatever arent taking into account that this is only basic details about how many frames are required for the human mind to merge them into a moving image. It does not mean that more would not be better.

I have played Q2 and now Q3 for years and there are certainly HUGE differences between low from rates and large framerates. Often its just a 'feel' you get for a game at certain rates.

There is also the issue of having to play many games at extremely high fps for the engine to run 'better' at those fps. This again is the case with the quake3 engine (it may just be the case on certain mods) but at 125 fps in quake3 the engine seems at its smoothest and due to known physics issues with the game engine jumps can be made slightly easier etc.

So tbh the most acceptable framerates for me are ones which make the game its smoothest and are where the engine performs the best. I do hope the rumour of the doom 3 engine being a fixed fps rate are true as its one other way to level the playing field - rather than having some 'secret tweaks' available to those in the know which makes it easier for them to perform.
 
I think it depends a lot on the game and the engine. For instance, 30 FPS can be acceptable for many titles...Splinter Cell, Morrowind, etc. But it's nice to have 60+ in other games, especially fast-paced ones like UT. Add into that fact that everyone has their own personal opinion about framerates, and you come to the conclusion that there is no one solution.

The idea that the human eye can only see 24 FPS is completely incorrect. It's possible for 24 to look fluid, but only because of some form of blurring (movies) or certain content (like cartoons) that seem natural in that state. Games (and monitors) tend to be much more noticeable (especially in peripheral vision). People also make the mistake (in North America anyway) that TV is at 30 FPS and hence the eye can only see that much. In fact, TV involves 60 refreshes per second of images (NTSC) or at least 50 (PAL).

Now, that makes a distinction between frames per second and refreshes per second. And, yes, the human eye can see 30 frames a second (which is why monitors typically start at 60 Hz, or 60 refreshes). But the human eye can also spot differences and movement, so having FPS in line with refreshes is optimal...hence where the idea of 60 FPS being ideal originated (not to mention V-Sync). And of course the human eye can recognize not two but up to three times its frames in refreshing (TAA comes into play here), or 90 Hz, and again having it in-line means 90 FPS. This is because your eyes ares not going to be perfectly in-sync with the refreshes and hence can see partial-refreshes up to 3 parts significantly.

But really, you will only technically perceive the amount of refreshes, not the FPS, so you're limited by your refresh rate (although w/o V-Sync you can perceive up to 180 FPS theoretically, but only 90 full changes per second). So people who say they can "see" more than 90 are not actually gaining any visual benefit, and people who say they can "see" FPS greater than their refresh rate are in the same boat. Eyes do differ from person to person, but only 2x-3x refresh and 1x-3x frames (60-90 Hz, 30-90 FPS). Ultimately, therefore, the ideal situation is 85 or 100 Hz with V-Sync enabled and maximum FPS.

*edit*
I should also mention that movement (as I mentioned above) can affect how smooth something looks, but it's all related. Hence changes in FPS can make something look less smooth, but that's an absolute effect (is noticeable equally despite the FPS level).
 
FPS games I try to stay at a constant 85FPS since I run vysnc. With games like CoH and other RPGS anything over 20FPS is ok for me.
 
kick@ss said:
You might want to run at a higher refresh rate. 60 Hz can be really bad on the eyes.

I thought that high refresh rates were bad on the eyes?
 
I thought that high refresh rates were bad on the eyes?

My understanding is that the higher the frame rate, the better. Last time I checked the human eye worked about 12 fps. Movies are shown at 24fps to fulfill the Nyquist-Shannon Sampling Theorom. Below this things will look choppy. But as the frame fate goes up from there, one's eye blends the images together to make a smoother image. 60Hz is the exception because it creates a beat frequency with the power coming out of the wall.

Am I to understand that about half of you out there wouldn't play a game if the frame rate dropped below 60 fps at any point in time. It seems like to me that alot of people are confusing the words acceptable and desired. To me acceptable means the bare min or I don't play it. For me, if the frame rate drops below 30fps every now or then, I need to lower the settings.
 
cabbage329 said:
My understanding is that the higher the frame rate, the better.

Interesting. Will it make a difference on how smooth things look if I bump up the refresh rate to 85Hz (the max supported by my monitor)?

As a side note, how safe is it to over-ride the max refresh rate of a CRT?
 
lorcani said:
Interesting. Will it make a difference on how smooth things look if I bump up the refresh rate to 85Hz (the max supported by my monitor)?
As long as that didn't cause the AGP bus to be saturated from all the dynamic data per frame sent to the vid card, when that happened your per frame cursor position synchronization data will start causing lags and stutterings due to data transport contentions.

Canned graphic sequences as in typical benchmarks needing less dynamic data will perform better. Flybys are usually best, completely hands-off, no dynamic with all else stored in graphic memory are PERFECT..

For real games with real internet connections, real cursor positions, real intensive sounds then not so hot, as all of them added more to data transport contentions.
As a side note, how safe is it to over-ride the max refresh rate of a CRT?
Not good for your monitor at all, the common symtoms for newer monitors are blanked-out and out-of-range errors, for older ones they usually die in short order if not an immediate death.

You should read this pertinent link.
 
For all you people who think humans can't detect more than 24 FPS - or even 60 FPS....

WHAT A COMPLETE CROCK!

Personally, I can detect a flicker at 70 Hz (But not at 72Hz). Yes - that's refresh rate - not FPS - but the principle applies. Look - the limit of human perception is at or around 1/200th of a second - and probably lower.

Trained fighter pilots can not only identify an airplane with a single exposure of 1/200th of a second, but can also identify the TYPE of airplane in that 1/200th of a second.

So - you are DEAD wrong about 24 FPS or even 60 FPS as being the limit of detectability.

I have spent an awfully long time on this stuff - and you can read my page at http://planetdescent.com/d3help/framerate.shtml -- you might learn something.

My personal experience is that frame rates above 100 FPS provide the best gaming experience. Drops to 60 FPS are very noticeable (Even with 4x FSAA and V-Sync on) to me.

I don't seem to be able to detect differences in performance or game play (Response times to commands) above 100 FPS however.

Framerates of 30 FPS or 24 FPS are only acceptable if the source frames are very blurred, (Like TV) or the movement on screen is very slight, or very slow. Images rendered by GFx card are pretty much rock solid - no image blurring of fast-moving objects.

This to my mind is going to be the next big advance in driiver technology, motion-blurring at low frame rates. Ideally, we want rock solid images, and lots of them, but it's acceptable to blur fast objects at low frame rates,

The whole concept is moot though - because AVERAGE frame rate means NOTHING. (Neither does the Mediam Frame Rate, nor the Maximum Frame Rate). The only important figure is MINIMUM FRAME RATE.

As the [H] guys so rightly say - max rate means diddly squat - what we want to know is - "How does the game play when there are 8 opponents on screen, with 12 rockets flying, a 300,000 polygon background, and 100ms lag?"

Remember, 30 FPS online adds an extra 30 ms of machine-lag. (100FPS ping adds just 10ms extra lag). Also, the packet rate of the server also affects lag - in the same ratio.

So, the REAL advances are going to come, when the GFx drivers are customisable, when you select more minimum frame rate for each p[iece of eye candy you use.

You'll set minimum at 60 FPS, and ask to have FSAA drop by a multiple when 60 FPS is reached, and Aniso to be dropped when 60 is Reached, and Trilinear dropped to Bilinear when 70 is reached....

See what I'm talking about?

When sitting in a sniping position, with very little happening on screen, and with very little motion, the drivers would heap on the eyecandy for you, but when you get into a nasty fight with massive polygons being pushed around, the drivers will chop the candy to maintain the frame rate. (Who needs 8x FSAA when you're in the middle of an 8-way rocket fight??)

This will of course, make it impossible to objectively benchmark graphics cards - and [H]'s policy of gameplay results will be further vindicated.

The REAL proof of the pudding will be, how much eye-candy your card can support when things get hot and heavy. And your MINIMUM SHOULD BE SET AT 60 OR MORE!
 
for scripted game 30+ is enjoyable. but for online vs type play like unreal or cod anything less than 60 is appalling to me and i wont play long. I spend high dollar to try to keep all of my games running 60 plus. FarCry just wont work with me though:) but im still running through the scripted stuff right know. its a pretty hard game and that makes it fun. but the frame drops i get sometimes are frustrating. personally i think game coders should find a fps and stick with it. i think 60 fps is a good start. id atleast like to see them keeping up with monitor refresh rates and 60hz is about as low as you'd want to go.
 
Interesting. Will it make a difference on how smooth things look if I bump up the refresh rate to 85Hz (the max supported by my monitor)?

As a side note, how safe is it to over-ride the max refresh rate of a CRT?

The refresh rate is the rate that the monitor is updated. Video games create a full frame before they start a new one. Thus the fps is either limited by either the video card or the rest of the computer (but not the monitor). What you should do with your refresh rate depends on other factors. You should get a program like fraps and see what kind of fps you are getting. But no matter what others say, the real question is, "what are you happy with?". Start with the resolution and settings high, and lower them till you like what you got.

The monitors that I've seen before lose sync if you run them too fast. But if for some reason it does work over spec, I think it would shorten its life. If you like buying new monitors, go for it. Windows can auto detect the max rates and if you tell windows a higher rate, it may still run it at a lower rate. If you are into spending money on your computer, Getting a good monitor is worth the money. I've got a NEC FP2141SB and it's worth the money to me.
 
Anything less than 100FPS is a major pain in the ass. Anyone who plays online team based FPS games will attest to this.
 
About the refresh vs flicker... on my monitors I run 60Hz and don't see it. On many of my customer's monitors, I can see flicker up to 70Hz. Depends on the vid card, RAMDAC and monitor. In most of these situations changing either the vid card or monitor cleared up the 60Hz issue, although with every customer we just bumped the refresh.

Most of the ones I've noticed were low end ATI cards, Xpert 98 and such.
 
I get the worst headaches at 60 Hz, I would suggest setting the refresh at 72, even if you don't "see" any problems. It doesn't really affect FPS, but it can save on eyeglass prescriptions.
 
I like 60 FPS constant if possible, but 30 fps with little or no variation in frame rate is good.
 
Alright. I've bumped it up to 85Hz. The cursor feels a bit smoother, and I'll try some games in the morning. I notice that my eyes have begun to hurt just looking at 85Hz, but it's probably just adjusting. If it continues, I'll drop it down a notch, maybe to 72Hz or something.

About flickering at 60Hz: I've never seen it, not even looking at it with my peripheral vision (IIRC, peripheral vision is better than full-frontal at detecting changes in brightness).
 
30fps is no problem for me , even lower as long as its single player. 60+ for multiplayer
 
lorcani said:
Alright. I've bumped it up to 85Hz. The cursor feels a bit smoother, and I'll try some games in the morning. I notice that my eyes have begun to hurt just looking at 85Hz, but it's probably just adjusting. If it continues, I'll drop it down a notch, maybe to 72Hz or something.

About flickering at 60Hz: I've never seen it, not even looking at it with my peripheral vision (IIRC, peripheral vision is better than full-frontal at detecting changes in brightness).

This is strange. If I run at a lower refresh rate, I get headaches. The higher I set it the easier it is on my eyes.
 
It really depends on the game, for FP shooters, I want 40 minimum, but for games like homeworld, framerate dips into the teens don't bother me that much.
 
lorcani said:
Alright. I've bumped it up to 85Hz. The cursor feels a bit smoother, and I'll try some games in the morning. I notice that my eyes have begun to hurt just looking at 85Hz, but it's probably just adjusting. If it continues, I'll drop it down a notch, maybe to 72Hz or something.

About flickering at 60Hz: I've never seen it, not even looking at it with my peripheral vision (IIRC, peripheral vision is better than full-frontal at detecting changes in brightness).

In my experience, 72 Hz seems to be the "sweet spot". I know everyone's different, but that seems to do the trick for me as far as eye strain goes.
 
For first person shooters...like 3dfx said a solid 60fps is all you want\need.
 
I am used to not having a kickass card in my computer. I buy what I need at that time to get by. My last card was a Radeon 9000 Pro 64meg. Right now I am running a GeForce FX 5200 128meg. I think I am gonna go back to the Radeon, I like the Drivers and the card a lot more, but I degress. All I care is that my games are smooth. I like at least 30 frames. If I can get that then I am happy. That's what I got with my old Voodoo1 and my P133...enough to make me happy. Worms, Live 2004, and Battlefield are all I care about.
 
DNAlevelC:

You know you lose money that way right?
The most cost effective way to upgrade is to buy the absolute 2nd-to-top-of-the-line card out (after picking your brand, NV or ATI) when your last one is just barely able to play the latest game you like; i.e., buying a GForceFX 5900 when the 5950 has been out for a month and your GeForce 2 GTS just barely creeps along in Battlefield: Vietnam.

This way you pay $370-$470 every 3 years as opposed to paying $100 every 10 months... :confused:

You know what? You suck! :mad: ... *mumble, mumble* ... *gripe, gripe* ... ;) J/K

At least this way, you get to experience top of the line graphics 18 months out of the cycle for nearly the same price instead of average graphics your whole life.
 
"bout the refresh vs flicker... on my monitors I run 60Hz and don't see it. On many of my customer's monitors, I can see flicker up to 70Hz. Depends on the vid card, RAMDAC and monitor. In most of these situations changing either the vid card or monitor cleared up the 60Hz issue, although with every customer we just bumped the refresh."

Computers at school or friends houses or whereever, I can pick out 60hz right away if I think about it, otherwise it just looks 'wrong' and eventaully i go "A HA i know why".

75hz and higher all looks the same on other comptuers
On mine, when I played UT, I could pick out 85vs100hz every time... didn't really see a difference, the game felt different.

On the desktop after awhile I sometimes notice that its 75hz and not 85. 60hz I can see right away, theres a flicker to it and seems blurry almost. 75hz has the slightest flicker, not really noticeable until I switch to 85hz and then I can't see any at all... 85hz also seems brighter somehow.

I've played around with refresh rates, my 400PS does 1600x1200 at only 75hz... and it really bugs me, I always switch betwen 1600x1200x75 and 1280x960x85 cause 1600x1200 has a flicker to it.

In FPS always >60 is the minimum, noramly meaning 80avg... it's playable at less but I'd turn off the details and/or use a lower resolution before settling with 60fps with dips to 40s.

In most other games 40-60avg is fine... dips to below 40 might be pushing it but overall dips arn't as annoying in racing sims and stuff compared to FPSs.
 
Back
Top