What (average) FPS is acceptable to you in games?

What (average) FPS is acceptable to you in games?

  • less than 15 FPS - You must be really into slideshows!

    Votes: 1 0.5%
  • 15 FPS

    Votes: 1 0.5%
  • 25 FPS

    Votes: 1 0.5%
  • 30 FPS

    Votes: 19 9.1%
  • 40 FPS

    Votes: 25 12.0%
  • 50 FPS

    Votes: 18 8.6%
  • 60 FPS

    Votes: 75 35.9%
  • 75 FPS

    Votes: 22 10.5%
  • 100 FPS

    Votes: 28 13.4%
  • 120 FPS

    Votes: 10 4.8%
  • 144 FPS

    Votes: 7 3.3%
  • more than 144 FPS - Explain yourself!

    Votes: 2 1.0%

  • Total voters
    209
I meant at 120 Hz with and without gsync. Sorry for not making myself clear enough
Oh! I wouldn't know, actually! All my high refresh rate monitors have had GSync, and I've always used it. I don't know that I'd miss it at a steady 120fps framerate (barring any tearing, of course).
 
it said "What AVERAGE FPS" and I'm thinking lots of folks were thinking "desired minimum FPS" when they cast their votes
 
I pose this question, because we often hear that consoles are capped to 30fps in many games. Then I read an article on PC games where they say "the game is unplayable at this setting" where they show the average fps to be 35 and the minimum is sg like 25. That's already better than the console experience, so calling it unplayable is quite unfair IMO.

Those console games have their animations all designed around a 30fps refresh rate, and resources are loaded to hit that rate. The PC games that are "unplayable" are usually a stuttering mess at those frames because the system is being overloaded when it slows down to that fps.
 
As long as I get minimum framerates of 60fps, I'm happy.

I wouldn't mind a little more, but the monitor I'd need for that (40+" 4k 120hz) just isn't made yet.

So, these days I usually just vsync to 60hz, and expect the framerate to be a straight line, never dipping below 60fps.

In some really demanding games, since i use a TV as a screen, it has european 50hz modes, so I play vsynced to 50hz. I only do this in single player titles though. In multiplayer, I'd consider it unacceptable.

Some games are playable with much lower framerates, like the Civilization series and its turn based strategy. In those I could probably tolerate 30fps or maybe even 15, but the irony is that the type of games for which this kind of framerate would be acceptable, usually aren't particularly demanding, and as such easily play at much higher framerates.
 
I'm not terribly concerned with my average FPS. As long as my minimums don't drop below 45 I'm pretty content.
 
Since my son let me have his Benq 144hz I guess 144hz is it. I am not sure if I get that but at 1080p and
a 1080ti, it should around there. Play games have fun.
 
Since my son let me have his Benq 144hz I guess 144hz is it. I am not sure if I get that but at 1080p and
a 1080ti, it should around there. Play games have fun.


I think I'd cap my games before I let them run free up to 144fps. I value less fan noise and heat more than I do crazy frame rates. A cap at 90-100 somewhere ought to be more than sufficient.
 
I think I'd cap my games before I let them run free up to 144fps. I value less fan noise and heat more than I do crazy frame rates. A cap at 90-100 somewhere ought to be more than sufficient.
I have the hybrid versions so no noise from them.
 
Yeah, but it is silent. The temp barely moves up and the rest of my system is much louder.

Same... it's silly that it's this easy to get a straight >30% overclock on the top-end consumer GPU with no noise versus the HSF versions. Won't buy another HSF GPU for my desktop.
 
Same... it's silly that it's this easy to get a straight >30% overclock on the top-end consumer GPU with no noise versus the HSF versions. Won't buy another HSF GPU for my desktop.
I actually have no OC'd this card. Probably if I went back to 4K I would.
 
For shooters I prefer to be around 80 or greater so when explosions or something dumps work on the GPU it has some room to dip keeping me over or close to 60. Anything fast paced or high action like that I prefer this.

For most everything else I usually can live with 40 to 60 without issues.

I just barely got GTA V and as it supports ultra wide and multiple monitors out of the box my NV Surround setup finally gets a work out. For most things in the city I'm at 37 to 45 FPS. Hitting the country I can see dips to 28 for moments but its usually around 30 to 35. Whenever I spend some real time out in the country I'll see if it stays that low and if so town down something else. Right now AA is off and reflections are down a little bit. Still looks great at 5760 x 1200. It could play at max everything but city was 35 FPS at best, and 20 to 25 in the country lol. In this case I'll eat the lower FPS to get the three screens looking purdy.
 
That's interesting. Even with my massive radiators, my Pascal Titan X and [email protected] (1.445v) take some fan speed to cool them down.

Most of that is probably the highly overvolted 32nm CPU though :p
Haha you must be in the "I won't upgrade until my chip dies" boat. I too will be pushing similar voltages, if not more, into my 6800k when I get the upgrade itch haha.
 
Haha you must be in the "I won't upgrade until my chip dies" boat. I too will be pushing similar voltages, if not more, into my 6800k when I get the upgrade itch haha.

I'll upgrade when it stops playing all the newest titles at better than 60fps :p (or when it dies)
 
Same... it's silly that it's this easy to get a straight >30% overclock on the top-end consumer GPU with no noise versus the HSF versions. Won't buy another HSF GPU for my desktop.
What is a HSF GPU? Google tells me: Han shot first :LOL:
 
What is a HSF GPU? Google tells me: Han shot first :LOL:

Exactly what you'd expect!

[to avoid potential confusion, I mean either the blower-style or open-air fan style coolers traditionally used on GPUs versus those that come with water blocks for custom loops or those that come fitted with closed loop water coolers like what Hagrid and myself are using]
 
For the FPS genre, one exception was the original Crysis. At 35-45 FPS it felt like 60 on other games at the time. Not sure what it was about that game but anything around 30+ FPS felt surprisingly smooth for what the actual frame rate was.
 
For the FPS genre, one exception was the original Crysis. At 35-45 FPS it felt like 60 on other games at the time. Not sure what it was about that game but anything around 30+ FPS felt surprisingly smooth for what the actual frame rate was.

It was something to do with its motion blurring. I wouldn't say it felt like the frame-rate was higher, but it did feel more smooth for sure. The thing is, if I remember correctly, you could turn that down or off, and gain a few frames per second. It's been a long time since I played the original though. My most recent play was either Warhead or Crysis 2. Never played the third one...
 
For shooters / racers 60 fps is the target (my max monitor freq), quality be damned. Low details are fine to achieve this, but not a reduced resolution.

I've recently forcefully played rpg's at 30 fps mainly because I was doing time-consuming resource gathering and wanted minimal cpu / gpu usage due to the hot summer weather. When it gets more action-y, the fps gets unlocked.
 
It was something to do with its motion blurring. I wouldn't say it felt like the frame-rate was higher, but it did feel more smooth for sure. The thing is, if I remember correctly, you could turn that down or off, and gain a few frames per second. It's been a long time since I played the original though. My most recent play was either Warhead or Crysis 2. Never played the third one...

I can't speak to Crysis specifically. I've actually never played it.

My take in FPS games in general - however - was that to me at approximately 30fps games LOOK as good and smooth as they are ever going to look.

However, when I grab the mouse and start moving things around, I can feel that something is wrong at 30fps. It feels laggy and disconnected. This feeling gradually goes down as the framerate increases until about 60fps where mouse movement it doesn't feel laggy to me anymore.

There is a slight improvement above 60fps, but it is very, very subtle, and by 90fps - IMHO - it gets as good as it is going to get. There is IMHO no need for any refresh or framerate above that. I'd rather spend the extra GPU ower at that point on more resolution, higher polygon counts, more AA or cooler looking effects.

I haven't played at above 60fps in a long time though. Back during the original Counter-Strike betas my GeForce 2 GTS and later GeForce 5 500Ti were very powerful for that game, and I had a 22" Iiyama Visionmaster Pro which supported 100hz at 1600x1200, so I played that game vsynced at 100hz.

I'd like to get back up above 60fps again, but I am unwilling to sacrifice my large 4k screen for that, and no one makes a 40+" 4k screen supporting above 60hz yet. 90-100fps is probably as high as I'll go though. I feel th ekids these days going on and on about how important 120-144fps is are nuts. Totally in placebo territory.
 
I can't speak to Crysis specifically. I've actually never played it.

My take in FPS games in general - however - was that to me at approximately 30fps games LOOK as good and smooth as they are ever going to look.

However, when I grab the mouse and start moving things around, I can feel that something is wrong at 30fps. It feels laggy and disconnected. This feeling gradually goes down as the framerate increases until about 60fps where mouse movement it doesn't feel laggy to me anymore.

There is a slight improvement above 60fps, but it is very, very subtle, and by 90fps - IMHO - it gets as good as it is going to get. There is IMHO no need for any refresh or framerate above that. I'd rather spend the extra GPU ower at that point on more resolution, higher polygon counts, more AA or cooler looking effects.

I haven't played at above 60fps in a long time though. Back during the original Counter-Strike betas my GeForce 2 GTS and later GeForce 5 500Ti were very powerful for that game, and I had a 22" Iiyama Visionmaster Pro which supported 100hz at 1600x1200, so I played that game vsynced at 100hz.

I'd like to get back up above 60fps again, but I am unwilling to sacrifice my large 4k screen for that, and no one makes a 40+" 4k screen supporting above 60hz yet. 90-100fps is probably as high as I'll go though. I feel th ekids these days going on and on about how important 120-144fps is are nuts. Totally in placebo territory.
Not really. My son showed me the difference and it is there. From 60 to 144 there is a definite difference that I could tell even at my old age.
 
Coming from a console gamer for 10+ years, I'd say my medium is 80-100. I've had some game time on some serious rigs and pushing 160+ is nice, but I get what I want in the median 90 range @ 1440.
 
Not really. My son showed me the difference and it is there. From 60 to 144 there is a definite difference that I could tell even at my old age.

Pay more attention to what I wrote.

I didn't say there was no difference above 60fps. There is.

I said there was a small difference over 60fps, up until 90-100fps somewhere. Above 100 there is no perceivable difference at all to me.

Either way, the truth is that the gameplay experience is governed by the minimum fps you get, as that minimum fps almost always happens during the most important high action twitchiest part of a game. It makes little sense to run away with the framerate during low action views of the horizon, or a wall.

Because of this, if I had a variable refresh rate screen, I'd benchmark my titles over a few matches and then cap the framerate (if supported) maybe 5-10% above the minimum framerate recorded.

This way you aren't running away with the framerate and creating wasted heat on low action scenes. It should also raise the minimum framerate a little as a cooler GPU can usually turbo boost more during the high action scenes when needed the most.

That said, I have a fixed refresh screen, so I vsync it to 60 and adjust settings so the framerate looks as much like a flat line as possible, with minimal dips.
 
Last edited:
Pay more attention to what I wrote.

I didn't say there was no difference above 60fps. There is.

I said there was a small difference over 60fps, up until 90-100fps somewhere. Above 100 there is no perceivable difference at all to me.

Either way, the truth is that the gameplay experience is governed by the minimum ffps you get, as that minimum fps almost always happens during the most important high action twitchiest part of a game. It makes little sense to run away with the framerate Sue nf low action views of the horizon, or a wall.

Because of this, if I had a variable refresh rate screen, I'd benchmark my titles over a few matches and then cap the framerate (if supported) maybe 5-10% above the minimum framerate recorded.

This way you aren't running away with the framerate and creating wasted heat on low action scenes. It should also raise the minimum framerate a little as a cooler GPU can usually turbo boost more during the high action scenes when needed the most.

That said, I have a fixed refresh screen, so I vsync it to 60 and adjust settings so the framerate looks as much like a flat line as possible, with minimal dips.
I never tried that difference. I just set it all to max and play. I have no idea on anything else... :)
 
One thing I've noticed since replacing my monitor is that with its 75hz refresh rate, it gives games a little "breathing room" for when things get hectic.

Dropping into the mid 50's = sucks and can be quite jarring.

Dropping into the mid 60's = no biggie

I definitely recommend 75hz for the average joe.
 
After playing a lot of console games (Playstation exclusives mostly) lately, I've learned that frame consistency is just as important (if not more so) than frame rate, of course providing it's at least at 30 FPS and locked to it instead of unlocked and very inconsistent (introducing noticeable juttering). For example, Horizon: Zero Dawn only runs at 30 FPS, but it's locked at it and has pretty much perfect frame pacing throughout the entire game regardless of what's on screen, so there's no juttering or dropped frames. Throw in a modest motion blur effect and it looks quite smooth still despite being a relatively low frame rate.

This is pretty much the reason why console games are typically locked to either 30 or 60 FPS instead of allowing for an unlocked frame rate to bounce around anywhere in between and causing severe juttering, making a game bouncing around 40-50 FPS look/play worse than if it was locked at 30 FPS with perfect frame pacing.

But it's not cut and dry either, as for me, some console games give you an option between a higher unlocked frame rate at lower resolution/IQ, or a locked frame rate at higher resolution/IQ. Both Nioh and God of Ware on PS4 (Pro at least, not sure about base PS4) give you these options, and while I couldn't stand the unlocked frame rate mode on Nioh because it just varied way too much, I opted to play God of War on an unlocked frame rate because it at least was a bit more consistent in its performance still.

Anyways, all this to say that I care less about "average" FPS because I want it to be as consistent as possible still, so ideally your min/avg/max frame rate would all be the same in a game IMO. Since I have kind of moved away from first person shooters in the last couple years, I'm good with 30/60FPS in any of the games I've played. If I was still into playing twitchy games like Battlefield, CoD, and all that stuff, then the higher the better of course.
 
After playing a lot of console games (Playstation exclusives mostly) lately, I've learned that frame consistency is just as important (if not more so) than frame rate, of course providing it's at least at 30 FPS and locked to it instead of unlocked and very inconsistent (introducing noticeable juttering). For example, Horizon: Zero Dawn only runs at 30 FPS, but it's locked at it and has pretty much perfect frame pacing throughout the entire game regardless of what's on screen, so there's no juttering or dropped frames. Throw in a modest motion blur effect and it looks quite smooth still despite being a relatively low frame rate.

This is pretty much the reason why console games are typically locked to either 30 or 60 FPS instead of allowing for an unlocked frame rate to bounce around anywhere in between and causing severe juttering, making a game bouncing around 40-50 FPS look/play worse than if it was locked at 30 FPS with perfect frame pacing.

But it's not cut and dry either, as for me, some console games give you an option between a higher unlocked frame rate at lower resolution/IQ, or a locked frame rate at higher resolution/IQ. Both Nioh and God of Ware on PS4 (Pro at least, not sure about base PS4) give you these options, and while I couldn't stand the unlocked frame rate mode on Nioh because it just varied way too much, I opted to play God of War on an unlocked frame rate because it at least was a bit more consistent in its performance still.

Anyways, all this to say that I care less about "average" FPS because I want it to be as consistent as possible still, so ideally your min/avg/max frame rate would all be the same in a game IMO. Since I have kind of moved away from first person shooters in the last couple years, I'm good with 30/60FPS in any of the games I've played. If I was still into playing twitchy games like Battlefield, CoD, and all that stuff, then the higher the better of course.


Yeah, I'd argue that "Average FPS" is a completely meaningless measure. It just does not matter at all.

What is really important is minimum FPS and consistency.

The lack of consistency is less jarring if you have an adaptive refresh rate screen though.

I'm not sure I could ever find 30fps acceptable in a first or third person game though. It might be less noticible on consoles though as controllers are a much less precise control device. With a mouse odd know and find it awful instantly.
 
Back
Top