What (average) FPS is acceptable to you in games?

What (average) FPS is acceptable to you in games?

  • less than 15 FPS - You must be really into slideshows!

    Votes: 1 0.5%
  • 15 FPS

    Votes: 1 0.5%
  • 25 FPS

    Votes: 1 0.5%
  • 30 FPS

    Votes: 19 9.1%
  • 40 FPS

    Votes: 25 12.0%
  • 50 FPS

    Votes: 18 8.6%
  • 60 FPS

    Votes: 75 35.9%
  • 75 FPS

    Votes: 22 10.5%
  • 100 FPS

    Votes: 28 13.4%
  • 120 FPS

    Votes: 10 4.8%
  • 144 FPS

    Votes: 7 3.3%
  • more than 144 FPS - Explain yourself!

    Votes: 2 1.0%

  • Total voters
    209

M76

[H]F Junkie
Joined
Jun 12, 2012
Messages
14,006
I know it depends on the game, but still you can have a general idea what you can still enjoy a game at. Of course let's exclude strategy games where fluid movement is not as important. For example I could play XCOM2 on an old laptop with an 6670 in it. It was annoying but playable still. So let's focus on action and simulation games instead where FPS does matter.

I pose this question, because we often hear that consoles are capped to 30fps in many games. Then I read an article on PC games where they say "the game is unplayable at this setting" where they show the average fps to be 35 and the minimum is sg like 25. That's already better than the console experience, so calling it unplayable is quite unfair IMO.

I remember playing F1GP2 at 15fps in 1996, later I slightly raised the bar and usually aimed at a whopping 25 fps average trough the mid 2000s.

I still don't expect that much as I find 40-45 fps perfectly playable in most if not all games.
 
45+ fps in a shooter, 30+ in an rpg/non action game is my happy place, obviously I aim for 60fps but when im on the road with my work laptop I take what I can get lol (laptop only has a GTX 860m)
 
It's actually kind of sad, in the past 5 years I've been spoiled and can now immediately notice any FPS drop what-so-ever.

I shoot for 60 whenever possible but there are exceptions....

GTA V is a good example. There is no way I can run that at a stable 60 fps with decent settings. But, if I set it to half-vsync and pretty maxed out, it stays a super smooth 37.5 (half of my 75hz).

It's weird because I had tried half-vsync on my old 60hz monitor and couldn't stand it. Apparently the difference between 29-30 and 37-38 is easily perceptible....
 
60+. I have a 60hz monitor. Any less and I'm not too happy. I suspect when I one day get a 100+hz monitor with G Sync my opinion will change due to the G Sync factor.
 
If you picked very high FPS can you actually tell the difference or is it more like a placebo effect? Do you think you could see the difference between 100 and 120 in a blind test? I doubt that.
 
If you picked very high FPS can you actually tell the difference or is it more like a placebo effect? Do you think you could see the difference between 100 and 120 in a blind test? I doubt that.
 
If you picked very high FPS can you actually tell the difference or is it more like a placebo effect? Do you think you could see the difference between 100 and 120 in a blind test? I doubt that.

When I build a PC, I aim for 50-60 FPS max settings, whatever I have to do to get there. With consoles, I really don't care, 30 basically the minimum I'm seeing, and it's totally playable/fine.
 
Last edited:
80, minimum, but I try to shoot for 100. These days, things start feeling sluggish below 70.
I know it depends on the game, but still you can have a general idea what you can still enjoy a game at. Of course let's exclude strategy games where fluid movement is not as important. For example I could play XCOM2 on an old laptop with an 6670 in it. It was annoying but playable still. So let's focus on action and simulation games instead where FPS does matter.

I pose this question, because we often hear that consoles are capped to 30fps in many games. Then I read an article on PC games where they say "the game is unplayable at this setting" where they show the average fps to be 35 and the minimum is sg like 25. That's already better than the console experience, so calling it unplayable is quite unfair IMO.

I remember playing F1GP2 at 15fps in 1996, later I slightly raised the bar and usually aimed at a whopping 25 fps average trough the mid 2000s.

I still don't expect that much as I find 40-45 fps perfectly playable in most if not all games.
No, it really doesn't depend on the game. Higher FPS with the refresh rate to go with it means better input response, and every game benefits from that. This includes any game that requires pointing and clicking with a mouse cursor.

And I find a lot of console games are completely unplayable because of the 30 FPS cap. Some games like Forza Horizon and Witcher 3 work some voodoo where the game is fine on consoles, but it's still better on PC. For Forza it may have to do with the underlying physics engine still running at 360 Hz on the console. Then you have games like Dark Souls 3 which are completely unplayable on consoles due to the framerate. It makes me wonder if the people who say it's too hard have never played it on a decent PC. I quit playing it on console even before I got through the Road of Sacrifices.

That's rough that you could only run GP2 at 15 FPS. I consider the PC I was able to play on at the time a potato and I could still play it at 30 FPS.
 
Last edited:
Everyone's going to say 60+ fps.

Reality: It depends on the game!

First Person Shooter? 60fps.

Real Time Strategy? I'll deal with 30-40fps.

Something like Solitaire or Freecell? Yeah, 20fps will do.
 
I voted 50, but really more like 55. 55-60 is where I need it to be. Regular dips under 50 are not acceptable.
 
I honestly have no idea because I've never paid attention. So i'm gonna say anywhere between 25 and 60?

I'm sure if someone put two monitors playing the same game at different rates right next to each other I'd probably see a difference, but I switch between console and PC stuff pretty regularly and I've never noticed anything.
 
Using a 144hz monitor, I prefer higher FPS. Always achievable? No.. but I'd turn down graphics to get 120-140 and run the monitor on 120/144 all day. So smooth
 
its funny how use to lower fps you can get, I went from playing destiny 2 on my PS4 to playing on my PC and it was trippy how different the game felt/looked running that much faster. I went from 30 fps on the ps4 to 150 on my main PC (both at 1080p) and It took a while to get use to it lol
 
I try to go for 100 fps minimum. When I first got my 144Hz monitor, I'll admit that I couldn't notice a difference in games, and only noticed the smoothness on desktop usage. Unfortunately, a downside is I feel that I've become accustom to such. Now, even watching TV and movies, I just see a series of still images moving fast, rather than smooth motion.
 
I always turn up the eye candy until I'm around 30FPS. Do I notice higher, definitely. Is it worth turning off the shiny? Nope.
 
Regular monitor? 60 minimum, I've gotten sensitive to framerate over the years.

Since I got a freesync monitor? Anything in the range is good.
 
I voted for 60 because of variations in frame rates during gameplay. Actually, if a game was 100% rock solid 30 fps in most cases this is fine. But that is rarely possible and 30 fps ends up being 25-35 fps, not good enough.
 
I have spent several years now with 4 different high refresh LCD monitors and consider myself to be pretty sensitive to motion clarity. The point of diminishing returns for me is around 110 FPS, so that’s where I try to keep my minimums. Of course with VRR I’m not exactly getting a bad experience if I drop into the 80’s, but I can generally notice the sub-90 slowdowns when they occur.
 
For most games 60 is fine, especially if it is a AAA graphically intense game. But For competitive shooters, I greatly prefer 144+, and I will gladly sacrifice in game graphics settings to get there.
 
With very few exceptions I hate playing anything that isn't locked at 60fps with vsync enabled. More is certainly better, but that's the point of diminishing returns IMO. It's also the cap for the majority of monitors and almost all TV's.
On the PC I will spend way too much time adjusting settings until I can keep things at 60fps and never dropping below.
 
Hmmmm.... Really depends on the kind of game. Also I've found it has changed now that I have a Gsync display. Previously I had a 60Hz monitor so 60fps solid was my target. With Gsync I find that in games like MMOs, I'm much more tolerant of frame drops below 60. I still like higher rates better, but lower FPSes are acceptable. That said, if I can pull it off I like 100fps+ (my current screen is 144Hz). It does get noticeably more fluid sand smooth which is really nice.
 
More than 144 with a refresh rate limiter in-game. I wish to always be in excess and turn down my card so it's not working as hard, causing it to run quieter and cooler.
 
Being on 144hz changed a lot for me. 60hz/fps isn't the same anymore. For some older games locked under 60fps or even 30 I can deal, but with newer games, anything less than around 75fps becomes choppy. I don't mind lowering settings to keep the frame rates high either. I made the switch from high resolution to mid-range res and higher frame rate when video cards prices started to get out of control and I've also always wanted a high refresh rate monitor since the CRT days.
 
If I am cruising at 50 there is very little to complain about, so I also voted 50. Of course 60 is better, even noticeable better. Still 50 is playable without moaning about it.

A couple of years ago I was playing Rune on a 60hz CRT monitor. Rune was locked at 60fps. Man, did that look sweet! That is when I understood how having FPS matching monitor refresh rate is truly ideal.
 
I like my average fps to be 80 or above. This makes it so my gameplay is smoother during those intensive spots. I'm generally ok with fps going as low as 40-50.
 
I aim to a minimum of 60fps. I like to keep my averages at 70 - 80 but depending on the game I like to keep at 120 - 144fps. I'm extremely sensitive to fps, I can certainly tell the difference between 144 and 120 and 100 and below..

Some games below 60fps are entirely unplayable to me. Normally Im comfortable with a range of 75fps - 90 - 120fps (being 75 the minimum, 90 the average and 120 the cap) this comes because I used for years a 120hz panel and that's what I got used to.

Only a single game made me play at 60fps and was fallout 4 however it was painful at first.

People can really get used to very high refresh rate and never look back, when I see my cousins playing with their consoles at 30 fps it really feel sick and painful to me.
 
Regular monitor? 60 minimum, I've gotten sensitive to framerate over the years.

Since I got a freesync monitor? Anything in the range is good.
freesync was terrible for me, I could see the screen pulsating as the fps went to the lower threshold. I don't mind lower fps, but my eyes mind low refresh rate in a screen. It was as if the old crt monitors came back to haunt me when I enabled freesync.
 
i picked 75. if you're getting 60 AVERAGE you're going to have a bad time.

average = some of the time 40 some of the time 80. 75 AVG you'll probably not fall TOO much less than 60fps.

if this was worded as "minimum acceptable FPS" my answer would be 60.

now if you have a 165hz screen, thats a different story.
 
I would select multiple framerates if I could. I enjoy my FPS games but not great at them, I'd say at least 60fps with graphics turned up on FPS games. Open world areas I prefer higher quality graphics and at least 45+ fps. Closed world or dungeon games I'm good with 30+. If I can get above 25+ with a 4k game fully maxed out for visual flair I am okay with. I do not need a super high amount of fps although if I am benchmarking, I do like seeing the higher numbers but otherwise not needed.
 
60fps because I use 60Hz displays.
I will play a game that dips lower (stares hard at Kingdom Come) but I buy the fastest card to try and not drop below 60fps.
Averge 60fps because I use vsync, so minimum 60fps too.
 
Last edited:
60 for SP games. For MP shooters I like around 100-120. Other games where quick reaction times are not as important, 60 is fine as well. But I'll gladly take more.

Sometimes a game will drop to 50 or so and I will be okay with it, but I do want the average to be around 60. Dropping into the 40s is noticeable and I try to avoid that... I hope the GTX 1070 can handle these newer games a bit longer.
 
Where's the option more the better?

AAA/Console Ports are fine at 60FPS, shooters 1440p and over 100FPS is mandatory.
 
Back
Top