What (average) FPS is acceptable to you in games?

What (average) FPS is acceptable to you in games?

  • less than 15 FPS - You must be really into slideshows!

    Votes: 1 0.5%
  • 15 FPS

    Votes: 1 0.5%
  • 25 FPS

    Votes: 1 0.5%
  • 30 FPS

    Votes: 19 9.1%
  • 40 FPS

    Votes: 25 12.0%
  • 50 FPS

    Votes: 18 8.6%
  • 60 FPS

    Votes: 75 35.9%
  • 75 FPS

    Votes: 22 10.5%
  • 100 FPS

    Votes: 28 13.4%
  • 120 FPS

    Votes: 10 4.8%
  • 144 FPS

    Votes: 7 3.3%
  • more than 144 FPS - Explain yourself!

    Votes: 2 1.0%

  • Total voters
    209
40-ish FPS on the assumption there aren't any significant framedrops. I tend to notice changes in FPS more then the FPS itself.

Basically, as long as it's smooth.
 
You know what I find fun? Sitting FPS elitists at a system locked to 30 and they dont know it.

Never once heard a bitch about it.

Its all in your head IMO. Anything 24FPS+ is fine by me. Thats the same FPS movies are shot at (or used to be shot at). Have you ever heard someone bitch about low FPS in a movie?
 
I love my visuals, so I'm fine with 30 if it gets me great visuals. After all, that's what most console games run at.
 
You know what I find fun? Sitting FPS elitists at a system locked to 30 and they dont know it.

Never once heard a bitch about it.

Its all in your head IMO. Anything 24FPS+ is fine by me. Thats the same FPS movies are shot at (or used to be shot at). Have you ever heard someone bitch about low FPS in a movie?


A bit of an odd argument with that. When I'm playing Destiny 2 at 60 fps & it drops to 30 fps for the cinematics, I instantly notice the change.
 
You know what I find fun? Sitting FPS elitists at a system locked to 30 and they dont know it.

Never once heard a bitch about it.

Its all in your head IMO. Anything 24FPS+ is fine by me. Thats the same FPS movies are shot at (or used to be shot at). Have you ever heard someone bitch about low FPS in a movie?

Movies do seem better optimized where I don't notice the input lag as much
 
You know what I find fun? Sitting FPS elitists at a system locked to 30 and they dont know it.

Never once heard a bitch about it.

Its all in your head IMO. Anything 24FPS+ is fine by me. Thats the same FPS movies are shot at (or used to be shot at). Have you ever heard someone bitch about low FPS in a movie?
I do bitch about low FPS in films all the time. Do you know why movies seem smooth? Motion blur. Pause a movie in the middle of an action sequnce and what do you see? A giant blurry mess. 4K and 8K are completely useless gimmicks until we we raise the FPS to at least 48 (preferably 96) to preserve detail during motion.
The 30FPS console games are also percieved smooth because they have motion blur tacked on them, which I absolutely despise, I hate artificial motion blur even more than I hate the natural motion blur introduced in film with extended shutter times.

In a 24 fps movie the shutter speed has to be about 40 ms for motion to seem smooth. That's absolutely awful. Even with 1ms shutter speed you'd get an inch of motion blur on fast moving objects using ultra wide angle lens.
 
I do bitch about low FPS in films all the time. Do you know why movies seem smooth? Motion blur. Pause a movie in the middle of an action sequnce and what do you see? A giant blurry mess. 4K and 8K are completely useless gimmicks until we we raise the FPS to at least 48 (preferably 96) to preserve detail during motion.
The 30FPS console games are also percieved smooth because they have motion blur tacked on them, which I absolutely despise, I hate artificial motion blur even more than I hate the natural motion blur introduced in film with extended shutter times.

In a 24 fps movie the shutter speed has to be about 40 ms for motion to seem smooth. That's absolutely awful. Even with 1ms shutter speed you'd get an inch of motion blur on fast moving objects using ultra wide angle lens.
So much this. The Hobbit series was a pleasure to watch in 48 FPS.
 
As long as it doesn't dip under 40. I don't need constant 60, I've accepted that since moving to 4k - there are some exceptions in fast twitch type games though where higher FPS can be a huge advantage, but I'm not really in the competitve arena anymore so it matters less and less to me every year, and high res ultra detail and immersiveness means much more, albeit sometimes at the cost of performance.
 
Well, ultimately more FPS means lower input lag which is a good thing no matter how you put it. My recollection of your post history, M76, tells me you care more about the story. Nothing wrong with that, of course, but my perception is that you also are uneasy with people who play games and get good at them at the same time.

I've had my current 120 Hz Benq TN panel for 6 years now and so far I've been happy with it. Recently tried a curved 144 Hz (gsync, not that it matters at high fps) IPS and I it was an obvious improvement over what I have.

And that was with CS:GO at 144Hz. I'd imagine Quake would feel a lot better at 200 and beyond on a low retention panel. It may not matter to you but not everyone plays exactly the same way. At some point a host of little things - that you may consider RGB like - scale up and make things better for everyone. Give me 1000 FPS and 120K resolution I say. Let's push it.

To me playing Doom at almost constant +120 FPS even in the craziest fights felt nothing like enjoying significantly better graphics in simulated 4k using DSR at 60 fps. Looks better due to dynamic resolution but plays like shit. No thank you.
 
I voted '60 fps average', but for me, it's 60 fps minimum.

I then frame cap to 60hz, vsync off. This is at 4K so 60hz is all there is, but these days I'm fine with that.

5 years ago I was all into 120hz+ but just don't care anymore. :oldman:
 
I have a very good rig and I still play with all of my graphics on low. I really can't stand low FPS and less than 144hz anymore..
 
60 ideally, but honestly I'll settle for 45-50 in most non-competitive games as long as it stays smooth and has plenty of eye candy in trade-off.
 
I think there are a lot of things to consider when you ask "What average FPS is acceptable?" Not all games need to be played at maxed resolution, for instance I see no real difference between 1080p and 4k in Battlefield 4. So that being said, I could lower the resolution to increase the FPS. I play on a 4k monitor which is capped at 60 Hz, so anything above say 90 FPS I don't really notice a difference. I could play at 4k (60-75 FPS) or at 1080p (120 FPS). I don't really see a difference between the two,m so I leave that particular game at 1080p because it "feels" smoother.

Games like Fortnite need a high resolution to see into the distance. My 4k monitor is 43". So Playing at max resolution is important, but the fluidity of the game isn't really noticeable over 60 FPS.

I think the question should be "What average FPS is acceptable at given resolutions: 4k, 1440p, 1080p or lower if your on a really old monitor". Obvious answer for most is "As high as I can afford."
 
I think there are a lot of things to consider when you ask "What average FPS is acceptable?" Not all games need to be played at maxed resolution, for instance I see no real difference between 1080p and 4k in Battlefield 4. So that being said, I could lower the resolution to increase the FPS. I play on a 4k monitor which is capped at 60 Hz, so anything above say 90 FPS I don't really notice a difference. I could play at 4k (60-75 FPS) or at 1080p (120 FPS). I don't really see a difference between the two,m so I leave that particular game at 1080p because it "feels" smoother.

Games like Fortnite need a high resolution to see into the distance. My 4k monitor is 43". So Playing at max resolution is important, but the fluidity of the game isn't really noticeable over 60 FPS.

I think the question should be "What average FPS is acceptable at given resolutions: 4k, 1440p, 1080p or lower if your on a really old monitor". Obvious answer for most is "As high as I can afford."

It's true that it's really dependent on a number of factors. The difference in resolution really depends on; screen size, distance from monitor, native game texture resolutions, up/down-scaling tech, FPS can also contribute to clarity. The perception of FPS smoothness/clarity is dependent on the qualities of the sum of all contributing factors that go into pixel response time and pixel transitional rate.

With the right level of transition to response, most levels of FPS can be presented in a smooth manner. But, without having a standard to follow for such to be achieved in a manner acceptable to all (personal preference or performance goal limited by current level of technology), rendering techniques are often used to simulate motion-blur and create smooth transitions. Pre-rendered scenes can often be run at low FPS and be perceptibly smooth/clear because all the excessive rendering that can be used to create it isn't something that is being done on-the-fly, or need to account for unpredictable changes (so things like proper levels of motion blur won't look wonky if they're setup properly).

It all boils down to how our [analog] senses perceive digital outputs and how that digital output can be augmented to simulate an analog reality. Basically if rendering techniques are properly applied to a specific FPS level (and hardware is capable of handling it), the viewer can be fooled into thinking the digital output is analog and higher FPS starts to become less important.
 
Try playing Witcher 3 at 30 fps.... It's borderline unplayable during fights etc.

That's probably the only game I've ever seen that LOVES hyper-threading..
 
I've played things at 30fps and they have seemed fine. I've played things that my system would hit well over 100fps and they sucked and felt choppy. It really depends on the gameplay, how input is handled, and how big the FPS swing is. Some game styles are really punishing on input lag, some are less so. Some input handling feels pretty locked in if of slightly reduced awesomeness with lower frame rates, and some feel like floaty skatey messes if you aren't at a high framerate. Some feel like that at any frame rate because the input handling is just shitty. Some games are playable at lower frame rates, but your latency has to be reasonably stable and if it is all over the place with given settings it's not going to work.
 
I have gotten used to a 144 Hz monitor and G-Sync. It is certainly noticeable now when dipping below 100 FPS, or so.
 
I have gotten used to a 144 Hz monitor and G-Sync. It is certainly noticeable now when dipping below 100 FPS, or so.

Isn’t the whole point of Gsync to synchronize FPS with refresh rate so you don’t notice dips/low FPS?

30 FPS is fine for anything except shooters imo.
I personally find 120+ FPS is pointless, especially when it is at the expense of image quality.
 
Isn’t the whole point of Gsync to synchronize FPS with refresh rate so you don’t notice dips/low FPS?

30 FPS is fine for anything except shooters imo.
I personally find 120+ FPS is pointless, especially when it is at the expense of image quality.
The point of G-Sync and other VRR is to eliminate tearing when the frame rate does not equal the refresh rate. You can still tell when the frame rate gets low. I start to notice it when a game drops below 80, while below 60 is obvious.
 
For competitive games I shoot for 250+ FPS even though my monitor displays 144hz. Lower input delay, and certain games like CS:GO, PUBG, and old school quake games Q3 love crazy high FPS. Also- frametime.

For single player I prefer 120+, I’ll do 60fps for ultra, maxes graphics. If the game isn’t particularly fast paced I’ll do anything that doesn’t bother my eyes horribly which is minimum of 40FPS.
 
If you picked very high FPS can you actually tell the difference or is it more like a placebo effect? Do you think you could see the difference between 100 and 120 in a blind test? I doubt that.
When I got my first 144hz monitor, I discovered something. I don't know if it's the same for everyone, but it's my personal experience.

Somewhere around 90-100fps is where the "magic" happens for me. Below that, and my brain doesn't register onscreen motion as actual movement. It knows that it's a screen, it's displaying a sequence of images very quickly, and it's not motion.

Above that threshold is when my brain is tricked into seeing onscreen movement as actual movement. The instincts that respond to movement in my field of view are as present when watching a 100+fps screen as they are when watching actual events.

I'm not a competitive enough or practiced enough gamer to say that the extra frames give me any sort of "edge," although I've been using high refresh-rate monitors long enough now that I can easily spot lower ones. I can adjust to 60fps (as I sometimes game on my travel PC,) but the first few minutes of gameplay are always a bit jarring until I become reacclimated.

High refresh rates are a very subtle thing, I think, but bring enormous value to a gaming experience for me.
 
I like mine synced. So 60. Or 120. As long as the card can keep up. This is why I'll continue to buy new cards, even though I only game at 1920x1080. I want to keep up with effects being cranked up, and quality settings on new games, but I want my frame rate to stay put. I can't stand tearing, stuttering, and while I can handle an OCCASIONAL dip in frames per second, I am happiest when it's solid. Not faster, not slower. (unless faster is also solid) :D
 
Picked 120FPS, but really I mean sub-8.3ms frametimes. Because 'average' involves arbitrary highs and lows, and I'm not a fan of lows :).
 
The same thing it does at 100- fps.

It draws a new frame on your monitor as soon as your GPU gets done with it, instead of waiting for your monitor's next vertical scan.

I get that but have you noticed an improvement at say 120 Hz?
 
I don't really notice dips under 60 - at least unless it starts to tear. I do notice if it dips so much that it starts to studder/hiccup - that's usually under 30fps.

I do also notice big jumps in framerate. If I'm cruising along at 60+, and all of a sudden it drops to 30, then that's very jarring as well. It's most noticeable on view changes - like panning left/right where new items are coming into view, rather than say, moving just forward and it's just the perspective changing. Whereas a constant 30 FPS, I can often live with it.

I can notice a difference on FPSes on console/PC (30/60fps), but it isn't a huge deal for me - the controller vs KBM is a much bigger deal, and I honestly don't know how much of it is that I'm limited to 30FPS or how much of it is that the controller has limited range of control.
 
I can get by with 25 fps with the strategy, builder games I typically play. Simulation speed is my metric as opposed to just fps.

For all other games above 35 fps with no dips.
 
You know what I find fun? Sitting FPS elitists at a system locked to 30 and they dont know it.

Never once heard a bitch about it.

Its all in your head IMO. Anything 24FPS+ is fine by me. Thats the same FPS movies are shot at (or used to be shot at). Have you ever heard someone bitch about low FPS in a movie?

That would never happen for me. Yeah im FPS elitist, and I would see the low fps straight away.
 
I think for me, as I have been using 144hz now for about 5 years, I always set my graphics options to get me AT LEAST 75-80 FPS Minimum.

To me, 144hz monitor vs 60hz is night and day difference. I notice straight away if the game dips below 60 ish.

Any game that requires camera movement, especially first person shooters, then yeah, for me 80 is minimum.
Racing, or other types of games, I can live with 60 and above though.
 
I get that but have you noticed an improvement at say 120 Hz?
When I got my first 144hz monitor, I discovered something. I don't know if it's the same for everyone, but it's my personal experience.

Somewhere around 90-100fps is where the "magic" happens for me. Below that, and my brain doesn't register onscreen motion as actual movement. It knows that it's a screen, it's displaying a sequence of images very quickly, and it's not motion.

Above that threshold is when my brain is tricked into seeing onscreen movement as actual movement. The instincts that respond to movement in my field of view are as present when watching a 100+fps screen as they are when watching actual events.

I'm not a competitive enough or practiced enough gamer to say that the extra frames give me any sort of "edge," although I've been using high refresh-rate monitors long enough now that I can easily spot lower ones. I can adjust to 60fps (as I sometimes game on my travel PC,) but the first few minutes of gameplay are always a bit jarring until I become reacclimated.

High refresh rates are a very subtle thing, I think, but bring enormous value to a gaming experience for me.
 

I meant at 120 Hz with and without gsync. Sorry for not making myself clear enough
 
Back
Top