Anyone here NOT care if a game is 60 FPS or not?

Anything below about 50 fps and I start adjusting settings. How people play with 30 fps is baffling to me.
 
I don't think good enough motion blur is physically possible. The brain is too capable of seeing the difference between stuttery blur and actual smooth motion. Like a panning shot in a movie, doesn't matter how well they do it, it looks stuttery.
Which might be desirable for certain types of games. Like I said, it would depend both on the game and on the person (what the person wants in terms motion). If you wanted a game to have the look and feel of film, you might want to cap it to 24 Hz and use the right type of filmic effects to achieve that.

Perfection — with respect to the limits of the human eye — probably isn't achievable in 24 fps, but you can actually synthesize smoother motion than current film is able to capture by leveraging certain optical tricks.
 
The way the game feels is more important to me then eye candy.

60fps or go home.
 
Smoothness in gameplay is all I care about. I've never measured frame rate for any game i've played. I don't own consoles. Had enough of the frame rate monitoring during my flight sim days.
 
60 is the bare minimum for SP games. Multiplayer is all about 120. Been like that since... well, Quake 3. I am very picky when it comes to fps and latency.
 
min FPS is far more important by so much!! i would rather heave steady 40-5FPS than 60, 100 and dropping down to 20 at points.
 
Some games you can play fine 40-50 fps with it still being smooth enough. 60 is desirable for most obviously.
 
I don't play shooters much and don't play PvP online at all, and I really don't care much about 60 fps most of the time. Mostly I only worry about minimum FPS. 30 is generally fine for my RPGs, and real time strategy games can go lower. Then there's Civ 5... I need what, like 10 fps for that?
 
I've never cared about 60 FPS...as long as I hit a minimum of 30 I'm good...60 is ideal but 30 is very playable...I'm more concerned about image quality and maxing out my graphics at 30 FPS versus hitting 60
 
I don't really look at framerate - just whether or not the game plays smooth.

That said, it annoys me more when the game I'm playing is competitive in nature.

In single player games, I wouldn't mind the game hiccuping every now and then.
 
I've never cared about 60 FPS...as long as I hit a minimum of 30 I'm good...60 is ideal but 30 is very playable...I'm more concerned about image quality and maxing out my graphics at 30 FPS versus hitting 60


Yeah but see, I can detect when it flips from 30 to 60, or vice versa, or even 60 to 45 and so on. It sucks, I guess some people are just more prone to noticing.
 
Yeah but see, I can detect when it flips from 30 to 60, or vice versa, or even 60 to 45 and so on. It sucks, I guess some people are just more prone to noticing.

Ditto. Though I find that when I'm really sucked into the game, the immersion overrides the frame dip. I guess that says more about how good the game is than anything. Far Cry 3's bumpy framerate bothered me not in the least, especially once I got into it and started busting caps in some pirate ass. :cool:
 
I wonder if there's a correlation between those gamers on a limited budget willing to accept 30 and those of us always having two or three of the top cards unwilling to accept less than 60 and aiming for 120?

I don't play games online. I get headaches below 60. I've aimed for 100-120 since the days of the 21" sony crt.
 
I wonder if there's a correlation between those gamers on a limited budget willing to accept 30 and those of us always having two or three of the top cards unwilling to accept less than 60 and aiming for 120?

I don't play games online. I get headaches below 60. I've aimed for 100-120 since the days of the 21" sony crt.

Probably. People who spend more on their systems should expect to get much better performance than those who do not spend as much.
 
Probably. People who spend more on their systems should expect to get much better performance than those who do not spend as much.

I don't exactly have a slouch of a PC (even by [H] standards), but as I've stated above, I'm one of those people who can tolerate 30 fps. And honestly, many people I know with decent gaming systems want the gaming system for the graphical quality over being able to play at 100+ fps.

I'm not going to deny that there is a difference between 30 fps and 60 fps (as it's been proven people can even distinguish a single frame within several hundred frames within a second), but I wonder how many people here who claim to need 60 fps minimum would really need 60 fps if the game had a consistent frame rate, even as low as 30 fps, but with no way to actually judge it (i.e. with FRAPS)? I honestly think much of this is psychological.
 
I wonder if there's a correlation between those gamers on a limited budget willing to accept 30 and those of us always having two or three of the top cards unwilling to accept less than 60 and aiming for 120?

I don't play games online. I get headaches below 60. I've aimed for 100-120 since the days of the 21" sony crt.

Nope not at all.

Games with top graphics have crap game play and are about imerrsion, it's more like a movie than a game. Since the game play sucks so people can see all the graphics, who the fuck cares what your FPS is? It's silly, view it as a movie, 24 fps works there. For most modern RPGs and even stuff like Metro this works just fine. There's no need for more. It's the amount of details you can chuck at it, the point is to watch the game, not to play it. This is what PC gaming is all about.

On the other hand if it's old school stuff like Quake Live or CS 1.6, you know games not graphics fests... I bust out the Trinitron and go for 120 or bust and strip every last bit of detail out of it. Because these games are for playing, not for watching.

It's pretty basic, the better the graphics the less frame rates and controls count. Games you play to look at are not games you play to play.
 
For me, I'm willing to spend the extra $'s just to ensure I'm at 60 fps. I certainly don't want to, but at this point I guess I'm spoiled.
Luckily I'm only on a 1080p TV, so my resolution isn't too insanely high and I'm only shooting for 60 instead of 90 or 120.
Something to also keep in mind with console games is that a LOT of people have motion smoothing enabled on their television, even if they don't know it. That essentially "fakes" higher framerates by inserting extra frames into games/TV/etc. While it cripples precise inputs like a mouse, it's barely noticeable with analog stick aiming. Short of fighting games I tend to enable it with console games, too. That's one other reason a console game at 30fps often looks better than a PC one at the same framerate.
 
I also find it really depends on the game. I'm fine hovering above 30 on slower paced games but the faster the action, the more fps I crave.
 
Nope not at all.

Games with top graphics have crap game play and are about imerrsion, it's more like a movie than a game. Since the game play sucks so people can see all the graphics, who the fuck cares what your FPS is? It's silly, view it as a movie, 24 fps works there. For most modern RPGs and even stuff like Metro this works just fine. There's no need for more. It's the amount of details you can chuck at it, the point is to watch the game, not to play it. This is what PC gaming is all about.

On the other hand if it's old school stuff like Quake Live or CS 1.6, you know games not graphics fests... I bust out the Trinitron and go for 120 or bust and strip every last bit of detail out of it. Because these games are for playing, not for watching.

It's pretty basic, the better the graphics the less frame rates and controls count. Games you play to look at are not games you play to play.

That's an awfully jaded view. I certainly don't count games like dead island, skyrim, BL2 and so on as just 'watching' games...
 
I'm not going to deny that there is a difference between 30 fps and 60 fps (as it's been proven people can even distinguish a single frame within several hundred frames within a second), but I wonder how many people here who claim to need 60 fps minimum would really need 60 fps if the game had a consistent frame rate, even as low as 30 fps, but with no way to actually judge it (i.e. with FRAPS)? I honestly think much of this is psychological.

Sometimes it is, but sometimes framerates are also deceiving. I can get GTA4 to run at a steady 60fps...but it never actually "looks" like it. No matter what I do, it still looks like it's closer to 30 than 60, even though everything I have confirms it's at 60. It's just not very smooth.
Other games can look great running in the 40's and you'd have no idea you weren't at 60. Especially when a ton of stuff is going on.
There's definitely something else in play, but *usually* 60 is the sweet spot.
To me, when something is running around 30 fps, there's almost like a little bit of "flicker" to the screen. Like something just isn't quite right. To me, 50'ish is right when panning around and basic movement becomes very smooth and eye catching.
 
I don't exactly have a slouch of a PC (even by [H] standards), but as I've stated above, I'm one of those people who can tolerate 30 fps. And honestly, many people I know with decent gaming systems want the gaming system for the graphical quality over being able to play at 100+ fps.

I'm not going to deny that there is a difference between 30 fps and 60 fps (as it's been proven people can even distinguish a single frame within several hundred frames within a second), but I wonder how many people here who claim to need 60 fps minimum would really need 60 fps if the game had a consistent frame rate, even as low as 30 fps, but with no way to actually judge it (i.e. with FRAPS)? I honestly think much of this is psychological.

Well, the advantage to me (at least) is that when you're running a higher framerate, you're seeing more detail so to speak assuming the use of a 120hz display.

For fast paced multiplayer games especially, turning is a biggie. If it takes you (depending on mouse sensitivity) 0.25 seconds to do a 180* turn, running the game at 30FPS will only show you roughly 7-8 frames of information. Likewiese, running it at 60FPS will show you 15 frames and 120FPS 30 frames in the same amount of time. It doesn't sound like much, but it can make all the difference in the world when it comes to spotting movement off the side of the screen or at a distance.

On top of that, it just makes animation so buttery smooth even for non-multiplayer games... at least in most titles. Some games just have terrible animation no matter the refresh rate and framerate. But it most titles, it does make it look/feel significantly better

I can deal with lower than 60 in some cases (50 or so), but 30 is just too low IMO
 
I don't mind as long as I don't have stuttering and screen tearing. Getting high priced computer components has never been a high priority for me, so if I was expecting 60 fps, I would never be able to play any game.

However, high fps is definitely a big deal for some game types, although I can adjust.
 
Well, the advantage to me (at least) is that when you're running a higher framerate, you're seeing more detail so to speak assuming the use of a 120hz display.

For fast paced multiplayer games especially, turning is a biggie. If it takes you (depending on mouse sensitivity) 0.25 seconds to do a 180* turn, running the game at 30FPS will only show you roughly 7-8 frames of information. Likewiese, running it at 60FPS will show you 15 frames and 120FPS 30 frames in the same amount of time. It doesn't sound like much, but it can make all the difference in the world when it comes to spotting movement off the side of the screen or at a distance.

On top of that, it just makes animation so buttery smooth even for non-multiplayer games... at least in most titles. Some games just have terrible animation no matter the refresh rate and framerate. But it most titles, it does make it look/feel significantly better

I can deal with lower than 60 in some cases (50 or so), but 30 is just too low IMO

I think it's actually a good exercise to play on low fps sometimes. I had to get used to 24 fps or less for WoW since I had a shitty laptop, and for UT2004, so I learned to predict what my opponent would do and be ready with a DOT or flak cannon blast.
 
That's an awfully jaded view. I certainly don't count games like dead island, skyrim, BL2 and so on as just 'watching' games...

None of those have remotely good gameplay on any level, as is typical of PC games or games that cross onto the PC platform. You're there to watch the game as you play through it and look at sparkly shiny things. This has always held true, take Doom... as soon as it became about GFX the gameplay went to shit. It's the universal truth in gaming, the better the GFX the shittier the game play. It's why hardcore gamers avoid fancy graphics like herpes and stay the hell away from anything that starts talking about what sort of visuals it pumps out, if they are talking about the graphics it's because the game play sucks and is made for a 4 year old who will be impressed by the visuals. You can also tell someone is a casual gamer when they talk about graphics as being important, sure sign of a pure casual is they care about it.

Don't get me wrong I like and buy those games, but I'm not stupid enough to think they are any more of a game than watching a blue ray is. It's fun, but it's casual as hell.

Good games don't feel the need to hide behind fancy visuals, same's true of movies as well. Once they start screaming visuals you know it's to hide a steaming turd.
 
I wonder if there's a correlation between those gamers on a limited budget willing to accept 30 and those of us always having two or three of the top cards unwilling to accept less than 60 and aiming for 120?

I don't play games online. I get headaches below 60. I've aimed for 100-120 since the days of the 21" sony crt.

What good is higher FPS when it comes at the price of latency? Also, given the argument you make, I think there's a chance you're deluding yourself. Don't know for sure, though, and I am not claiming it, so take it easy pls :D
 
As others have said, micro stutter. That's what bugs the hell out of me. FPS isn't an issue for me as long as it is a fairly constant fps. Even then, sometimes it goes undetected unless I have fraps running. I think this is similar to a 3D discussion where many claim to hate it and others swear by it. Purely subjective.

[H]i by the way.
 
Well, the advantage to me (at least) is that when you're running a higher framerate, you're seeing more detail so to speak assuming the use of a 120hz display.

For fast paced multiplayer games especially, turning is a biggie. If it takes you (depending on mouse sensitivity) 0.25 seconds to do a 180* turn, running the game at 30FPS will only show you roughly 7-8 frames of information. Likewiese, running it at 60FPS will show you 15 frames and 120FPS 30 frames in the same amount of time. It doesn't sound like much, but it can make all the difference in the world when it comes to spotting movement off the side of the screen or at a distance.

On top of that, it just makes animation so buttery smooth even for non-multiplayer games... at least in most titles. Some games just have terrible animation no matter the refresh rate and framerate. But it most titles, it does make it look/feel significantly better

I can deal with lower than 60 in some cases (50 or so), but 30 is just too low IMO

Well, if you had a consistent 30 fps, it should still appear buttery smooth. As other people have already stated, it's more like microstutter that's occurring. You could be running at 1000 fps, but if there are pauses anywhere greater than 40 ms in those 1000 frames in that second, it will still appear jerky. I'd like to see rather than just counting how many frames you get in a second, FRAPS to give a better indicator such as perceived frames per second. I'm sure there are already fps overlays that do this (though I'm unaware of what they are). (*EDIT* Maybe a perceived minimum frame rate which takes the highest differential between frames within a timespan and calculates that. Once you can get a perceived minimum frame rate of 30 fps, it should appear smooth).

As far as the information goes, unless you have a very tight FoV, I doubt you'll be much better at interpretation of those extra 7 frames in that time span. Unless you have another player jump in on the edge of the screen in that 1 frame, you won't be missing that much. There probably would be auditory clues anyway.

In terms of why better players turn down the graphics, yes, I agree, some if it has to do with getting a consistent high frame rate. Consistency is the key here though. But also, some older engines had bugs which gave higher fps a speed advantage, and lower graphical quality makes players stand out easier.
 
I don't care about FPS, I care if its smooth. FPS doesn't tell you everything, just look at AMD and crossfire studdering.
 
I think a "steady" FPS count is preferable, no matter what it is. If your framerate spikes between 30-100 and averages 60, I think most would prefer an average of 40 that fluctuates from 35-45.
 
above 30 and I am ok. My internet 3mbps -5 mbps isn't the best anyways and honestly if I spent so much time adjusting graphic settings to the game type (32 players vs 64) and maps I would never play my games.
 
It depends on the game, but for the most part, I don't care. I'm much more sensitive to "micro-stutter" than I am to high frame rate. If I can get a game locked at around 30fps that never stutters, I'm good.

I second that.
IMO, 60 FPS barrier is generally overestimated. Smoothness is the key.
 
I don't pay attention to frame rates, I simply set high-res (5990x1200 when available) then adjust image settings if I experience stuttering. I don't care if its 30fps or 150fps, as long as it's playable.


Posted from Hardforum.com App for Android
 
Try to play BF3 on the outdoor maps at less than 60 fps with all hell breaking loose. 'nough said
 
Yeah not having at least 60FPS does feel somewhat sluggish, but if it's at a steady 50/40 or even 30 then it isn't too bad. The worse thing is when you have fluctuating FPS like people have mentioned. going from 10 to 40 to 60 to 30 back to 10 really does suck. So I say ad long as it's above 30 FPS and counting that it is consistently where it should be, then it's fine. Obviously I still prefer a steady 60 FPS or higher though.
 
Back
Top