Anyone here NOT care if a game is 60 FPS or not?

I would weather through a slideshow so I could play an FPS game at higher resolution back in the 90s. I can live with a 20 fps low-point if the game is good. I notice all manners of slow-down, they just don't annoy me unless it's a MP match and costs me a kill/round.
 
It depends heavily on the game. If it's something where quick timing matters and/or is competitive (FPS, Racing), then 60fps or better is important. However a third person, single player action game (think Tomb Raider, GTA, etc.) it really doesn't bother me as much. It still has to be playable levels, 30fps at absolute minimum, with averages above 40, but I don't strive to stay above 60 at all times.

True this for me...couldnt have said it better.
 
None of those have remotely good gameplay on any level, as is typical of PC games or games that cross onto the PC platform. You're there to watch the game as you play through it and look at sparkly shiny things. This has always held true, take Doom... as soon as it became about GFX the gameplay went to shit. It's the universal truth in gaming, the better the GFX the shittier the game play. It's why hardcore gamers avoid fancy graphics like herpes and stay the hell away from anything that starts talking about what sort of visuals it pumps out, if they are talking about the graphics it's because the game play sucks and is made for a 4 year old who will be impressed by the visuals. You can also tell someone is a casual gamer when they talk about graphics as being important, sure sign of a pure casual is they care about it.

Don't get me wrong I like and buy those games, but I'm not stupid enough to think they are any more of a game than watching a blue ray is. It's fun, but it's casual as hell.

Good games don't feel the need to hide behind fancy visuals, same's true of movies as well. Once they start screaming visuals you know it's to hide a steaming turd.

Your opinions interest me. I am curious..to you what are the best and worst games of the last few years and why? Just want to see just where this inverse relationship btwn great game vs great graphics holds or falls down.
 
I think 60+ is nice, but as long as it's above 40 I'm happy. Under 40 I can notice the choppiness and it bugs the crap out of me.
 
Above 40fps is good. I'll generally trade higher resolution and fancy effects (except AA) for fps as long as fps is about 30. Can depend on the game, of course.
 
Dont care about the framerates as long as there is no microstutter and the game plays smoothly in all situations and all load ranges.

I know some people in here say they turn down eye candy for good framerates. I would never do that since, in my opinion, that defeats the purpose of playing games on a PC. If I wanted dumbed down graphics for consistent frame rates I will just get a console.

If you cant play the games you want with all settings maxed and maintain comfortable framerates it time to save your milk money and upgrade.
 
Well, if you had a consistent 30 fps, it should still appear buttery smooth. As other people have already stated, it's more like microstutter that's occurring. You could be running at 1000 fps, but if there are pauses anywhere greater than 40 ms in those 1000 frames in that second, it will still appear jerky.

I'm aware of what microstutter is and that's not what I'm describing at all. There is an absolute difference in how smooth V-Synced 30FPS looks versus 60 and especially 120.

Hell... I can even tell a difference on my desktop doing work in Office or browsing the internet (sometimes my 120Hz gets set back to 59Hz and have to manually set it back to 120)

As far as the information goes, unless you have a very tight FoV, I doubt you'll be much better at interpretation of those extra 7 frames in that time span.

You'd be surprised. After getting the 120Hz, my Kill/Death ratio improved dramatically. It's not just spotting the movement of other players in the background either. It's also the improved accuracy of the mouse for aiming. It's the equivalent of having your mouse set to 300DPI versus 1200DPI (30FPS versus 120FPS)

I'm sure other 120Hz users can back me up on that. It's something you have to use to understand I think.
 
Dont care about the framerates as long as there is no microstutter and the game plays smoothly in all situations and all load ranges.

I know some people in here say they turn down eye candy for good framerates. I would never do that since, in my opinion, that defeats the purpose of playing games on a PC. If I wanted dumbed down graphics for consistent frame rates I will just get a console.

That's the strength of the PC, IMHO. You can tailor just about any game to your personal preferences. To me, at least, gameplay > lens flare and SSAO
(not that they're not nice effects... but when you're using somewhat older hardware, you have to make sacrifices)

Besides, good luck getting a console game to run at anything over 60FPS :p
 
That's the strength of the PC, IMHO. You can tailor just about any game to your personal preferences. To me, at least, gameplay > lens flare and SSAO
(not that they're not nice effects... but when you're using somewhat older hardware, you have to make sacrifices)

Besides, good luck getting a console game to run at anything over 60FPS :p

Some of the best games on PCs aren't even that demanding on the computer. FTL comes to mind...
 
i remember when cs came out, i played it with about 10 fps and keyboard only.

I have no idea how I can play like that in the past.
 
It depends on the game. Some games are playable at 30 fps, but having it around 60 fps is always a plus. I for one don't care about fps as long as it is playable.
 
i remember when cs came out, i played it with about 10 fps and keyboard only.

I have no idea how I can play like that in the past.

That's how I played Quake and the original Team Fortress. I was confused how people used the mouse to aim back then.
 
Well, if you had a consistent 30 fps, it should still appear buttery smooth.
Given very convincing motion blur, yeah, 30 fps will be extremely smooth. Without motion blur, though, most anyone can very easily detect a difference between 30 fps and 60 fps. The sample rate just isn't high enough. Not everyone can detect a difference between 60 and 120, but the difference between 30 and 60 is marked.
 
I hate motion blur for most games. It works well in racing games and the like, but in anything else it drives me nuts. I can't play the 360 version of Fable 3 just because it's so bad.
 
I hate motion blur for most games. It works well in racing games and the like, but in anything else it drives me nuts. I can't play the 360 version of Fable 3 just because it's so bad.

word. Alan wake is really bad aswell
 
Given very convincing motion blur, yeah, 30 fps will be extremely smooth. Without motion blur, though, most anyone can very easily detect a difference between 30 fps and 60 fps. The sample rate just isn't high enough. Not everyone can detect a difference between 60 and 120, but the difference between 30 and 60 is marked.

I can't. If you show me a website that says here is 30 fps and here is 60 fps then yea I can point out looking at same objects next to each other.

However if I am in the middle of a firefight in BF3 or some shit and you ask me "hey is this particular moment at 60 fps or 30 fps I would probably be wrong more than half the time.

That is the problem 60 fps is just a marketing term any more.
 
I can't. If you show me a website that says here is 30 fps and here is 60 fps then yea I can point out looking at same objects next to each other.

However if I am in the middle of a firefight in BF3 or some shit and you ask me "hey is this particular moment at 60 fps or 30 fps I would probably be wrong more than half the time.

That is the problem 60 fps is just a marketing term any more.

You should be able to tell pretty easily when aiming down the sights. Maybe some people are just more sensitive to it than others?

Anyone remember CRT flicker at various refresh rates/resolutions? Some people it never bothered, while others it tended to give headaches. I'm guessing it's the same sort of deal

Can you tell the difference from the Hobbit 24fps and the 60fps trailers?
 
However if I am in the middle of a firefight in BF3 or some shit and you ask me "hey is this particular moment at 60 fps or 30 fps I would probably be wrong more than half the time.
My wording was specific. I said "most anyone can very easily detect a difference between 30 fps and 60 fps". If you're asked on the spot whether what you're looking at 30 fps or 60 fps, you aren't discerning differences between the two: you're making an evaluation based on your experience.
 
I imagine 60 fps would help out in a lot of movies that have that terrible "shakey cam" effect. For example, every movie Michael Bay has ever made.

Or how every fight scene has too much motion blur.

http://frames-per-second.appspot.com/ This is a good experiment on how motion blur can effect a image terribly at lower fps but at higher fps motion blur has a more realistic appearance.
 
My wording was specific. I said "most anyone can very easily detect a difference between 30 fps and 60 fps". If you're asked on the spot whether what you're looking at 30 fps or 60 fps, you aren't discerning differences between the two: you're making an evaluation based on your experience.

Its a baseless claim even if it is specific. Who is your "most everyone".

I am in the field of technology and I can't even really tell. That means how many more idiots out there probably don't even know what FPS is?

I am not saying we all don't on this forum, but what I am saying is have you ever seen a study like I described in my post?

Probably not? You know why? Because most people don't notice or care as long as their game runs.
 
I am in the field of technology and I can't even really tell. That means how many more idiots out there probably don't even know what FPS is?
It makes no difference whether people know the terminology or not, nor does it matter how much experience people have in the field of technology. Again, we're talking about people being able to discern differences between the two frame rates. I didn't say that most people could look at two examples of motion and say "this one is 30 Hz and this one is 60".

This isn't "I can tell what frequency this is". This is "I can tell that this frequency is different from this other one".
 
People might not know the hows and whys (or even care), but I bet most can at least verify that those two framerates look very different.
 
It makes no difference whether people know the terminology or not, nor does it matter how much experience people have in the field of technology. Again, we're talking about people being able to discern differences between the two frame rates. I didn't say that most people could look at two examples of motion and say "this one is 30 Hz and this one is 60".

This isn't "I can tell what frequency this is". This is "I can tell that this frequency is different from this other one".

If people don't know the terminology then people won't actively seek out the difference between the copies. Hence why you have all these developers talking about 60 FPS, but packing never really indicates that.

If I went to my wife and sat down to play a racing game with her, say that game was need for speed on the Xbox and I said to her "man this game looks so much better at 60 fps on my PC" she isn't going to be like "I know dude, she is going to say what is fps?"

I could go to a website and show her a side by side picture, but I doubt after looking at that picture she would be able to point out the difference in game.

That is what you are ignoring. You guys are actively looking for something like 60 FPS and then you are conditioning yourself to say "I have to tweak the game settings for 60 FPS or I vomit blah blah" when all along in an A/B blind test you might not be able to pick out the difference.
 
People might not know the hows and whys (or even care), but I bet most can at least verify that those two framerates look very different.

Not in the moment. Sure on a website maybe, but sit them down for 15 minutes with a custom demo of say like BF3.

Then overlap sections of 30 and 60 fps game play. Have them press a button each time there is a segment of 30 fps gaming vs 60 fps gaming.

Report back results. I will be waiting and I will be right, most people won't notice a difference, even your hardcore tweakers who don't have the FPS counter in front of them.
 
Not in the moment. Sure on a website maybe, but sit them down for 15 minutes with a custom demo of say like BF3.

Then overlap sections of 30 and 60 fps game play. Have them press a button each time there is a segment of 30 fps gaming vs 60 fps gaming.

Report back results. I will be waiting and I will be right, most people won't notice a difference, even your hardcore tweakers who don't have the FPS counter in front of them.

I would think that both of them would need to be pre-rendered to ensure that all other factors are the same. The only difference would be the framerate. Considering that 60 is evenly divisible by 30, I would postulate that it is more difficult to tell the difference between 60 and 30 fps, then it would be to tell the difference between 25 or 35 and 60.
 
I would think that both of them would need to be pre-rendered to ensure that all other factors are the same. The only difference would be the framerate. Considering that 60 is evenly divisible by 30, I would postulate that it is more difficult to tell the difference between 60 and 30 fps, then it would be to tell the difference between 25 or 35 and 60.

Agreeable just that 30 and 60 are such buzz terms or standards around development of games I guess.

You could do your test in a lot of different frames if you wanted to prove a number of different things.
 
I would think that both of them would need to be pre-rendered to ensure that all other factors are the same. The only difference would be the framerate. Considering that 60 is evenly divisible by 30, I would postulate that it is more difficult to tell the difference between 60 and 30 fps, then it would be to tell the difference between 25 or 35 and 60.

Well, it's fairly easy to tell the difference between 15 fps and 30 fps, being that they're both divisible by 15. And for me, it's actually easier to tell the difference between 24 fps and 30 fps than it is to tell the difference between 30 fps and 60 fps. (You can use that website posted above to check yourself). Not that I can't tell the difference, but as the resolution becomes smaller between frames, the noticeable effect becomes less. It's similar to how the difference between 720p -> 1080p to me looks a lot bigger than 1080p -> 4k.

The problem with a frame rate of 30 fps right now is that in that second, you probably get 1/2 a second at 45 fps and half a second at 15 fps. It's those minor dips which cause people to say "OMG, this looks horrible". I still think the best way would be to have a better judge of perceived frame rate, because 30 fps isn't necessarily 30 fps, even in the same game. And for me personally, it would help clean up this myth that 30 fps is unplayable.
 
If people don't know the terminology then people won't actively seek out the difference between the copies.
This isn't of any relevance to anything I've said.

Considering that 60 is evenly divisible by 30, I would postulate that it is more difficult to tell the difference between 60 and 30 fps, then it would be to tell the difference between 25 or 35 and 60.
I can't think of any reason why it would make a difference, assuming the timestep is still correct.
 
For me, I don't require 60fps solid + to run, I just aim for at least 40fps and above.
 
I liked the visuals and immersion over fps when I was single monitor/single player. But when I tried NVidia surround, with the 120Hz monitors, I went apeshit and added the third 780 to get the max frame rate I could. Right now I'm playing Firefall a bit and outside the cities I'm pushing 120 fps on all monitors when not running on 3d. In a city it can drop to the 40's sometimes and that is where in my opinion you can see a marked difference in the frames/gameplay...
 
This isn't of any relevance to anything I've said.


I can't think of any reason why it would make a difference, assuming the timestep is still correct.

It's because the human eye/brain is more sensitive to rythem than just frequencies. When you double it up, it maintains a similar rythem, but if it's not divisible, it becomes more noticeable. I mean, have you ever taken a video of a PC monitor while something is running on it? You'll see the flickering of the screen refresh because it's not synchronized. The human eye can tell the difference more when more factors are thrown into the mix.

Even if the viewer can't quite articulate why it's different.
 
People might not know the hows and whys (or even care), but I bet most can at least verify that those two framerates look very different.

I agree. I have precision running to check my frames when I notice something odd about my movement.... Low and behold its less than the normal frame rate. Once out of the chaos, everything is smooth again

Some ideas to help would be to lower your settings for smoother gameplay, buy more power to push your games at higher settings, or overclock your current system.

But to the OP, yes I like to be above 80 to stay smooth during fps drops
 
It's because the human eye/brain is more sensitive to rythem than just frequencies. When you double it up, it maintains a similar rythem, but if it's not divisible, it becomes more noticeable. I mean, have you ever taken a video of a PC monitor while something is running on it? You'll see the flickering of the screen refresh because it's not synchronized. The human eye can tell the difference more when more factors are thrown into the mix.

Even if the viewer can't quite articulate why it's different.

A cute theory, but bonkers
 
You'll see the flickering of the screen refresh because it's not synchronized.
You see the flickering because most video is recorded at 30 Hz. That's well below the typical CFF. Even if the video capture were synchronized with the blanking interval, you would still observe flicker.
 
Back
Top