Do people really need 60+ FPS?

BlackTigers91 said:
To game? I've picked up BF2 on my Geforce 6150, and, well, yeah. I've never seen the light of 40FPS. But, one thing I did notice...


I can game perfectly fine at around 25FPS. Sure, I get dips into the single digits (smoke), but thats the only time i feels weird. So, I guess my question is, do people really need to have 60 FPS to play the game properly? Or is it just something nice? Can higher frames help you game?



I have no idea, this 6150 is the best card I've ever seen.

(Oh, and if this turns into flamefest, I'll just lock it. I want serious discussion.)
as quite a few have said, Yes.

once you play games all the time at over 40FPS anything less is insufficent, and quite frankly bothersome.

I wish i could be happy with 25FPS. then i wouldnt need better video cards all the time, haha.


For example:

My girlfriend who plays a lot of GuildWars notices a difference from my x800Pro which got around 30-40FPS in the game to my 7950 which is obviously well over 100FPS. it plays smoother, looks bettter, etc.

So to that end it is even more important to have a better video card for Image Quality

alone, Granted games CAN be played at 25FPS, but i wouldnt want to.



supastar1568 said:
isnt madden 06 for the xbox 360 at 30 fps?

I could be wrong

Depends if you are playing @ 720P or not, Progressive Scan is 60FPS.
 
I can see the difference between 30-60, I consider 30 minimum playable frames. 60+ I can just feel the difference.
 
lets mention that fps needed can depend on the game.

A slower RTS game like Age of Empires would need a lot less fps to play smoothly

Fast paced first person shooter, you would want higher
 
kc7hif said:
human eye can detect up to 75 fps.. so once you hit that mark, shoot for quality and you'll have the best experience you can gaming.
actually theres been alot of contraversy around that lately, "how many frames per second CAN the human eye see" i personally always believed it was around 24 frames per second or something, maybe thats just to see motion.... ive come to the conclusion that our eyes can see 24+, 100+, and so fourth but that its our brains that cant capture and understand every frame we see, and merely discards them. the point to 60fps is a cause of the limits to our hardware....just food for thought....
 
The 60fps cap especially for multiplayer online gaming sucks! That would be great for Quake 1.. Why limit the whole thing to 60fps when it could have been 120?

I guess we could blame it on the current LCD technology, when so many people have 60Hz monitors that John Carmack decided to cap it at 60fps in order to ensure more accurate and matching online play (so that one person jumps in exactly the same frame that is supposed to be sent to other computers via data)...

Heck, it should be increased to 120 fps right away, regardless! What do we have 500Hz or 1000Hz laser mouse for? Why did we make a fuss over increasing the PS/2 rate of mice a few years ago?

It would be nice to see Jonathan Fatal1ty Wendel reply to this post with some kind of input with his own 2 cents..., eh?!? Of course, he would appreciate the 120fps cap ... greatly!
 
no you don't. People are just whiny b1tches who complain because their 250-600$ videocard is not getting them 60 fps in every game. I run every game at 20-25 fps, do you see me complaing and saying "Does anyone know how to unlock the 100 fps lock in bf2?" no you don't. People spend 1k-5k on pcs then complain about everything. And thats why America is so fucked up little johny;) !
 
xtasyindecay said:
No, America is fucked up because people like you and the OP try to attack things they can't have. People make these "justification" threads for lack of personal satisfaction with their unbalanced or crap-ass system. Sorry you are a poor piece of shit. Maybe Bush will hand out debit cards for poor fools to upgrade from a 6150 on-board and suddenly 60 fps will seem nice.

Stupid thread.
Excuse me?

I am not trying to attack anything, whatsoever. Hell, I even posted that I will be getting an x850 in the near future. I really do not appreciate you calling me a 'poor piece of shit' because I can't afford a decent video card.

All I wanted was some opinions...
 
It's not that I don't want a top of the line GPU, it's just that my sense of logic and reasoning tells me I can't tell the difference between 150fps in CSS and 102fps in CSS, while at the same time, my wallet could tell the difference between a $200 expense and a $600 expense.

The card I'm getting (x1800GTO2) is around $200, has 512mb of GDDR3, and kicks butt like no other ~200 card.
Yes I'm poor.
Yes I'm cheap.
No, that doesn't mean I can be pwned.

(Well, not necessarily. ;))
 
Betatester said:
actually theres been alot of contraversy around that lately, "how many frames per second CAN the human eye see" i personally always believed it was around 24 frames per second or something, maybe thats just to see motion.... ive come to the conclusion that our eyes can see 24+, 100+, and so fourth but that its our brains that cant capture and understand every frame we see, and merely discards them. the point to 60fps is a cause of the limits to our hardware....just food for thought....
Thank goodness. Finally someone who knows what the fuck they are talking about. The 60 fps thing is bullshit.
 
as long as it NEVER drops below 60FPS i dont mind not having no more htan 61fps
 
Now on the other hand... at some point all those fps are wasted.

I know a guy on another forum who plays CS:S, makes a good living, likes having a good computer. His current rig is an X2 4800+ OC'd, dual 7900GTX 512 cards in SLI. Monster benchmark scores.

He plays CS:S at 1024x768.

Wasted.
 
I only accept 'dips' to around 25-30 FPS but i like a constant average of 55-75 so a 6150 won't do it for me especially now that i have a a new LCD. When you use a CRT you can scale down to 1024x768 and use a cheaper card. I would till pick up a 6600 or x1300 pro or something similar even if i were a cheap gamer.
 
Remember Halo? Its got an option to set FPS at 30. When it came out it was a tough game for most cards. I had a ti-4200 that could just about keep up with it. The 30fps cap frankly made the game seem smoother for me.
 
We had a MASSIVE thread here last year about 60fps and what the human eye can and cannot detect, which ended in a "locked status" after a bit-o-trolling evolved after page 10 or so.

There was no substancial proof that was either pro / against that hypothosis. For me personally, I can absolutly see a difference in game play from 60 - 75fps. It gets sig. more difficult at 75+ fps. The biggest thing that I thing we all strive for is consistancy. If consistantly have about 50 - 60 fps, then you adapt your gameplay to that. Sig. changes will require you to alter / adjust the way you play. Thats what this is about.. consistancy.

Yes, if you average 100+fps then the game is like "butta baby". However, if that 100fps drops to 20-30 in high intensity gunfights, smoke, or like in BF2, changes in terrain from water to ground in a helecopter (or jet), you dont have time to adapt, and your ability to game gets flushed down the toilet...

I have yet to see anything, and I do mean anything that proves / disproves "The human eye can only detect 60fps". No, the TV does not count, its under a different format / wavelength. Please don't ask me what, or any further details because my knowledge after that is minimal. I'm willing to assume that there is a difference in renduring on a CRT vs. LCD, and why somebody could say that "30fps is smooth" because I am lost at that point :) . For me, I cant stand any current LCD's becuase they really seem to smear / tear in high intensity / high-movement situations. Hopefully this will change as technology evolves.

Not sure if this helps.. but it is my .03 worth...

:)
 
Even if we completely throw away the argument of how many fps the eye can see, there are other benefits to having 60+, 70+, or even 100+.

The best example of this is CS:S, counter strike source. Because source sucks in the hitbox/registration category, you need every FPS you can get. A default CSS server runs at a 33 tickrate. That means the server is really only keeping track of the game at 33 fps. Most people can run their game at over 33 fps, so the server is undercompensating. Hence you're gonna get some shots that aren't hitting where they are supposed to be hitting. Thats why you upgrade your server from 33 to 66 or the ideal 100 tickrate. If you have a server running at 100 tickrate, things are looking good. Except for when you get people who can't run their game at 100 frames per second. Then the whole problem starts again. So besides providing a superficial benefit in prettyness, for some games, at least CSS, more fps = necesary for accurate gameplay.
 
You can get the same cpu, yet some overclock better and such. similarly, some people have better eyes, ears, etc.. get the point?

They say the -->**average**<-- person can see up to 60fps... but it really ranges from person to person. They don't mean average as in..normal people can see only 60fps... they mean that if you take a buncha people and rate their eyes, it averages out to be 60fps or 60hz or 0.016666667ms response time.

Now that I've said that, lets go into folding and aliasing.
taken from wikipedia:
The sun moves east to west in the sky, with 24 hours between sunrises. If one were to take a picture of the sky every 23 hours, the sun would appear to move west to east, with 24 × 23 = 552 hours between sunrises.
folding is sampling slightly faster.. basically the opposite of aliasing

The nyquist rate is probably not giong to tie into this subject too much, but it kind of explains why people try to achieve as many fps+hz/response time as possible... and it all pretty much translates to people not wanting to miss any action if at all possible. You want to catch every bit of action that happens over the network and be able to respond accordingly.

So.. in the end... if you're in doubt.. try other machines w/ different setups to rate whether or not it's better and worth spending money on... b/c even if they have a 20'' lcd and ati1900xt and some super computer... their lcd might have some ultra high resposne time.. and ultimately will screw everything else up.


oh.. and to answer your original question: some need 60+ fps, some don't :D all depends upon their genetic make-up... do YOU need 60+ fps? I'd say.. better safe than sorry
 
If you play a game with hitscan weapons then you really want >= 60fps in my opinion. Trying to play deathmatch style games in ut2004 is really tough with low fps. It's just too jerky to be able to place your mouse pointer accurate on a few pixel high figure moving in the distance. You can get away with lower fps in game types without true hitscan weapons. eg. bf2 sniping would be tough against moving targets with low fps but using the games many machine guns which are just point and spray doesn't matter half as much (you can't be that accurate with them even if you wanted to).
 
I really can't play any shooter unless it's at 800x600 or 640x480 with around 100+ FPS. It feels a lot smoother to me than 60 and I can definetly tell a difference between the two.

If anyone remembers Quake 2 or playing Quake 3, there were "magic" numbers for the amount of FPS that would effect gameplay. The amount of FPS that you would recieve would impact the amount that your character could jump.
 
Aelfgeft said:
So yeah. Frame rates PAST 60 are purely performance since most human eyes can't detect the difference. However, up to and including 60 is the sweet spot.

I seem to be one the only people in the world that can tell the difference between 60, 75, 85, 90 and 100hz on a CRT. Does that make me wierd? lol
 
Nah. I can see a difference up to around 90Hz on my CRT and in fps, but it doesn't mean that if a single frame flashed somewhere in there that I'd be able to read it. The difference becomes a lot more obvious when you're waving around the mouse and making quick side to side movements. Lower fps gives jerkier movements and they're noticeable even above 60fps. About 90fps is when it becomes perfectly smooth to me.
 
EnFoRcEr!! said:
I seem to be one the only people in the world that can tell the difference between 60, 75, 85, 90 and 100hz on a CRT. Does that make me wierd? lol

Easier to pick up flashing lights than it is to pick up dark color changes in, say, a dark alien world or vent. ;)
 
EnFoRcEr!! said:
I seem to be one the only people in the world that can tell the difference between 60, 75, 85, 90 and 100hz on a CRT. Does that make me wierd? lol

Me too.. I can tell the differences without knowing the refresh rates, especially on my 24" Sony GDM-FW900 CRT.

Well, it can be somewhat difficult to tell the difference between 90 and 100 Hz since the screen flickering is almost *completely* gone at 90 Hz already. The screen tearing with Vsync disabled is noticeably reduced at 100 Hz, though.

Also, I could say that the game play is smoother at 100 Hz than at 90 Hz, that is if the games themselves are not capped at say, 60 or 85 fps. Doom 3, Quake 4 are capped at *only* 60 fps, while online UT2004 is capped at a nicer 85 fps.
 
Ranari said:
Do we really need cars that can drive faster than 70mph?

Yes


Aelfgeft said:
Do we really need speakers that can go louder than 80 decibels?

Hell Yes

And by the way, Halo 2 on the Xbox proved we don't need more than 30FPS. What really matters is the minimum. Halo 2 never really dipped below 30 so it was fine, but higher numbers arn't really needed.
 
The reason the arguments about "how many FPS can the eye see" fail so often is because no one ever explains what they mean properly.

You have to take into consideration a lot of factors, how quickly is each frame displayed for instnace, is constantly displayed until the next one is ready and they instantly swapped with no percieveable delay, or does it slowly fade from the screen and the next frame is drawn before it's totaly faded. Does the image you're looking at inlcude motion blur, the list goes on and on and on.

using examples like TV's, lightbulbs and monitors all need very different criteria to judge by, monitors have a much higher refresh rate than a TV, but TV mostly displays film which contains motion blur. Monitors typically display rendered scenes without motion blur (although we may have this soon)

You have to consider the input device, are you controlling the movement of the images on the screen? In which case you have the latency of the input -> screen movement to take into consideration, it might look smooth from a non interactive point of view, but as soon as you need to use that screen to carefuly position or orientate an object that same frame rate might not be enough.

Even different types of monitors have different ways of displaying, CRT's typically need at least 75Hz to produce a steady enough of an image so you don't get a headache after 30 minutes, but you'll only find 60Hz on a LCD monitor, and thats all it really needs.

Theres even more complex issues like how do we claim we can perceive the difference between frames, are we talking about being able to tell flicker from a screen, are we looking for totaly smooth motion in our image or maybe something else. You can have a smooth appearing TV output on a PC monitor running at 25 FPS and have it look fine but still be able to see flickering if the refresh rate is low (60Hz)

If you're going to make outrageous claims about what you claim is factually perceiveable frame rate wise then you're going to have to include all of the above information and more, if you're not clear and accurate it turns into a flame war, these threads are nutorious for it because people blert off 1 liners like "your eye can only see 25 FPS max so 60FPS is not worth it"

hope that helps :D
 
I like to have at least 80FPS @ most times. The more FPS the better shot registration.
 
NickS said:
I like to have at least 80FPS @ most times. The more FPS the better shot registration.

I prefer the high FPS because it helps me control my mouse, in fast shooters like UT2004 where twitch aiming is the name of the game, you need to be able to turn large amounts and stop dead on your enemy, this is easier with a higher frame rate since you're seeing a more accurate picture of what your mouse is doing, less frames = more guess work on your part to place your crosshair.
 
Exactly! Strangely, though.. with my Nvidia video cards, it seems much more laggy at say, 45 or even 55 fps than on my Radeon cards. With Nvidia cards, I have to keep it at least 75 fps in UT2004 to avoid any kind of lag. 85 minimum is ideal.
 
i play CS 1.6 on my pentium three at about an average of 22 or so FPS, i can get top scores that that... so there
 
Here's a little selftesting for those who claim human eye can't see over 25fps:
1. Set your max fps in game X to 25. Do a fast 360º turn.
2. Set your max fps in game X to 50. Do a fast 360º turn.
3. Set your max fps in game X to 75. Do a fast 360º turn.
4. Set your max fps in game X to 100. Do a fast 360º turn.
If you can't see any difference contact your local optician right away.

I can play 3rd person games like Gothic 3 at 25fps but in 1st person games I need over 50fps to feel that the game is running smooth.
 
My eyes are complete garbage. Even in close range, I can sometimes have a very hard time discerning a lot of detail, unless I study it for a long time. This has happened to me with age. My eyes have always been bad, but my glasses have always been able to "fix" it for me. So, no, I can't tell the difference between 40 and 60 and 100 fps, but I can very easily tell when it goes under about 34 fps in some games.

Because of the deterioration of my eyes, I have mostly moved away from twitch FPS games like UT04 and Q3, and now I play slower-paced games like Civ IV and Oblivion. I do still enjoy the faster-paced games like UT04 and BF2, but I don't do as well as I used to. In BF2 and DesertCombat, I tend to take up support roles, so that I'm not as involved in fighting.

Jut my thougts.

-Mark
 
I can deal with frame rates of 30 or more, but what I can't stand is FPS jumping all over the place. I can handle 30 or 45FPS but I want it to be as consistent as possible.

The main reason to have 100+ FPS is to ensure it never drops below 30, or better yet, below a detectable level of performance loss. This way the game feels the same all the time.
 
The human eye can't see past 20. Movies go at 24....and those 4fps allow for special effects.
Although in computer gaming in theory if you get 20fps.....and your player gets 80fps....your player should in theory be able to use those extra 60fps to help his aim on you by having smoother response.

I have to say....the eye might not notice it....but we feel it when we use our mouse in gaming. Seems how the other player has 60fps more accuracy on you.
If you guys follow my logic.
 
factory81 said:
The human eye can't see past 20. Movies go at 24....and those 4fps allow for special effects.
Although in computer gaming in theory if you get 20fps.....and your player gets 80fps....your player should in theory be able to use those extra 60fps to help his aim on you by having smoother response.

I have to say....the eye might not notice it....but we feel it when we use our mouse in gaming. Seems how the other player has 60fps more accuracy on you.
If you guys follow my logic.

The human eye can see the difference between 20 and 30FPS. 20FPS isn't smooth and 30 is.

Think the human eye can't see more? Well tell me how it is that when your FPS drops from 147FPS to 37FPS, it can be felt? How exactly do you perceive this change? It is because you perceive the difference between what your hands and eyes are doing and seeing. There is no tactile feedback from the mouse, so it must be felt as a discrepancy between your eyes and your hands. Eyes being the key word here.

That's my theory anyway. I've never seen any hard evidence to support that humans can't see the difference over 20FPS. I simply do not believe that.
 
factory81 said:
The human eye can't see past 20. Movies go at 24....and those 4fps allow for special effects.
Although in computer gaming in theory if you get 20fps.....and your player gets 80fps....your player should in theory be able to use those extra 60fps to help his aim on you by having smoother response.

I have to say....the eye might not notice it....but we feel it when we use our mouse in gaming. Seems how the other player has 60fps more accuracy on you.
If you guys follow my logic.

You're all wrong.

Man, how naive... excuse my language,.. it's just hard for me to tolerate this kind of stupidity.

The human eye can see past 40, and that's something I'm willing to bet my house on.

I can see past 60 fps myself. Movies are usually made in a special way so that we do not notice the choppiness. It's like as if you are only walking very slowly in a straight line in first-person 3d games. Things move slowly around you that you do not see much of the choppiness. That's how most movies are made, with slow camera movement.

Maybe you are basing your opinion on the fact that when we quickly wave our hands in front of a bright CRT screen in a dark room, we notice the choppiness of our hands.

Why is that so?

That is because of the screen's refresh rate itself (CRT's only). It does not apply to LCD screens.. when we wave our hands in front of an LCD, the hand moves smoothly in front of our eyes.
 
Well I dared to repeat what I learned in biology or somehing class.

I myself like to think I can notice the difference between 20 and 80fps....but I also will say that I notice it mainly in mouse movement and feedback between me and the game. Only thing visual you notice is that it runs smooth for me until it drops into the teens.

http://www.100fps.com/how_many_frames_can_humans_see.htm

Just google in the term; human eye 24fps

And you get a plethora of results.
I won't say that it is cut and dry humans eye = 24fps. It is just what I was taught by a school system that was perfected by George W. Bush. God Bless no child left behind
:rolleyes:


That one wbesite 100fps says this though, which is very interesting....
If you could see your moving hand very clear and crisp, then your eye needed to make more snapshots of it to make it look fluid. If you had a movie with 50 very sharp and crisp images per second, your eye would make out lots of details from time to time and you had the feeling, that the movie is stuttering.



I think that the whole frames per second issue is a "technology by technology" basis. Because when you learn about authoring dvd's and etc you learn that each frame is actually 2 frames or something. So 30fps turns into 60fps and deinterlacing interlacing....blah blah blah.
I definitley will say the faster the FPS for games the better, and the more stable the fps for games the better.
It is turning in to an old wives tail the 24fps a second theory.
Too many factors....
Bottom line that I will fight for....
You play better if you have a smooth running game. Because your mouse has that many more frames per second to move to the target accurately.
30-40fps is sufficient in my opinion works well for me....anything more and I feel overkill.
 
Back
Top