is this true? the human eye can only see 66fps?

WTF? as far as Im concerned its allways been 24fps, unless you are some kind of superhuman:eek:
 
The answer vaires depending on a hundreds of different factors, you'd have to be vastly more specific with your question to expect a sensible answer, or one did that didn't end in a large flame war.

I'll give you a hint to get started, let us know:

What display type
Does the refresh of the display differ from the frame rate of the media
Are we talking flicker/smoothness
Are we talking the peripheral vision or direct vision
Are we talking about computer generated images or film
Does the media contain motion blur

The list really does go on and on but these are the basics.
 
Turn fraps on, go into a game, if you see the little yellow numbers go above 66, you're seeing that many frames per second. If you couldn't see above 66, than every time it got above it, you wouldn't see the two little numbers. Right?

Anyway, I really don't think it matters too much, you can't tell the difference once it gets past 60 much anyway.
 
I ran FRAPS on my occipital lobe and I was getting between 19 and 33 FPS depending on how many objects I was looking at. I tried overclocking but it didn't make a noticeable difference.
 
Turn fraps on, go into a game, if you see the little yellow numbers go above 66, you're seeing that many frames per second. If you couldn't see above 66, than every time it got above it, you wouldn't see the two little numbers. Right?

Anyway, I really don't think it matters too much, you can't tell the difference once it gets past 60 much anyway.

No that's wrong. You would still see the image but you eyes would only be grasping 66 FPS and the rest would not make a difference as the eye wouldn't be sensitive enough to notice them. But as stated above, it's a complex question, not something that a simple number figure can be slapped onto.
 
another topic on this?

eyes do not see in "fps"
it's the fluctuation in fps that's detectable.
 
Oh come on guys, the question certainly can be answered.

If you're talking movie media, then yes you'd notice a difference in a movie filmed at 24 fps and one filmed at 60.

Remember Bourne Supremacy? The annoying fight scenes there were shakey, blurry, and hard to make anything out? That's what a movie at <24 fps would feel like while the rest would be 60. ( Just so that we're clear, a movie showing at 24 fps is considered HD.)

If you're talking video games, then it again depends on the type of game. You're not going to see crap for difference in Tetris at 10 fps and Tetris at 200 fps. There's not much happening on the screen and your eyes easily lock on within those 10 fps.

If you're talking Counter Strike Source, with potentially 50 people on the screen with cabinets flying, bullet casings kicking out of your weapon, and a patridge in a pear tree, then 10 fps will not be enough and you'll need those 60 fps for it to look smooth.

I think alot of you guys are referring to the tests that they did on fighter pilots.

Sadly I gotta go because my kid is beating the hell out of my keyboard. More after she goes to sleep.
 
I ran FRAPS on my occipital lobe and I was getting between 19 and 33 FPS depending on how many objects I was looking at. I tried overclocking but it didn't make a noticeable difference.

It's between 17 and 30FPS says scientific studies... but that's besides the point. It's 17-30FPS as a constant flow, games being rendered are based on an average (with highs and lows).. therefore something like 60FPS+ is needed to ensure the lows don't hit bellow 30FPS... at least that's the idea.
 
Whatever framerate it is, it is certainly below 66.

But that is irrelevant. The reason FPS players need high frames is so that you can see the direction and aim of the player when aiming. I drop in CS 1.6 in kill to death ratio about 1 kill every 10-15 fps up until around 40, where it just bombs down to about a 1:1. I don't see how people can play shooters with low fps, I NEED my 100+ fps for twitch-aiming, especially if I'm sniping.

<awpwhore. But my ratio is usually 15+:1 at LANs, always better online (pubbing, I stopped clanning because it got lame). So anyone who doesn't like snipers can kiss my ass :p
 
wow...

Yeah, im pretty sure we dont see in frames...more of a constant input.

/closed and noted that there are a bunch of idiots lurking here
 
The answer vaires depending on a hundreds of different factors, you'd have to be vastly more specific with your question to expect a sensible answer, or one did that didn't end in a large flame war.

I'll give you a hint to get started, let us know:

What display type
Does the refresh of the display differ from the frame rate of the media
Are we talking flicker/smoothness
Are we talking the peripheral vision or direct vision
Are we talking about computer generated images or film
Does the media contain motion blur

The list really does go on and on but these are the basics.

im talking during gaming, lcd display.
if we can only see 24fps then y do people complain when they have low fps like 60.
 
Oh come on guys, the question certainly can be answered.

If you're talking movie media, then yes you'd notice a difference in a movie filmed at 24 fps and one filmed at 60.

Remember Bourne Supremacy? The annoying fight scenes there were shakey, blurry, and hard to make anything out? That's what a movie at <24 fps would feel like while the rest would be 60. ( Just so that we're clear, a movie showing at 24 fps is considered HD.)

If you're talking video games, then it again depends on the type of game. You're not going to see crap for difference in Tetris at 10 fps and Tetris at 200 fps. There's not much happening on the screen and your eyes easily lock on within those 10 fps.

If you're talking Counter Strike Source, with potentially 50 people on the screen with cabinets flying, bullet casings kicking out of your weapon, and a patridge in a pear tree, then 10 fps will not be enough and you'll need those 60 fps for it to look smooth.

I think alot of you guys are referring to the tests that they did on fighter pilots.

Sadly I gotta go because my kid is beating the hell out of my keyboard. More after she goes to sleep.

Even this is far too simplified.

With a TV media you get motion blur naturally captured with a camera which has a finite shutter speed, it means that the light of whatever is in motion is captured within one frame, essentially each frame is more than just a snapshot of time it's a length of time over which something might move.

This causes motion blur on the image and when played back it helps the image appear more smooth and eliminates the jerkyness of only 24fps playback. I doubt you'd see much difference between 24 and 60 fps on film which is being played back on a normal TV, the added motion blur fills in the blanks for such a low frame rate.

On the flip side of the coin, 24 is a much too low frame rate for a video game without motion blur, and we can see the difference all the way upto 60fps.

Theres many unspecified factors here, the list is truly massive. Covering the basics I asked for in a previous post would guide us in the right direction, to even get close to an "answer" we'd need a whole lot more information and a incredibly more detailed question.
 
wow...

Yeah, im pretty sure we dont see in frames...more of a constant input.

/closed and noted that there are a bunch of idiots lurking here

yes its a constant input...

im talking during gaming, lcd display.
if we can only see 24fps then y do people complain when they have low fps like 60.

24 is barely playable. usually 30 and above is good enough to be enjoyable. but i think the ideal number is what your monitor's refresh rate is, correct? anything higher (i.e. 100, 150) is just a pissing contest number.
 
im talking during gaming, lcd display.
if we can only see 24fps then y do people complain when they have low fps like 60.

A little better.

The simple answer that is on an LCD display we can see many more changes per second than 24, much much higher in fact, due to the nature of LCD displays we do not have to worry about flicker too much.

During gaming we notice lower frame rates more easily because lower frame rates not only cause a visual stutter but we "feel" that delay when we cause input with the mosue/keyboard/joystick or whatever, and we feel that our inputs are only being read at low intervals

Other things we need to consider, if we have a CRT monitor refresh rate (which is totaly different to fraem rate) can be a problem if it's low. We need to conisder that soem games are starting to impliment motion blur which makes things appear more fluid than they really are, we will see less visial stutter but still feel our inputs through the mouse/keyboard being less smooth.
Some game use a fixed time scale and so low frame rates have little impact of the smoothness of the virtual world and the things in it, other games have a timeline thats fixed with the fraem rate, so a slow frame rate means that things in game can actually slow down and appear jerky (example: Doom3 was capped at 60fps because thats what the engine ran at, your display could exceed that but the additional frames were just duplicates of previous ones)
Do we have tearing or not? this is a massive factor in how games appear on monitors
How stable is our frame rate? As mentioned before we pickup changes in frame rate easily, if our average fraem rate is 24 but our minimum is really 5 and max is 50 then it's going to look, and play really awful, if it's a steady 24 all the time this appears more smooth to us.
Does our game change much? If the game scenery changes quickly and reguiarly then we may need a higher frame rate for it to appear smooth, maybe we're playing a flight sim and only make tiny adjustments to leveling and turning which means the horizon bearly changes in which case we can deal with a lower FPS and maintain apparantly smooth motion, where as a game like UT2004 Instagib where you reguarly do 180 degree spins to face opponents behind you, you need a much larger FPS not only to look smooth but also to keep the input device (mouse) reading smoothly.

There really is many many factors to all of this, very few people really comprehend exactly how complex it all is.
 
yes its a constant input...



24 is barely playable. usually 30 and above is good enough to be enjoyable. but i think the ideal number is what your monitor's refresh rate is, correct? anything higher (i.e. 100, 150) is just a pissing contest number.

heh i get like 200+ sometimes in css.
 
yes its a constant input...

Maybe, but Im fairly sure your brain works in fairly discreet steps as your neurons fire, it's not streamed data but discreet values very quickly, but then so is light and photons so there we go, look small enough and things are almost always discreet, energy, light etc

I digress.

24 is barely playable. usually 30 and above is good enough to be enjoyable. but i think the ideal number is what your monitor's refresh rate is, correct? anything higher (i.e. 100, 150) is just a pissing contest number.

Entirely depends on the game, and the monitor.

For games 24 FPS might be fine for a non stop flight from London to New York in a commercial airplane, because we're not going to see things change a lot and if they do it wont be fast (unless we crash >.<) if we're playing UT2004 we might be better off with 60fps although most people playing competativly often expect MORE than 60fps, most of the UT2004 pro gamers stick to 85fps which is the cap on the engine when online.

My left monitor is a 22" LCD flatscreen (wide 16:10) and runs at 60hz
My right monitor is a 19" CRT which runs at a refresh rate of about 150Hz even at 1600 1200, even higher at lower resolutions

When our frame rate is in sync with our monitors refresh rate we eliminate tearing, this can be done by turning vsync on, it results in a more desireable output.
 
A good read, if a little long:
http://amo.net/NT/02-21-01FPS.html
A Movie theatre film running at 24 FPS (Frames Per Second) has an explanation. A Movie theatre uses a projector and is projected on a large screen, thus each frame is shown on the screen all at once. Because Human Eyes are capable of implmenting motion blur, and since the frames of a movie are being drawn all at once, motion blur is implemented in such few frames, which results in a lifelike perceptual picture.
The USAF, in testing their pilots for visual response time, used a simple test to see if the pilots could distinguish small changes in light. In their experiment a picture of an aircraft was flashed on a screen in a dark room at 1/220th of a second. Pilots were consistently able to "see" the afterimage as well as identify the aircraft. This simple and specific situation not only proves the ability to percieve 1 image within 1/220 of a second, but the ability to interpret higher FPS.
 
The fps you can detect is often different, some games require 60+ fps (egfear) so you dont get a lag feeling, others 15-30 (cnc3 req 30 to be perfect).
 
wth... you guys are all wrong human eyes can only see 10 FPS but we have .000001 MS refresh rate, thats why we dont see any ghosting or streaking at all (exept playing really fast with glow sticks in the dark)
HAHA pwned.
 
If it's faster than you can see, that's your framerate for your very own unique pair of eyeballs.

The end.
 
Um.. Yeah, in terms of FPS we can perceive. I'd say max is 72FPS, beyond that you cannot tell a difference. I can tell between 45FPS and 60FPS sometimes, but it is damn impossible for me to tell the difference between 60FPS and any higher.(I'll let me refresh rate go up sometimes to like 85)
 
I think alot of people are missing the point here..

It really comes down to,

If you had an LCD monitor showing a moving video (Of anything..) and you steadily upped that video FPS, at what point would you stop noticing the difference?

Say you done it in 5's.. Start off 1FPS, Then go to 6, Then 11 and so on, noting every time you can tell there is a difference. Eventually you wouldn't be able to, Where its at.. that would be the perfect FPS, Regardless of what your looking at..
 
When I was at the university in '99 I could not stand it when crt monitors were set at 70hz. I always had to change it to 85 hz (mhz?) and above for me not see see the screen flicker. I absolutely couldn't stand it when I went to the computer lab. I don't know why it felt like I was the only person that notice this out of 300 computers. Sometimes I wondered if I was the crazy one. How does that compare the fps?

I CAN see the difference between 60fps and 100 fps in first person shooter games, it's blatently obvious to me, more fluid like.
 
... and a patridge in a pear tree.....

LOL you guys are all bickering about what Frame rate the eye can see. Some guy says 24, another guy juggles 60. Someone throws out a 72 and then we have mayhem.
 
I think 24fps is the figure that is quoted for the lowest framerate at which the human eye stops experiencing a 'slideshow' and experiences 'video'. It's video, but 24Hz still looks flickery when it goes dark between each frame: that's why a normal television has a different kind of PHOSPHOR COATING, so that each frame glows for at least 1/24s, until the next frame is scanned onto the coating. The phosphor coating on CRT monitors has far less persistence, which is why 60Hz is still horrid and flickery, and 85Hz is generally fine.

LCD monitors are of course fine at 60Hz, because the image persists brightly and reliably until the next frame comes. Apart from when gaming, I don't think we'd notice if we brought the refresh rate of a TFT down to 15Hz, because it doesn't go dark between frames and 1/15 of a second is too quick to notice jerkiness when all you're looking at is a flashing cursor. At most, the mouse might look jittery.

I'm talking about refresh rate rather than frame rate, though.
 
I thought we as humans interpreted motion (or light and color for that matter) as opposing light waves being deflected off said focused object(s) (an orange chair is every color but orange sort of thing) as a form bio illuminating stimulation apposed to processing "Frames"(still imagery) Per Second like a high speed camera would.... but what the hell do i know...

Anyways all numbers that have been stated thus far are wrong. Reason? Do any of you know what Epilepsy is? Well if you do, you would know that roughly only 5&#37; of 50+ million that are diagnosed with epilepsy that of them suffer from episodic attacks that are triggered by "flashing lights".

What you may not know is that every one has a point at with they will start to exhibit neurological seizures like an epileptic that suffers from the "flashing light" symptoms would.

If they are exposed to radically high number of images over a extended amount of time,
their neurological buffer becomes saturated. This in return will cause contentious part of the brain to "skip/laps" over what there viewing. In extreme cases it will cause seizures and possible neurological damage (used in militarize prisoners conditioning programs AKA brain washing).

Its has to to with a neurological processing buffer among other factors and when the buffer is over saturated it will cause theses type of seizures when pushed well past there processing levels for extend periods of time(other factors play apart in this though like physical/emotional stress levels ie like the basics 5 level "brake down", undernourishment, sleep depravation, ect .

this buffer is different for every one though. Good news is you don't have possession the equipment to preform this on your selfs or friends ;).
 
I see fluid motion at 20 FPS, 30 FPS is perfect, and 60 FPS is perfection with brilliant input response time.

Anything lower than 20 FPS = really noticeable lag. Under 10 FPS = Slideshow.
 
I thought we as humans interpreted motion (or light and color for that matter) as opposing light waves being deflected off said focused object(s) (an orange chair is every color but orange sort of thing) as a form bio illuminating stimulation apposed to processing "Frames"(still imagery) Per Second like a high speed camera would.... but what the hell do i know...

Actually, that chair "absorbs" every color except orange.
 
LCD monitors are of course fine at 60Hz, because the image persists brightly and reliably until the next frame comes. Apart from when gaming, I don't think we'd notice if we brought the refresh rate of a TFT down to 15Hz, because it doesn't go dark between frames and 1/15 of a second is too quick to notice jerkiness when all you're looking at is a flashing cursor. At most, the mouse might look jittery. I'm talking about refresh rate rather than frame rate, though.

LCD pixels do hang on for longer and thus the refresh rate doesn't have to be as high. However, LCD's have more of a problem with motion film than CRT's do. Newer ones are better, but they still have the problem.
 
Personally, I don't know why everyone says 24 or 60 or even 72. It doesn't make sense. You can't compare the human eye to an electronic device that outputs images. I play Quake 3 competitively and I can notice a difference between 100 FPS to 120 FPS. I can even notice the difference between 120 FPS to 180 FPS. Even 240 FPS. For some reason, others can't see the same. :(
 
[RIP]Zeus;1030848530 said:
I am going repeat you on this

As i am sick and tired of stupid people saying we can only see 24 FPS....

http://amo.net/NT/02-21-01FPS.html

And I will make it a third quote - http://amo.net/NT/02-21-01FPS.html


And here is an important quote from that article:

Even if you could put motion blur into games, it would be a waste. The Human Eye perceives information continuously, we do not perceive the world through frames. You could say we perceive the external visual world through streams, and only lose it when our eyes blink. In games, an implemented motion blur would cause the game to behave erratically; the programming wouldn't be as precise. An example would be playing a game like Unreal Tournament, if there was motion blur used, there would be problems calculating the exact position of an object (another player), so it would be really tough to hit something with your weapon. With motion blur in a game, the object in question would not really exist in any of the places where the "blur" is positioned, that is the object wouldn't exist at exactly coordiante XYZ. With exact frames, those without blur, each pixel, each object is exactly where it should be in the set space and time.

The overwhelming solution to a more realistic game play, or computer video has been to push the human eye past the misconception of only being able to perceive 30 FPS. Pushing the Human Eye past 30 FPS to 60 FPS and even 120 FPS is possible, ask the video card manufacturers, an eye doctor, or a Physiologist. We as humans CAN and DO see more than 60 frames a second.
 
Well thats just great! we are all superhumans. :p

Can you display fps in stalker or bf2142? how?
 
Back
Top