How Many Frames Per Second Can The Human Eye Really See?

24 fps is death in any online fps.

I agree with the after 100 fps you can't feel a difference.

Disagree? See you on the battlefield!

I used to play True Combat: Elite at 25-30 fps, no problem whatsoever. I still prefer 60+, but it wasn't a problem.

 
Last edited:
It depends on a great many things. Most importantly motion blur. Movies that run at 24fps seem smooth because of motion blur.
It might seem smooth, but stop the movie and the still image in any action scene looks like a smear. There is no information there. The whole point of FullHD and 4K looses it's meaning then and there.

That's why I'd have preferred moving from FHD@24FPS to FHD@96FPS and no motion blur but clear crisp picture. Where all the information is there, and it's not left to your brain to fill in the gaps when movement is happening. It'd be much less straining.

That's why the first post effect I turn off in games is always motion blur. Because it's fake, it doesn't add anything the screen actually looses information due to it.

And idiots are already planning 8K screens. do 4K@60fps at least before you morons.

In this regard old analog interlaced screens were better, because they had more motion inofrmation, each field was captured in a different moment of time. It was great, because when there was no motion it was the equivalent to full progressive. And during motion it lost some resolution but more than made up for that by the fluidity in motion and clearer details. Compared to crappy 25 or 29.97 fps depending on your region.

And then came the de-interlacing methods that added great interpolation algorithms that made the drawbacks even less significant.
 
Outside of gaming, though, does the average user need anything better then 60 FPS? I don't think do. Sure, there are exceptions out there, but you don't need a 165FPS monitor to read web pages or work in Excel. More than likely, the consideration may be higher resolution, screen size, or color accuracy than the FPS. My personal preference is 27" IPS monitors which have a resolution of at least 1920x1080, although the resolution of 2560x1440 on the ViewSonic XG2703-GS was really nice. Too bad a few pixels died and I had to return it for replacement yesterday.

You have no lived until you have scrolled through a document at 144 fps. Just reading through forums was a huge improvement when I got my first 144hz display.
 
If you test with people who are totally ignorant of frames per second and refresh rates, there will be no placebo effect. i.e. You don't even need to tell them it's an "experiment". I just thought that it didn't need to be said since it can be deduced pretty easily. It's just like when the Hobbit went to 48 fps and lots of people complained of how horrible it looked. They didn't know that the film was shot/played at 48 fps, all they knew was that their movie watching experience was different.

So you are arguing that it doesn't need to be a blind test because you can make the test blind ?

Here is the issue with you approach. You have a forced outcome.
Lets say you show them the different fps like you said aka blind (they don't know) and since the tester is not there you could almost arguer that you are in fact arguing for a double blind test. but anyway.
Then you ask them afterwards which movie/game looked the best. most people that might not even see a different will answer one of the other because they are not under the assumption er there is a difference and boom your placebo effect came into play again.

Doing a correctly ABX test there is no preference involved at all no subjective measuring and no placebo. its one of the (if not the) most objective way to measure a humans sensory ability


When you can have people experience/reporting a difference between two identical situation just by asking them what is the better one, then yes you cant use you method of simply planting them in from of 2 situations (60fps and 120fps) and ask the which one is best. You need to do the test properly. and acknowledge the null hypothesis.
 
It's not about best, it's about noticing a difference. Just like how people watching 48 fps the Hobbit didn't think 48 fps was better, they just knew it was different. But you're right, not everyone will see a difference, not everyone is as sensitive. But that is an acceptable response to the experiment and wouldn't create a null hypothesis situation.

I am not sure how this statement "most people that might not even see a different will answer one of the other because they are not under the assumption er there is a difference and boom your placebo effect came into play again." is placebo effect. The placebo effect can only come into play in an experiment like this if I told the people before hand that one screen they are looking at has a higher refresh rate, or if I said something like "higher refresh rates can be linked to better looking displays". But if I do not give the subject ANY background information, and just told them look at these different displays, then there can be NO placebo effect because they do not know what the variable is. Placebo can only happen if the subjects know or think they know what an expected outcome should be, but if they don't know, then no placebo.
 
Sometimes I switch my 144hz display to 120hz to stream to my Chromecast, if I don't my display turns to static and I have to alt+f4 until I get my desktop back. I'll then launch a game like CS for example, notice something isn't quite right, and switch back to 144. 60hz is almost unbearable for me, but I've had extremely sensitive eyes since I was a child. I remember going to the dentist and stacking 4 pairs of sunglasses and STILL uncontrollably tearing up from the light.
 
I used to play True Combat: Elite at 25-30 fps, no problem whatsoever. I still prefer 60+, but it wasn't a problem.




But when "you used to play" was it against people who had 165 Hz monitors with video cards that can kick ass ? Or was it back when everyone was doing 60 - 85 Hz ?
 
You have no lived until you have scrolled through a document at 144 fps. Just reading through forums was a huge improvement when I got my first 144hz display.

I don't doubt that, and I saw that when I had my 144Hz display (a Viewsonci XG2703-GS which I had to return due to dead pixels). But, that makes it "nice to have", not "need".
 
I don't doubt that, and I saw that when I had my 144Hz display (a Viewsonci XG2703-GS which I had to return due to dead pixels). But, that makes it "nice to have", not "need".
But that's a stupid way of looking at things. By that logic we don't need more than minimum hardware specs for anything.
 
I don't doubt that, and I saw that when I had my 144Hz display (a Viewsonci XG2703-GS which I had to return due to dead pixels). But, that makes it "nice to have", not "need".

I feel like I can work longer in front of a high framerate display without taking my eyes off it. I get much more eyestrain using a 60hz in my office than my 144 at home. Do I NEED it to be productive? Not really, but it helps.
 
But that's a stupid way of looking at things. By that logic we don't need more than minimum hardware specs for anything.

True. Unfortunately, under the criteria of the situation, that's what you have to work with. If money is a constraint, compromises have to be made. :(

Just for fun, I decided to look up the price of a 27" monitor (which I use at work) with a resolution of 1920x1080, and don't care on the panel. This is what I found based upon lowest price at NewEgg:
60Hz - Acer G6 Series G276HLGbd Black 27" 6ms Widescreen LED - $150. Only has D-Sub and DVI.
75Hz - DELL SE2717Hx 27" Black IPS LCD/LED Monitor - $155. D-Sub and HDMI.
120Hz - Only two monitors, both over $500, and both out of stock.
144Hz - Acer GN276HL - $279. D-Sub, DVI, HDMI.

I can probably see to get the 75Hz monitor for approval since it is only a few dollars more. But, the difference between 75Hz and 144Hz (69Hz difference) is $124.

Since the current generation of video cards uses Displayport, lets add that as a constraint. Again, lowest priced monitor at NewEgg.

60Hz - AOC i2769Vm - $180.
75Hz - None listed.
120Hz - Only one monitor, it's over $500, and it's Out Of Stock.
144Hz - AOC G2770PQU - $289.

So, per NewEgg, the difference between a 60Hz monitor and a 144Hz monitor is $109. ¯\_(ツ)_/¯ If you are a gamer, sure, you can justify the increased price. If you are in a typical office environment, the higher refresh rate is "nice to have", not a "need".

And, note, I said "lowest priced", not good quality or well-reviewed. I don't know the quality of the monitors, how well-reviewed they are, or such. None have any adaptive sync technology.
 
That's why more people should push for 144 Hz minimum and drive it mainstream to lower the prices and raise the standard. We were so far ahead with CRT regarding motion it is sad to be stuck with such a lackluster standard today.
 
I'm a fucking scientist of course I distrust articles posted on a topic that has literally been beaten to death. These articles sprout up all the time and try to defend the false claims presented and I'm tired of it having used high refresh monitors since the 90s. You can believe all you want that your reactions to blips don't happen while I live in a world of fully firing brain functions. If you can't react to a change in perception then in the past animals would have eaten us. Reactions don't exist on a sin wave while we wait for a loop to compete like on a CPU we function in linear real time and react to stimuli accordingly.

As a scientist you are familiar with the Straw Man and what specious argument is?
 
I'm surprised 60hz screen are even allowed any more, as they seem to always cause a large amount of people pain.
 
I'm surprised 60hz screen are even allowed any more, as they seem to always cause a large amount of people pain.

Well, as I sit at work on a Government computer that I spend most of 8 hours a day using, the 22" displays, (3 of them) are all set at 1680 X 1050 @ 60 Hz.
 
A buddy of mine is a retinal neurosciencentist. He has the following to say on the subject.

Its actually a bit complicated, depending upon many factors. But the simple take home is that humans max out at around 60Hz in the absence of any other cues. Birds? 120Hz.
The lighting in my office flickers at 120 Hz. It doesn't matter if I'm staring at a monitor or a blank wall, I will get an ocular migraine if I didn't cover my cubicle to keep the light out.
I did read the article, but it was my own fault for not saying that I wasn't responding to the article. I was replying specifically to the "A buddy of mine is a retinal neurosciencentist" post.
The article actually does a very good job of explaining that the visual system is a SYSTEM. if you want me to address the article itself, here it is. I realize that the "climax" of the article if I can use the term very loosely, is when the scientists says “Just because you can see the difference, it doesn’t mean you can be better in the game,” “After 24 Hz you won’t get better, but you may have some phenomenological experience that is different.” I WILL BET MY LIFE SAVINGS that in a fast paced game, that any average or above gamer will perform better at 60 fps than if they played at 24 fps.



Sorry sir, but I took quite a few neuropsych courses in university and while my major was NOT the visual system (it was language and neural networks, and yes, I did participate in quite a few labs, I know the scientific method and how experiments work), I did do a significant amount of work on it. I was just oversimplifying the fact that a lot of people can tell difference between 60 fps and 120 fps and you don't need a double blind test or a scientist to prove it.
Case study of myself and Dark Souls 3. On the same 60 Hz display, I get eye pain and have difficulty playing on the Xbox ONE version (to the point I quit the game after getting only about 10% through it), where the framerate throughout is 30 FPS or worse. On the PC version where it runs well above 60 FPS I have no issues and have completed the game many times. Had the same experience with Arkham Knight on the PS4 and PC.
 
"Let’s go to 24hz, which is movie industry standard. But I don’t see any point going above that"

Fuck that! I do see each individual frame in 24fps movie. It does seem like motion but it is definetly not smooth and natural one, just a fast stuttering serie of images. Thats why I have to use motion interpolation. I just have to keep it on low values to avoid artifacting as much as possible but still strong enough to make motion somewhat lifelike looking.
 
Show me a human who can successfully distinguish between 70, 75, and 80 FPS set ups of the same moving environment, and I will show you my pet unicorn.

Been hearing for years people claiming they can see past 100fps, it's amazing how easily we can fool ourselves.

NASA did studies on this apparently, the absolute peak pilots topped out at 72 fps, and could not discern between higher frame rates.

I always thought companies like Valve used this number for max frame caps because of this
 
The lighting in my office flickers at 120 Hz. It doesn't matter if I'm staring at a monitor or a blank wall, I will get an ocular migraine if I didn't cover my cubicle to keep the light out.

Case study of myself and Dark Souls 3. On the same 60 Hz display, I get eye pain and have difficulty playing on the Xbox ONE version (to the point I quit the game after getting only about 10% through it), where the framerate throughout is 30 FPS or worse. On the PC version where it runs well above 60 FPS I have no issues and have completed the game many times. Had the same experience with Arkham Knight on the PS4 and PC.
pretty sure your office lights flicker at 60, not 120.
 
  • Like
Reactions: jtm55
like this
I'll believe it when I see it.

The thing is, a sudden burst of light (i.e. something suddenly appearing) is perceived very differently from motion.

Following movement on a screen where the next frame is predictable, and the human brain doesn't process images quite as quickly. A game where things are constantly unpredictably changing is different.
 
Go to http://testufo.com/
Print out a picture of the UFO. Slide it across the screen at the same rate that the UFOs are drawn.
What looks smoother? Your monitor or the piece of paper?


Yes, there's a pretty big difference between 30, 60, and 144. And 144 still isn't perfect, not by a long shot.

I think that John Carmak said that the eye can see into the 1000s. Of course, that's well beyond the point of diminishing returns.
 
I'll believe it when I see it.
haha I see what you did there.

Yeah, we can discern super high fps' nearly at 300 FPS when we are trying to pick out a single image flashed, but in moving images you're not seeing anything past 70, unless of course you're someone who's also a hypochondriac.
 
haha I see what you did there.

Yeah, we can discern super high fps' nearly at 300 FPS when we are trying to pick out a single image flashed, but in moving images you're not seeing anything past 70, unless of course you're someone who's also a hypochondriac.

Bull freakin horseshit

oh god, dgz the exhumator :/
 
I'm banging out a XB240H@144Hz@1ms, 333fps on my cod1,coduo,cod2,codmw1 and codwaw.

Reasoning behind high fps is that you can jump higher and get on ledges ect, "you're not so heavy in the game play but light as a feather".

144Hz or higher is where it's at.

I use to bang out a 75Hz 17" monitor@125-333fps on my cod games.

I really dont care what the eye cans see or not. Higher fps is best game play.

Edit:

Screw gsync, ulmb and amd sync technology.

I'm upto 149Hz now :D
http://www.overclock.net/t/1493866/guide-overclocking-your-monitor
 
It makes me laugh at people comparing darkened movie theaters with gaming. I, personally, can't perceive FPS differences above 70hz. Or rather, FPS rates above 70. A constant 70fps, after a few minutes, will be much smoother than FPS dipping between 74 and, say, 90fps, due to frame time artifacts. The human eye is much better at seeing differences than consistencies. This is why tearing, hitching, frame time differences etc are so aversive.

I have a small parrot, and I just considered what this might be doing to him/her... It used to HATE my old LCD TV, but has no issues hanging out (and actually really enjoys some of the programming) that plays on my old Sony Plasma.

For most of you in this thread so far, have you considered joining together and creating a test group that can travels the globe and disproves decades of scientific studies and industry design. I mean think about it, the chances of all you coming together right here on this forum and this thread.....it can't be coincidence. This is destiny.

It must be impossible for you guys living in this new world where LED's provide most light. I can't imagine what it must be like to have to look at 60hz flickers all day and night.?

Have you ever heard about a capacitor? For smoothing driver input? You can develop a LED supply for a few dollars that switches so beyond the perceivable refresh of the eye... The same way we can use switching power supplies in Audio systems (and measure their major harmonic frequencies) without hearing (MOST) switching noise.

LED strings, like christmas lights, without capacitors, actually do make me throw up when I drive by. It took me into my mid 20's to really piece it together.

pretty sure your office lights flicker at 60, not 120.

Most commercial LED drivers double the 60hz pulse to form 120hz, and smoothing capacitors on the outside of the MOV output to further smooth the output. An improperly sized capacitor, or multiple diodes, where one diode is failing or not to spec, can make people sick, as the diode discharges the smoothing capacitors at different rates depending on it's junction temperature (and remember, LED's are diodes, so the relationship between power consumption and Tj is positive, and can run away).
 
Last edited:
cant directly compare florescent lighting flicker to refresh rate of game. Different tricks makes their respective fields work. seems like I remember reading something that's similar to light reflecting/refracting off various surfaces for a split second after an off pulse. Our eyes could be more sensitive to light or motion thus reducing the times our eyes see nothing during the off cycle.
 
Back
Top