How Many Frames Per Second Can The Human Eye Really See?

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
This is a pretty relevant topic, as high-refresh monitors become increasingly popular and some of us are on the fence as to whether we should upgrade to one or not. Beyond doing the obvious and just scoping one out in person, you can read about what some experts have to say about motion. One guy even claims you cannot see over 20 Hz, although the actual implication seems to relate to gameplay and performance, not visuals. There is also an argument there that resolution and contrast ratios are considerably more important than refresh rates. Many thanks to Gigantopithecus for sharing this one.

“Certainly 60 Hz is better than 30 Hz, demonstrably better,” Busey says. So that’s one internet claim quashed. And since we can perceive motion at a higher rate than we can a 60 Hz flickering light source, the level should be higher than that, but he won’t stand by a number. “Whether that plateaus at 120 Hz or whether you get an additional boost up to 180 Hz, I just don’t know.” “I think typically, once you get up above 200 fps it just looks like regular, real-life motion,” DeLong says. But in more regular terms he feels that the drop-off in people being able to detect changes in smoothness in a screen lies at around 90Hz. “Sure, aficionados might be able to tell teeny tiny differences, but for the rest of us it’s like red wine is red wine.” Chopin looks at the subject very differently. “It’s clear from the literature that you cannot see anything more than 20 Hz,” he tells me. And while I admit I initially snorted into my coffee, his argument soon began to make a lot more sense.
 
One major weakness in these studies is that they seem to mostly ignore variation between people. We know some people hear better than others, in that some people can hear frequencies below 20Hz and/or above 20KHz, and some people are more responsive to certain frequencies than others. We also know some people are able to taste and smell chemicals that others can't, and sometimes this is the result of simple genetics (like PTC tasters vs non-tasters). It's certain some of us can see refresh rates faster than others, and perceive finer contrast ratios than others, too.
 
Pretty good article, actually.

It boils down to this: latency is what we really want, not 1000 FPS. As a 120+ FPS nut (yeah, Quake played a huge part), I say that's largely true. Though some of the arguments are kind of bullshit.

“You can see this without a camera. Given all the studies, we’re seeing no difference between 20hz and above. Let’s go to 24hz, which is movie industry standard. But I don’t see any point going above that.”

Ok, professor. OK.

That said, I could see CRTs kind of flicker at 100 Hz. I also used to work with a person who didn't mind coding on a 60 Hz CRT back in the day. It was baffling, especially for a guy in his 30s. It varies GREATLY from person to person.
 
Bottom line is - the fps on the monitor cannot be the weakest link , we need the highest possible fps and let our eyes / brain sort the rest.
and like most gamers , i can spot 30/60/90 and more fps difference.
 
That said, I could see CRTs kind of flicker at 100 Hz. I also used to work with a person who didn't mind coding on a 60 Hz CRT back in the day. It was baffling, especially for a guy in his 30s. It varies GREATLY from person to person.

oh CRT days, i couldn't use any monitor before i switch to 75hz minimum that most monitors supported , higher if possible , i couldn't understand how other didn't see the terrible flickering.
 
A fascinating read in of itself, thanks for sharing.
 
The environment in which we are viewing something is also a factor. Depending on lighting (lighting level, type(s) used in the viewing space, etc.), refresh rates may need to differ from person to person...among many factors to figure in. It isn't exactly a one-size-fits-all situation that can be answered with a single solution.
 
going from a 60hz monitor to 144hz in CSGO it was night and day. Also, when my fps go from 299 down to 150's I can tell. Although tbh I'm not sure if that's b/c there maybe some tearing at those fps, but it looks choppy to me. I'd try a 200+hz monitor if they made them as I bet there'd be a slight performance advantage over 144
 
The key takeaway, for me anyways, was the separation between the ability to perceive a difference (low to high frames) and the effectiveness of the person with reacting to the information being provided at different FPS.

It almost seemed to say that yes, you might be able to see a difference in the screen at greater than 20 fps but you cant react any faster with that flicker/tearing ugly annoying video.

That does seem to miss another problem. Maybe why the 24hrz movie is ok is because we dont interact with it? Higher fps isnt just about having a better image (for me anyways) I have always found to a point that higher fps also means a more responsive system. To a point anyways.
 
After more careful reading, I think Adrien Chopin is onto something. As I understand it, he's arguing the brain only takes 7-13 samples a second to process all environmental and tracking data in a typical shooter. We just can't react fast enough. Good to know I guess, though new questions arise: which parts of the nervous system are slowing things down and by how much. It's cool when science confirms my religious beliefs by telling me how things work on a divine level
 
Personally I stop noticing a difference around 100Hz. The first time I jumped from a 60Hz monitor to a 120Hz monitor the difference was night and day for me - noticed it right away with just the mouse moving across my desktop. But since then I've played on 144Hz and 165Hz monitors and honestly don't notice any difference compared to the 120Hz monitor I played on years ago.
 
Good to know I guess, though new questions arise: which parts of the nervous system are slowing things down and by how much.

Thinking of attempting an overclock of the CNS? I think that's called methamphetamine salts. Careful, if you try to run at those speeds without an adequate power supply and cooling, your system will crash .. but the fucking framerate is phenomenal.

:)
 
Pretty good article, actually.

It boils down to this: latency is what we really want, not 1000 FPS. As a 120+ FPS nut (yeah, Quake played a huge part), I say that's largely true. Though some of the arguments are kind of bullshit.

“You can see this without a camera. Given all the studies, we’re seeing no difference between 20hz and above. Let’s go to 24hz, which is movie industry standard. But I don’t see any point going above that.”

Ok, professor. OK.

That said, I could see CRTs kind of flicker at 100 Hz. I also used to work with a person who didn't mind coding on a 60 Hz CRT back in the day. It was baffling, especially for a guy in his 30s. It varies GREATLY from person to person.

Good quote. That is baffling that those ppl mentioned the industry standard of 24fps. This fact only makes the article ridiculous when you consider that the 24fps standard was not made in the name of science but for financial and technical limits. That is, the industry standardized at 24fps because it made economical sense and was a speed that was achievable with the tech of the day.

That these researchers think that there is no reason to go above that implies that they really have no clue. lol. What if the industry stuck to the speed of silent films which went as low as 12fps?
 
For most of you in this thread so far, have you considered joining together and creating a test group that can travels the globe and disproves decades of scientific studies and industry design. I mean think about it, the chances of all you coming together right here on this forum and this thread.....it can't be coincidence. This is destiny.

It must be impossible for you guys living in this new world where LED's provide most light. I can't imagine what it must be like to have to look at 60hz flickers all day and night.

Or is possible you just notice a change in FPS and assume that it's your golden eyes able to see something that mere mortals cannot? Like a propeller blade. Do you really see the RPM it's turning at, or do you see the change in RPM?
 
Thinking of attempting an overclock of the CNS? I think that's called methamphetamine salts. Careful, if you try to run at those speeds without an adequate power supply and cooling, your system will crash .. but the fucking framerate is phenomenal.

:)

who told you to look into my personal life, spidey? who sent you?
 
Personally I stop noticing a difference around 100Hz.

One person quoted in the article estimates the threshold at about 90Hz and that sounds pretty solid to me. Back in the CRT days I could always tell up to 85Hz on any monitor, and on the same monitor I could notice a small difference when going up to 100Hz, so I knew the threshold was somewhere in the 85-100Hz range. Certainly people can see higher than that and that fact isn't disputed in the article, but he states that it becomes an audiophile-like experience at that point. Personally I think 90-100 should be good enough for most people.
 
@60Hz flicker is detectable by me on a CRT, so is 75, but @ 90 I no longer perceive the flickering. How does that translate to a LCD? Not sure, but I can certainly tell via input lag if a game is locked at 30fps, where I do not notice the input lag at or above 60.
The variation between individuals will be pretty big as well. The type of game you are playing will also effect your perception of this. If taken all by itself and ignoring the other factors that refresh rate affects, yeah, people may have a hard time telling.

In VR I can immediately detect input lag of the controllers when it slips into reprojection.
 
Yes but the operating nature of LCD and CRT are very different. Your eyes focus on the two differently (hence the blood red eyes back in the CRT days). Again....you may have been seeing something else and just associated it with direct FPS.
 
I was a semi pro fps gamer who played at several ESL Cologne lans. Can definitely tell the difference between 30,60,120,144hz monitors. Haven't had the pleasure to try 180 or 200hz though.
 
Back when I had a CRT I had to set it to 85hz or the flickering gave me headaches when I tried to do any work on it. With LCD's things have been different and 60hz has been fine. I bought a new 144hz display a while back and haven't noticed too much of a difference. Games appear smoother for sure but that might just be the G-sync. As far as work in general everything is about the same as it was using a 60hz LCD.
 
100Hz on an LCD is very different from 100Hz on a CRT and brain processing/reaction time has fuck all to do with motion smoothness.
 
CRTS and LCDs work differently. WIth CRTs, because the screen was illuminated for just a few microseconds, the screen had to be constantly "flashing" to refresh the image, thus the flicker. Since LCD isn't "flashng", there is no flicker. I'm sure you can find articles on how CRTs and LCDs work using Google.

Having said that, I will state that there is a noticeable difference between 30 FPS and 60 FPS. Will you notice a difference between 60 FPS and 120 FPS? Yes, but not as dramatic. It becomes a game of diminishing returns the higher you go. (Same with RAM). It helps when you have an adaptive sync monitor.

Outside of gaming, though, does the average user need anything better then 60 FPS? I don't think do. Sure, there are exceptions out there, but you don't need a 165FPS monitor to read web pages or work in Excel. More than likely, the consideration may be higher resolution, screen size, or color accuracy than the FPS. My personal preference is 27" IPS monitors which have a resolution of at least 1920x1080, although the resolution of 2560x1440 on the ViewSonic XG2703-GS was really nice. Too bad a few pixels died and I had to return it for replacement yesterday.
 
I pretty much doubt that gamers can discern 120Hz from 144hz on a high end TN panel in A-B testing.
That is equivalent of saying that humans can confidently discern between 6.9ms and 8.3ms time intervals- no-fucking-one has an internal clock that precise. prove me wrong pls- that is a video worth a few millions views for a youtuber .
i do understand that, on average, the gamer reaction time can be up to 1.4 ms faster on teh 144Hz panel, but this is not teh same as the gamer being able to discern the faster frame rate in A-B testing..

In slower panel techs there are overdrive hints that can help, but as long as the pixel behavior is much faster than the refresh rate, A-B testing can be performed to debunk such myths.
 
Outside of gaming, though, does the average user need anything better then 60 FPS? I don't think do.

You can easily feel the difference in just moving the mouse and windows around.
 
It's amazing the amount of effort many go into to tell others what they can and cannot perceive.
Probably more of a study paid for by industry, so they can decide if it's economically feasible to produce stuff above a certain "threshold" if only 4% are going to recognize it. Then again bigger numbers are bound to make a good number of people think it's better and hence buy it.
 
Upon reading three words of the title, i present you, me rolling my face on the keyboard:

we34s2drft
 
I was a semi pro fps gamer who played at several ESL Cologne lans. Can definitely tell the difference between 30,60,120,144hz monitors. Haven't had the pleasure to try 180 or 200hz though.

I can perceive the difference between 30, 60 and 120. Just got a new set of 144hz monitors to replace 120hz, I'm not perceiving the difference there. That is the smallest gap in the progression at the highest end so not surprising to me.
 
For most of you in this thread so far, have you considered joining together and creating a test group that can travels the globe and disproves decades of scientific studies and industry design. I mean think about it, the chances of all you coming together right here on this forum and this thread.....it can't be coincidence. This is destiny.

It must be impossible for you guys living in this new world where LED's provide most light. I can't imagine what it must be like to have to look at 60hz flickers all day and night.

Or is possible you just notice a change in FPS and assume that it's your golden eyes able to see something that mere mortals cannot? Like a propeller blade. Do you really see the RPM it's turning at, or do you see the change in RPM?
You're being a bit snarky, but uh yes some lights do have a flicker that give me a headache. Though I'm not one to say 60Hz is the culprit. TVs and such have never bothered me. Honestly I'd say 9/10 times with lights it's the circuit they're on being inadequate or perhaps low quality fixtures.
 
V-sync isn't the worst thing in the world hell run a game at a hard 30 fps without screen tearing and consistent frame timings and you'll find people can play it just fine because the information is consistent. Human brain is pretty good at guessing and compensating, reaction time means so little compared to game knowledge and muscle memory to common situations.
 
oh CRT days, i couldn't use any monitor before i switch to 75hz minimum that most monitors supported , higher if possible , i couldn't understand how other didn't see the terrible flickering.
THIS! 60Hz would drive me bonkers. 75Hz was completely and totally tolerable. At round 90Hz it made no difference. It was a hair better than 75Hz but just a hair. To me, the jump from 60 to 75 was immense but beyond that? Meh. However, that's CRT tech. The 55" 4K LCD Sammy on my desk is perfectly fine at 60. It's NOTHING like the old CRT days so I'm not sure that's a fair comparison.
 
When I upgraded from a 60hz monitor to a 144hz monitor, it was the single best graphical upgrade I've made, _ever_.

Nobody can convince me otherwise.

It's easy for me to see the difference between 60 and 120, I was on 120 monitors for over six years until I went from 120 to 144. Can't see the difference. For me I think 120 is getting near the limit of the perception of most people, at least going from 120 to 144.
 
Last edited:
The question isn't how many Hz a person can see, it's the fact we can detect when the framerate of a game doesn't match what the monitor is displaying.

60 fps may look smooth at 60Hz but it looks choppy as shit at 120Hz.
 
THIS! 60Hz would drive me bonkers. 75Hz was completely and totally tolerable. At round 90Hz it made no difference. It was a hair better than 75Hz but just a hair. To me, the jump from 60 to 75 was immense but beyond that? Meh. However, that's CRT tech. The 55" 4K LCD Sammy on my desk is perfectly fine at 60. It's NOTHING like the old CRT days so I'm not sure that's a fair comparison.
It was highly dependent on your monitor. With some resolution/frequency settings you could tell the monitor was struggling and had to back it down a notch. On my CRT (KDS VS-19sn), 85hz was noticeably easier on the eyes (clearer text, contrast, less noise and flicker).

As others have already mentioned LCDs behave very differently. While higher frequency support results in improvements you are still dealing with behaviors unique to LCD, particularly with changes in framerate and differences between monitor refresh at a given framerate, frame drops, motion issues, input lag etc. It is hard to judge on an LCD what you can really see because its behavior is in no way linear and varies with each model.

Personally, I get more eye strain (red eye) on LCDs.
 
A buddy of mine is a retinal neurosciencentist. He has the following to say on the subject.

Its actually a bit complicated, depending upon many factors. But the simple take home is that humans max out at around 60Hz in the absence of any other cues. Birds? 120Hz.
 
What all these studies don't factor which really surprises me is that the human visual system is just that, a SYSTEM. It is not a simple camera that takes snapshots of "frames" and then processes them. It is a COMPLEX SYSTEM that has many things going on. Vision is our primary sense. You need to sit your retinal neuroscientist friend in front of a 60 fps game and then a 120 fps game.
 
Back
Top