What's the new "The human eye cannot see more than 30fps" mantra being preached today?

D

Deleted member 201992

Guest
I remember for years on many different online communities that people would proclaim "the human eye cannot see more than 30fps, and it's all in your head!" Nowadays, I'm wondering what's the new mantra that the lemmings preach about today that makes no sense like how people used to say you can't see above 30fps. I ask this because there is so much disinformation being paraded around as factual proof. People often don't reason that well, and then they sometimes use shoddy "evidence" to make a factual statement that is actually erroneous.
 
The new one is "your eyes don't see in 4K." All I have to say to that is, "No shit, Sherlock." Our eyes don't see in digital, pixelated resolutions?

upload_2019-1-24_14-18-58.png


But the real argument is that you don't need resolutions higher than 1080p because you can't tell the difference as our eyes blend it all together. It's just as stupid, reaching, and unscientific as the 30 FPS argument.
 
I can definitely see the difference between 1080 and 4K. However, I will say we are getting to the point where it's harder to tell because the pixels are literally so small our eyes can't see them individually.

There are other arguments around MP3 vs Lossless, and a lot of that may depend on the equipment you use (which is a whole other argument). Honestly, I'm not sure I've been able to tell a difference between 320 Kbps MP3 and FLAC, though I don't exactly have uber-high-end audio equipment.
 
I know it's not scientific but, I will compare it to women and their purses/makeup. Every one is different with different needs and wants. Some don't care and some are anal.
I can tell the difference from 30fps to 60fps. 60fps to 120fps as well. 480p to 720p. and 720p to 1080p. But 1440p to 4k I can barely tell.

My son can spot the difference from 60fps to 50fps, but can not tell the difference between 480p vs. 720p, nor 720p vs. 1080p in movies.
My wife and daughter can't tell the difference between 30fps vs. 60fps nor can they tell from 480p vs. 1080p in movies.

Some people just don't care enough to notice the quality and speed differences between res and fps.
Maybe they can tell the difference, but are not bothered enough by the difference to actually admit it.

I love inviting a console player over to my house to play the same game on my PC and have them blown away by the difference in frame rates.
I know when they go back home, that I just ruined their console experience and gives me a warm fuzzy feeling.
 
Last edited:
I can actually tell the difference between 10% FPS increments. I know this, because I've been in scenarios where a monitor or application was limited and I knew something was up.

However, I'd take 70-100FPS Freesync'd versus 100-144Hz typical sync'd. It makes a huge difference.
 
You can't see 4K is probably the most common. But I already heard that back in 2003 only it was "Why do you need High Definition, DVD is more than enough!"

Another persistent bullshit is: You need to sit farther back if you buy a larger TV! I remember some people claiming that you need to sit 5m away at least from a "40 TV, otherwise your brain will fry!

And this might not be something people say, but do: Vertical videos. FFS We're a the point where I saw professionally edited vertical videos! The freak takes the effort to edit the video, but still records in vertical WTF?!

And the joker of all, than never seizes to amaze: "I don't find X useful therefore it's not useful" You can substitute anything for X, and someone have probably said it at some point.

because the pixels are literally so small our eyes can't see them individually.
That's actually one misconception also: That resolution stops mattering when you can't discern individual pixels. You can't discern individual pixels on a HD screen either unless you lean extremely close to it.
 
I had a salesman at Microcenter today try and convince me, and another nearby customer, that our eyes can't see the difference between 1080p and 4k and that any perceived difference is placebo.
 
That's actually one misconception also: That resolution stops mattering when you can't discern individual pixels. You can't discern individual pixels on a HD screen either unless you lean extremely close to it.

I do think that at a certain point you won't be able to tell the difference, regardless, given your eyes can't make out infinitesimally small variations. We may not be there quite yet, but I think eventually we'll get there.
 
I do think that at a certain point you won't be able to tell the difference, regardless, given your eyes can't make out infinitesimally small variations. We may not be there quite yet, but I think eventually we'll get there.
For movies, I think 8K is that point. (since they have natural anti aliasing) For games however I don't think even 32K would do it. Example if you play at 4K with 4XSSAA that's actually the equivalent of playing at 16K without AA. And you can easily spot the difference between 4K with AA and without it.

But I think FPS is more important than resolution beyond 4K. I'd take 4K 60 FPS for movies over 8K 30 any day.
 
I remember for years on many different online communities that people would proclaim "the human eye cannot see more than 30fps, and it's all in your head!" Nowadays, I'm wondering what's the new mantra that the lemmings preach about today that makes no sense like how people used to say you can't see above 30fps. I ask this because there is so much disinformation being paraded around as factual proof. People often don't reason that well, and then they sometimes use shoddy "evidence" to make a factual statement that is actually erroneous.

It's really more so that people have a confirmation bias towards their setup or what to extent their ideal setup would be. Really tt doesn't just apply to this but any number of topics and not just tech related either. But with the technology subject there is an added aspect of purchase justification, you need to convince yourself you made the best purchase and are on the so called right side.

This is why I think in general you should be rather wary of taking commentary for granted. Look at the whys of what what they say and don't focus on the whats they say basically.
 
I would say the common ones I hear now:

1) Don't use OLED TV for your computer because of "burn-in".
2) You need to upgrade your CPU to play games today at high resolution/FPS.
3) You need to drain your battery before you charge it for the first time.
4) Macs don't get viruses / Macs are more secure than Windows.
 
resolution stops mattering when you can't discern individual pixels
aliasing effects might beg to differ

the 30fps thing comes from film/video where the camera's shutter speed is such that it blurs the image enough to look smooth. with faster cameras now there is a more obvious reason to go higher fps

with video games of course it's different because there is typically no motion blur so you need more frames for that to appear smooth. there was one such study done with folks and I think above 120hz almost nobody could tell or feel the difference, and then at 240hz is where it stopped mattering... if anyone finds this study please link it?
 
aliasing effects might beg to differ

You're trying to make me look bad or what? Quoting my mentioning of the misconception itself out of context, is if I wasn't trying to debunk that notion, but promote it?
 
aliasing effects might beg to differ

the 30fps thing comes from film/video where the camera's shutter speed is such that it blurs the image enough to look smooth. with faster cameras now there is a more obvious reason to go higher fps

with video games of course it's different because there is typically no motion blur so you need more frames for that to appear smooth. there was one such study done with folks and I think above 120hz almost nobody could tell or feel the difference, and then at 240hz is where it stopped mattering... if anyone finds this study please link it?

Hahahaahh. Another one I found.
 
The difference between 4K and and 1080p should be very easy to see if you have a 4K monitor/TV. Just toggle the resolution. If you can't tell the difference, you either have a terrible monitor or you need an eye exam. That doesn't mean it's more important that framerate (I don't think it is), but it should still be a pretty obvious difference.
It feels like the people spouting 4K doesn't matter are the same people that used to go on and on about how one console is better. The others aren't even worth owning.

I don't know if I've heard all that many head scratchers lately beyond "4K looks the same as 1080p." The Apple vs. PC vs. Android chatter has mostly died down and people are shockingly reasonable when looking at the different platforms these days. I can't tell you when gamers have been more indifferent to the "console wars," too.

We need a divergence of VR tech to rile people up ;)
 
You're trying to make me look bad or what? Quoting my mentioning of the misconception itself out of context, is if I wasn't trying to debunk that notion, but promote it?
no

If you can't tell the difference, you either have a terrible monitor or you need an eye exam.
or the content itself isn't high enough res for it to make any difference

We need a divergence of VR tech to rile people up ;)
Yes, probably the next one will be about the width of foveated rendering cone or something... "beyond 5degrees is useless" said the partially blind person :D
 
or the content itself isn't high enough res for it to make any difference

True. That's especially the case with film. When you factor in film grain, inconsistent source material, 30% of your screen being unused (letterboxing), etc. it can be difficult to tell unless you're doing a head to head comparison. Ditto with older games that lack high resolution textures and effects that benefit from it.
 
Back
Top