[NBCNews.com] Have we surpassed the limits of human vision? Agree or Disagree?

Have we reached the limits of high resolution displays that our eyes can discern/see?

  • Agree

    Votes: 2 10.5%
  • Disagree

    Votes: 17 89.5%
  • Other (Share opinion below.)

    Votes: 0 0.0%

  • Total voters
    19

octoberasian

2[H]4U
Joined
Oct 13, 2007
Messages
4,082
http://www.nbcnews.com/technology/e...phones-surpass-limits-human-vision-2D11691618

"There's going to be some density beyond which you can't do any better because of the limits of your eye," said Don Hood, a professor of ophthalmology at Columbia University, in a phone interview with NBC News.

Manufacturers like Sony and Samsung have their new 4K TVs as a revolution in imaging. Sony's website describes their displays -- which range in price from $3,000 to $25,000 -- as "four times clearer than HD." Samsung's $40,000 85-inch TV promises "a new form of fulfillment" with its "simply breaktaking resolution."


"Sony believes that the 4K picture quality difference is evident when seen in person, and we invite consumers to see and experience the difference for themselves because seeing is believing," Sony said in an email to NBC News in response to experts who questioned the practicality of the company's 4K displays. Samsung did not respond to similar inquiries.
Personal opinion:
I would love to have a higher resolution display and my next monitor will definitely be 1440p until 4K displays come down below $1000 USD (and the sub-$1000 4K Dell isn't it just yet, needs to be lower). Then again, it's coming to a point where movies start to look like they're done live on stage, and you get to see the how imperfect actors and actresses, and newscasters look like on TV.

Higher resolution, also, means I can see more when it comes to games like Civilization 5, or having multiple windows and doing more than one thing on the screen at the same time.

The only things I don't like are the prices currently, but that will take time and they're getting lower each year. The other is that 60FPS movies kind of irk me. I've shared my thoughts here in another forum post.

What are your thoughts? Is there only so much our own eyes can see with a higher resolution display after a certain point?
 
I don't think 4k is a big deal at all for 5-60" screens, I think it is much better for jumbo screens like 80+ inches.


I don't want to see a flood of 4k tvs that are like 40-50 inches, they need to suck it up and start making wall sized displays.

Why can't I have a cheaper wall display that is not projector based? What's with this 55" crap?
 
It all depends on the distance you are from the device and the size of it, you can see the individual pixels of a 4k 50" TV if you stick your eyes right up to the screen a couple inches away, but you'd be hard pressed to do the same on a 4k phone sized device. 4k on a smaller devices are pretty much at the point of diminishing returns of pixel perception and all in all honestly wasteful on the battery. If you get far enough away from a that same 4k 50" TV you could change it a 1080p and it would be indistinguishable from one another.

edit: To answer your question we have not reached the limits of the human visual system, but 8k seems to a good number, unless it's for super massive sized screens.
 
Still like that DLP hides dots even at 1080P in television sizes. A smooth and more film-like look. Flat panel, one gets a big one for home theater, but then can't sit close enough to it for home theater effect, because of screen door. In this regard, I think 4K is an important advance...
 
First, "4K" or 3840x2160 is not "four times clearer than HD".
Study resolution. What it meant from the beginning of time. Pixel count is not resolution and never was. Relevant pastebin of quotes I collected from people who got it right.

sqrt((3840*2160)/(1920*1080)) = 2 = 200% of 1080p's resolution

Resolution is a measure of x per unit of length, not area. That's why 200 PPI display can only look (up to) twice as sharp as a 100 PPI display [of same size], even if total pixel count is quadrupled.

As for limitations of the human eye, I believe there are enough studies that dispel the "1 arcminute myth". In my educated opinion this upper limit is closer to 0.3 - 0.5 arcminutes.
 
Published data from my lab (before I joined it) has shown the ability to discriminate (static) test patterns that deviate by as little as 2 arc seconds, although that is an extreme case. Look into the hyperacuity research.

Last year I published data that showed that observers could detect changes between two moving patterns whose trajectories differed by as little as 15 arcseconds (0.25 arcminutes).

So for both moving and static patterns, we can perceive in the "hyperacuity" range.
 
Last edited:
Back
Top