8K Broadcasting Begins in Japan

Again, even in that sector, the innovation is coming from the independent orgs. MKBHD, LTT, many youtubers are already shooting in 8k and uploading in 4k, and they don't have the bank roll of companies like major "traditional" broadcasters or content creators. Seriously awesome cameras are the cheapest they've ever been, you can get a rad 4K camera for $2k-$10k depending on what you need, as opposed to $200k-$500k in past decades for MAYBE 1080P cameras.

Sorry i meant by traditional content providers.
 
FYI humans have on the order of 1 arcminute resolution not 1 arcsecond. You physically could not have 1 arcsecond resolution unless your pupil was about 140mm wide, and that's a hard limit due to the laws of physics not something "better eyesight" determines.

1 arcsecond would be like seeing a separation of 1mm at a distance of 200 meters. Or 5 microns at 1 meter which is like a large bacteria

You are correct, I meant arcminute. Apologies, I am an astrophotographer and used to referring to arcseconds all the time ;)
 
they are 720p@60fps instead of 1080p@30fps. 720p is actually significantly better for sports because of the increased framerate.

Yea but it’s jagged rather have 1080p 60fps. I have a 75” 4K tv in the living room when I watch with family or friends it tends to look less than great.
 
8K does not have to be that big. More like my attachment.



For home usage yes but for theaters there are already a couple that have migrated to 8K LED walls. The great thing about those is that while the industry is still working on a standard 8K distribution method, any sort of standard would need the backend upgraded, not the display itself since they are inherently disjoined.


Based on the scientific established norm that the human eye has a 1 arcminute resolution, limited by the cornea, we wind up with something like this, calculated objective distance/size/resolution chart:

tv-size-distance-chart.png


The problem with 8k is , that in order to discern the detail differences you have to be so close to la large screen that the screen is larger than your field of view.

There are certainly applications for this, if you are using a very large screen to provide detailed imaging in the center, and using the surroundings for peripheral vision (this is what I do when I sit within 2 ft of my 48" 4k screen while gaming) or if you have applications where you want the viewer to turn their head (or move their eyes) and look around at different parts of the screen, but for traditional viewing, I'd argue it is of limited use, home or otherwise.

Yes, movie theaters have very large screens, but you also generally sit much further away from then than you do from your TV at home, and even if you didn't, you'd wind up in the situation where the screen is bigger than your field of view, limiting it's usefulness.
 
they are 720p@60fps instead of 1080p@30fps. 720p is actually significantly better for sports because of the increased framerate.

That's kind of a misnomer. 720p is WORSE for sports than 1080p, but in the tradeoff between resolution and refresh rate, people tend to choose higher refresh for sports. If 1080p60hz were an option, that would clearly be superior.

Furthermore, interlaced modes just need to die a horrible horrible death. There is no acceptable use for these in 2018, and hasn't been for decades.
 
So we already have so much main stream 4k content and in home screen penetration that we need 8k? I mean I'm sure 8k will be great but uh... did I miss the 4k revolution?
 
Based on the scientific established norm that the human eye has a 1 arcminute resolution, limited by the cornea, we wind up with something like this, calculated objective distance/size/resolution chart:

View attachment 124627

The problem with 8k is , that in order to discern the detail differences you have to be so close to la large screen that the screen is larger than your field of view.

There are certainly applications for this, if you are using a very large screen to provide detailed imaging in the center, and using the surroundings for peripheral vision (this is what I do when I sit within 2 ft of my 48" 4k screen while gaming) or if you have applications where you want the viewer to turn their head (or move their eyes) and look around at different parts of the screen, but for traditional viewing, I'd argue it is of limited use, home or otherwise.

Yes, movie theaters have very large screens, but you also generally sit much further away from then than you do from your TV at home, and even if you didn't, you'd wind up in the situation where the screen is bigger than your field of view, limiting it's usefulness.

“The Chart” is known to be BS, if you didn’t know. I sit 11ft from my 75 and can clearly, easily see 1080 vs 2160. Clear as day, and my eyesight isn’t perfect...

That chart has been circling for years and is long debunked...
 
That's kind of a misnomer. 720p is WORSE for sports than 1080p, but in the tradeoff between resolution and refresh rate, people tend to choose higher refresh for sports. If 1080p60hz were an option, that would clearly be superior.

Furthermore, interlaced modes just need to die a horrible horrible death. There is no acceptable use for these in 2018, and hasn't been for decades.

1080p@60fps wasn't an option. All the production equipment is based off NAB standards which is either 720P@60 or 1080 (i/P) @ (60i/30p).
 
“The Chart” is known to be BS, if you didn’t know. I sit 11ft from my 75 and can clearly, easily see 1080 vs 2160. Clear as day, and my eyesight isn’t perfect...

That chart has been circling for years and is long debunked...


I'd argue that is mostly placebo.

The 1 arcminute capability of the eye is well established by scientific studies. The rest from there on is simple geometry and trigonometry. This is objective, not subjective.

The parallells are similar to the audiophile world.

You are probably responding more to image settings, backlighting, color, contrast, etc. of newer screens, giving a perceived higher level of fidelity and attributing it to resolution, just like how in audio higher volume tricks out brain into thinking higher quality.
 
Last edited:
You are correct, I meant arcminute. Apologies, I am an astrophotographer and used to referring to arcseconds all the time ;)
You're forgiven :) I understand that using one particular unit constantly it almost is knee jerk when you're talking about similar types of units to refer to what you do.
 
“The Chart” is known to be BS, if you didn’t know. I sit 11ft from my 75 and can clearly, easily see 1080 vs 2160. Clear as day, and my eyesight isn’t perfect...

That chart has been circling for years and is long debunked...

Yes and no...and arguing a specific subset argument on that chart to debunk it is disingenuous. The reason why you see the difference between 1080p vs 2160p is because you brain has enough time to process enough frames to fill in the details. PERIOD. We are biological orgasms and thus the capacity of certain organs of our body can be discretely and ACCURATELY measured. On static images our eyes are quite amazing, but once you start moving things around..well, the answer is definitely limited by your brains ability to process the delta changes. If I have a fast enough moving scene I would bet you a $100 you could not tell the difference between a 4k and a 2k image video.

So yes, for a high changing image..the chart is a damn good place to start. It isn't absolute though. Arguing about static images resolution on a projector or TV design for things that are meant to be moving is academic at best. Most of the 2k 4k comparison have such bullshit data it is amazing on its own count as well.
 
I'd argue that is mostly placebo.

The 1 arcminute capability of the eye is well established by scientific studies. The rest from there on is simple geometry and trigonometry. This is objective, not subjective.

The parallells are similar to the audiophile world.

You are probably responding more to I age settings, backlighting, color, contrast, etc. Giving a percieved higher level of didelity and attributing it to resolution, just like how in audio higher volume tricks out brain into thinking higher quality.

He is "half right" and you are "half right". Our brain has the ability to interpolate data very because our eyes do not take static snapshots. This is the same technique used in modern cameras to give us much higher pixel accuracy and in case density because you can actually figure out what the real answer is when each cone/rod doesn't have a "perfect data".
 
I'm not sure about that. My girlfriend just got a new 4k 55nu8000 Samsung ultra HDTV to replace a 1080p 43 inch from just five years ago. The difference in details one can see are noticeable. Watching a 4k movie or episode is quite different than a 1080p or 720p episode of elementary was on the old TV.

Even at 10 feet away viewing distance and only a 55 inch screen, the difference is clear. 4k is worth it for clarity, color and contrast over 1080p TVs of even five years ago.
To be fair, Elementary was only available on dvd
 
I'd argue that is mostly placebo.

The 1 arcminute capability of the eye is well established by scientific studies. The rest from there on is simple geometry and trigonometry. This is objective, not subjective.

The parallells are similar to the audiophile world.

You are probably responding more to image settings, backlighting, color, contrast, etc. of newer screens, giving a perceived higher level of fidelity and attributing it to resolution, just like how in audio higher volume tricks out brain into thinking higher quality.
LOL - Nah, no placebo here. Been doing this far too long for any placebo in this regard. Nice try though!
You can tell yourself what you are capable of seeing. You cannot tell someone else what they are capable of seeing.

The chart is well known to be BS.
 
LOL - Nah, no placebo here. Been doing this far too long for any placebo in this regard. Nice try though!
You can tell yourself what you are capable of seeing. You cannot tell someone else what they are capable of seeing.

The chart is well known to be BS.

You can't train yourself out of placebo with experience. It is part of the human brain. It is part of all of us. It is pervasive. It manifests itself in everything any of our senses are exposed to throughout our lives. You can never tell the difference between placebo and objective truth, and you can never get rid of it.

The human brain is not rational. We can make rational decisions and choices in life by holding ourselves to external processes and believing them more than our own eyes, but that takes extreme discipline.

Only the ignorant trust their own senses.
 
You can't train yourself out of placebo with experience. It is part of the human brain. It is part of all of us. It is pervasive. It manifests itself in everything any of our senses are exposed to throughout our lives. You can never tell the difference between placebo and objective truth, and you can never get rid of it.

The human brain is not rational. We can make rational decisions and choices in life by holding ourselves to external processes and believing them more than our own eyes, but that takes extreme discipline.

Only the ignorant trust their own senses.
Wow you’re dense. Have you ever heard of A/B testing? I can take a static still native 4K image and it’s duplicate, exact 1080p counterpart and flip them back and forth with arrow keys on my HTPC and guess what...voila! A/B testing plain as day. “Look at all that lost detail in the 1080p!” Not rocket science man, just the facts. The chart is BS.
 
Wow you’re dense. Have you ever heard of A/B testing? I can take a static still native 4K image and it’s duplicate, exact 1080p counterpart and flip them back and forth with arrow keys on my HTPC and guess what...voila! A/B testing plain as day. “Look at all that lost detail in the 1080p!” Not rocket science man, just the facts. The chart is BS.


You don't seem to understand what it takes to do a true blinded A/B testing study.

Just like with audio you need to "volume match" the two using sensitive instrumentation. In the case of TV's this more has to do with backlight settings, saturation settings, sharpness settings, etc. etc. to make sure everything is exactly the same except for the resolution. You also cannot do the comparison on the same TV, as the interpolation making the 1080p content fir the 4k screen will throw off the results. You need to compare an identically sized 1080p and 4k panel, with identical settings (which is almost impossible to achieve, even with sensitive specialized equipment)

Then the comparison needs to be done on the exact same content. The exact same scene of the exact same movie The content has to be moving (as no one cares if you can tell the difference in a static image).

Beyond this, you can't do the switching yourself, as if you know which is which, your brain will intervene and make the one you think is going to look better, actually loo better to you. If you want to do it right, you can't even have another person switch the samples for you who knows which is which, as they might give it away using subconscious cues.

Quite frankly, I'll believe scientific studies and calculations over anyone's subjective opinions on pretty much anything, especially coming as it does from someone who claimed they didn't have to worry about bias because they had lots of experience. In fact, those who claim to be the most experienced and knowledgeable on things are usually the one with the most subconscious bias, much more than those who have no knowledge on a subject. I'd argue that you are the one who is dense here, or at least very ignorant on the subject of innate human bias and the placebo effect.

Unless they come up with an alternative objective scientific study that proves the first one wrong, I'm sticking with this one, and if someone does, then I'll have to compare each on their merits and methods.

Subjective anecdotal experiences mean absolutely nothing.
 
LOL, ok man. Or you just look at my 75 screen from 11ft and realize the chart is BS, along with every other person that knows it's BS. But keep believing something else.
I think posting it on a forum and actual having proof are 2 different things is what they could be getting at.
 
LOL, ok man. Or you just look at my 75 screen from 11ft and realize the chart is BS, along with every other person that knows it's BS. But keep believing something else.


I don't doubt that you, and many other people who are home theater/TV enthusiasts THINK you are seeing a difference.

That's the way the human brain works. Any time you expect something to be better, your brain will subconsciously manipulate your senses to make it experience it that way. This is why many people are miraculously cured by sugar pills.

You can be looking straight at something and seeing something that isn't there. That's the way our primitive monkey brains work, all of us. Combine that with internet forum echo chamber reinforcement where everyone "has seen it with their own eyes" and this becomes even more exacerbated.

We all see bias with our own eyes, and hear bias with our own ears, with all of our senses. As Obi-Wan said to Luke in A New Hope: "Your eyes can deceive you, don't trust them".

I'll take any results of a well documented study over my own eyes or ears, and so should you.
 
I am a home theater "enthusiast", I can tell the original AC3 from the AAC rip within 5 seconds of listening, but I know that the only way in a video comparison the 1 arcminute rule can seem to be violated is if either the 2 sources under comparison have more differences than resolution (e.g. compression, sharpness, colors etc which is quite common in 2 different streams), or the displays are different, or the display is optimized for the higher resolution. That last one I found out recently visiting a friend who bought a brand new 75 inch 4K TV. With FHD content his TV was quite fuzzy, not even close to what you get on a similar FHD set, while switching to 4K content everything became crystal clear. So you could see the difference even from 15-20ft away. No idea what was up with that TV.
There have actually been experiments where it is shown that our brains sometimes can combine info from both eyes and extrapolate to discern some details in better than 1 arcmin detail, but those are edge cases, the chart is pretty much to the point as long as the only difference is the resolution.
 
Back
Top