Why 8K TV is a non-starter for PC users

Unless your inches away from the screen most wouldn't be able to tell if it's 4K or 8K. You have to get to massive size screens to even tell the difference and then you need actual non upscaled content to really appreciate it.
 
It's not the resolution that is the problem, its the equipment and bandwidth needed to push it at acceptable refresh rates and fps. And forget streaming...

More isn't always better.
 
If 8k does become standard, it'll be the last increase in resolution for anything but niche applications like VR.
 
That's just plain incorrect.

Aliasing does not go away at higher pixel densities. It is still there but smaller.

AA algorithms do not make aliasing go away. It's still there but blurrier. Even without 8k monitors, rendering stuff at higher res, then down-sampling the image to match the screen resolution produces better results than AA algorithms
 
Not exactly. AA basically blurs away artifacts in an image caused by low sample frequencies. Depending on the image not even 8K will be a high enough sample frequency to avoid those artifacts. Hence the intelligent AA blur algos will also be useful.

AA algos also have the benefit of not wasting performance. When you render at 8K you’re throwing away a lot of resolution on parts of the image that don’t need it.

See my reply to [Z]
 
I have a Sony 75" 4k TV that upscales and is about 10' away. Can't tell the difference between 1080 and 4k now. If you walk up to the TV point blank you can see the pixels. I'd argue resolution needs to increase until you can't tell the difference between what's on the screen and the actual object in real life. No pixels detected at point blank by above average human eyes.


No, you've already proven that, for your viewing distance, 1080p is already satisfactory. Are you seriously going to view your giant light bulb of a 75" TV from 5 feet away just to prove a point?

From normal viewing distances 1080p video/gaming is indistinguishable from 4k on a TV. Even though you view your monitor at about half the distance you view a TV (from size vs viewing distance), there is still a finite resolution at which going beyond you will not notice a difference. That resolution is gong to be around 4k for most users.

You may be able to highlight some corner cases, but for 95% of content consumer's use cases, it's not noticeable. To the market, that makes 8k worthless.
 
Last edited:
Heck, from a TV perspective most people can't even tell the difference between 1080p and 4k at normal screen sizes and viewing distances :p

Considering how much of the source material is upscaled over-compressed crap, that's understandable. With stuff mastered for it, on my OLED 4k HDR panel, you can instantly spot the difference between 2k & 4k from 30' away.
 
Heck, from a TV perspective most people can't even tell the difference between 1080p and 4k at normal screen sizes and viewing distances :p
That's because most people are watching streaming content typically at 1080p or worse on their 4K TV. I've established many times that the bandwidth currently common in "4K" streaming is barely enough for an acceptable 1080p stream. If you watch cable TV it's an even worse comparison because most "HD" channels still broadcast at 720p "half HD." Then there are those who buy an upscaling DVD player, watch their old DVD movies, and complain there is no difference.

I guess that kind of proves your point, but not for the reason you were probably thinking. Most people can't tell a difference because they are ignorant about their content consumption, simply eating up what is advertised and what the sales person tells them.
 
The content I consume comes from Dish and streaming services. Maybe that's the problem but I pay $27 a month for all my TV and $3-6 per new movie. Personally there's no way I'm paying $30 for every movie to get a better 4k picture. Maybe my absolute favorites but ignorance is bliss. I don't think my Blueray player has been turned on in years.
 
Resolution will never be a substitute for AA.


Higher resolutions do not obviate the need for AA, and even if they did it would be much more computationally efficient to use an AA technique.

Boosting resolution as nauseum is just foolish, inefficient and wasting resources that can be better used elsewhere (like higher geometry models)
I agree. 1080p is fine for most desktop monitors for gaming. You lose 3x the processing power on making lines less jaggy, when other visual improvements would have a much bigger impact, as others have said.
 
What about, say, in 10 years? Should we get consumer graphics hardware which runs 8K at 60hz or even 120hz and maintains a steady 60fps+ in-game, it’s always going to be 4X more GPU-resource hungry than the same quality in 4K, thus 4X less detail than would be the case if the GPU rendering time was put to better use on actual rendering quality (ray tracing, lighting, shadow quality, polygon count) rather than it being squandered on resolution.

8K has a strong future for VR headsets (which is perhaps also the future of the cinema experience itself) and that’s what I’ll be covering in a blog post tomorrow. Stay tuned.

https://www.eoshd.com/2019/11/why-8k-tv-is-a-non-starter-for-pc-users/
Are these your words or are you just posting what the link says?
 
I normally love Digital Foundry but everything about that video annoyed the fuck out of me. I remember gaming on CRTs, I GREATLY prefer my high res flatscreens and especially my 4k ones.

I do not miss CRTs at all. I remember back in the LAN days, lugging my 50LB+ 1280x1024 18inch 75hz CRT around.
 
I think 4k resolution is enough for even big (40"+) screens on your computer desk. I've used 55" 4k TV which was sitting very close to me and had no problems with pixels.

VR however, I'm sceptical that it will look good even with 8k.
Index does a good job with the screen door effect (depending a lot on the game), but the visual quality is miles behind my 4k TV.
 
I normally love Digital Foundry but everything about that video annoyed the fuck out of me. I remember gaming on CRTs, I GREATLY prefer my high res flatscreens and especially my 4k ones.

I do not miss CRTs at all. I remember back in the LAN days, lugging my 50LB+ 1280x1024 18inch 75hz CRT around.

Heh, if you want to read about a technology killed by greed fuckery that would have been a best of both worlds alternate reality you should read about SED TV. Imagine if your crt could be thin and low power like a lcd with all the benefits of CRT* picture quality /motion resolution.

Chances are that it would lose against quantum dots but it would have been way better in the meantime :-p

EDIT :* sorry I initially wrote lcd in automatic, it's way too early.
 
Last edited:
The goal is ever improving picture quality. 8K is wasteful now, but current AA is a compromise and comes with a loss of PQ in some areas. SSAA or native high pixel count is visually superior and the goal should be to achieve performance that would make it possible. Everything else is just another form of "good enough for now" "cheating" .

I forgot to share this link to provide the basis for my "it is a waste" statement :

https://michaelbach.de/ot/lum-hyperacuity/index.html

You can read it and try the interactive test to see for yourself how hyperacuity is indeed a 5-10x higher level of visual acuity in very specific spots/situations VS regular acuity which is why a smart resolution is my proposed solution since 5x-10x higher resolution on the full screen is a massive increase in required everything.

While we get to something like that a mix of AA will always be better.
 
Yeah, PPI/DPI, which differs greatly with display size, is what's more important than resolution. Currently our phones have 400-700 PPI, which is several times more than any 4K TV's PPI, which even at a smallish 55" is only 84 PPI. So even 8K at 55" (160 PPI) isn't anywhere near the PPI we already have on our phones. You can bet large format display (TV) resolutions will keep bumping for the foreseeable future. I would definitely love to see a large display or TV at 400+ PPI though, bet it would be amazing to see at any distance and would be perfect for a monitor.
And that's only 60% more than a 24" 1200p monitor, which I bought 12 years ago.
 
I think 4k resolution is enough for even big (40"+) screens on your computer desk. I've used 55" 4k TV which was sitting very close to me and had no problems with pixels.

VR however, I'm sceptical that it will look good even with 8k.
Index does a good job with the screen door effect (depending a lot on the game), but the visual quality is miles behind my 4k TV.
I can see them on my 65" monitor if I'm within 5', so I'm not so sure 8K is unneeded, but we'll see in a few years.
 
!remindme 10 years
I think some peoples attitudes will be a lot different then.
Makes me think of Bill Gates saying 640K is more memory than anyone will ever need.
 
At current time, 8K is a blip on the distant horizon. Current hardware is, in my opinion, inadequate to support 8K for gaming purposes. Professional graphics artists, office productivity? Sure, for those who feel thg hey need to have it.

For me, 4k or 3440 x 1440 is plenty enough resolution for my purposes. 8K would be a want, not a need and even then I really don't see, at this time, any need I may ever have which would require 8K.
 
At one point I thought 1280x1024 was the highest resolution I'd ever need. Today I won't go back from 4k.
I thought the same, only I used a normal 4:3 1280x960 monitor. Then I got a 1600x1200 monitor for a steal and realized how wrong I was. I say bring on the pixel invasion.
 
omg this is so low res, needs 512k at least

images?q=tbn:ANd9GcTDyR8cXxRr1XmS9WSzKl1toSgEhJgfiVLH4yGnlhoul6SdmaoR&s.jpg
 
Issue with 8k at the moment has actually been cable, storage and port technology. Once we get cables that can actually support 8k on a single cable they'll become more useful as the "next" big thing, just like 480p, 720p, 1080p and 4k.

https://www.digitaltrends.com/home-theater/hdmi-explained-everything-you-need-to-know-news-specs/

The next few rounds of cables are in the technology phase and will likely be available to consumers in a couple years.

For gaming we really need whatever comes after HDMI 2.1 so we can have 8k @120hz or faster.
 
The REAL question is:

How much higher can we go before it all becomes just a bunch of useless hogwash.... whether in k's of resolution, GHZ of speed, TFlops of bandwidth, etc etc....

We should just go ahead and turn ourselves into machines with infinite capabilities & be done with it...
 
No, you've already proven that, for your viewing distance, 1080p is already satisfactory. Are you seriously going to view your giant light bulb of a 75" TV from 5 feet away just to prove a point?

From normal viewing distances 1080p video/gaming is indistinguishable from 4k on a TV. Even though you view your monitor at about half the distance you view a TV (from size vs viewing distance), there is still a finite resolution at which going beyond you will not notice a difference. That resolution is gong to be around 4k for most users.

You may be able to highlight some corner cases, but for 95% of content consumer's use cases, it's not noticeable. To the market, that makes 8k worthless.

Yeah but:

741556.jpg
 
How much higher can we go before it all becomes just a bunch of useless hogwash.... whether in k's of resolution, GHZ of speed, TFlops of bandwidth, etc etc...
I'mnot sure, but I think the race for printer DPI has kinda settled now, so something similar might happen with screen size/res... it's why folks bring up AR/VR, because of course that simplifies things in terms of predicting where it'll end as it's more of a fixed screen size and known retinal-resolution target.
 
The bigger issue is that we're approaching the cusp of 'resolutionless' display output.

That is, the resolution of the display and the resolution of output rendering are no longer tightly coupled, with pixels smaller than human eyes can distinguish and seamless scaling implemented all around.

While desktop operating systems are pretty behind here, we're already seeing it fairly well implemented in mobile devices and to different extents in consoles, with the seeds on desktops in games with dynamic resolution scaling and variable shader technology.
 
I'mnot sure, but I think the race for printer DPI has kinda settled now, so something similar might happen with screen size/res... it's why folks bring up AR/VR, because of course that simplifies things in terms of predicting where it'll end as it's more of a fixed screen size and known retinal-resolution target.

You pretty much need to define viewing distance for any particular display device, and normalize that to your common extreme cases (a 55" TV used as a desktop monitor, a phone used 6" away from the users face...) in conjunction with human vision. Get to the point where the utility of increasing resolution no longer advances.
 
I normally love Digital Foundry but everything about that video annoyed the fuck out of me. I remember gaming on CRTs, I GREATLY prefer my high res flatscreens and especially my 4k ones.

I do not miss CRTs at all. I remember back in the LAN days, lugging my 50LB+ 1280x1024 18inch 75hz CRT around.

My first monitor (excluding the TV for my VIC20) had a dot pitch of .43. I don't often miss those days but when I do, I use a VT220 type font! :oldman: http://sensi.org/~svo/glasstty/
 
1080p @ 60hz for life!

For most desktop applications, this is actually a decent stopping point. While I've farmed them out to family members, I did have a double-stack of 24" IPS panels that at normal viewing distances had great text sharpness, color, and viewing angles.

I'd have taken higher refresh rates for quality of life if I could, but that's about it, unless they were say 4k with 200% scaling.
 
I remember talking to some camera guys and one of them said the human eyes resolved about 50 megapixels at the sharpest point (or something like that, I can't remember exactly).

4K is about 8 megapixels ... so it looks like we still have a helluva way to go before we hit the "No more improvements" ceiling.
 
I remember talking to some camera guys and one of them said the human eyes resolved about 50 megapixels at the sharpest point (or something like that, I can't remember exactly).

4K is about 8 megapixels ... so it looks like we still have a helluva way to go before we hit the "No more improvements" ceiling.
At what size are we talking about, though? 9600x5400 (50MP in 16:9) at 138" is still 80 PPI. On a 55' movie screen it would be 17 PPI.
 
I remember talking to some camera guys and one of them said the human eyes resolved about 50 megapixels at the sharpest point (or something like that, I can't remember exactly).

4K is about 8 megapixels ... so it looks like we still have a helluva way to go before we hit the "No more improvements" ceiling.

That 'sharpest point' qualifier is going to come with a stack of caveats I think. Being somewhat into photography myself... actually getting your eyes to that point isn't something you do a lot, and it depends significantly on the content itself, as your brain doesn't interpret what individual rods and cones sense the same way that camera pixels do, or even the way photosensitive crystals do with film.

4K is about 8 megapixels ... so it looks like we still have a helluva way to go before we hit the "No more improvements" ceiling.

8K is 32MP, and since that's available in the consumer space now, we already have examples that are 'close'.

Now, to provide a bit more context to the discussion: we should really be talking about pixels per degree, which is a function of both pixel size and viewing distance.
 
Back
Top