Why 8K TV is a non-starter for PC users

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,785
What about, say, in 10 years? Should we get consumer graphics hardware which runs 8K at 60hz or even 120hz and maintains a steady 60fps+ in-game, it’s always going to be 4X more GPU-resource hungry than the same quality in 4K, thus 4X less detail than would be the case if the GPU rendering time was put to better use on actual rendering quality (ray tracing, lighting, shadow quality, polygon count) rather than it being squandered on resolution.

8K has a strong future for VR headsets (which is perhaps also the future of the cinema experience itself) and that’s what I’ll be covering in a blog post tomorrow. Stay tuned.

https://www.eoshd.com/2019/11/why-8k-tv-is-a-non-starter-for-pc-users/
 
Meh. I think 4k is as much as well ever need. 8k is a total waste of pixels and process power.

That's like saying 640x480 or 1024x768 (or any arbitrary resolution) is all we'll ever need. The argument is ludicrous. 8k is only a nonstarter today because we don't have the GPU power to make use of it and SLI is effectively worthless at this point so that can't change anytime soon.
 
That's like saying 640x480 or 1024x768 (or any arbitrary resolution) is all we'll ever need. The argument is ludicrous. 8k is only a nonstarter today because we don't have the GPU power to make use of it and SLI is effectively worthless at this point so that can't change anytime soon.


Disagree. With previous resolutions there were obvious shortcomings. Pixelation and all that.

With 4k, anything that fully fits in the human field of view is going to ourreaolve the human eye. 8k I completely pointless. Something for marketing people and audiophile types who love to water their money on useless stuff :p
 
Among other things, as long as any kind of AA is needed, we don't have enough pixels.

Resolution will never be a substitute for AA.


Higher resolutions do not obviate the need for AA, and even if they did it would be much more computationally efficient to use an AA technique.

Boosting resolution as nauseum is just foolish, inefficient and wasting resources that can be better used elsewhere (like higher geometry models)
 
Disagree. With previous resolutions there were obvious shortcomings. Pixelation and all that.

With 4k, anything that fully fits in the human field of view is going to ourreaolve the human eye. 8k I completely pointless. Something for marketing people and audiophile types who love to water their money on useless stuff :p

Except that your wrong. On a 28" monitor, fair enough. On a 49" TV used as a computer monitor, the pixel pitch isn't all that good. 4K isn't enough.
 
If you are sitting close enough to it where 4k isn't enough, then I am guessing that it is because the monitor is exceeding your field of view.

Nope. I used a Samsung KS8500 @ about 2.5-3' away from me and it filled my entire field of view nicely. I used the thing for work and I could see the difference in text on it vs. something like my 34" 3440x1440 display. The pixel pitch is way worse on the 49".
 
More resolution is always better, but you need to weigh the pros and cons. 8K for me would be indistinguishable from 4K at some smaller sizes (like 27 inches) but for large-format displays, like a 50-60 inch curved display, 8K would be awesome. I'd just need literally 10 times the graphics horsepower I have now....
 
Yeah, PPI/DPI, which differs greatly with display size, is what's more important than resolution. Currently our phones have 400-700 PPI, which is several times more than any 4K TV's PPI, which even at a smallish 55" is only 84 PPI. So even 8K at 55" (160 PPI) isn't anywhere near the PPI we already have on our phones. You can bet large format display (TV) resolutions will keep bumping for the foreseeable future. I would definitely love to see a large display or TV at 400+ PPI though, bet it would be amazing to see at any distance and would be perfect for a monitor.
 
Last edited:
Resolution will never be a substitute for AA.


Higher resolutions do not obviate the need for AA, and even if they did it would be much more computationally efficient to use an AA technique.

Boosting resolution as nauseum is just foolish, inefficient and wasting resources that can be better used elsewhere (like higher geometry models)

That's a lot of wrong in a single post, not to mention the rest of your output in this thread. Higher resolution is the best form of AA ever.
 
That's a lot of wrong in a single post, not to mention the rest of your output in this thread. Higher resolution is the best form of AA ever.

Not exactly. AA basically blurs away artifacts in an image caused by low sample frequencies. Depending on the image not even 8K will be a high enough sample frequency to avoid those artifacts. Hence the intelligent AA blur algos will also be useful.

AA algos also have the benefit of not wasting performance. When you render at 8K you’re throwing away a lot of resolution on parts of the image that don’t need it.
 
Among other things, as long as any kind of AA is needed, we don't have enough pixels.

due to how our eyes work with tiny vibrations at around 120hz more or less we have an effect that lets us notice the border of things/changes in contrast beyond the level at which we can truly define a pixel, this is what is known as hyperacuity, the thing is that honestly it is a better /more efficient use of resources at that point to use AA instead of more pixels since it is only a subset of the pixels in any object that which will present problems and at least at the moment we don't have an easy way to just increase resolution around the borders.

Maybe in the future we could get a solution similar to video encoding, able to dynamically change the resolution of the screen beyond what is currently proposed as foveated rendering and then a "dynamic 8k" would be worth it but for a long time it would be a waste of resources.
 
I would love an 8k panel for work and games.
There has been a chimei 384 zone 8k 55" VA panel out for years. Someone on here was interested to hack the scaler/etc to make it 60Hz...

8k 55" would be the ultimate stock broker/video editing/etc screen for the next 5 years. Size of your desk and you can keep everything open as needed.
Want to game? Run it at a lower res in a boxed window if need be to not have to turn your head.

AMD made a study about this. They found 16K x 16K @ 240Hz was the limit of human visual acuity and where you cannot distinguish VR from reality.
We have a very, very long way to go.

I for one think VR might be the way around this via lithography based VR panels, they are already at 1kHz with that at very high resolutions... that is where the most interesting advancements are happening.

9J-I_C2V9n_2AxT1jzvCTqokmk8hXQAYkjijODuJG_s.jpg
 
If you are sitting close enough to it where 4k isn't enough, then I am guessing that it is because the monitor is exceeding your field of view.

When the kids are on the PCs and all I have left is the TV I sometimes stand 2’ away. I find 4k to be more than adequate.

Personally I think we’re into “diminishing returns” area. The vast majority if people are likely perfectly happy with 4k. I imagine there’s a small niche that would prefer higher, but personally I’d prefer more color depth and high Hz before 8k.
 
due to how our eyes work with tiny vibrations at around 120hz more or less we have an effect that lets us notice the border of things/changes in contrast beyond the level at which we can truly define a pixel, this is what is known as hyperacuity, the thing is that honestly it is a better /more efficient use of resources at that point to use AA instead of more pixels since it is only a subset of the pixels in any object that which will present problems and at least at the moment we don't have an easy way to just increase resolution around the borders.

Maybe in the future we could get a solution similar to video encoding, able to dynamically change the resolution of the screen beyond what is currently proposed as foveated rendering and then a "dynamic 8k" would be worth it but for a long time it would be a waste of resources.
The goal is ever improving picture quality. 8K is wasteful now, but current AA is a compromise and comes with a loss of PQ in some areas. SSAA or native high pixel count is visually superior and the goal should be to achieve performance that would make it possible. Everything else is just another form of "good enough for now" "cheating" .
 
I have a question? Is PPI/DPI connected directly to resolution? As mentioned above i think 4k is nice its just the horrible PPI/DPI.
 
I have a question? Is PPI/DPI connected directly to resolution? As mentioned above i think 4k is nice its just the horrible PPI/DPI.
PPI is connected to resolution vs screen size.

If you have a 25mm/1"x1" VR panel at 1440p it'll be 1440 PPI... (raw, without optics/etc between your eye and the panel - you can't see this pixel density at all, obviously not usefully high PPI but an example for you...)

On a 27" 1440 screen it'll be about 109PPI which for me is a good basis for a screen to be acceptable for long periods of work and crisp when gaming or looking at photos.

Or a 5.5" 1440p screen e.g. phone would be ~540PPI

4k 43" is around 110-120PPI from memory and would be the next logical step for me.

edit: play with this.. https://www.sven.de/dpi/
 
Last edited:
I use a 32" 4k for photo editing and really like the size/res combo.
 
The thing is...is it even marketable? Does someone need it, or does someone even want it?

I'm sure a segment of consumers will bite, but at least for me I'm more interested in content than eye-candy. I'd take a game or show with a great plot and dialogue in dvd/1080 over a turd in blueray/4k any day (let alone 8k).
 
I'm sure a segment of consumers will bite, but at least for me I'm more interested in content than eye-candy. I'd take a game or show with a great plot and dialogue in dvd/1080 over a turd in blueray/4k any day (let alone 8k).
That's a false dichotomy. Everything, both turds and those with great plot and dialogue will be available in 8k. A lot of movies are already filmed at 8k because that's what high end RED sensors already capture at.

https://www.red.com/dsmc2

I'm personally hoping that I can hold off on a new TV upgrade so my next TV can be 8k, but I can immediately tell the difference in resolution and for the content I consume, it'll be wonderful. Can't wait to watch my BBC nature documentaries at 8k.

Though for PC gaming I'm hoping 4k144hz becomes more feasible.
 
Adding more native resolution... still need AA.

Adding more dynamically scaling resolution... good use of power. But is it better than MSAA? Probably depends on the application.
 
Personally doubt most people will even see the difference. Working on display tech is more important then how many pixels it display. Just going from a normal 4k LCD to a 4k OLED was more of a difference then going from a 1080p to 4k LCD.


Heck, from a TV perspective most people can't even tell the difference between 1080p and 4k at normal screen sizes and viewing distances :p
 
Never be needed (or needed) doesn’t matter, marketing will make sure 8k sells once cost is down.

That being said, bring on the 8k ready GPUs! My anu... wallet is ready.
 
That's a lot of wrong in a single post, not to mention the rest of your output in this thread. Higher resolution is the best form of AA ever.

That's just plain incorrect.

Aliasing does not go away at higher pixel densities. It is still there but smaller. You could probably go up to 64k resolutions and still see it, based on some of what I have seen on high PPI phone screens.

There might be some theoretical insane resolution where this is no longer the case, but it would be so high as to be completely impractical both today and in any future scenario.

Resolution is just not the solution to anti-aliasing. At the point where resolution solves the aliasing problem, you'll get indistinguishable results for a fraction of the processing power using various AA techniques.

There is simply not now, and never will be any practical use for PPI above 100 on the desktop.

With a PPI of 100 at 4k you wind up with an ~42-43" screen, and unless you want to go bigger than that at desktop use distances (which seems highly impractical) 4k is the final necessary resolution.

Unless we are talking mobile technology (where you hold the device closer to your face) VR headsets or some sort of futuristic direct on the eye rendering technology going above 100ppi is just a foolish waste of GPU and CPU cycles that could be put to better use elsewhere.
 
Disagree. With previous resolutions there were obvious shortcomings. Pixelation and all that.

With 4k, anything that fully fits in the human field of view is going to ourreaolve the human eye. 8k I completely pointless. Something for marketing people and audiophile types who love to water their money on useless stuff :p

yea 640x480 is a strawman comparison
 
That's just plain incorrect.

Aliasing does not go away at higher pixel densities. It is still there but smaller. You could probably go up to 64k resolutions and still see it, based on some of what I have seen on high PPI phone screens.
In a practical (visible) sense it goes away and would do so at a much lower resolution than that. A native 8K screen with minimal post-p. AA would look superior to today's 4K and any AA method. Don't see why we should settle for today's lackluster results. The only great method today that doesn't lower PQ is SSAA with 4K screens and at that point you might as well output at processing resolution with a small touch-up as mentioned.
 
What does 8k TV have to do with my pc?

If 8k tv's come out and there are movies etc in 8k also, great.

My pc doesn't get a new display very often at all, and I'm going to want high refresh 144hz or better. That's a ways off unless some new breakthru display tech is about to go into mass production.
 
That's a false dichotomy. Everything, both turds and those with great plot and dialogue will be available in 8k. A lot of movies are already filmed at 8k because that's what high end RED sensors already capture at.
I suppose I could have explained my point better. 8k (or even 4k) do not have enough of a return IMO to push me to invest my money on newer equipment when there is plenty of current content that I'm happy with at a lower res. It's a diminishing return...telling me it's over 9000! doesn't get me to buy when it's early tech that's overpriced for what's visually noticeable.
 
I have a Sony 75" 4k TV that upscales and is about 10' away. Can't tell the difference between 1080 and 4k now. If you walk up to the TV point blank you can see the pixels. I'd argue resolution needs to increase until you can't tell the difference between what's on the screen and the actual object in real life. No pixels detected at point blank by above average human eyes.
 
I told my doctor that 4K content was hurting my eyes. He agreed and warned me that watching any content under 8K could permanently damage my vision.

I don't think we're really going to make progress in this area until people see it as a medical issue and address it as such.
 
Last edited:
Back
Top