Why 8K TV is a non-starter for PC users

You pretty much need to define viewing distance for any particular display device, and normalize that to your common extreme cases (a 55" TV used as a desktop monitor, a phone used 6" away from the users face...) in conjunction with human vision. Get to the point where the utility of increasing resolution no longer advances.
Do you think screen walls or tables will ever catch on? If they get cheap enough it could be a shift away from the "wobbly screen on a desk" setup?
 
Do you think screen walls or tables will ever catch on? If they get cheap enough it could be a shift away from the "wobbly screen on a desk" setup?
It's inevitable eventually.
I've seen some conceptual stuff of entire kitchen surfaces that were glass screens where you could just drag UI windows from your refrigerator to your kitchen counter to your sink.
Pretty cool stuff.
 
No but as far as desktop pc users the potential for it to fall over in an office space for example is a saftey risk, so a more secure wall or table-like screen may be better and give more working surface. I realize most of us here maybe have more secure mounting, I've just seen more than few screens flop over, usually the cheap ones businesses go for
 
No but as far as desktop pc users the potential for it to fall over in an office space for example is a saftey risk, so a more secure wall or table-like screen may be better and give more working surface. I realize most of us here maybe have more secure mounting, I've just seen more than few screens flop over, usually the cheap ones businesses go for
Never had a screen fall over or even remotely be at the risk of falling over. Do you live on a slanted hill in which the builders forgot to level the home?
 
If your screen is wobbly while it's sitting on your desk then you're doing something wrong.

like buying cheap monitors. The biggest complaint I have about various cheap non-vesa mount monitors I've ran into at work is that the single stick stand type always have major wobble at the attachment point.
 
like buying cheap monitors. The biggest complaint I have about various cheap non-vesa mount monitors I've ran into at work is that the single stick stand type always have major wobble at the attachment point.

I just buy cheap monitors with VESA mounts. US$99/ea a few years back for a pair of Acer 24" 1080p IPS monitors with VESA mounts that I put on a solid vertical stack stand. They did come with dinky stands themselves, but once mounted were perfect, and actually pretty great monitors. My mother has one now, on an arm, and my wife has the other, which is still on the top position of the stand so that it sits above her laptop.
 
Do you live on a slanted hill in which the builders forgot to level the home?
no I'm more talking about companies that buy the cheap crap by the truck loads and employees that don't give a shit and liabilities and such...
 
Yeah ... that’s not right. I don’t think in the history of owning monitors that they’ve ever been wobbly on my desk.

Depends how much you shake it or how flimsy your desk is. My desk is fairly stable and my monitor is an Acer Predator with the large adjustable stand, 27", 2560x1440. Still wobbles if I bump into the desk. If I quickly press the delete key it wobbles ever so slightly but in general for normal use it isn't much of an issue.
 
That 'sharpest point' qualifier is going to come with a stack of caveats I think.
Yup, we don't even focus colours the same.. green 555nm is king via response with daylight adjusted eyes and typically is sharpest. Red and blue get harder to focus on as you get further away from sensitivity range.
730nm red is visible.. as is 380nm UV (to me).

Do you think screen walls or tables will ever catch on? If they get cheap enough it could be a shift away from the "wobbly screen on a desk" setup?
https://news.samsung.com/global/sam...the-worlds-first-modular-microled-146-inch-tv
Modular 146" microled. I routinely am around ones that are bigger than most cinema screens (real pro stuff though).. Dot pitch has been decreasing over the last 8 years, they're getting pretty impressive.
Too bad they are often driven with shitty old content at painfully low res, 1280px etc...
 
Yup, we don't even focus colours the same.. green 555nm is king via response with daylight adjusted eyes and typically is sharpest. Red and blue get harder to focus on as you get further away from sensitivity range.
I can endorse this.
My Emotiva amplifier has a blue LED light round the power button when on and the older I get, the more I see a slightly offset from the centre triangle to hectangle of blue circles (and the original blue circle), yet at the same time am strongly focused on the power button.
Very strange.
 
1080p @ 60hz for life!
27" 1440p 144Hz for FPS gaming, monitor is about 14" from my face so good enough. For me, until 4k 144Hz monitors drop considerably in price I'll be sticking with what I've got. To be honest, at this viewing distance any bigger screen will mean I have to move my head too much to see everything, and I'm too lazy lol
I do have a 4k 65" Sony A9F which is currently the love of my (movie viewing) life, but haven't had a chance yet to hook up my pc, kids keep getting in the way!
 
Last edited:
I'm still super confused, we have cell phones with like said about high PPI/DPI that look good but the resolution is low. Why cant we get larger screen with higher PPI/DPI It just seems like there are to many diminishing returns once we hit 4k. 4k is nice but 1440 with a high PPI/DPI could you honestly make out the extra details?
 
If you want to show your digital images like you used to show your slides, YOU NEED 8K.
8K screens show 32Mpixels pictures that are now quite standard. There is already a smartphone made by Nokia in 2011 which delivers more than 40Mpixels.
Good films for slides would show an equivalent of more than 100M pixels like the Kodachrome 25. Best Leica slide scanners would only get 96 M pixels of one slide, though.
So 8K is not even up to the task to replace those old slides. We need 16K screens for that and way better cameras.
You only see 8Mpixels on a 4K TV, so that's even less than any cheap smartphone of today can do.
 
More pixels can be beneficial even on smaller screens. Higher resolutions should allow for better motion resolution, something LCD's suck at in general.
 
If you want to show your digital images like you used to show your slides, YOU NEED 8K.
8K screens show 32Mpixels pictures that are now quite standard. There is already a smartphone made by Nokia in 2011 which delivers more than 40Mpixels.
Good films for slides would show an equivalent of more than 100M pixels like the Kodachrome 25. Best Leica slide scanners would only get 96 M pixels of one slide, though.
So 8K is not even up to the task to replace those old slides. We need 16K screens for that and way better cameras.
You only see 8Mpixels on a 4K TV, so that's even less than any cheap smartphone of today can do.

Have you ever had to sit thru a 2 hr slide show of shitty vacation pictures? That takes a lot whiskey.

If I ever walk in to a friends house and they proceed to show a slideshow on a screen (Any screen) I'll lock myself in the bathroom with a bong for the duration.
 
If we had the GPU horsepower to push 8K @ 120/144, I think I'd rather use that same GPU horsepower to push 1440p @ 60+ with full ray tracing...in modern games (not 20 year old game remakes). But that's just me.
 
If you want to show your digital images like you used to show your slides, YOU NEED 8K.
8K screens show 32Mpixels pictures that are now quite standard. There is already a smartphone made by Nokia in 2011 which delivers more than 40Mpixels.
Good films for slides would show an equivalent of more than 100M pixels like the Kodachrome 25. Best Leica slide scanners would only get 96 M pixels of one slide, though.
So 8K is not even up to the task to replace those old slides. We need 16K screens for that and way better cameras.
You only see 8Mpixels on a 4K TV, so that's even less than any cheap smartphone of today can do.

You get it.
I'm all in for imaging and all in for video. I'll gladly dip my toes in a large 8k as the next 'stop' when it's individually addressable type of HDR (OLED/etc) and at least 120Hz for local driving (games/etc). 4k is also not bad compromise when the aforementioned is met.
I can't wait for higher resolution film standards plus lossless compression for the expected streaming method to get there, best and latest 4k/120 visual lossless is well under 3Gbps, requiring a serious CDN + WAN, or pro application on a network. I can't tell you the exact number, but that data rate consequently can do 8k30 which is a good start for movies. I think as a downloadable and storable method, this is much more viable as it's not realtime yet realistically.
 
If we had the GPU horsepower to push 8K @ 120/144, I think I'd rather use that same GPU horsepower to push 1440p @ 60+ with full ray tracing...in modern games (not 20 year old game remakes). But that's just me.

We shouldn't have to make that compromise, going forward, with better hardware and better optimization (and just better RT integration), but if we do, it'll be nice to have the choice!

Good films for slides would show an equivalent of more than 100M pixels like the Kodachrome 25.

This I'll need a reference for, while also pointing out that actually resolving 100MP is extremely difficult. I don't think you'd get even half that out of a family album unless it was shot as pure landscapes or in studio, with the very best equipment available.

You only see 8Mpixels on a 4K TV, so that's even less than any cheap smartphone of today can do.

8MP on a smartphone really isn't 8MP of information except in the very best conditions. It's a very poor comparison to make, as in any conditions other than the very best, phones are generating 'detail' based on what they actually can capture.

when it's individually addressable type of HDR (OLED/etc) and at least 120Hz for local driving (games/etc).

This is my minimum for upping resolution now. 4K isn't really there yet, unfortunately.
 
This is my minimum for upping resolution now. 4K isn't really there yet, unfortunately.

I'm hoping next year TVs will deliver us to the hallowed grounds of better-than CRT in every way... Or at least close enough for those requirements to be met. PC monitor market currently can die in a fire, TVs are going to make them obsolete if they diversify the sizes to take that market... Only a few of them are in both markets.
Next-gen consoles and VRR/latency related marketing is a real help for our cause.
 
I told my doctor that 4K content was hurting my eyes. He agreed and warned me that watching any content under 8K could permanently damage my vision.

I don't think we're really going to make progress in this area until people see it as a medical issue and address it as such.
Are you serious or joking? Is that a real thing?
 
Have you ever had to sit thru a 2 hr slide show of shitty vacation pictures? That takes a lot whiskey.

If I ever walk in to a friends house and they proceed to show a slideshow on a screen (Any screen) I'll lock myself in the bathroom with a bong for the duration.
People who take shitty vacation pictures are probably just posting them on Facebook. But maybe even they have a few pictures that are worth looking at and they show them on their TV....it definitely wouldn't take 2 hours. If I pushed it and decided to show you 100 images, it wouldn't take 2 hours. At most it'd take about 15 minutes.

I personally like looking at them on a big screen to decide if I want to print and if so how big, but my 5k TV can't come close to displaying a 36mp image.
 
People who take shitty vacation pictures are probably just posting them on Facebook. But maybe even they have a few pictures that are worth looking at and they show them on their TV....it definitely wouldn't take 2 hours. If I pushed it and decided to show you 100 images, it wouldn't take 2 hours. At most it'd take about 15 minutes.

I personally like looking at them on a big screen to decide if I want to print and if so how big, but my 5k TV can't come close to displaying a 36mp image.

Easily 2 hrs, as no vacation slide show is complete without a narration.
 
Will a 8K video game look more lifelike than Days Of Thunder or Top Gun at 1080p? Do pixels matter? Or is there another way to make games look more lifelike?
 
That's like saying 640x480 or 1024x768 (or any arbitrary resolution) is all we'll ever need. The argument is ludicrous. 8k is only a nonstarter today because we don't have the GPU power to make use of it and SLI is effectively worthless at this point so that can't change anytime soon.

Accept there is a limit to what the average human can actually see. The human eye is still the human eye. There is a limit as to the max resolution a human (not our cats) an see the difference in. Although our analog eyes can be technically capable of seeing very fine detail up close the further things are the less detail we can resolve. The truth is no one can see the difference between 4k and 8k at a reasonable TV viewing distance, and or at reasonable viewing sizes. (in other words few people are going to be gaming on 80+ inch screens... and no one is sitting a couple inches from their 80+inch 8k monitor so they can make out a bit more detail in their game)

I think its a pretty realistic to say for games.... we don't need 8k. All that processing power will be much much better utilized making things like Ray Tracing a reality, and potentially other GPU intensive tricks to give us even more realistic simulations of reality.

8k Makes sense for motion pictures... cause they get projected on massive Imax size screens. The idea that anyone at home with a 80" inch TV is going to be able to tell the difference between 4k and 8k is laughable. Considering a human with perfect 20/20s field of view and detail resolution 4k is only marginally better then 1080p... I'm not saying its not better but the diminishing returns are real. A great number of people with less then 20/20 vision can't tell the difference between 1080p and 4k. Between 4k and 8k... its just not possible again unless your talking about a massive screen where the eye is drawn to specific smaller parts of the image at a time. Or as some have said perhaps in VR where the screens are right in front of your eye. (I have read that perfect human eyes could probably tell the difference in VR up to around "20k" but I have no idea how accurate that is)
 
On a 49" screen on your desk as a monitor, you probably can tell the difference between 4K and 8K. I can see the pixel pitch of 4K and it's not fantastic.
 
On a 49" screen on your desk as a monitor, you probably can tell the difference between 4K and 8K. I can see the pixel pitch of 4K and it's not fantastic.

Is that 4k though or the quality of the 4k screen your looking at. I take your point though at 50 inch+ you may start seeing the difference for very fine text. To be fair though I'm not sure how practical 50+ inch screens are as monitors. Of course your not going to get that crisp 384 ppi you can get on a smaller 4k screen... still even a large 4k should be giving you 192 ppi the newer windows standard with ease.

Of course if we are really talking pixel pitch of 4k monitors... the question starts to be imo what are you using the screen for. Sure perhaps you can see the difference in very fine print on between a 4k and 8k screen at or over 50 inchs. But can you really tell the difference while playing any type of game ? I am sure like audiophile stuff there will be those that claim that is so... but I have a feeling if you put them in a A/B pick the 8k screen situation they would score right around the same as chance. Still we may all be able to pick the 8k screen out if you fill the screen with 10pt font. At that point you are limiting your field of vision and focusing and sure the math says as I pointed out with the research on VR... at very very close distances with our focus locked it would probably be around the 20kish point where we would stop seeing any difference at all. (although diminishing returns and the simple fact that very few people have razor sharp 20/20 vision mean the majority would probably stop seeing any difference well before that)

And you know when I type that out who knows perhaps 8k is the sweet spot. In 10 years driving 8k may seem trivial... and at that point the diminishing returns probably do make going any higher resolution wise pointless. 8k per Eye in VR.... would be an effective 16k worth of pixels and darn close to the theoretical limit of human vision.
 
On a 49" screen on your desk as a monitor, you probably can tell the difference between 4K and 8K. I can see the pixel pitch of 4K and it's not fantastic.

Hmm I think that Chad missed the qualifiers in your post *49" + desk* that's a massive diagonal viewing angle for a desk monitor, a 27" one would make everything much better I bet.

Honestly I have no idea how people use screens so massive as a desk solution, I guess that 8k will be a blessing allowing you to get the effect equivalent to a wall of individual monitors like in a security master booth.
 
  • Like
Reactions: ChadD
like this
On a 49" screen on your desk as a monitor, you probably can tell the difference between 4K and 8K. I can see the pixel pitch of 4K and it's not fantastic.
I think that as resolution increases, it becomes less relevant. Color reproduction/accuracy, contrast, HDR, etc are becoming more important than just resolution.
 
Yes, it is. This is a definite medical issue that is underreported. Hopefully the industry bans 4k as soon as 8k can be commonplace in our living rooms.
Have any kind of medical research to back that up? No offense, but sounds like another "wifi and cell radiation causes headaches and even cancer" type of deal to me.
 
Back
Top