Why 8K TV is a non-starter for PC users

Using the 187mp high res mode on my Panasonic S1R I am very much looking forward to an 8k desktop monitor. Hopefully the next latest and greatest nvidia card supports HDMI 2.1 so it will be worth it to upgrade. Would probably have one of the 8k qled samsungs already if there was a way to drive them at higher frame rates. If I was more into video work and had a panasonic S1H with the 6K video mode probably would have an 8k monitor already.
 
Easily 2 hrs, as no vacation slide show is complete without a narration.
Not if I'm doing it. I'd just say, "this is at <place>" rinse repeat. If someone asks a question, answer it, otherwise move on.
 

Yeah, I don't see how resolution has anything to do with it, staring at a phone screen a foot away from your face might though. A few years ago I developed this habit of only looking at my phone in the morning/night through my right eye with my left eye closed. Not a scientific study by any means but a few years later now my prescription in my right eye is stronger than my left and my distance vision in my left eye is noticable better than the right when not wearing glasses.
 
I dunno. I have a 43" 4K monitor and it's awesome but almost overwhelming. I can't see pixels from 2 feet away (the depth of my desk). I don't know what 8k would improve.

OTOH I'm sure I said the same thing with my first VGA monitor... then 1024x768 monitor... then 1080p... then 1440p.... now 4k. I'm sure I will adapt and "need" it if it becomes awesome enough.
 
More resolution is always better, but you need to weigh the pros and cons. 8K for me would be indistinguishable from 4K at some smaller sizes (like 27 inches) but for large-format displays, like a 50-60 inch curved display, 8K would be awesome. I'd just need literally 10 times the graphics horsepower I have now....

Absolutely. I have a 28" 4K monitor, and I can't see being able to observe the difference in 8K at that size. Even if we can see it to some extent, I doubt it would justify the price and performance penalty anytime soon. As I said, at 49" 2-3' away, I think it makes some sense. The pixel pitch / dot pitch whatever is slightly worse than it is on a 30" 2560x1600 monitor. It's certainly worse than my 34" 3440x1440 monitor.
 
Until diagonal lines and strands of hair stop looking pixelated to me in games the resolution isn't high enough. 4k isn't enough.
 
Last edited:
Absolutely. I have a 28" 4K monitor, and I can't see being able to observe the difference in 8K at that size. Even if we can see it to some extent, I doubt it would justify the price and performance penalty anytime soon. As I said, at 49" 2-3' away, I think it makes some sense.

Have a 31.5" 4k panel, and getting close enough to read unscaled text is just a little too close, while any scaling option other than turning it basically into a 1080p monitor looks pretty bad.

This is why I've been interested in the ~40" 4k size, if and when that becomes a thing with strong gaming and color accuracy support.

The pixel pitch / dot pitch whatever is slightly worse than it is on a 30" 2560x1600 monitor. It's certainly worse than my 34" 3440x1440 monitor.

Still have my HP ZR30w; need to give it a test run and see where its at. That's honestly my ideal size / pixel pitch for a desktop monitor.
 
Have a 31.5" 4k panel, and getting close enough to read unscaled text is just a little too close, while any scaling option other than turning it basically into a 1080p monitor looks pretty bad.

I've got a 5k, and I only have it scaled to 150% and I could live with it at 125, in most cases, but for this forum, I find that's a bit small (while for windows itself, I think it's fine). If this was 5 years ago, I could have easily done it at 100%, but my eyes ain't what they use to be, but I think I'll try sticking with 125% and see how it goes.
 
We shouldn't have to make that compromise, going forward, with better hardware and better optimization (and just better RT integration), but if we do, it'll be nice to have the choice!



This I'll need a reference for, while also pointing out that actually resolving 100MP is extremely difficult. I don't think you'd get even half that out of a family album unless it was shot as pure landscapes or in studio, with the very best equipment available.



8MP on a smartphone really isn't 8MP of information except in the very best conditions. It's a very poor comparison to make, as in any conditions other than the very best, phones are generating 'detail' based on what they actually can capture.



This is my minimum for upping resolution now. 4K isn't really there yet, unfortunately.
There are bad smartphones at taking pictures; but I just said any low end smartphone does take 8MP. Now good smartphones take 20MP wit good differentiation between pixels.
I have a Nokia N8 from 2010 ans it takes 12MP that I have checks to give every 12MP it claims on photos with a little bit better accuracy than good lenses on my Nikon Reflex, which is also 12MP.
A frind of mine who had a Nokia Pureview 808 showed my his pictures of 36MP on the same test and they were all there in the middle and on the edges. So that's even more than what you will get from a 8K TV.
And I really like to have something half on par with the quality of the slide projector that my parents used since the 60s now in 2020. Not too much to ask from today's technology I think, to much turned to serve incompetent populace.
 
There are bad smartphones at taking pictures; but I just said any low end smartphone does take 8MP. Now good smartphones take 20MP wit good differentiation between pixels.
I have a Nokia N8 from 2010 ans it takes 12MP that I have checks to give every 12MP it claims on photos with a little bit better accuracy than good lenses on my Nikon Reflex, which is also 12MP.
A frind of mine who had a Nokia Pureview 808 showed my his pictures of 36MP on the same test and they were all there in the middle and on the edges. So that's even more than what you will get from a 8K TV.

Given proper exposure, focus, and limited dynamic range of the scene in question, sure, a smartphone camera can approach the per-pixel quality of larger sensors. That's mostly a product of not screwing up the optics. But there's also limits here.

A frind of mine who had a Nokia Pureview 808 showed my his pictures of 36MP on the same test and they were all there in the middle and on the edges. So that's even more than what you will get from a 8K TV.

...and that's beyond the limit. Yes, the phones have software processing to 'fill in' the details; this happens on every digital camera regardless in the demosaicing stage of processing. But the challenge is that much of that detail isn't real, no different than on the very best camera phones today.

And I really like to have something half on par with the quality of the slide projector that my parents used since the 60s now in 2020. Not too much to ask from today's technology I think, to much turned to serve incompetent populace.

Unless your parents were top-notch photographers, which they may be / have been, you're lucky to get 1MP of real detail from slides. Especially with the lenses of that time, you run into real optical limitations that cap resolution quite quickly.

Even today, if you want to resolve detail, the photographic process -- especially for landscapes with average or greater depth and dynamic range! -- is pretty involved. Exposure bracketing and focus stacking are the name of the game, as small pixels need wider apertures and as far as digital sensors have come in terms of noise performance, we're still an order of magnitude or more away from capturing what our own eyes can see, let alone what's available in nature.


Now, try and do all of this in video for an 8K production :)
 
I'm pretty happy with my 43" 4k/120hz Asus monitor. I don't demand HDR and OLED. I haven't tested OLED but I don't really like HDR. However, going from 60hz to 120hz was noticeable in games that are not as demanding. I can't wait to upgrade to next-gen cards and see most games hit 90-120hz in 4k.
 
xFOV-RE.png.pagespeed.ic.c1tB3XlKD3.jpg


when talking of future VR, please remember eye-tracking and foveated rendering will be part of the equation, no need to render 100% quality across all the screen, outside of something like 10 degrees the quality can be like 480p and lower frame-rate even
 
for all you futurists i would like to remind you for every new "8k cant fight progress old man" fortune-telling there are a hundred 1950 "kitchens of the future". thats the thing about... the next big thing. you cant predict it. i wonder how many google stadias are already sitting in a closet somewhere..
 
Given proper exposure, focus, and limited dynamic range of the scene in question, sure, a smartphone camera can approach the per-pixel quality of larger sensors. That's mostly a product of not screwing up the optics. But there's also limits here.
On the Nokia smartphones I posted there was some kind of real Zeiss optics used, not the kind of actual Zeiss or Leica "approuved" and that includes recent Nokia smartphones (which isn't Nokia anymore in fact). Also mind that Nokia made a Windows Phones 40MP (Lumia 1020) smartphone a year after with an optimized sensor, smaller, and with OIS but without special photography chip so the phone is much slower at taking photos, and there you can say you have all the tweaks and enhancements made for recent smartphones to look better but are false and insufficient on the dynamic range.
...and that's beyond the limit. Yes, the phones have software processing to 'fill in' the details; this happens on every digital camera regardless in the demosaicing stage of processing. But the challenge is that much of that detail isn't real, no different than on the very best camera phones today.
Again, not true about the models I posted. There's no smartphone on par with those today. I'm expecting maybe Sony to go that way, but unsure since they don't want to mix with tech of their cameras.
Unless your parents were top-notch photographers, which they may be / have been, you're lucky to get 1MP of real detail from slides. Especially with the lenses of that time, you run into real optical limitations that cap resolution quite quickly.
This is a joke ! My flat scanner made for slides with all IP anti-dust stuff and silverfast software, at my office is far from being able to scan the details from those slides and it should be able to scan more than 10M prixels of details according to optical tests. In fact I've passed some of the slides on a Nikon slide scanner which can handle more than 20MP and it's fairly not enough, depending on the kind of film. Shame Nikon, Minolta (not Sony), and Canon, dont make those slide scanner anymore. There is still the professional ones made by Leica but at an unconventional price and the best of those can reach 100MP as far as I remember (and not even sure its was on 24x36 and not 6x6).
Stay away from the $100 slide scanner sold everywhere now. Maybe this is where you get your 1MP. :rolleyes:
My parents had a bunch of SLR from Canon. made vacation photos + architectural photos. I also did plenty as a kid. 25 ASa films were catching 100MP equivalent. No doubt on that. When i'm projecting the slides on 2x3m screen, I can go very close and watch (on the side to avoid my shadow) and I can get a whole bunch of details. This is so much better than digital, you could print a letter size page out of the details of that picture 2mx3m and it would look sharp.
I also tested the old Canon FD 85mm/1.2 that my father bought at the end of the 70s on a Sony Full frame mirrorless with a FD adapter and it gets all the details on the 40MP sensor. You need to catch the raw and use a software to correct the pixel level chrome aberrations. Use good quality fixed lenses and you can get your 100MP details on film (since there is no such sensor today). Lenses of that time where great, cautiously hand made. And as far you can use them with an adaptor, they have kept their value.
Even today, if you want to resolve detail, the photographic process -- especially for landscapes with average or greater depth and dynamic range! -- is pretty involved. Exposure bracketing and focus stacking are the name of the game, as small pixels need wider apertures and as far as digital sensors have come in terms of noise performance, we're still an order of magnitude or more away from capturing what our own eyes can see, let alone what's available in nature.
Now, try and do all of this in video for an 8K production :)
This is just what I mentioned previously in the thread. 8K is nice for static images. For action movies your eye is very much okay with Full HD. There is no way to share with people your sharp static images, including those taken by your smartphone (and good camera phones should show all the pixels they advertize for). So this is why 8K is very much needed.
 
On the Nokia smartphones I posted there was some kind of real Zeiss optics used, not the kind of actual Zeiss or Leica "approuved" and that includes recent Nokia smartphones (which isn't Nokia anymore in fact).

"Real Zeiss" lenses are made in Germany by Zeiss themselves, but these days that's almost entirely limited to their cinema lenses that cost more per unit than I want to invest in a car. Any Zeiss lens you see now is either made in Japan or China by someone else under some form of license. Sony is a pretty big partner of Zeiss, but all of their 'Sony Zeiss' lenses are designed and produced by Sony themselves -- and not all of those are even good, let alone class-leading as the Zeiss name would seem to imply.

Anything on a phone is likely produced in China. That doesn't make it bad, and generally speaking the lenses on smartphones are quite surprisingly very good, however, the limitations of the format with respect to optical science, starting with diffraction, absolutely apply.

Also mind that Nokia made a Windows Phones 40MP (Lumia 1020) smartphone a year after with an optimized sensor, smaller, and with OIS but without special photography chip so the phone is much slower at taking photos, and there you can say you have all the tweaks and enhancements made for recent smartphones to look better but are false and insufficient on the dynamic range.

Really the biggest issue in photographic processing is the introduction of 'false detail' at so many stages, including on the sensor itself during capture. This is why there's still a decent market for medium format backs that color their captures very little and use higher bit-rates (16bpp as opposed to 14bpp or even 12bbp for 24x36 cameras) to capture 'true' detail. Phone cameras are running in the exact opposite direction, using software as much as possible to fill in the 'gaps' to provide output that is most pleasing, not most correct.

This is a joke ! My flat scanner made for slides with all IP anti-dust stuff and silverfast software, at my office is far from being able to scan the details from those slides and it should be able to scan more than 10M prixels of details according to optical tests. In fact I've passed some of the slides on a Nikon slide scanner which can handle more than 20MP and it's fairly not enough, depending on the kind of film. Shame Nikon, Minolta (not Sony), and Canon, dont make those slide scanner anymore. There is still the professional ones made by Leica but at an unconventional price and the best of those can reach 100MP as far as I remember (and not even sure its was on 24x36 and not 6x6).

Well, the primary utility of these scanners was derived from high-quality film, and that's all but completely gone outside of token demand from the hipster crowd (of all ages).

You want to do high-res film scans these days, you grab the highest resolution camera you can afford -- could be 50MP with Canon or 60MP with Sony for 24x36, 100MP with Fuji at 44x36 or an MF back at 54x40 -- and a flat-field macro lens and appropriate lighting adapter and go to town.

My parents had a bunch of SLR from Canon. made vacation photos + architectural photos. I also did plenty as a kid. 25 ASa films were catching 100MP equivalent.

This... is a no. The film crystals don't support that level of detail, and again, you still have to deal with the limitations of the lenses, the subjects, and the photographers themselves. Granted that many of the planar-style lenses (50mm lenses ranging from about f/1.4 to f/2.8) were capable of about 50MP when stopped to f/5.6, you still had to get the rest of the capture chain lined up to get it on film, and film tops out at about 8MP of useful subject information on average; the best cinema captures on various systems from Super35 on up are good for 4k conversions (see Alien and Aliens!), but that's about the limit.

Use good quality fixed lenses and you can get your 100MP details on film (since there is no such sensor today).

There are actually quite a few -- Fuji's (err, Sony's) 44mm x 33mm mounted to an IBIS system in the GFX100 and using the lenses they've released for it have no problem with 100MP stills, supposing the composition and shooting technique are present. Larger MF backs are also available with even more dynamic range and lens flexibility.

But those aren't shooting 100MP video ;).

Lenses of that time where great, cautiously hand made. And as far you can use them with an adaptor, they have kept their value.

Variation was hell -- it still is, if you ask Sony shooters paying the Sony tax and still getting assed-up glass. They've gotten better of course, but Canon is really leading here -- their cheapo 50/1.8 available at any Best Buy has less variation than Sony is seeing across their range these days.

If you got a very good copy of a 'vintage' lens and it managed to avoid calamity over the years, yes, they're still good today and even provide images with a 'look' that can stand out. And some designs when stopped down a bit had crazy resolving power, shot appropriately, but in general, there are very real limitations present in older glass that still haunt lens design and production today.

I'd say that 8k / 32MP isn't a stretch for a good chunk of older glass, again shot appropriately and on a sensor with at least that resolution, but that's likely about it except for designs that shot sharply at wider apertures due to diffraction.

This is just what I mentioned previously in the thread. 8K is nice for static images. For action movies your eye is very much okay with Full HD. There is no way to share with people your sharp static images, including those taken by your smartphone (and good camera phones should show all the pixels they advertize for). So this is why 8K is very much needed.

The challenge when applying the concepts used to classify current video technology to human vision is simply that human eyes don't work that way. Increase the resolution and whether or not your average human would perceive it depends on quite a few factors, not the least of which are what the human in question has been trained to see and the specific content used to test; the reality is that we're pretty far from hitting the limits of human perception with current technology.

I'll agree with you wholeheartedly about the need to keep pushing resolution, and with current technology that's at least possible :).
 
View attachment 208133

when talking of future VR, please remember eye-tracking and foveated rendering will be part of the equation, no need to render 100% quality across all the screen, outside of something like 10 degrees the quality can be like 480p and lower frame-rate even
Agreed, but foveated rendering is more a GPU technology than a display technology. The display still needs to be 4k, 8k, 16k, whatever. The pixels don't just get up and move to where you're looking :p .

Although bandwidth could potentially be reduced.
 
  • Like
Reactions: Youn
like this
Unless your parents were top-notch photographers, which they may be / have been, you're lucky to get 1MP of real detail from slides. Especially with the lenses of that time, you run into real optical limitations that cap resolution quite quickly.
Definitely get more than 1mp for the slides I've scanned. They were mostly Kodachrome from the 50s and 60s. But they did require work after the fact (as has pretty much every negative I've ever scanned). But no doubt we've long since surpassed film with DSLRs.
 
I have no doubt 8k will be a thing in the near future, but it's a good example of one technology out pacing it's complementary counterparts.
 
I have no doubt 8k will be a thing in the near future, but it's a good example of one technology out pacing it's complementary counterparts.
isn't that almost always the case with TVs and Monitors?

Going with TVs, HD TVs existed several years before there was any content (other than broadcasts of nature videos that I only saw in AV stores). 1080p existed years before Bluray. 4K existed well before there was UHD disks and who knows if we'll get 4k Broadcasts. If it was me, i'd want to make sure my 4K setup could easily switch to 8K (or alternatively that TVs could downsample 8K to 4K).

For Gaming, 4K monitors were around for years before playing in 4K was practical, given the cost of GPUs. I'm a bit behind on it, so I'm not sure what it'd cost for a GPU that could do 4K with all the bells and whistles. my guess is it's very pricey.

I think it's pretty much always display then content.
 
isn't that almost always the case with TVs and Monitors?

Going with TVs, HD TVs existed several years before there was any content (other than broadcasts of nature videos that I only saw in AV stores). 1080p existed years before Bluray. 4K existed well before there was UHD disks and who knows if we'll get 4k Broadcasts. If it was me, i'd want to make sure my 4K setup could easily switch to 8K (or alternatively that TVs could downsample 8K to 4K).

For Gaming, 4K monitors were around for years before playing in 4K was practical, given the cost of GPUs. I'm a bit behind on it, so I'm not sure what it'd cost for a GPU that could do 4K with all the bells and whistles. my guess is it's very pricey.

I think it's pretty much always display then content.

Ya, somewhat. Bluray came in a weird time where streaming was getting started and Sony was giving the finger to customers on the bluray disc tech to keep the prices artificially inflated. Since most consumers were not tech savvy enough to know the difference between 4k and 1080 it was essentially a shoulder shrug thing.

We really have not even conquered scaling for hidpi monitors yet so 8k is mostly just going to make things worse initially. At least imo.
 
We really have not even conquered scaling for hidpi monitors yet so 8k is mostly just going to make things worse initially. At least imo.

It's going to take real innovation from OS vendors on the desktop. So very many targets to account for. HDR has the same problem.

But essentially, we'll get to the point where the OS is informed of the capabilities of the display and processes output accordingly, where the applications don't know by default and for the most part don't care, with it all calibrated to the environment around the display and the user in question.

Someday.
 
Ya, somewhat. Bluray came in a weird time where streaming was getting started and Sony was giving the finger to customers on the bluray disc tech to keep the prices artificially inflated. Since most consumers were not tech savvy enough to know the difference between 4k and 1080 it was essentially a shoulder shrug thing.

We really have not even conquered scaling for hidpi monitors yet so 8k is mostly just going to make things worse initially. At least imo.

I kinda disagree. Just like with 4K and DVD, prices were initially high (but current standards, but not when compared to VHS). The only reason DVD seemed cheap compared to prices on Blu Ray was the insane discounts you could get with coupons for virtually every etailer (and god there were so many DVD sellers with coupons back then).

Now 4K can be pricey compared to the older stuff now, but again people forget that DVDs were selling (without coupons) for 25-30 bucks in stores....but again etailers regularly sold them at a loss.
I look at 4K today and prices generally start at 30 bucks, but a few months later, I see those disks for much MUCH less. FFS, you can buy virtually all of the recent Disney and Marvel releases, in a steelbook for 15-20 bucks at Best Buy.

As I recall, Blu Ray prices came down fairly quickly. I know by 2009 I was getting great deals on new releases. Yes, they were 20 or so bucks, but adjusted for inflation, that was super cheap (and unadjusted it wasn't much more than DVDs were without a coupon in 99 or 2000.
 
Back
Top