Why OLED for PC use?



As far as we know this would only fix the ASBL from kicking in during HDR scenes with APL that doesn't change much, I don't think it would fix the ASBL problem on desktop use like say you have spreadsheets open or something and no moving content.
 
  • Like
Reactions: elvn
like this
If I were to buy a screen right now, it would most likely be a 77" LG G3 for the media/home theater room. The 83" G3 doesn't have MLA this year.

. .

Sounds really nice. Premium price though. Even the 55" is ~ $2500 at release I think.

Ignoring price and size considerations (on either end larger/smaller depending on your room usage scenario) the micro lens array sounds like the only big reason to jump to a G3, or to even upgrade from previous gens of oled really in the near timeframe imo. I wouldn't bother going to a C3 (C not G3) from cx or c1, c2 myself.

https://www.whathifi.com/reviews/lg-g3-oled-tv-c3-oled-tv

OLED Dynamic Tone Mapping Pro allows the G3 and C3 to break the image up into 20,000 blocks, or zones. By contrast, the previous Alpha 9 Gen 5 processor found in last year's C2 model could only manage 5000 blocks.

Resolution increases sound good to me wherever they improve . . display rez/PPD, FALD lighting resolution, fpsHz motion rez and image clarity resolved, tone mapping grid resolution breakdown too it seems according to that which I wasn't aware of.

These new processing features pale into insignificance when compared to the hardware upgrade that LG is bringing with the G3 (but not the C3). Called Micro Lens Array, it's basically a layer of tiny lenses that help improve the focus of the light that is being emitted by the OLEDs, essentially meaning that more of the light that's generated actually makes it to your eyes, apparently resulting in a big increase in brightness over last year's already-very-bright G2. Like the G2, the G3 also boasts a brightness-boosting heatsink that its C-series sibling has to do without.

. .


lg-display-oled-MLA-diagram-7.jpg



. .

https://tftcentral.co.uk/articles/a...technology-and-micro-lens-array-mla-explained


While the animated marketing image simulating the increased brightness almost looks like the light output is scattering, the lenses are actually keeping the output more dense and less scattered supposedly, focusing and amplifying the light.

As an example from one of their new panels, LG.Display’s third-generation 77” 4K OLED TV panel based on META technology has a total of 42.4 billion micro lenses, approximately 5,117 micro lenses per pixel, which work to emit even the lost light due to internal reflections to produce the clearest and most detailed pictures. Keep in mind this number of MLA lenses will vary depending on panel size and pixel density.

. .

I'd actually be happy with a boxy housing and more aggressive active cooling if it gave higher performance. Also not that concerned with power usage except that energy efficiency is better vs heat in regard to brightness hitting levels requiring ABL.

Keep adding layers of tech to the front or the back (QD, Meta/MicroLensArray, Heatsinks, etc.) as long as it improves performance without major side effects. Extra toppings may cost extra. (y)


3d8c114fcff2510b58dea3d92def510acd14ef5d.gif

"
 
As far as we know this would only fix the ASBL from kicking in during HDR scenes with APL that doesn't change much, I don't think it would fix the ASBL problem on desktop use like say you have spreadsheets open or something and no moving content.

That's more applicable to the original question of this thread for sure. That would still be nice for media and gaming though. It darkens a few popular show's scenes famously where it shouldn't which is annoying. If you step back a step in the player/services and then resume it will be light again but having to do that to kickstart the brightness again, even in very rare scenes. . sucks.
 
That's more applicable to the original question of this thread for sure. That would still be nice for media and gaming though. It darkens a few popular show's scenes famously where it shouldn't which is annoying. If you step back a step in the player/services and then resume it will be light again but having to do that to kickstart the brightness again, even in very rare scenes. . sucks.

It's even worst when using it on the desktop, which is why I disabled it in the service menu within a few days as I couldn't stand it anymore.
 
Using 60 PPD 4k as a reference point for what I consider a minimum here (optimally, not always possible obviously) :

. . . . . . . . . . . . . . . . . . . . . .

98" 4k screen at ~ 69" away has the same PPD and viewing angle and looks the same as

80" 4k screen at ~ 56" away

77" 4k screen at ~ 54" away (60PPD, 64deg viewing angle) ~ > 4.5'

65" 4k screen at ~ 45" away

55" 4k screen at ~ 39" away ~ > 3ft 3"

48" 4k screen at ~ 34" away

42" 4k screen at ~ 29" away

31.5" 4k screen at ~ 22" away

27" 4k screen at ~ 19" away

. . . . . . . . . . . . . . . . . . . . . .


Just something to consider with very large screens if you value the tighter pixel grid look most people think of when they think of a 4k screen. That said, people used 720p tvs and 1080p tv for years so it could depend on how much you value picture quality pixel density wise.

There will probably be more 8k tvs starting to hit the market in 2024 - 25 also even though they stalled out on those for 2023.

Most people with a 70"+ screen are probably not setting it up closer than 5' away but you might be surprised.

I very much see 8k as a useless gimmick.

If you get close enough to a 4k screen that it fills your field of view, I very much doubt blockiness is an issue.

I have vision corrected to 20/20+ and sit ~18-24" (depending on seating position) from my 42" 4k screen and have no issues at all with pixel density. I mean, sure aliasing is present in rendered titles, but that's kind of a red herring, as aliasing will never be solved by higher resolution. You need some sort of smoothing or anti-aliasing effect to eliminate that.

I used to sit at the same distance from a curved 48" 4k screen, and that was admittedly a little much.

The only way I can see 4k having insufficient pixel density is in corner cases where people sit close enough that they focus on only a small portion of the screen. This may be a real use case, but for movies and games it probably isn't.
 
Last edited:
I very much see 8k as a useless gimmick.

If you get close enough to a 4k screen that it fills your field of view, I very much doubt blockiness is an issue.

I have vision corrected to 20/20+ and sit ~18-24" (depending on seating position) from my 42" 4k screen and have no issues at all with pixel density. I mean, sure aliasing is present in rendered titles, but that's kind of a red herring, as aliasing will never be solved by higher resolution. You need some sort of smoothing or anti-aliasing effect to eliminate that.

I used to sit at the same distance from a curved 48" 4k screen, and that was admittedly a little much.

The only way I can see 4k having insufficient pixel density is in corner cases where people sit close enough that they focus on only a small portion of the screen. This may be a real use case, but for movies and games it probably isn't.
Those obsessed over pixel density would almost certainly cringe at me being ~1m away from a 50" display.

There's nothing wrong with it, honestly. I would be on a 43" monitor but every single 43" LCD panel is flawed so I went up instead of down. Pixel density doesn't cross my mind when either gaming or working.

Not everything has to achieve the pitch of my phone. It's frankly a waste of resources to do so, especially at 40"+.
 
Those obsessed over pixel density would almost certainly cringe at me being ~1m away from a 50" display.

There's nothing wrong with it, honestly. I would be on a 43" monitor but every single 43" LCD panel is flawed so I went up instead of down. Pixel density doesn't cross my mind when either gaming or working.

Not everything has to achieve the pitch of my phone. It's frankly a waste of resources to do so, especially at 40"+.

It's just me but I truly do not understand the obsession with super high PPI displays. I have a 12.9" iPad Pro so yes I do know what a good high PPI display looks like. Does it look super crisp and clear? Absolutely. Would I say it is an definite must have for my desktop monitor? Not even close, just feels like quite a waste of resources indeed.
 
It's just me but I truly do not understand the obsession with super high PPI displays. I have a 12.9" iPad Pro so yes I do know what a good high PPI display looks like. Does it look super crisp and clear? Absolutely. Would I say it is an definite must have for my desktop monitor? Not even close, just feels like quite a waste of resources indeed.
It's mostly an issue with MacOS. MacOS's scaling system is complete garbage where integer scaling is really the only thing that does not cause extra blur. On Windows 4K @ 150% ("looks like 2560x1440") looks nice and sharp, on MacOS it's a little bit more blurry. I can easily test this out just by swapping inputs between my Mac and PC. MacOS instead seems to just naively do "target resolution * 2" so 150% scaling renders at 5120x2880 which the display then downscales to native res.

To combat the blur, I just ended up turning the "sharpness" feature on my display up a couple of notches and that makes it close enough to how things look on my Macbook Pro's 16" display without introducing any noticeable artifacts.

On top of that I recently found that my 4738.99 euro Macbook Pro 16" M2 Max can't fuckin' do 4K 144 Hz @ 150% scaling with HDR. The HDR toggle just disappears. It does work at 60 Hz, it does work if I use integer scaling (200% or 1080p) or native resolution. So the issue is entirely that MacOS scaling is somehow tied to DP/HDMI 2.1 bandwidth limitations, which is insane because it should always just output 4K to the display. Meanwhile on the same display my 13600K's integrated GPU has no trouble with 4K 144 Hz at any scaling level in Windows, with HDR.

So all this bullshit leads to Apple's 5K and 6K displays being the best option for Mac users - they just look nicer, while being overpriced and underperforming. My current Mac has, apart from resolution, a superior HDR display to the 6K XDR display with more dimming zones while keeping similar brightness capabilities - over 1000 nits sustained, 1600 peak brightness.

Of course, the larger your display, the more you want resolution if viewing distance needs to be reasonably close. On the LG CX 48" I always felt it would be a lot better if it was 6K or 8K. But in its intended purpose - as a living room media/gaming display it looks just fine at 4K because my sofa is further away.
 
Last edited:
  • Like
Reactions: elvn
like this
Going to be honest, I don't know how either Windows or Mac behave at anything other than 100% scaling. I had an old PowerBook G4 (remember those?) with a 1024x768 display (remember that? ;)) and I switched my desktop away from Windows to Linux in 2007.

I have, though not particularly intentionally, always stuck with displays that don't need scaling.

And yeah, I've also inevitably used phones and tablets with high PPI displays over the years and they look great. I don't see the immediate need to waste GPU power on that for my large desktop displays though.
 
I very much see 8k as a useless gimmick.

If you get close enough to a 4k screen that it fills your field of view, I very much doubt blockiness is an issue.

I have vision corrected to 20/20+ and sit ~18-24" (depending on seating position) from my 42" 4k screen and have no issues at all with pixel density. I mean, sure aliasing is present in rendered titles, but that's kind of a red herring, as aliasing will never be solved by higher resolution. You need some sort of smoothing or anti-aliasing effect to eliminate that.

I used to sit at the same distance from a curved 48" 4k screen, and that was admittedly a little much.

The only way I can see 4k having insufficient pixel density is in corner cases where people sit close enough that they focus on only a small portion of the screen. This may be a real use case, but for movies and games it probably isn't.

. . .

If you get close enough to a 4k screen that it fills your field of view, I very much doubt blockiness is an issue.

I agree with that sentiment with 4k but as it fills your field of view just so - where the methods we use to compensate for how blocky the pixels really are able to compensate fully*.
When any 4k screen of any size is far enough away that it -just fills- your 60 to 50 degree human viewing angle a 4k screen gets 64 to 77PPD which is nice. When you sit closer than that spilling the 4k screen outside of where it fits your human 60 to 50 degree viewing angle, no. While you can scale the text facet at the cost of desktop real-estate sitting closer text scaling does nothing for the pixel size in games and the 2d desktop graphics and imagery. Sitting a 42" 4k screen at ~ 24" view distance is like 1500p desktop monitor pixel sizes and if the text is then scaled accordingly it's probably back to 1500p like desktop real-estate too.

While 4k at 60 to 50 degree , ~ 64 to 77 PPD looks decent for gaming and text that's because we are masking how bad the PPD really is where we can.

Text sub sampling and graphics anti aliasing to smudge the edges.

*2D desktop graphics and imagery have no such masking on them typically.

*Non standard subpixel layout issues, DLSS and frame amplification edge artifacts, etc. are all minimized, literally, by using even higher PPD .. smaller perceived pixels. Larger perceived pixels, larger problems (like non-standard subpixel layout effect on text which is often a vocal complaint when people cram a larger gaming tv directly onto a desk).

. . .

I agree with you though, 4k at 60 to 50 degree viewing angle where the screen fits into your human FoV "fills your field of view" as you put it the PPD after the masking methods are applied looks quite good. - rather than overfilling your field of view where the screen surface is instead being pushed outside of your human viewing angle and the pixels will look larger. (60 to) 64 to 77 ppd is pretty nice for sure.

...

Thats not to say that 1600p, 1500p, 1440p, even 1080p screens at desktop distances or similar perceived pixel sizes in relation to your view distance on larger screens are not usable, they just won't get the fine pixel density most people expect of a 4k screen, which a 4k screen is capable of when viewed at optimal viewing angles vs. viewing distance. So they will fringe more on contrasted edges since the methods we use to mask how bad the pixel sizes actually are won't be able to compensate fully anymore. (Will also exacerbate other artifacts like non-standard subpixel layouts, DLSS/Frame amplification edge artifacts, etc).

===============================

Personally I'd still love an 8k 55" 1000R screen someday as a wall of screen real-estate outside of games. On a 1000R curvature screen the 1000R, 1000mm radius or focal point of the curve is around 40" which as your view distance would "only" be around 61 to 62 PPD at 4k. That's not too bad as a minimum for rgb but especially with non-standard pixel structures and the uncompensated for 2D desktop graphics and imagery a higher PPD would be better. I prefer 70PPD or more on a 4k OLED screen for comparison if I can get it (55deg which would be 35" view distance on a 42" 4k, or ~ 40" view distance on a 48" 4k).

With a 8k 55" screen you'd get something like 49" wide x 28" tall. That would be like having a quad of four 28" 4k screens. I'd love that kind of thing for desktop/app + media use outside of gaming, and with games I could put whatever size resolution 1:1 (or AI upscaled) within that I wanted. Anything from full screen 4k upscaled via DLSS quality AI upscaling to 8k (and perhaps frame amplification tech as it matures) - DLSS which benefits more from higher starting resolutions as a foundation to work from and has it's own AA .. or running a 32:10 or 21:10 3840x based resolution for some games AI upscaled (to 7680x 2400 / 7680x3200 across the whole screen width) or even running a 1:1 pixel 4k screen space, or 5k, or 6k, letterboxed or tiled/windowed. It would be like having both a big 8k screen useful for some things full screen (with AI upscaled 4k content) + a wall of resolution and high PPD to be tiled or letterboxed however I wanted to instead of using a monitor array with bezels.

don't see the immediate need to waste GPU power on that for my large desktop displays though.
You're not wrong, but neither were the people using 1080p screens when they said that about 1440 ones, or 1440p screens when they said that about 4k ones. Hell probably 800x600, 1024x768 :LOL:

AI upscaling from a decently high starting resolution is pretty good already now and will probably get even better. GPUs are only going to get more powerful, and frame amplification tech has a lot of room to mature too going forward. We' just landed on good 4k performance more or less now but I'm always looking to the horizon.

MBPT56W.png
 
Last edited:
InnoCN 32M2V is the way to go. I replaced my LG CX with this as my primary display and for the price it's as good as it gets. It went as low as $799 on Amazon so that matches with the historical low of the LG 42 C2. For $799 you will not find a better 32" monitor out there, period. It isn't OLED so there's no fear of burn in, it's 4K at 32 inches with RGB subpixels so text clarity is AMAZING. Build quality actually feels rather excellent, but of course there will always be haters out there who don't even own the monitor but just wanna rag on the fact that it's a Chinese brand so it must be built poorly and cheaply by default, lol as if displays from Samsung, Acer, etc. don't have any build quality problems themselves. :rolleyes:
Isn't that a 60hz display?
 
You're not wrong, but neither were the people using 1080p screens when they said that about 1440 ones, or 1440p screens when they said that about 4k ones. Hell probably 800x600, 1024x768 :LOL:
Heh, no doubt. Resolution increases for me have been about more screen real estate though, not an attempt to get a pixel density equivalent to a phone/tablet.

I've found legitimate usefulness increase as screens have gotten bigger and I can fit more work (or game!) on the screen at once.

Higher PPD than what I have doesn't increase usefulness for me, personally.
 
Heh, no doubt. Resolution increases for me have been about more screen real estate though, not an attempt to get a pixel density equivalent to a phone/tablet.

I've found legitimate usefulness increase as screens have gotten bigger and I can fit more work (or game!) on the screen at once.

Higher PPD than what I have doesn't increase usefulness for me, personally.
For me the equivalent of dual 2560x1440 screens (or 5120x1440) without scaling is basically as much desktop space as I really need for anything I do.

But my complaint with that setup was that it could be sharper for text, which is why I'm looking forward to the Samsung 57" 7680x2160 or dual 4K screen. I'll just end up scaling it to the equivalent of 5120x1440.

I read enough text that it started to bug me that reading on my 12.9" iPad Pro or 16" Mac was nicer than doing it on my desktop systems.
 
For me the equivalent of dual 2560x1440 screens (or 5120x1440) without scaling is basically as much desktop space as I really need for anything I do.

But my complaint with that setup was that it could be sharper for text, which is why I'm looking forward to the Samsung 57" 7680x2160 or dual 4K screen. I'll just end up scaling it to the equivalent of 5120x1440.

I read enough text that it started to bug me that reading on my 12.9" iPad Pro or 16" Mac was nicer than doing it on my desktop systems.

I find that I can never have enough desktop screen real estate. I work with massive spreadsheets, and combining data from many different windows/sources. There is no level at which I'd say no to more desktop space.

I lose so much focus and efficiency when switching between windows, that I want ALL windows open on my desktop and visible at the same time so I don't have to keep trying to find that window that I just had open.

On the flipside, I am perfectly happy with traditional ~100ppi desktop pixel density.

My computer is not a phone. I don't hold the screen right up to my face, so I really don't need higher pixel density.
 
PPI is nice, but not the most important thing IMO, and the scaling can be an issue.

I was completely planning on going 4K @ 32" for my new upgrade since my computer can now generally do well with it in games, but after not being happy with the 32" options I tried, I'm actually fine with the 2560 x 1440, especially at 27", and I had to use scaling at 4K even on the bigger monitor, whereas I can leave it at 100% here. It's still enough real estate for me though the more was definitely one nice thing the 4K res offered. Maybe next monitor something that fits me well will become available. Until then, as long as burn-in doesn't become an issue, this OLED has been perfect for my needs so far.
 
Wow... and here I've been since 2014 using 4k at 28" size the most but also having tried 24, 27, and 32 inch screens and always been thinking 8k (or higher than 4k) would be great! More ppi = sharper image, scaling works fine and if I want more desktop room I could always buy a larger 8k screen then if desired.

I'd love 8k at 32" ideally, as I sit about 24" from my screen on my desk. I'm surprised the thought of that is eliciting yawns here.

My 24" 4k was by far the best due to sharpness, but the physical size was just too small for me :(.
 
Wow... and here I've been since 2014 using 4k at 28" size the most but also having tried 24, 27, and 32 inch screens and always been thinking 8k (or higher than 4k) would be great! More ppi = sharper image, scaling works fine and if I want more desktop room I could always buy a larger 8k screen then if desired.

I'd love 8k at 32" ideally, as I sit about 24" from my screen on my desk. I'm surprised the thought of that is eliciting yawns here.

My 24" 4k was by far the best due to sharpness, but the physical size was just too small for me :(.

Perhaps calling 8K useless or a gimmick is overly harsh, sure there are definitely benefits to have higher resolution, but is it really worth it? Just how clear do you want text to be? I get it, sharp crisp text looks nice, but pushing over 32 million pixels for that alone just seems totally not worth it IMO. 8K is probably an even less noticeable upgrade in gaming/movies. There is simply too much diminishing returns past a certain point. Oh and let's not forget, the smaller the pixels, the harder it's going to be to push more brightness for HDR...so anyone who wants better HDR but also wants smaller pixels, yeah that sounds kinda counterintuitive. I'll take the bigger pixels with better HDR performance over the opposite.
 
Just how clear do you want text to be?
4k is fine for text at 28" for me... It's the detail in games I want, especially distance detail. I also like the higher ppi for video and photo editing, as well as sculpting and hand painting textures (I'm an indie game dev by trade).

Regarding the rest, I've yet to try HDR, so I can't really comment on that. I'll have to soon enough though so I can master content in it ;). Going from 1680x1050 to 2560x1600 to high refresh 1440p to 4k have all been upgrades for me.

I want to finally go high refresh hdr 4k at some point soon, but oled fringing and lack of smaller 4k panels have me hesitant. I may just get an oled for mastering and gaming, then an m28u or similar for normal use and coding. I hate there's no all in one display...
 
Last edited:
Perhaps calling 8K useless or a gimmick is overly harsh, sure there are definitely benefits to have higher resolution, but is it really worth it? Just how clear do you want text to be? I get it, sharp crisp text looks nice, but pushing over 32 million pixels for that alone just seems totally not worth it IMO. 8K is probably an even less noticeable upgrade in gaming/movies. There is simply too much diminishing returns past a certain point. Oh and let's not forget, the smaller the pixels, the harder it's going to be to push more brightness for HDR...so anyone who wants better HDR but also wants smaller pixels, yeah that sounds kinda counterintuitive. I'll take the bigger pixels with better HDR performance over the opposite.
IMO for 32", 6K is more than enough and 8K won't improve the situation much. 8K would be more useful in the 42-55" sizes. It's entirely pointless on something like a 70+ inch TV when the viewing distance for that needs to be so high that you won't be able to tell 4K from 8K anyway.

8K would have better integer scaling options for gaming though.
 
Wow... and here I've been since 2014 using 4k at 28" size the most but also having tried 24, 27, and 32 inch screens and always been thinking 8k (or higher than 4k) would be great! More ppi = sharper image, scaling works fine and if I want more desktop room I could always buy a larger 8k screen then if desired.

I'd love 8k at 32" ideally, as I sit about 24" from my screen on my desk. I'm surprised the thought of that is eliciting yawns here.

My 24" 4k was by far the best due to sharpness, but the physical size was just too small for me :(.
Yea an 8k 50" would be awesome.
 
  • Like
Reactions: elvn
like this
Perhaps calling 8K useless or a gimmick is overly harsh, sure there are definitely benefits to have higher resolution, but is it really worth it? Just how clear do you want text to be? I get it, sharp crisp text looks nice, but pushing over 32 million pixels for that alone just seems totally not worth it IMO. 8K is probably an even less noticeable upgrade in gaming/movies. There is simply too much diminishing returns past a certain point. Oh and let's not forget, the smaller the pixels, the harder it's going to be to push more brightness for HDR...so anyone who wants better HDR but also wants smaller pixels, yeah that sounds kinda counterintuitive. I'll take the bigger pixels with better HDR performance over the opposite.

Just how clear do you want text to be?

Ultimately I'd want the pixels so small that we didn't have to rely on edge smudging hacks anymore to hide how bad the pixel structure really looks but that's asking too much currently. The 2d desktop's graphics and imagery get no edge smudging compensations either typically. People doing graphics/photo/art work always want the highest PPD they can get because they are working in 2D imagery. The higher the PPD, the less staircase fringing on contrasted edges, the less non-standard subpixel layouts will affect how things look to your eyes, the less obvious DLSS/Frame amplification edge artifacts will be, etc. Larger perceived pixel sizes, larger problems.

But for now I was talking about something like a 55" 1000R 8k dream screen of mine which would "only" be around 122 PPD. That would look sweet but it's nowhere near overkill PPD wise imo. You can test this by taking a current 4k screen and stepping back from it far enough. That gives and idea in regard to pixel sizes - not viewing angles and screen real-estate obviously. You'd probably scale the default text size up a bit to get back to around "magazine article size" to your eyes which would give the edges more resolution/pixels to round edges with. Much finer piece of graph paper to color in squares on to make your fonts with. You'd probably have to do your text sub sampling customization all over again too since the edges would be different.

42" 4k screen at 65" view distance = 55" 8k screen at 40" view distance = ~ 122 PPD
48" 4k screen at 74" view distance = 55" 8k screen at 40" view distance = ~ 122 PPD
55" 4k screen at 85" view distance = 55" 8k screen at 40" view distance = ~ 122 PPD


My previous laptop was a 15.6" 4k screen. At 18" view to 24" view it was 90 to 121 PPD depending on what I was doing with it. Looked great.



the smaller the pixels, the harder it's going to be to push more brightness for HDR...so anyone who wants better HDR but also wants smaller pixels

The qdLED FALD LCD samung 8k screens can do 2000+ nit right now but that along with their 4k version that does 2000+nit both suffer from aggressive ABL due to the heat even though they are using LEDs and not OLEDs. If they used more boxy housings and heatsinks with active cooling they'd probably operate better. Just mentioning this because there is already a 2000+ nit 8k fald screen on the market.

micro OLED screens in upcoming gens of VR screens can go extremely bright supposedly too and they are tiny. In VR their PPD is low however since it's right near your eyeballs, even when 8k per eye/lens. Just saying there can be very tiny pixels even on some types of oled tech that can go extremely bright, (even if subtracting the effect of it looking even brighter from having the VR screens so close to your eyeballs).
. .
 
Last edited:
Ultimately I'd want the pixels so small that we didn't have to rely on edge smudging hacks anymore to hide how bad the pixel structure really looks but that's asking too much currently. The 2d desktop's graphics and imagery get no edge smudging compensations either typically. People doing graphics/photo/art work always want the highest PPD they can get because they are working in 2D imagery. The higher the PPD, the less staircase fringing on contrasted edges, the less non-standard subpixel layouts will affect how things look to your eyes, the less obvious DLSS/Frame amplification edge artifacts will be, etc. Larger perceived pixel sizes, larger problems.

But for now I was talking about something like a 55" 1000R 8k dream screen of mine which would "only" be around 122 PPD. That would look sweet but it's nowhere near overkill PPD wise imo. You can test this by taking a current 4k screen and stepping back from it far enough. That gives and idea in regard to pixel sizes - not viewing angles and screen real-estate obviously. You'd probably scale the default text size up a bit to get back to around "magazine article size" to your eyes which would give the edges more resolution/pixels to round edges with. Much finer piece of graph paper to color in squares on to make your fonts with. You'd probably have to do your text sub sampling customization all over again too since the edges would be different.

42" 4k screen at 65" view distance = 55" 8k screen at 40" view distance = ~ 122 PPD
48" 4k screen at 74" view distance = 55" 8k screen at 40" view distance = ~ 122 PPD
55" 4k screen at 85" view distance = 55" 8k screen at 40" view distance = ~ 122 PPD


My previous laptop was a 15.6" 4k screen. At 18" view to 24" view it was 90 to 121 PPD depending on what I was doing with it. Looked great.





The qdLED FALD LCD samung 8k screens can do 2000+ nit right now but that along with their 4k version that does 2000+nit both suffer from aggressive ABL due to the heat even though they are using LEDs and not OLEDs. If they used more boxy housings and heatsinks with active cooling they'd probably operate better. Just mentioning this because there is already a 2000+ nit 8k fald screen on the market.

micro OLED screens in upcoming gens of VR screens can go extremely bright supposedly too and they are tiny. In VR their PPD is low however since it's right near your eyeballs, even when 8k per eye/lens. Just saying there can be very tiny pixels even on some types of oled tech that can go extremely bright, (even if subtracting the effect of it looking even brighter from having the VR screens so close to your eyeballs).
. .

And if it was 4K instead of 8K it would be brighter or require less power draw, thr latter of which seems to be a problem for future TVs with energy regulations. You simply aren't going to push the same brightness on 8K as you would on a 4K screen without more work.
 
I find it silly that I can run a hot tub, pool filter, (I own neither of those, just saying) . . multiple high powered gpu rigs with ineffecient and overclocked gpus between desktop and laptop, surround systems and amplifiers, etc... I can buy multiple screens instead of one for a multi-monitor array, but they don't want me to be able to buy a single screen with double the power use. I don't live in europe but it may affect the mfg's offerings.
 
Last edited:
Yeah it's dumb but it is what it is. And again the point is any example you show of an 8K OLED doing high brightness, a 4K version would just be even brighter if given the same specs. Sure a 1000hp car that weighs 5000lbs is pretty fast, but a 1000hp car that weighs half that would be even faster (given that traction isn't a problem). You take a 55" 8K OLED that can do 1000 nits and a 4K one would just be even brighter as long as it's not artificially restricted for burn in prevention. And I would happily take the brighter 4K over the sharper 8K any day of the week. 8K also goes against the pursuit of higher frame rates being that it's much harder to drive than 4K, another reason why I don't want it. Sure we could just "upscale" 4K to 8K for performance savings but still you now pretty much bought an 8K screen only to play at 4K in the end, and I'm sure it would barely look any better than native 4K with DLAA. Of all the things to want for future displays, 8K is at the very bottom of my list. Instead, gimme better HDR, and give me more Hz. 4K 1000Hz with that reprojection stuff blurbuster keeps talking about would probably look amazing in games, and on the desktop you would always be seeing that 1000fps.
 
Last edited:
I find it silly that I can run a hot tub, pool filter, (I own neither of those, just saying) . . multiple high powered gpu rigs with ineffecient and overclocked gpus between desktop and laptop, surround systems and amplifiers, etc... I can buy multiple screens instead of one for a multi-monitor array, but they don't want me to be able to buy a single screen with double the power use. I don't live in europe but it may affect the mfg's offerings.
They've gone after a number of appliances already. Vacuum cleaners, ovens, microwaves etc. Overall it's just one approach to reduce energy usage - if manufacturers are given free reign then they might never prioritize power efficiency on stuff like this.

It seems that TV manufacturers are kind of getting around the regulations by simply having more eco modes that are on default but the user can disable.
 
Yea an 8k 50" would be awesome.
I'd be curious to see what this looks like. I'd almost certainly scale everything to 200% for the 4k equivalent real estate but the sharpness would be a sight to see. It is legitimately the resources that are the concern though. If I'm content with 4k at 50", is it worth pushing four times the number of pixels for the sake of sharpness? That, and the sharpness itself would only work for vectors - raster elements would be softly scaled (unless nearest neighbour scaling can be forced) and I'm not a fan of that.

I reserve final judgement until I see it in action for myself. ;)
 
I'd be curious to see what this looks like. I'd almost certainly scale everything to 200% for the 4k equivalent real estate but the sharpness would be a sight to see. It is legitimately the resources that are the concern though. If I'm content with 4k at 50", is it worth pushing four times the number of pixels for the sake of sharpness? That, and the sharpness itself would only work for vectors - raster elements would be softly scaled (unless nearest neighbour scaling can be forced) and I'm not a fan of that.

I reserve final judgement until I see it in action for myself. ;)

I'm sure it's going to look great. But again, is it worth it just for the sake of "muh sharpness"? If I had 2 options for my next display, one that uses DP 2.1 to keep 4K resolution but push refresh rates to 500Hz, while the other opts for 8K 144Hz instead, and the 8K display suffers from either lower HDR brightness or the same HDR brightness but at much higher heat and power consumption to do so, I know which display I would go for. There will be those who go for the other option of course, just giving my 2 cents.
 
Was
I'd be curious to see what this looks like. I'd almost certainly scale everything to 200% for the 4k equivalent real estate but the sharpness would be a sight to see. It is legitimately the resources that are the concern though. If I'm content with 4k at 50", is it worth pushing four times the number of pixels for the sake of sharpness? That, and the sharpness itself would only work for vectors - raster elements would be softly scaled (unless nearest neighbour scaling can be forced) and I'm not a fan of that.

I reserve final judgement until I see it in action for myself. ;)
Ya it would look nice but until we have GPUs 4 times as fast as a 4090 I'm not going near 8k lol.
 
I'd be curious to see what this looks like. I'd almost certainly scale everything to 200% for the 4k equivalent real estate but the sharpness would be a sight to see. It is legitimately the resources that are the concern though. If I'm content with 4k at 50", is it worth pushing four times the number of pixels for the sake of sharpness? That, and the sharpness itself would only work for vectors - raster elements would be softly scaled (unless nearest neighbour scaling can be forced) and I'm not a fan of that.

I reserve final judgement until I see it in action for myself. ;)

It would look just like a quad of 28" 4k screens with no bezels, sitting 35" to 40" (40" view distance if at the radius of a 1000R curved one would make your eyes equidistant from all points on the curve).

AI upscaling looks good and better the higher the base rez you start from. If you start from 4k , 8k will look great full screen. Any edge artifacts from AI upscaling and/or frame amplification would be that much tinier at ~ 122 PPD as well. 4090's and the next few gpu gens after should run 4k rez well enough frame rate wise to be upscaled. You could also run other resolutions on it 1:1, full width ultrawides and 1:1 letterboxed/tiled/windowed resolutions of 4k, 5k, 6k, etc, sitting closer when desired. Best of both worlds and a ton of relatively high PPD real estate.


"With a 8k 55" screen you'd get something like 49" wide x 28" tall. That would be like having a quad of four 28" 4k screens. I'd love that kind of thing for desktop/app + media use outside of gaming, and with games I could put whatever size resolution 1:1 (or AI upscaled) within that I wanted. Anything from full screen 4k upscaled via DLSS quality AI upscaling to 8k (and perhaps frame amplification tech as it matures) - DLSS which benefits more from higher starting resolutions as a foundation to work from and has it's own AA .. or running a 32:10 or 21:10 3840x based resolution for some games AI upscaled (to 7680x 2400 / 7680x3200 across the whole screen width) or even running a 1:1 pixel 4k screen space, or 5k, or 6k, letterboxed or tiled/windowed. It would be like having both a big 8k screen useful for some things full screen (with AI upscaled 4k content) + a wall of resolution and high PPD to be tiled or letterboxed however I wanted to instead of using a monitor array with bezels."
 
I'm sure it's going to look great. But again, is it worth it just for the sake of "muh sharpness"? If I had 2 options for my next display, one that uses DP 2.1 to keep 4K resolution but push refresh rates to 500Hz, while the other opts for 8K 144Hz instead, and the 8K display suffers from either lower HDR brightness or the same HDR brightness but at much higher heat and power consumption to do so, I know which display I would go for. There will be those who go for the other option of course, just giving my 2 cents.
Oh yeah I'm in complete agreement, as things stand I think the downsides of the idea outweigh the positives and I'm perfectly content with what I've got.

AI upscaling looks good and better the higher the base rez you start from.
For games and videos, yes. I'm not sure I want AI upscaling (or any sort of scaling that isn't vector or nearest neighbour) on the desktop though.

I think it's going to be a fair old while until I move on from what I have. Everything out there has compromises and I have, at least for now, found the least annoying set of compromises to work with.
 
I'm sure it's going to look great. But again, is it worth it just for the sake of "muh sharpness"? If I had 2 options for my next display, one that uses DP 2.1 to keep 4K resolution but push refresh rates to 500Hz, while the other opts for 8K 144Hz instead, and the 8K display suffers from either lower HDR brightness or the same HDR brightness but at much higher heat and power consumption to do so, I know which display I would go for. There will be those who go for the other option of course, just giving my 2 cents.

Understandable. And I agree with everyone who is just fine with 4k since we just barely got to the point where we can drive 4k adequately much like when I had 1440p gaming screen at 100fps average or so and some people were moving to 4k.

8k HDR brightness
----------------------------
If talking FALD for HDR brightness, it would just have more pixels per zone while the actual onscreen objects would be exactly the same size in media and games, just at higher (even if AI upscaled, sharpened and dlss AA applied) resolution. So the FALD zones would be exactly the same brightness like they are on both samsung's 2000nit+ 4k and 8k models currently. You could run a 1080p HDR movie or video at 1080p upscaled (AI upscaled by a nvidia shield for example) on a 4k FALD right now at the same brightness levels as 4k similarly. OLED is another story though, sure. But there are better oled isolating technologies and printing techs that keep them from degrading as fast in the works. If you had to have the brightest - FALD is just zones not rez really so you could go that route.

8k Hz
----------
Hz wise, if you ran 1440p upscaled on some 4k screens you could get 120hz before 4k 120hz was a thing across the board on gaming tvs, so you could probably run 4k upscaled (on the tv end) at a higher hz than the 8k native rez could run. You can run 4k 120hz now on a 60hz 8k screen like samsung's 2000+nit 8k one for example. I think it might be best if they started doing AI upscaling on the screen itself like a g-sync module though, vs port and cable bandwidth limitations if that were possible. I'm not saying it would be as fast as the top gaming screens if something came out at 500hz earlier on dp 2.1 so your point is valid and it could be come down to some tradeoff in Hz unless a big 8k gaming screen came out.

GPU vs 8k rez
---------------------
A big 8k gaming screen with dp 2.1 and with AI upscaling module on the screen itself would really be something but that's wishful thinking at this point. However GPU power wise vs fps - you could run 4k, 5k, 6k windowed/tiled/letterboxed on a big (~ 55") 8k screen if you wanted to , or 3840x based ultrawide resolutions upscaled across the whole width of the screen, or 4k AI upscaled to 8k fullscreen - then go back to full 8k of desktop resolution outside of your game. The following gpu gens are only going to get more powerful and AI upscaling (which is good already from a high starting rez on a high PPD screen) and frame amplification tech should mature along the way as well.


. . . .

So if you want just as high of a HDR brightness on a 8k screen, probably go the FALD route. Then it will be the same.

4k will get 240hz - 500Hz capable before 8k for sure but there might be hope for an AI upscaling module on the screen itself eventually.

That 1/2 8k screen due out sounds interesting too. Neo G9. 59" diagonal 7860 x 2160. Could put an 8k screen above it maybe in the meantime :whistle:

https://www.techpowerup.com/302913/...ces-worlds-first-7-680-x-2-160-dp-2-1-monitor
 
Last edited:
https://www.rtings.com/tv/reviews/samsung/s95c-oled#test_608

Aside Fringing and not supporting DV, this is the best gaming TV until A95L comes around.

If you decouple from desk mounted separately on a stand or wall mount, etc and have the room the upcoming 55inch LG G3 with its heatsink and micro lens array will be competitive performance wise if not surpassing though it will be expensive at like $2500 usd at release.

https://www.hdtvtest.co.uk/n/LGs-G3-to-use-MLA-tech-but-only-in-55-65-and-77-inch-models

"the G3 can hit a peak brightness of 1,470 nits."

"The ability of the G3 to surpass 1,000 nits brightness is key, because that is more or less the level at which most modern HDR movies are mastered for home viewing. So any TV that can display images at over 1,000 nits should be able to reproduce the source material exactly as the director intended, without compromising its brightest scenes."


The improved brightness in the LG G3 OLED TV is due to the introduction of Micro Lens Array technology, which incorporates a layer of miniscule lenses that direct more of the light created by the OLED pixels towards the viewer. So, it gives LG a way to increase brightness without squeezing more out of its OLED panels.

LG Display, the subsidiary of LG that produces its OLED displays, has already confirmed that MLA is being used with its third-generation OLED panels, which it calls “META” OLED. However, until now, LG Electronics had not confirmed if the G3 specifically would be using that panel. Now, it has done so, joining Panasonic and Philips, who will also use META panels in their flagship OLED TVs (the Panasonic MZ2000 and Philips OLED908).

However, there’s a caveat, as not all G3 OLED TVs will come with the META panels. "The hardware that's in each size of the G3 differs, and in some sizes, MLA is part of that hardware solution,” Seperson explained.


Seperson didn’t officially state which sizes would get the META panel, but he didn’t really need to either. What he said is that the 55-, 65- and 77-inch G3 TVs will be “70% brighter” than last year’s model,

..

https://www.forbes.com/sites/johnar...icing-and-availability-for-its-2023-oled-tvs/
 
Last edited:
If you decouple from desk mounted separately on a stand or wall mount, etc and have the room the upcoming 55inch LG G3 with its heatsink and micro lens array will be competitive performance wise if not surpassing though it will be expensive at like $2500 usd at release.

https://www.hdtvtest.co.uk/n/LGs-G3-to-use-MLA-tech-but-only-in-55-65-and-77-inch-models








..

https://www.forbes.com/sites/johnar...icing-and-availability-for-its-2023-oled-tvs/
The problem with WRGB is that the more its brightness gets higher the more its colours get washed-out. You can notice it clearly when putting on side-by-side comparison against Quantum-Dots based IPS/VA/OLED displays. And I don't think MLA is a solution for that. The goal for this year's META + MLA WRGB is to beat QD-OLED in brightness just so they can say they have the brightest OLED TV in 2023, only that department. They don't put into consideration of other factors like Colour Coverages and such.
 
The problem with WRGB is that the more its brightness gets higher the more its colours get washed-out. You can notice it clearly when putting on side-by-side comparison against Quantum-Dots based IPS/VA/OLED displays. And I don't think MLA is a solution for that. The goal for this year's META + MLA WRGB is to beat QD-OLED in brightness just so they can say they have the brightest OLED TV in 2023, only that department. They don't put into consideration of other factors like Colour Coverages and such.

WOLED seems to have reached it's peak since the 2016 C6. All they've done after that is just made it more resistant to burn in and then boosted the pure white brightness on the G2 and G3.
 
  • Like
Reactions: Xar
like this
Back
Top