Why OLED for PC use?

Based on HUB's review of the M1 Macbook Pro display from 2021, the performance of these is on par with the Asus PG32UQX, even slightly surpassing brightness at small window sizes, and that's for sustained brightness! So my M2 Macbook Pro should be quite similar. HDR brightness measurements timestamped below.



So tried out the Macbook Pro against my LG CX 48" OLED and while the Macbook Pro is clearly better for HDR, I still find the LG CX 48" OLED to look pretty satisfying in real world scenes and games. Can it be better? Absolutely, higher brightness would help bring out more details in HDR content. I could spot clear differences here in many videos and it wasn't just for very bright details. Even my gf noticed that things looked better.

I tried a number of videos off YouTube, like 4K HDR walkarounds from Japan in light and dark conditions, footage from various games in 4K HDR, nature videos etc.

But I still don't agree at all with kramnelis that the OLED is somehow terrible. Is it as good as a best of the best mini-LED? No, but let's be realistic, it doesn't cost nearly as much and is not some weird subpar tech like kramnelis claims. I guess my eyes just "don't want to see better images."

What this does make me wish is that these 10K+ zone mini-LED backlights would start appearing in desktop size displays. In real content I really could not spot blooming, I had to pull up a starfield type video to see the issue. It's actually pretty weird how a laptop display can outperform TVs and desktop displays in this area. If they were able to fix the abysmal pixel response times that Apple has, this tech could be very good.

PS. MacOS is still an absolute piece of shit for external display support. Could not get 4K 120 Hz with HDR working, 4K 60 Hz only shows the HDR toggle on this latest and greatest M2 Max 16" Macbook Pro. At least my desktop monitor finally runs at 4K 144 Hz, even though it also refuses to go to HDR.
 
Based on HUB's review of the M1 Macbook Pro display from 2021, the performance of these is on par with the Asus PG32UQX, even slightly surpassing brightness at small window sizes, and that's for sustained brightness! So my M2 Macbook Pro should be quite similar. HDR brightness measurements timestamped below.



So tried out the Macbook Pro against my LG CX 48" OLED and while the Macbook Pro is clearly better for HDR, I still find the LG CX 48" OLED to look pretty satisfying in real world scenes and games. Can it be better? Absolutely, higher brightness would help bring out more details in HDR content. I could spot clear differences here in many videos and it wasn't just for very bright details. Even my gf noticed that things looked better.

I tried a number of videos off YouTube, like 4K HDR walkarounds from Japan in light and dark conditions, footage from various games in 4K HDR, nature videos etc.

But I still don't agree at all with kramnelis that the OLED is somehow terrible. Is it as good as a best of the best mini-LED? No, but let's be realistic, it doesn't cost nearly as much and is not some weird subpar tech like kramnelis claims. I guess my eyes just "don't want to see better images."

What this does make me wish is that these 10K+ zone mini-LED backlights would start appearing in desktop size displays. In real content I really could not spot blooming, I had to pull up a starfield type video to see the issue. It's actually pretty weird how a laptop display can outperform TVs and desktop displays in this area. If they were able to fix the abysmal pixel response times that Apple has, this tech could be very good.

PS. MacOS is still an absolute piece of shit for external display support. Could not get 4K 120 Hz with HDR working, 4K 60 Hz only shows the HDR toggle on this latest and greatest M2 Max 16" Macbook Pro. At least my desktop monitor finally runs at 4K 144 Hz, even though it also refuses to go to HDR.


Lower end WOLED probably can't compete as well with high end Mini LED but the Samsung S95C has already shown that it can keep up quite well in that comparison test against the Hisense U8H. It's too bad no monitor sized 4K QD-OLED's exist, and even if it did then the pixel size would just hinder the brightness anyway.
 
Based on HUB's review of the M1 Macbook Pro display from 2021, the performance of these is on par with the Asus PG32UQX, even slightly surpassing brightness at small window sizes, and that's for sustained brightness! So my M2 Macbook Pro should be quite similar. HDR brightness measurements timestamped below.



So tried out the Macbook Pro against my LG CX 48" OLED and while the Macbook Pro is clearly better for HDR, I still find the LG CX 48" OLED to look pretty satisfying in real world scenes and games. Can it be better? Absolutely, higher brightness would help bring out more details in HDR content. I could spot clear differences here in many videos and it wasn't just for very bright details. Even my gf noticed that things looked better.

I tried a number of videos off YouTube, like 4K HDR walkarounds from Japan in light and dark conditions, footage from various games in 4K HDR, nature videos etc.

But I still don't agree at all with kramnelis that the OLED is somehow terrible. Is it as good as a best of the best mini-LED? No, but let's be realistic, it doesn't cost nearly as much and is not some weird subpar tech like kramnelis claims. I guess my eyes just "don't want to see better images."

What this does make me wish is that these 10K+ zone mini-LED backlights would start appearing in desktop size displays. In real content I really could not spot blooming, I had to pull up a starfield type video to see the issue. It's actually pretty weird how a laptop display can outperform TVs and desktop displays in this area. If they were able to fix the abysmal pixel response times that Apple has, this tech could be very good.

PS. MacOS is still an absolute piece of shit for external display support. Could not get 4K 120 Hz with HDR working, 4K 60 Hz only shows the HDR toggle on this latest and greatest M2 Max 16" Macbook Pro. At least my desktop monitor finally runs at 4K 144 Hz, even though it also refuses to go to HDR.

Then what's the 4th image? You won't see better in a dim sRGB. The range will only go up. What's the point to buy OLED to see sRGB? When the times are ready even SDR can look like HDR1000.
 
Based on HUB's review of the M1 Macbook Pro display from 2021, the performance of these is on par with the Asus PG32UQX, even slightly surpassing brightness at small window sizes, and that's for sustained brightness! So my M2 Macbook Pro should be quite similar. HDR brightness measurements timestamped below.



So tried out the Macbook Pro against my LG CX 48" OLED and while the Macbook Pro is clearly better for HDR, I still find the LG CX 48" OLED to look pretty satisfying in real world scenes and games. Can it be better? Absolutely, higher brightness would help bring out more details in HDR content. I could spot clear differences here in many videos and it wasn't just for very bright details. Even my gf noticed that things looked better.

I tried a number of videos off YouTube, like 4K HDR walkarounds from Japan in light and dark conditions, footage from various games in 4K HDR, nature videos etc.

But I still don't agree at all with kramnelis that the OLED is somehow terrible. Is it as good as a best of the best mini-LED? No, but let's be realistic, it doesn't cost nearly as much and is not some weird subpar tech like kramnelis claims. I guess my eyes just "don't want to see better images."

What this does make me wish is that these 10K+ zone mini-LED backlights would start appearing in desktop size displays. In real content I really could not spot blooming, I had to pull up a starfield type video to see the issue. It's actually pretty weird how a laptop display can outperform TVs and desktop displays in this area. If they were able to fix the abysmal pixel response times that Apple has, this tech could be very good.

PS. MacOS is still an absolute piece of shit for external display support. Could not get 4K 120 Hz with HDR working, 4K 60 Hz only shows the HDR toggle on this latest and greatest M2 Max 16" Macbook Pro. At least my desktop monitor finally runs at 4K 144 Hz, even though it also refuses to go to HDR.



Sounds good. That macbook is also a lot higher PPD depending on how far away you are sitting from it too. That also makes things look a lot better, and will easily look better to a casual observer. I used to have a 15.6" 4k glossy laptop and it looked very fine compared to a lot of other screens. At 1.5' to 2' away viewing distance a 16" 4k screen (M1 is 3456-by-2234) is around 84 to 100PPD which is very high compared to most 4k setups. It will look noticeably better than ~ 60PPD and less where people tend to put a 48" on a desk down to even ~ 51 PPD. 8k will be a big leap in quality when we get there, doubling the PPD vs 4k, especially on larger screens rather than laptop screens.

All displays in every tech type are not equal like you said. Different # of zones and different algorithms get different results, and perhaps even different sized screens. Some fald are ips and some are VA as well I think. FALD is always non-uniform +/- 'ing areas of the screen non-uniformly even if not outright blooming like tinkerbell in every scene but if it looks good to you and you like it that's great. There are several different OLED generations and technologies now too with varying performance, and with a new one on the horizon. 10,000 FALD zones is a lot more on the mac, up from ~ 1300. 1300 is around 45x25 lighting resolution, 10k zones is around 134 x 75 zones, so 3x the density across row-wise and 3x the density tall column wise. That's a good jump but there is still a lot of room for improvement. OLED, quantum layers and meta tech lenses are smaller than pixels. Vincent was very clear when he said the ucx has blooming outside of areas and had distracting fluctuating black levels in dynamic content due to the ucx's FALD array and I believe him and his videos showing it (as well as some other reviews online). UCX in vincent's review was not just affecting areas outside of the brighter light sources in dark backdrop scenes but was even blooming/flaring into letterboxing while the media field showed dynamic content, even when bright areas weren't directly adjacent to it at times. He said outright that it blooms worse than the apple XDR FALD. The samsung qdLED FALDs also bloom and cause non-uniform areas in media but a lot of people find it acceptable for the benfits. In game mode they have wider # of zones changing and with slower transitions so the blooming and fald in game mode there is worse though. They all have tradeoffs.


What this does make me wish is that these 10K+ zone mini-LED backlights would start appearing in desktop size displays.

Yeah there are microOLEDs too but they are so small that they are going into VR headsets. At least VR will all be per pixel emissive in the following gens.
 
Last edited:
Sounds good. That macbook is also a lot higher PPD depending on how far away you are sitting from it too. That also makes things look a lot better, and will easily look better to a casual observer. I used to have a 15.6" 4k glossy laptop and it looked very fine compared to a lot of other screens. At 1.5' to 2' away viewing distance a 16" 4k screen (M1 is 3456-by-2234) is around 84 to 100PPD which is very high compared to most 4k setups. It will look noticeably better than ~ 60PPD and less where people tend to put a 48" on a desk down to even ~ 51 PPD. 8k will be a big leap in quality when we get there, doubling the PPD vs 4k, especially on larger screens rather than laptop screens.

All displays in every tech type are not equal like you said. Different # of zones and different algorithms get different results, and perhaps even different sized screens. Some fald are ips and some are VA as well I think. FALD is always non-uniform +/- 'ing areas of the screen non-uniformly even if not outright blooming like tinkerbell in every scene but if it looks good to you and you like it that's great. There are several different OLED generations and technologies now too with varying performance, and with a new one on the horizon. 10,000 FALD zones is a lot more on the mac, up from ~ 1300. 1300 is around 45x25 lighting resolution, 10k zones is around 134 x 75 pixels, so 3x the density across row-wise and 3x the density tall column wise. That's a good jump but there is still a lot of room for improvement. OLED, quantum layers and meta tech lenses are smaller than pixels. Vincent was very clear when he said the ucx has blooming outside of areas and had distracting fluctuating black levels in dynamic content due to the ucx's FALD array and I believe him and his videos showing it (as well as some other reviews online). UCX in vincent's review was not just affecting areas outside of the brighter light sources in dark backdrop scenes but was even blooming/flaring into letterboxing while the media field showed dynamic content, even when bright areas weren't directly adjacent to it at times. He said outright that it blooms worse than the apple XDR FALD. The samsung qdLED FALDs also bloom and cause non-uniform areas in media but a lot of people find it acceptable for the benfits. In game mode they have wider # of zones changing and with slower transitions so the blooming and fald in game mode there is worse though. They all have tradeoffs.




Yeah there are microOLEDs too but they are so small that they are going into VR headsets. At least VR will all be per pixel emissive in the following gens.
Even with blooming FALD miniLED is way more accurate than OLED when it loses tons of brightness to have more inaccurate color on every pixels. That's why HDR on miniLED looks better. Pixel dimming is useless without brightness.
 
What this does make me wish is that these 10K+ zone mini-LED backlights would start appearing in desktop size displays. In real content I really could not spot blooming, I had to pull up a starfield type video to see the issue. It's actually pretty weird how a laptop display can outperform TVs and desktop displays in this area. If they were able to fix the abysmal pixel response times that Apple has, this tech could be very good.

I'm not a fan of FALD usually because the zones are way too large, but if you make them small enough (as Apple has apparently done for the Macbook Pro?) then the technology is promising.

I'd have to see it in person though. Even 10k zones seems a little low on a screen with ~8.3 million pixels (if it is 16:9 4k). We are still talking some 830 pixels per zone, but its a shit ton better than your typical FALD display. I'm guessing the bloom is small enough that it isn't obtrusive, but I'd have to see it in person.

For a 4k screen, I'd think ~2 million zones would be more like it. At least that would give you one zone per 2x2 pixel grouping.
 
  • Like
Reactions: elvn
like this
What this does make me wish is that these 10K+ zone mini-LED backlights would start appearing in desktop size displays

Say it's 16inch at 10,000 zones, ~ 134 x 75 lighting resolution. About 14 inch wide, 8 inch tall. Roughly 10 zones width per inch wide, 9 pixels tall per inch height.

A 55inch screen would be around 48 inch wide x 27 inch tall. So roughly 480 zones wide x 243 zones tall (actually less, just rounded figures). Those are very crude figures and I could figure it out closer but using those it would be something like 116,640 zones if it used the same small sized fald cells as the macbook so 100,000 should fit baring heat and power concerns and any other barriers to implementing them (including cost or profit).

At 100,000 zones, a 4k screen would be down to around 83 pixels per zone instead of the up to 6000 - 7000 pixels per zone we have on up to 1300 zone fald now. (Or the 829 pixels per zone on the apple's 10,000 zone display). That would probably be a huge difference. Don't forget that more than one zone gets activated across areas to compensate though - but still that would be an extreme difference of magnitudes if they ever released something like that. You can get 64 PPD and a 60 deg viewing angle on a 55inch 4k screen at 3.5 feet viewing distance so a 55 inch screen is not undoable setup wise using a simple rail tv stand. The radius or focal point of a 1000R curve is around 40" distance too so it would work well for that if curved.
 
Last edited:
For a 4k screen, I'd think ~2 million zones would be more like it. At least that would give you one zone per 2x2 pixel grouping.

Yeah that's practically microLED. Dual layer lcd did similar with a 1080p monochrome lcd as a bacloght of a 4k screen, so a single 1080 pixel size per four 4k ones. Those are expensive, power hungry, run very hot, are very slow response time among other issues.

Someyear everything will be per pixel emissive.

However I wouldn't rule out some kind of activated directional layer or other clever tech advancement hacks being invented to improve FALD's zone accruacy or to breakdown its zones farther somehow someday but it's been pretty stagnant in that regard so far other than quantum dot color. Maybe they could combine a clear oled layer with a fald lcd to refine its zones or something, canceling out lifted areas relative to the content etc.
 
Dual layer LED are like 20-30k and unfortunately not suited for consumer use.

Yeah, the viewing angles and motion are absoulte garbage and it uses way too much energy, just terrible for consumers. But it's great for mastering because the still images from straight on are good.
 
Yep. I didn't emphasize just how expensive but I was indicating that they were ruled out for all of their drawbacks in what I wrote. Just brought it up because it mapped to what he was describing,- one light per 4 pixels.

However microled are super expensive right now too as far as pricing goes. LGs first consumer 55inch oled tvs were 1080p and $10,000 usd in 2013 (~ $13,000 now). So just because something starts out expensive doesn't mean it can't fall into enthusiast price range years later, and with performance enhancements gained over the years and even more optimal ways of achieving the same tech. (Like oleds have progressed).

MicroLED should arrive in relatively smaller consumer displays someyear but maybe in the meantime some clever innovations could still happen. I threw the idea of clear oled overlay on top of fald to counter its bleeding, overshadow its darkness, and to general compliment it in there but who knows what clever applications they might come up with to get some appreciable gains (and cheaper) some time before microLED.
 
Last edited:
Yep. I didn't emphasize just how expensive but I was indicating that they were ruled out for all of their drawbacks in what I wrote. Just brought it up because it mapped to what he was describing,- one light per 4 pixels.

However microled are super expensive right now too as far as pricing goes. LGs first consumer 55inch oled tvs were 1080p and $10,000 usd in 2013 (~ $13,000 now). So just because something starts out expensive doesn't mean it can't fall into enthusiast price range years later, and with performance enhancements gained over the years and even more optimal ways of achieving the same tech. (Like oleds have progressed).

MicroLED should arrive in relatively smaller consumer displays someyear but maybe in the meantime some clever innovations could still happen. I threw the idea of clear oled overlay on top of fald to counter its bleeding, overshadow its darkness, and to general compliment it in there but who knows what clever applications they might come up with to get some appreciable gains (and cheaper) some time before microLED.

MicroLED in monitor format is probably like 10yrs away lol.

Still using DP 1.4 as the premier connectivity from 2016.

PC users really get shafted with monitor tech its way way behind TVs
 
it uses way too much energy

Flashback to my saltwater reef keeping days of giant Ghostbuster trap metal-hallide lamp ballasts, large metal pond pumps,,heaters,etc. 🐠

Home theater surround systems can use some juice too, and gaming pcs.

MicroLED in monitor format is probably like 10yrs away lol.

PC users really get shafted with monitor tech its way way behind TVs

Yeah. Just the way it is. Coming from a 48 inch setup I'm fine with even a 55inch gaming tv at this point if it has a big leap in tech. I'd just use a slim spined tv stand and sit at desk with my eyeballs 3.5 feet away. There's also a chance I could still have a glossy option in that sphere where that is much more unlikely in monitors.

Other than PPD wise, PC monitor displays are way behind VR display tech and VR frame amplification tech too. Eye tracking, foveated rendering, varifocal display tech(varifocal lenses),,etc. and microOLED pancake lenses should be very bright in VR HDR.
 
MicroLED in monitor format is probably like 10yrs away lol.

Still using DP 1.4 as the premier connectivity from 2016.

PC users really get shafted with monitor tech its way way behind TVs

I mean, what is a TV if not a monitor with an HDMI port, a tuner and speakers? Why not just use a TV as your monitor? They even support VRR these days.
 
HDR isn't "better" than SDR in every situation, it's just a different way of representing something and some people may not prefer it. Desktop use being a perfect example. It's more advanced for sure. It requires more capable tech to achieve. Doesn't make it automatically "better".
HDR is technically better as you can display SDR on HDR monitor while reverse is not true.
Sure you can make HDR monitor basically display SDR as HDR and that might as well look more punchy than HDR version of the same video/game but at this point its not SDR anymore but rather being completely out of any specs.
 
HDR is technically better as you can display SDR on HDR monitor while reverse is not true.
Sure you can make HDR monitor basically display SDR as HDR and that might as well look more punchy than HDR version of the same video/game but at this point its not SDR anymore but rather being completely out of any specs.

I think HDR can look great in games and films that are designed specifically for it.

I absolutely HATE what my screen looks like when I enable HDR in windows, and sit on an SDR desktop. It's practically unusable. It's so bad it hurts my eyes.
 
I absolutely HATE what my screen looks like when I enable HDR in windows, and sit on an SDR desktop. It's practically unusable. It's so bad it hurts my eyes.
Nothing to do with HDR though.
Its monitor/TV fault because it doesn't do all which could be done to make HDR look good.

On my 27" IPS monitor HDR doesn't look very good because this monitor lacks capabilities of maxing out LCD panel contrast range and backlight blast at highest possible levels as soon as something happens in one of 16 zones. Windows doesn't make #FFFFFF as bright as brightest HDR highlights so this in turn results in low contras. I can reduce emulated dsktop brightness but black level remains the same so it reduces contrast even more. I can also reduce backlight brightness but lower contrast ratio still remains and it defeats purpose of having HDR mode enabled in the first place.

On 48" OLED monitor HDR looks somewhat worse than SDR but only because my particular monitor uses native white color for HDR to get best possible brightness vs hardware calibrated SDR profile I can use in SDR mode so whites have higher temperature than 6500K I use in SDR mode. Otherwise image looks very good with no such issues as those IPS monitor exhibits.
That said white color can be corrected in AMD control panel (not sure on NV or Intel). Not sure if Microsoft made color calibration more robust for it to be possible to properly calibrate (or rather in this case profile) HDR mode but if they did it should be possible to get identical SDR@HDR as monitor's own hardware calibrated SDR mode.

In other words issues you have with SDR desktop in HDR mode are due to the fact your HDR capable monitor is far from implementing the specs just right.
Monitor manufacturer could make release goal like "When enabling HDR mode in Windows desktop can be easily made to look the same as it does in SDR mode" even for LCD monitors and even the way Windows handles things right now should not make it impossible to achieve. This was however not what manufacturer wanted to achieve. They wanted image to be "good enough" and in these cases good enough is not good enough for you or me or anyone else for that matter except manufacturer.
 
I sorta agree with Kram on making the image look better by stretching the original SDR presentation into a fake HDR, I mean that's what AutoHDR does you are basically just shoehorning some bootleg HDR into a game that was never intended to look that way in the first place, and yet it can deliver excellent results at times depending on the game and monitor you are using. So the whole idea of "better images" despite everything just being stretched beyond it's intended color space and dynamic range yeah I can agree with that as I'm an AutoHDR user, but I don't agree with him saying that OLED displays isn't capable of delivering those better images. AutoHDR on OLED can definitely look great and much better than the original SDR sRGB presentation. Also, I strongly do not recommend anyone try his "SDR400" trick of cranking brightness up to the max and forcing wide color gamut, that seriously just makes games look ridiculously oversaturated and 400+ nits of sustained fullscreen brightness is really UNCOMFORTABLE to look at for more than a few minutes. Instead, just use Special K. Shadow Warrior 3 now looks amazing with Special K's HDR. Highlights like the sun now have the pop that they should without overblowing out and oversaturating the rest of the image.

A lot of people really like AutoHDR, and Windows 11 HDR implementation has come a long way from where it was even in Windows 10. While it does stretch things, it does so in a way that largely preserves color accuracy, etc. That's a lot different than just viewing SDR in an improper color space, and it doesn't seem like that's what he's been talking about lately.

I don't use AutoHDR because it does a few things undesirable for me personally (too bright menus/text, and in the game I primarily tried it with, lifting some darker scenes a tad too much brightness wise, so I preferred the more muted regular SDR look), BUT, I can certainly see why some people like it and how it might work extremely well for certain games. Even in my case, the negatives were fairly minor, so I can see why the added pop would be more desirable to some. Special K seems like a neat tech as well.

My main contention has always been his suggestion that all sRGB/SDR should only be viewed in a wide color gamut or HDR, and that everyone should prefer it that way or they're not seeing "better images" like he does. We all have preferences on how we view content. (For example, I don't personally like motion smoothing either, but I can understand why other people do). His way isn't objectively better - it's just personal preference he's passing off as "the way".
 
I like how you use your overblown photo to demonstrate blooming while in real scene FALD easily destroy OLED monitors

The best part of this is that the original HDR image of the hole-in-rocks image looks fantastic with HDR enabled on my OLED, as has virtually every image you've used to try to "own" OLED. No, I don't have a FALD right next to it to compare, and I'm sure if I did, the FALD would be brighter, but isolated looking at just my OLED, it looks great to me - vibrant colors, plenty of pop, plenty of detail, and as bright as I would want. If that's tone mapping, works for me! It looks MUCH closer to the image on the left of your example comparison in HDR than it does the one on the right. *shrugs*

On the other hand, with the UCG, I saw a LOT of blooming, especially in desktop use. The only remedy would have been turning off local dimming, which of course would have resulted in much poorer contrast on the desktop than OLED can do. Blooming is very real and it's not a serious argument to say you'll never see it on FALD, because you absolutely will in certain content. Similarly, things like pinpoint starfields will get crushed somewhat on FALD. Both these undesirable things are resolved on OLED, with HDR brightness being the tradeoff, and that's still a tradeoff I'll easily take at the moment based on my usage, especially when I have yet to be disappointed by anything I've tried in HDR, including your images.
 
Last edited:
I'm not a fan of FALD usually because the zones are way too large, but if you make them small enough (as Apple has apparently done for the Macbook Pro?) then the technology is promising.

I'd have to see it in person though. Even 10k zones seems a little low on a screen with ~8.3 million pixels (if it is 16:9 4k). We are still talking some 830 pixels per zone, but its a shit ton better than your typical FALD display. I'm guessing the bloom is small enough that it isn't obtrusive, but I'd have to see it in person.

For a 4k screen, I'd think ~2 million zones would be more like it. At least that would give you one zone per 2x2 pixel grouping.
It's 16.2-inch (diagonal) display; 3456-by-2234 native resolution at 254 pixels per inch. So kinda close to 4K, but using a 16:10 aspect ratio.

I can notice the blooming if I put a mouse cursor on a black background but even then the blooming is pretty faint rather than distractingly obvious. In real content it's just not noticeable unless you have a lot of couple of pixel size starfield type objects on screen. I think it's more than good enough.

For desktop use things get interesting as when I was testing all the HDR stuff and had the backlight at max brightness, I eventually got used to it and if I then switch to my 400 nits G70A, it looks extremely dim in comparison. But if you set the backlight on the laptop back to a level I would normally use, they look pretty similar and normal. That's how your eyes adapt to the brightness where it can feel blinding at first then your eyes adjust, like going from inside to a bright sunny day outside.

I think that sort of stuff can really change the perception of the image where the brighter, more contrasted image is often perceived as better. When I was testing the MBP against my LG OLED TV, my gf remarked that e.g green trees looked more vibrant on the Mac and when I started looking closer, I agreed. I ended up adjusting the Color parameter on the LG OLED higher to make it more saturated and that got them perceivedly closer together, though its clear the factory calibration on the Macbook Pro looks more accurate and the display probably can handle wider gamut better too. This is probably an area where QD-OLED would do better than LG WOLED.
 
I can notice the blooming if I put a mouse cursor on a black background but even then the blooming is pretty faint rather than distractingly obvious. In real content it's just not noticeable unless you have a lot of couple of pixel size starfield type objects on screen. I think it's more than good enough.

For me on the ProArt I tried, I could notice a lot of blooming on dark background, most especially in SDR. It was especially noticeable on things like Discord and even my calendar (there was a bright "spot" around all the dates). I use dark themes on a lot of content, so the only solution would have been to turn of local dimming for desktop use. (I could have used day mode too, but I prefer working on dark backgrounds if I spend a lot of time there.)
 
HDR is technically better as you can display SDR on HDR monitor while reverse is not true.
From a technical point of view, absolutely.

And I'm also of the opinion that properly mastered HDR content is better than SDR. I'm an avid gamer and playing games that support HDR on a display capable of it in a decent way is fantastic. Leaps and bounds over the old 2012 1080p Sony TV I had before.

But saying HDR is objectively better in all situations is what is being rammed down our throats here and it just isn't the case. Some may prefer the attempts to convert SDR into HDR with varying levels of success, or simply stretching it out and oversaturating everything. I'd rather keep content as close to source as possible but it's all personal preference and opinion. Something we're not allowed to have, apparently.
 
I was running my dell IPS for a few days tweaking my 13900K (my MSI BIOS doesn't like 4K) and it was awful. The OLED is so easy on my eyes, I don't get terrible eye strain looking at it. Maybe I'm a pleb but I don't give a single solitary rat turd about brightness curves and HDR vs SDR content, etc. I use the one that is comfortable for me and looks pleasing. I quite like how my FO48U looks in HDR mode in Windows 11, and thats how I run it.
 
A lot of people really like AutoHDR, and Windows 11 HDR implementation has come a long way from where it was even in Windows 10. While it does stretch things, it does so in a way that largely preserves color accuracy, etc. That's a lot different than just viewing SDR in an improper color space, and it doesn't seem like that's what he's been talking about lately.

I don't use AutoHDR because it does a few things undesirable for me personally (too bright menus/text, and in the game I primarily tried it with, lifting some darker scenes a tad too much brightness wise, so I preferred the more muted regular SDR look), BUT, I can certainly see why some people like it and how it might work extremely well for certain games. Even in my case, the negatives were fairly minor, so I can see why the added pop would be more desirable to some. Special K seems like a neat tech as well.

My main contention has always been his suggestion that all sRGB/SDR should only be viewed in a wide color gamut or HDR, and that everyone should prefer it that way or they're not seeing "better images" like he does. We all have preferences on how we view content. (For example, I don't personally like motion smoothing either, but I can understand why other people do). His way isn't objectively better - it's just personal preference he's passing off as "the way".

Yeah when AutoHDR first came out it was quite hit or miss, I'd say it was 50/50 for me but nowadays especially after the Windows HDR Calibration app came out along with improvements to AutoHDR itself it feels like a lot more hit rather than miss, maybe 80/20 now. I don't claim to have any understanding of how exactly AutoHDR works but I agree that it seems to preserve color accuracy pretty well and doesn't result in a ridiculously oversaturated picture like forcing wide color gamut over sRGB would.
 
  • Like
Reactions: Hypez
like this
Yeah when AutoHDR first came out it was quite hit or miss, I'd say it was 50/50 for me but nowadays especially after the Windows HDR Calibration app came out along with improvements to AutoHDR itself it feels like a lot more hit rather than miss, maybe 80/20 now. I don't claim to have any understanding of how exactly AutoHDR works but I agree that it seems to preserve color accuracy pretty well and doesn't result in a ridiculously oversaturated picture like forcing wide color gamut over sRGB would.

I still havent "upgraded" to Windows 11. Would you say its worth it for this AutoHDR feature?
 
I still havent "upgraded" to Windows 11. Would you say its worth it for this AutoHDR feature?

I thought Microsoft has brought AutoHDR over to Windows 10 now no? But even if they haven't, you can just use Special K instead of moving over to Windows 11. From the little digging around I've done it seems like Special K is better than AutoHDR anyway, just that it requires more tweaking to get setup Vs a simple ON switch with AutoHDR. I don't even do much in depth tweaking myself I just spend a few seconds to adjust some parameters and already looks great. Don't know what kinda witchcraft Special K does but all I can say is that so far it works great! The latest game I've injected HDR into using Special K is Scars Above and yup all I can say is "It just works" lol.
 

Attachments

  • 20230316_193555.jpg
    20230316_193555.jpg
    480 KB · Views: 0
I thought Microsoft has brought AutoHDR over to Windows 10 now no? But even if they haven't, you can just use Special K instead of moving over to Windows 11. From the little digging around I've done it seems like Special K is better than AutoHDR anyway, just that it requires more tweaking to get setup Vs a simple ON switch with AutoHDR. I don't even do much in depth tweaking myself I just spend a few seconds to adjust some parameters and already looks great. Don't know what kinda witchcraft Special K does but all I can say is that so far it works great! The latest game I've injected HDR into using Special K is Scars Above and yup all I can say is "It just works" lol.

Oh nice hows the inno treating ya
 
Back
Top