Why OLED for PC use?

Oh nice hows the inno treating ya

I'm really enjoying it! Especially for the price, at $850 it was an absolute no brainer. Sure it's not perfect but the only other 1152 zone FALD monitors out there cost well over $2000. I'm more than happy with this, my old X27 does some things better but overall the Innocn feels like an upgrade overall.
 
I'm really enjoying it! Especially for the price, at $850 it was an absolute no brainer. Sure it's not perfect but the only other 1152 zone FALD monitors out there cost well over $2000. I'm more than happy with this, my old X27 does some things better but overall the Innocn feels like an upgrade overall.

Yer its amazing value for the price incredible really, far far superior to the edge lit experience.
 
and everything ive tried it with looked weird.

When was the last time you tried it? And do you use the HDR Calibration app? Because without that Windows doesn't know what the capabilities of your display is and it defaults to a 2000 nits value which looks like absolute dogshit on most displays. Both my Acer X27 and LG CX with HGiG enabled had everything become clip city because it just could not display details that where that high of a brightness.
 
When was the last time you tried it? And do you use the HDR Calibration app? Because without that Windows doesn't know what the capabilities of your display is and it defaults to a 2000 nits value which looks like absolute dogshit on most displays. Both my Acer X27 and LG CX with HGiG enabled had everything become clip city because it just could not display details with that where that high of a brightness.
~a month ago. yes. it says mines 1500, hisense says 1000...
 
~a month ago. yes. it says mines 1500, hisense says 1000...

Ah ok not sure what else is missing then. Here's how bad AutoHDR used to look. Clip city!

530145_20211103_232701.jpg




It's still not perfect today but it's much better than it was back when it first launched.
 
Meanwhile us Linux folk don't really have HDR-on-desktop options yet but having been away from Windows for 16 years I refuse to return. My reasons for avoiding it have only strengthened over the years and HDR doesn't even come close to countering them.

I rely on the PS5 for HDR gaming, along with the Steam Deck and Switch for everything handheld.
 
I too do a lot of HDR on my PS5 and like how standardized it is with a good calibration. It's rare I've had a poor HDR experience.

On PC it's a lot more hit or miss. It's not rare to have a game ship with completely broken HDR that only gets fixed a lot later. It is nice that things like Auto HDR have been evolving and improving, even though I still don't personally use it, and some of these mods like Special K are really neat.

I too really appreciate great HDR, but it still needs a lot of work as far as standardization, quality control, good adjustments across a variety of displays, and ease of use for people who aren't really technical savvy. Some games have all of these and others seem to be pretty lacking. I know quite a few people who aren't very interested in HDR purely because it's so confusing and not at all intuitive to them. Consoles are probably their best bet so far, but even then you have to know how to use different TV modes, etc., which set-it-and-forget-it people can be turned off by. It's all getting better but still a juggling act between SDR, HDR, the various standards (10, 10+, Dolby Vision, HGiG).

I think HDR is great and appreciate it when the implementation is there, but it does have quite a ways to go before it's mature. It's one of the reasons good SDR is my primary consideration so far, but I am excited about the advancements in panel technology allowing for brighter HDR down the line (hopefully with both brightness and perfect contrast eventually).
 
I too do a lot of HDR on my PS5 and like how standardized it is with a good calibration. It's rare I've had a poor HDR experience.

On PC it's a lot more hit or miss. It's not rare to have a game ship with completely broken HDR that only gets fixed a lot later. It is nice that things like Auto HDR have been evolving and improving, even though I still don't personally use it, and some of these mods like Special K are really neat.

I too really appreciate great HDR, but it still needs a lot of work as far as standardization, quality control, good adjustments across a variety of displays, and ease of use for people who aren't really technical savvy. Some games have all of these and others seem to be pretty lacking. I know quite a few people who aren't very interested in HDR purely because it's so confusing and not at all intuitive to them. Consoles are probably their best bet so far, but even then you have to know how to use different TV modes, etc., which set-it-and-forget-it people can be turned off by. It's all getting better but still a juggling act between SDR, HDR, the various standards (10, 10+, Dolby Vision, HGiG).

I think HDR is great and appreciate it when the implementation is there, but it does have quite a ways to go before it's mature. It's one of the reasons good SDR is my primary consideration so far, but I am excited about the advancements in panel technology allowing for brighter HDR down the line (hopefully with both brightness and perfect contrast eventually).

For sure it still has a ways to go. Funny thing is when I bought my X27 back in 2018, it wasn't even for HDR use at all. I picked it up because it was the first 4K 144Hz option at the time and local dimming greatly improved SDR contrast and almost completely eliminated IPS glow vs the model that did not have FALD. So it was pretty much an expensive high refresh 4k SDR monitor that I never even bothered to unleash it's full potential until years later due to just how plain awful the HDR experience on Windows was. Heck even to this day I still complain pretty often about HDR on PC lol.
 
And I'm also of the opinion that properly mastered HDR content is better than SDR. I'm an avid gamer and playing games that support HDR on a display capable of it in a decent way is fantastic. Leaps and bounds over the old 2012 1080p Sony TV I had before.
HDR gaming is the kind of difference which I would describe as "new quality in gaming".
Much more impactful than eg. increasing resolution or even adding new effects.

In movies perhaps too but so far the only movie in HDR I watched was Interceptor so my impressions so far are rather limited 🙃

But saying HDR is objectively better in all situations is what is being rammed down our throats here and it just isn't the case. Some may prefer the attempts to convert SDR into HDR with varying levels of success, or simply stretching it out and oversaturating everything. I'd rather keep content as close to source as possible but it's all personal preference and opinion. Something we're not allowed to have, apparently.
Issue with keeping content "close to source" is that you might get something which is inaccurate to begin with and just assume it was creators intent.
Then even if there was some intent it might not make any sense and just distract you. The only benefit in not bothering with tweaking often is effort spend on tweaking...

...but wait!
How do you know if your setting is correct if you assume that what you get is correct and its someone intent for it to look like it does?
Like most people here do not really know that movies are not made for sRGB and assume that if they calibrate or have gamma at 2.2 then they see correct image. They do not.
Imho its best to forego illusion of close to source and just use personal preference. If you do that you will likely see that for vast majority of content your tastes will be up to specs (and even more than what you might think are the specs now!) and that not every content unfortunately also is.

Anyhoo, when it comes to wide gamut SDR it depends on game and monitor. On my IPS wide gamut is nicer than on OLED but both are somewhat similar in that they roughly preserve hues and just oversaturate things. In such case some games look quite nice with wider gamut. In some cases just reducing saturation might make things look better - of course in these cases its best to use sRGB but unfortunately this might on some displays result in more input lag. If I was to choose for game between even one frame of input lag and slightly less accurate colors then I will choose the latter simply because its more important and that colors in games are usually not very realistic to begin with.
 
HDR gaming is the kind of difference which I would describe as "new quality in gaming".
Much more impactful than eg. increasing resolution or even adding new effects.

Maybe it's just because my Asus XG438Q's DisplayHDR600 isn't the best HDR implementation out there, but when I have played games with good HDR support (Like Cyberpunk 2077) my impression has been that it was a nice to have, but in the grand scheme of things didn't make a huge difference.

All I did was enable HDR and start the game though.

I spent some time in the menus trying to get the HDR to look good, but to be honest I didn't really know what I was doing, so I don't know if I did it right.

I turned "maximum brightness" to 600 nits (because that is the spec of my screen) I set paper white to about 200 nits, because that is what looked good to me. I had no idea what to do with the Tone Mapping Midpoint, so I just kind of left it at stock, which was 2.

As mentioned, there was a noticeable difference with it on compared to with it off, but I'd be lying if I said I thought it was a huge deal, or transformative. If I accidentally turned it off, I'm not even sure I'd notice unless I was thinking about it.
 
Last edited:
Issue with keeping content "close to source" is that you might get something which is inaccurate to begin with and just assume it was creators intent.
Then even if there was some intent it might not make any sense and just distract you. The only benefit in not bothering with tweaking often is effort spend on tweaking...

...but wait!
How do you know if your setting is correct if you assume that what you get is correct and its someone intent for it to look like it does?
Like most people here do not really know that movies are not made for sRGB and assume that if they calibrate or have gamma at 2.2 then they see correct image. They do not.
Imho its best to forego illusion of close to source and just use personal preference. If you do that you will likely see that for vast majority of content your tastes will be up to specs (and even more than what you might think are the specs now!) and that not every content unfortunately also is.

Anyhoo, when it comes to wide gamut SDR it depends on game and monitor. On my IPS wide gamut is nicer than on OLED but both are somewhat similar in that they roughly preserve hues and just oversaturate things. In such case some games look quite nice with wider gamut. In some cases just reducing saturation might make things look better - of course in these cases its best to use sRGB but unfortunately this might on some displays result in more input lag. If I was to choose for game between even one frame of input lag and slightly less accurate colors then I will choose the latter simply because its more important and that colors in games are usually not very realistic to begin with.
I spend a reasonable amount of time doing color work for film, but I'll also say it's as a freelancer. So I'll say "I'm an amateur", even though I have spent a lot of time understanding color pipelines and doing color work.

The very short version is: if you're watching anything that has shown on either TV or in a film and has had a professional colorist touch it: what you're describing does not happen. All images have to pass through a lot of hands to the final product. And you could say "mastering" is exactly what it sounds like. Are all colorists equal? Can they all grade like Jill? Obviously not. But working in any color house in any professional environment is going to require basic understanding of:

Scene Referred Workflows and Color Management
Input Color Spaces
Camera Log Formats/Camera Gammas/Gamuts (if any, ARRI doesn't have a Gamut)
Color Space Transforms
Output Color Spaces

Because they work inside of what is called a "scene referred workflow" all of their grading is already designed to fit into the color space as intended. They see exactly how everything is going to fall as they're doing it. But what if they need to master something for two different color spaces (lets say an HDR color space and an SDR one for a Blu-Ray film)? In that case the output color space is set for the larger color space first and mastered, then shrunken down to the smaller one and then mastered again. Because of display transforms (and the incredible math behind them) generally speaking though changes have to be made, a lot of the shrinking of gamuts happens automatically and the colorist just has to check through and ensure that there are not any problems.

They're able to do this for 2 reasons: their working color space is always massive and it's in fact much larger than any output color space (yes, bigger than any HDR color space. In fact what a camera "sees" is still far greater than what we are capable of displaying on any monitor). And 2) the whole purpose of display transforms in the first place is to be able to mathematically solve this "shrinking container" issue. In effect moving the while point, black point, and everything in-between in a photometric way. Using these tools is how they get an HDR and SDR master to have the same 'look' despite the big difference in color space and luminosity.

This baseline of technical knowledge of course is on top of any and all artistic knowledge. Such as color theory, how the colorists eye is developed, knowledge of references (like say from painters or actual film), etc. For people that are more on the technical side of mastering and grading, they're going to spend time working through knowledge of "color science", and that basically means they understand the makeup of images on a mathematical level. Let's just say those people can bend images in ways that standard 1d tools cannot. (And as another aside, some artistic intent can also just suck. The grading on Solo was pretty horrible, even objectively. Not sure what that colorist was trying to do).

They're also going to be sitting in front of $10k worth of equipment minimum, and their equipment for sure will be calibrated.

The major issue is what color space something is mastered into and whether you're properly looking at the correct color space. There was a huge thread (where we also had to deal with a certain someone that doesn't know about these things repeatedly saying the same incorrect statements over and over) in which we talked about HDR color spaces. The two most common being Rec2020 and Dolby Vision. My short answer for the layperson is that is why there now is a mode on TV's called "Filmmaker Mode" so that all of those kinds of settings don't get in the way of the intended viewing experience.

If you're watching SDR, the only two color spaces anything would/could be mastered in would be Rec709/Gamma 2.4 or DCI-P3. To my knowledge only if you're watching Blu-Ray content are you going to run into DCI-P3. It's also used in theaters, which is why it's a standard in the first place, but it's not used nearly as much on consumer level stuff. YouTube as an example doesn't allow for DCI-P3 content to be displayed differently, content creators need to use Rec709 to display their content correctly. Again, for the layperson, Filmmaker Mode is the answer.


My tl;dr: It is in fact very easy to know the artistic intent of any/every TV show and Film.
Only YouTube content is a crap shoot because you're dealing with people who rarely have the technical understanding (and perhaps not even the artistic understanding) of post color workflow craft.

What you're doing is akin to using an EQ when listening to music. This could be because you don't have speakers that can reproduce what they can in a studio environment (that's generally the best reason to do this). And if you want, you are welcome to do so. But for sure displaying things in "whatever color space you want" and "making the image look like whatever you want" just like fiddling with an EQ in music is absolutely "not the intent of those that made it". And in both cases with proper setup and equipment it is possible to see and hear exactly how something was intended or at the very least to the limit your equipment is capable of reproducing the source material.
 
Last edited:
That, and I find having HDR enabled in Windows 10 to be plain awful on the desktop.

Like, so bad it is unusable IMHO. I wish there were a settings that kept it off, and the machine in SDR mode until a game, a full screen video or something like that which can take advantage of it launches. It's kind of annoying to keep turning it on and off all the time, and honestly, I often forget to turn it on when I go to start my games.

Here is what I am talking about:

1679191672149.png


With that HDR/SDR brightness balance slider all the way to the right, things generally look OK, but the screen becomes so bright (presumably outputting at full 600 nits for white windows) that it literally (in the literal sense, not literally in the figurative sense as some morons use the word these days) causes eye pain, and makes me want to look away from the screen.

1679192960026.png


I cannot handle looking straight at that 600nit output for more than a short period of time. It is just too much.

If I move the slider to the left, the brightness becomes less painful, but the contrast also goes to shit, where everything looks faded and washed out.

The only way to make it look good, is to turn HDR off again, and the contrast returns to normal, and the brightness viewing level becomes reasonable again, so I'm not in pain.

There is simply no good way to use HDR on the desktop, which forces me to always turn it off when I am not actively using it, and then I often forget to turn it back on.

Maybe this is just because my Asus ROG Strix XG438Q's DisplayHDR600 implementation sucks (but it did review well, I don't know.) but to me HDR is just not usable except in fullscreen video (game or movie) content mode.
 
Last edited:
The XG438Q just doesn't have the range for HDR , as it looks like you've suspected. No offense. It's not giving you a valid perspective on what real HDR performance looks like (even between OLED and FALD's tradeoffs).

from TrustedReviews :

The XG438Q’s HDR performance is also a little disappointing. Positively, using this screen’s HDR modes ramps up the brightness to 650 nits, which surpasses VESA DisplayHDR 600’s requirements. However, in HDR modes the panel’s black level sat at 0.32 nits.

Those results meant that the Asus produced a contrast level of 2030:1 in HDR mode.

That's a very poor contrast. There are edge lit VA TV's that do 6100:1 in SDR (full field, not checkerboard/ansi). If you turn the FALD off on the samsung qdled FALD LCD's via the service menu they still get 3800:1 contrast, and even the lifted leopard spots on fald screens with FALD active are 3800:1 to 5000:1 contrast still (though non-uniform with the rest of the scene/screen's extremes). The accompanying black depth to that 2030:1 contrast on the XG438Q is also very bad. OLED do "infinite" contrast down to oblivion black depths.

You really need an OLED or FALD to see appreciable HDR. Edge lit screens can't really do it justice as they are flashlighting from the sides. I have a 600nit capable non-fald laptop and it will be very pale/blown out bright areas in HDR unless I moderate the brightness to tone it down considerably. It still gets a bit more colorful on some light sources after doing that but at the cost of contrast in the surrounding areas paling a little still even after I tone it down some. The HDR looks like crap if I leave it full flare 600 which should be for obvious reasons if you think about it. If I want a real HDR impact I'll output it to my OLED TV since the laptop has hdmi 2.1.

You really need (optimally a multitude of very very very tiny) zones or per pixel emissive to do HDR even if it was 600nit, but 800-1200nit highlights are probably more of an acceptable minimum currently in a compressed curve of a TV's firmware.

. .
 
Last edited:
That, and I find having HDR enabled in Windows 10 to be plain awful on the desktop.

As elvn pointed out, a lot of it is probably the monitor, but one thing I'd also add is that only Windows 11 supports Microsoft's Windows HDR Calibration app (a separate app downloadable from the Microsoft Store), and that makes a pretty substantial difference in my opinion. So Win10's HDR support is definitely not as good, for whatever reason.
 
As elvn pointed out, a lot of it is probably the monitor, but one thing I'd also add is that only Windows 11 supports Microsoft's Windows HDR Calibration app (a separate app downloadable from the Microsoft Store), and that makes a pretty substantial difference in my opinion. So Win10's HDR support is definitely not as good, for whatever reason.

It requires an app? That's a shame.

I don't install apps, so I guess I will never use that feature.

I don't need Microsoft creeping into my phone too.

There's one way to guarantee that I will never use something. Make it require an app, or an account signin.

It doesn't really matter how "awesome" some feature is that is being offered at that point. If it requires an app or an account, or any kind of cloud integration, I'm out.
 
  • Like
Reactions: Nenu
like this
It requires an app? That's a shame.

I don't install apps, so I guess I will never use that feature.

I don't need Microsoft creeping into my phone too.

There's one way to guarantee that I will never use something. Make it require an app, or an account signin.

It doesn't really matter how "awesome" some feature is that is being offered at that point. If it requires an app or an account, or any kind of cloud integration, I'm out.

Uhhh no Windows HDR doesn't actually require the app. And what do you even mean by Microsoft creeping on your phone? It isn't actually a mobile app if that's what your'e thinking, it's literally just a regular Windows program. And the only thing the "app" even does is let Windows know your display's HDR capabilities.

https://www.pcworld.com/article/1338717/how-to-use-microsoft-windows-hdr-calibration-app.html
 
Uhhh no Windows HDR doesn't actually require the app. And what do you even mean by Microsoft creeping on your phone? It isn't actually a mobile app if that's what your'e thinking, it's literally just a regular Windows program. And the only thing the "app" even does is let Windows know your display's HDR capabilities.

https://www.pcworld.com/article/1338717/how-to-use-microsoft-windows-hdr-calibration-app.html

Ahh. I heard the term "app" and assumed it was something that ran on my phone.

Since when do we call PC programs "apps"?

To me an app is a scaled down program with limited functionality that runs on a mobile device.
 
Ahh. I heard the term "app" and assumed it was something that ran on my phone.

Since when do we call PC programs "apps"?

To me an app is a scaled down program with limited functionality that runs on a mobile device.

Ok yeah I can definitely see that now. I always thought of "app" as just short for "application" or something but not necessarily referring to mobile. Microsoft has an app store for Windows and that's where you would find the HDR Calibration Tool.
 
Ok yeah I can definitely see that now. I always thought of "app" as just short for "application" or something but not necessarily referring to mobile. Microsoft has an app store for Windows and that's where you would find the HDR Calibration Tool.

My bad. Makes sense now. Thanks for clarifying.
 
Maybe it's just because my Asus XG438Q's DisplayHDR600 isn't the best HDR implementation out there, but when I have played games with good HDR support (Like Cyberpunk 2077) my impression has been that it was a nice to have, but in the grand scheme of things didn't make a huge difference.
Definitely not the best HDR monitor out there for sure. Eight edge-lit zones is not enough to do anything even if doing them optimally and of course we do not live in world where displays are made optimally so we need better tech.

I did my HDR testing on OLED mostly on one game Ghost of Tsushima and for this game in particular HDR looks gorgeous whereas SDR looks normal and ordinary - and after playing it in HDR it become obvious that this normal and ordinary means lots of highlights getting ridiculous amount of compression. Compression which is normally not really noticed when all you watch on your display is range compressed content. Therefore uncompressing it to reveal high dynamic range is the new quality.

On LCD without proper* workarounds to get better contrast ratio we cannot uncompress highlights without range near black and this is arguably even worse than compressing highlights.

*) I do not consider FALD with 500 or 1000 zones as proper workarounds. We need more, much much more.
 
  • Like
Reactions: elvn
like this
Sorry for the "app" confusion - apps on Windows PCs started with Windows 8 (IIRC) when Microsoft launched their separate app store for Windows, but yeah, I meant (and should have specified) a PC app, which is distinct from a regular program, but it's always seemed weird to me they separate things out that way. I rarely use apps and prefer "regular" programs too, but as far as I know, that's not an option (though I don't know if a 3rd party option exists). Either way though, this app is small and unobtrusive, and since it does calibrate to the upper and lower limits for things like white level and black level for the monitor, it does tend to improve the HDR picture/experience. I'd imagine games without many HDR settings (Dead Space, for example, just has one screen to make sure it's not too dark) use it - seemed dead on at the defaults for me (to be fair, though, I didn't have the game before running that calibration). I don't know how it interacts with games like Cyberpunk with more specific settings though. I just know on both monitors I tried it with (the FALD I returned and now the OLED) it did improve HDR quite a bit; it's a shame it's not supported under Windows 10. (It's a lot like a game console's HDR calibration if you've ever run that.)
 
I just know on both monitors I tried it with (the FALD I returned and now the OLED) it did improve HDR quite a bit; it's a shame it's not supported under Windows 10.

Well, they have to get you to "upgrade" somehow :p

(It's a lot like a game console's HDR calibration if you've ever run that.)

Last console I used was one of these:

F%2Fwww.retroplay.se%2FPICTURE%2F268-76-neskonsol2.jpg


It did not have any HDR settings :p
 
Haha I just built a Lego one of those. It also does not have HDR =oP

Yeah, I totally think the HDR tool COULD run in Windows 10 - wouldn't surprise me if it is an 11 upgrade gimmick.
 
app is short for application. so regular desktop application programs (application are programs designed for end users) are still apps. when talking with someone about a windows/desktop topic i assume app means desktop app.

Hehe I'm in the strange position where ideally, I completely agree with you, but practically, these mean different things to me.

Since the Microsoft Store, if someone says App in relation to Windows, I prettymuch assume they mean something from the Microsoft Store. Whereas most people I know use the word "program" (or sometimes the full word "application", though it's relatively rare, at least in my case) for non-appstore 32 or 64-bit software. This may depend on the person, though - that's just my anecdotal experience (I fix PCs for work).

At this point I'd never say "app" personally unless I meant something from the Microsoft Store (or a phone app, of course). Microsoft's kind of been moving an Apple direction with these where some new computers ship in "S Mode" where 3rd party programs downloaded from the internet won't run (for "safety") but "apps" from the Microsoft Store do, and it's just odd. (That said, I know precisely nobody who hasn't turned off S Mode right away lol.)

Anyways, it's kind of weird the "Windows HDR Calibration" is an app and not either a built-in OS feature (which would make the most sense IMO) or a downloadable program that's not an app.
 
Ahh. I heard the term "app" and assumed it was something that ran on my phone.

Since when do we call PC programs "apps"?

To me an app is a scaled down program with limited functionality that runs on a mobile device.
"App" is literally just short for "application" and has nothing to do with mobile. Just another word for "program". I think it's more likely to people talk about "mobile apps" or "iOS app" or "Android app" when they want to point to mobile.

I do think the HDR calibration tool should be part of Windows 11 though. I guess eventually it will be but they don't want to push it as something accessible from Display settings yet or something.
 
I spend a reasonable amount of time doing color work for film, but I'll also say it's as a freelancer. So I'll say "I'm an amateur", even though I have spent a lot of time understanding color pipelines and doing color work.

The very short version is: if you're watching anything that has shown on either TV or in a film and has had a professional colorist touch it: what you're describing does not happen.
I just assume you didn't even read what I said and had speech prepared in your head and jumped on first inkling on occasion to write it. Its common discussion boards phenomena and usually people will say someone said is all wrong without reading what they are replying to or if they agree to it or not because they have to write their train of thoughts quickly or they feel it might disappear and just saying someone is wrong, especially if the essay they have to write is prepared to be reply to someone who said something actually wrong.

From experience I also know that trying to argue with such person only make things worse. People assume all their past decisions were valid and with lots of effort and if they said you were wrong in something which were obviously true then they will assume it was false and the longer discussion is the more they think they are correct. It can go to the point that simple things can get rewritten as opposite in their head. Even if at first someone had correct view then from heated discussion with someone trying to convince them true is true they get in the end strong emotionally signed truth that true is indeed false and false is true. Its how human psyche works and better to avoid having discussions with people who say everything you said is all wrong.

Or you actually think issue of color conversions never ever happen or that there are no people who calibrate their displays to sRGB and think Rec.709. movies look like they look like because its all artist intent?
 
I just assume you didn't even read what I said and had speech prepared in your head and jumped on first inkling on occasion to write it. Its common discussion boards phenomena and usually people will say someone said is all wrong without reading what they are replying to or if they agree to it or not because they have to write their train of thoughts quickly or they feel it might disappear and just saying someone is wrong, especially if the essay they have to write is prepared to be reply to someone who said something actually wrong.

From experience I also know that trying to argue with such person only make things worse. People assume all their past decisions were valid and with lots of effort and if they said you were wrong in something which were obviously true then they will assume it was false and the longer discussion is the more they think they are correct. It can go to the point that simple things can get rewritten as opposite in their head. Even if at first someone had correct view then from heated discussion with someone trying to convince them true is true they get in the end strong emotionally signed truth that true is indeed false and false is true. Its how human psyche works and better to avoid having discussions with people who say everything you said is all wrong.

Or you actually think issue of color conversions never ever happen or that there are no people who calibrate their displays to sRGB and think Rec.709. movies look like they look like because its all artist intent?
There are two parts to what you said.

User error and “doing your own thing”.

As far as calibrating your display to sRGB, I would say it’s impossible to say with 100% certainty what 100% of people will do. However if you’ve bought as an example, an X-Rite colorimeter it leads you down the “correct” path without having to have much knowledge.

I find it unlikely that there are people that have an interest in display technology or film spend $200+ on a colorimeter, actively choose the wrong profiles, and have never read or figured out even the most basic information about color spaces of what they’re watching. This especially in light of the fact that every site such as RTNGS constantly talks about color space accuracy of TV’s/displays and names the color spaces they’re striving for accuracy in.

I also disagree with the concept that because it’s possible for people to do things incorrectly that they should then simply do whatever they want rather than strive for accuracy. Though I covered that above. To retread: people of course can choose to do whatever they want like using an EQ. But I would say that’s suboptimal.
 
The best part of this is that the original HDR image of the hole-in-rocks image looks fantastic with HDR enabled on my OLED, as has virtually every image you've used to try to "own" OLED. No, I don't have a FALD right next to it to compare, and I'm sure if I did, the FALD would be brighter, but isolated looking at just my OLED, it looks great to me - vibrant colors, plenty of pop, plenty of detail, and as bright as I would want. If that's tone mapping, works for me! It looks MUCH closer to the image on the left of your example comparison in HDR than it does the one on the right. *shrugs*

On the other hand, with the UCG, I saw a LOT of blooming, especially in desktop use. The only remedy would have been turning off local dimming, which of course would have resulted in much poorer contrast on the desktop than OLED can do. Blooming is very real and it's not a serious argument to say you'll never see it on FALD, because you absolutely will in certain content. Similarly, things like pinpoint starfields will get crushed somewhat on FALD. Both these undesirable things are resolved on OLED, with HDR brightness being the tradeoff, and that's still a tradeoff I'll easily take at the moment based on my usage, especially when I have yet to be disappointed by anything I've tried in HDR, including your images.
Then what's the 4th image? You think sRGB 80nits can look close to the 4th image?

OLED not only looks worse in HDR, It looks worse in SDR as it can only display sRGB.


52751179067_7a9a569622_o_d.png
 
Just to touch on a few recent discussions, I'm also one of the "close to source" people, but I have no issue if people prefer wider gamut, just as I know some people prefer cooler color temperatures or more vivid modes than calibrated displays sometimes offer. My only contention is when people claim their way to be the only way to watch. We all have preferences/compromises in various areas, and that's okay as long as we enjoy it. I myself prefer slightly brighter than reference when it comes to SDR and prefer a gamma of 2.2, even for Rec.709 material.

Personally I do not like displaying things in a wider color gamut as the colors being thrown off bothers me, even if it appears to look nice. For example, I played around with a wider gamut in game when I first got the monitor, and oranges were downright reddish, etc. Did the saturation look nice on a certain level? It did, but I knew things weren't supposed to be that way, and it looked a lot more natural to me in sRGB.

Most of the time, it doesn't seem that difficult to determine "creator intent" for a source. As far as SDR goes, virtually all TV shows/movies are Rec.709 as far as I'm aware (which is fairly equivalent to sRGB). As far as HDR, it depends on the format, but those are all wide-gamut. Games are the same story - virtually all SDR games are intended for sRGB and HDR for a wide gamut.

As to input lag, I'm fortunate in that I can't really detect any difference between the calibrated sRGB mode and gamer mode as long as VRR is on. (In HDR, Gamer mode works, so it's a moot point.) There is a DAS feature disabled on modes other than Gamer, but it doesn't appear to make any appreciable difference. Even if there was a small one, it'd be worth it to me, though, as I mainly play immersive games and don't do anything competitive, etc., so as long as input lag is reasonbly small, it's a non-issue. Though it seems extremely good on the monitor I'm using at the moment.
 
Ah ok not sure what else is missing then. Here's how bad AutoHDR used to look. Clip city!

530145_20211103_232701.jpg




It's still not perfect today but it's much better than it was back when it first launched.

This is not a problem of AutoHDR. AutoHDR is automatically matched to the dynamic range. It will get max brightness to the sky. AutoHDR doesn't know ABL. The sky will not be clipped if the monitor hold the max brightness.
Your X27 max brightness is 1200nits. So that should be 1200nits sky but X27 can only hold 10% window before drop to 900nits at 20% so you see 900nits clipping.
 
Then what's the 4th image? You think sRGB 80nits can look close to the 4th image?

OLED not only looks worse in HDR, It looks worse in SDR as it can only display sRGB.

I think an HDR game looks best in native HDR. As I've said, none of your images have any problem displaying very nicely on my OLED in HDR, nor do games - they look fantastic. Apart from that, I'm not sure what you're trying to get at. I believe the Horizon game you're showcasing has native HDR, so I'd just use that if I played it. As far as the images you've posted, they're apparently custom ones you've made? Custom images you make yourself don't mean a whole lot without context.

OLED can do wide gamut just like any modern monitor, including in SDR, so I have no idea where you're getting the "can only display sRGB" assertion since that's quite obviously false, and you already know this. I just don't use the wider gamut as I want to see SDR in sRGB for the accuracy, as I've explained ad nauseum. That's my preference, yes I think it looks better and more accurate, and that's not going to change.
 
Its like a psuedo SDR effect with overblown saturated colors and i have a pretty good HDR grading display.
I don't know if you only calibrate sRGB or you calibrate all sRGB, Adobe RGB, DCI-P3. If the color is overblown it means the brightness is not enough. And you need to calibrate wide gamut SDR as well to prevent it reaching over Rec.2020 designed for 10bit 1000nits.
Grading is putting content into a different color space with different gamma. The transition between different color space is smart enough so it won't look off unless it's not calibrated.
 
I think an HDR game looks best in native HDR. As I've said, none of your images have any problem displaying very nicely on my OLED in HDR, nor do games - they look fantastic. Apart from that, I'm not sure what you're trying to get at. I believe the Horizon game you're showcasing has native HDR, so I'd just use that if I played it. As far as the images you've posted, they're apparently custom ones you've made? Custom images you make yourself don't mean a whole lot without context.

OLED can do wide gamut just like any modern monitor, including in SDR, so I have no idea where you're getting the "can only display sRGB" assertion since that's quite obviously false, and you already know this. I just don't use the wider gamut as I want to see SDR in sRGB for the accuracy, as I've explained ad nauseum. That's my preference, yes I think it looks better and more accurate, and that's not going to change.
Then what's the 4th image? Your OLED cannot even see native HDR where it shoots up 2000+nits. You can see a 300nits sun.

It's not like I cannot see sRGB. I can see much better in SDR then why just seeing sRGB?
 
This is not a problem of AutoHDR. AutoHDR is automatically matched to the dynamic range. It will get max brightness to the sky. AutoHDR doesn't know ABL. The sky will not be clipped if the monitor hold the max brightness.
Your X27 max brightness is 1200nits. So that should be 1200nits sky but X27 can only hold 10% window before drop to 900nits at 20% so you see 900nits clipping.

No it is a problem with AutoHDR and not ABL. ABL issue would mean I would've at least seen the detail first before ABL kicks in and clipped it, but I never saw the detail to begin with. The moment I start panning towards the clouds it already had zero detail, it was clipped right from the start and not because of ABL. That's because by default AutoHDR was using a 2000 nits curve so it was setting the highlights up to 2000 nits which the X27 is not capable of displaying period. Back then, maybe even today, Windows does not know the capability of your HDR monitor so it will just set it to 2000 nits peak and that's why everything looked awful. The way to get around this was to use CRU but now we have the calibration tool so there is no need to do that method anymore.
 
MicroLED in monitor format is probably like 10yrs away lol.

Still using DP 1.4 as the premier connectivity from 2016.

PC users really get shafted with monitor tech its way way behind TVs
Monitors are much advanced than TVs. TVs are not accurate. The world first HDR1000 monitor with QD layer is out in 2018. Back then no TV can do this specs. QD layer is already on monitors. You cannot do anything serious on TVs.
 
No it is a problem with AutoHDR and not ABL. ABL issue would mean I would've at least seen the detail first before ABL kicks in and clipped it, but I never saw the detail to begin with. The moment I start panning towards the clouds it already had zero detail, it was clipped right from the start and not because of ABL. That's because by default AutoHDR was using a 2000 nits curve so it was setting the highlights up to 2000 nits which the X27 is not capable of displaying period. Back then, maybe even today, Windows does not know the capability of your HDR monitor so it will just set it to 2000 nits peak and that's why everything looked awful. The way to get around this was to use CRU but now we have the calibration tool so there is no need to do that method anymore.
Have you checked it's supposed to be 2000nits? If you capture the heatmap the max brightness is exactly the brightness of your display. AutoHDR is corresponded to the dynamic range of the monitors not some suddenly 2000nits out of specs.

And Windows knows very well how much brightness your display has. Just like how the firmware reading details from VESA DisplayHDR Test.
 
Have you checked it's supposed to be 2000nits? If you capture the heatmap the max brightness is exactly the brightness of your display. AutoHDR is corresponded to the dynamic range of the monitors not some suddenly 2000nits out of specs.

And Windows knows very well how much brightness your display has. Just like how the firmware reading details from VESA DisplayHDR Test.

Windows may know today but I'm talking about BACK THEN. Look at the date of the original post, this was a known issue that Windows WILL default to a really high peak brightness if it detects no HDR certification. You literally say it yourself, AutoHDR works with the dynamic range of the monitor, so what if AutoHDR has no idea what the dynamic range of the monitor is? Then it goes to 1500 nits. I thought it was 2000 but it seems to be 1500 as the default value.

https://www.techpowerup.com/forums/threads/hdr-peak-brightness.288568/

And again, the only way to fix this back then was to use CRU:

"Use custom resolution utility, open the CTA-861 extension block and look for an HDR Static Metadata block. If there isn't one then your monitor isn't telling Windows through EDID what the max/avg/min luminance is in which case it will default to 1499 nits. You can actually add metadata manually to trick windows into showing custom luminance values. I believe it uses this to scale auto-hdr, though HDR isn't really worth using on that monitor."

So yes it IS/WAS an AutoHDR problem and not an X27 ABL issue.
 
  • Like
Reactions: XoR_
like this
Then what's the 4th image? Your OLED cannot even see native HDR where it shoots up 2000+nits. You can see a 300nits sun.

It's not like I cannot see sRGB. I can see much better in SDR then why just seeing sRGB?

We're never going to agree on this. I have no idea how you generated the 4th image, nor do I care in images you grade yourself - I care about playing games in HDR, and those look great so far. Most games will tone-map to the monitor capabilities, making this a non-issue. As has been said several times, yes, higher brightness capabilities would be even better, but given that's not a reality and I prefer the OLED for 80%+ of what I do over FALD since I work mostly in SDR, I think the HDR capabilities it does have are great. Do they match FALD? No. Are they good enough to play games and have an enjoyable HDR experience? Absolutely, and that'll only get better as technology advances.

I view sRGB as sRGB because it's the most natural/accurate way and looks better to me because of those factors (I've tried wider gamut and I've tried AutoHDR; both look less natural to me, regardless of monitor - others may like it and that's cool, but it's not for me). Your "better" is simply NOT better to me, my preferences, and my valuations. What looks better to me is better to me, and what looks better to you is better to you - unless we're comparing objective things like accuracy, that's all subjective.
 
Back
Top