Oh nice hows the inno treating ya
I'm really enjoying it! Especially for the price, at $850 it was an absolute no brainer. Sure it's not perfect but the only other 1152 zone FALD monitors out there cost well over $2000. I'm more than happy with this, my old X27 does some things better but overall the Innocn feels like an upgrade overall.
and everything ive tried it with looked weird.
~a month ago. yes. it says mines 1500, hisense says 1000...When was the last time you tried it? And do you use the HDR Calibration app? Because without that Windows doesn't know what the capabilities of your display is and it defaults to a 2000 nits value which looks like absolute dogshit on most displays. Both my Acer X27 and LG CX with HGiG enabled had everything become clip city because it just could not display details with that where that high of a brightness.
~a month ago. yes. it says mines 1500, hisense says 1000...
I too do a lot of HDR on my PS5 and like how standardized it is with a good calibration. It's rare I've had a poor HDR experience.
On PC it's a lot more hit or miss. It's not rare to have a game ship with completely broken HDR that only gets fixed a lot later. It is nice that things like Auto HDR have been evolving and improving, even though I still don't personally use it, and some of these mods like Special K are really neat.
I too really appreciate great HDR, but it still needs a lot of work as far as standardization, quality control, good adjustments across a variety of displays, and ease of use for people who aren't really technical savvy. Some games have all of these and others seem to be pretty lacking. I know quite a few people who aren't very interested in HDR purely because it's so confusing and not at all intuitive to them. Consoles are probably their best bet so far, but even then you have to know how to use different TV modes, etc., which set-it-and-forget-it people can be turned off by. It's all getting better but still a juggling act between SDR, HDR, the various standards (10, 10+, Dolby Vision, HGiG).
I think HDR is great and appreciate it when the implementation is there, but it does have quite a ways to go before it's mature. It's one of the reasons good SDR is my primary consideration so far, but I am excited about the advancements in panel technology allowing for brighter HDR down the line (hopefully with both brightness and perfect contrast eventually).
HDR gaming is the kind of difference which I would describe as "new quality in gaming".And I'm also of the opinion that properly mastered HDR content is better than SDR. I'm an avid gamer and playing games that support HDR on a display capable of it in a decent way is fantastic. Leaps and bounds over the old 2012 1080p Sony TV I had before.
Issue with keeping content "close to source" is that you might get something which is inaccurate to begin with and just assume it was creators intent.But saying HDR is objectively better in all situations is what is being rammed down our throats here and it just isn't the case. Some may prefer the attempts to convert SDR into HDR with varying levels of success, or simply stretching it out and oversaturating everything. I'd rather keep content as close to source as possible but it's all personal preference and opinion. Something we're not allowed to have, apparently.
HDR gaming is the kind of difference which I would describe as "new quality in gaming".
Much more impactful than eg. increasing resolution or even adding new effects.
I spend a reasonable amount of time doing color work for film, but I'll also say it's as a freelancer. So I'll say "I'm an amateur", even though I have spent a lot of time understanding color pipelines and doing color work.Issue with keeping content "close to source" is that you might get something which is inaccurate to begin with and just assume it was creators intent.
Then even if there was some intent it might not make any sense and just distract you. The only benefit in not bothering with tweaking often is effort spend on tweaking...
...but wait!
How do you know if your setting is correct if you assume that what you get is correct and its someone intent for it to look like it does?
Like most people here do not really know that movies are not made for sRGB and assume that if they calibrate or have gamma at 2.2 then they see correct image. They do not.
Imho its best to forego illusion of close to source and just use personal preference. If you do that you will likely see that for vast majority of content your tastes will be up to specs (and even more than what you might think are the specs now!) and that not every content unfortunately also is.
Anyhoo, when it comes to wide gamut SDR it depends on game and monitor. On my IPS wide gamut is nicer than on OLED but both are somewhat similar in that they roughly preserve hues and just oversaturate things. In such case some games look quite nice with wider gamut. In some cases just reducing saturation might make things look better - of course in these cases its best to use sRGB but unfortunately this might on some displays result in more input lag. If I was to choose for game between even one frame of input lag and slightly less accurate colors then I will choose the latter simply because its more important and that colors in games are usually not very realistic to begin with.
The XG438Q’s HDR performance is also a little disappointing. Positively, using this screen’s HDR modes ramps up the brightness to 650 nits, which surpasses VESA DisplayHDR 600’s requirements. However, in HDR modes the panel’s black level sat at 0.32 nits.
Those results meant that the Asus produced a contrast level of 2030:1 in HDR mode.
That, and I find having HDR enabled in Windows 10 to be plain awful on the desktop.
As elvn pointed out, a lot of it is probably the monitor, but one thing I'd also add is that only Windows 11 supports Microsoft's Windows HDR Calibration app (a separate app downloadable from the Microsoft Store), and that makes a pretty substantial difference in my opinion. So Win10's HDR support is definitely not as good, for whatever reason.
It requires an app? That's a shame.
I don't install apps, so I guess I will never use that feature.
I don't need Microsoft creeping into my phone too.
There's one way to guarantee that I will never use something. Make it require an app, or an account signin.
It doesn't really matter how "awesome" some feature is that is being offered at that point. If it requires an app or an account, or any kind of cloud integration, I'm out.
Uhhh no Windows HDR doesn't actually require the app. And what do you even mean by Microsoft creeping on your phone? It isn't actually a mobile app if that's what your'e thinking, it's literally just a regular Windows program. And the only thing the "app" even does is let Windows know your display's HDR capabilities.
https://www.pcworld.com/article/1338717/how-to-use-microsoft-windows-hdr-calibration-app.html
Ahh. I heard the term "app" and assumed it was something that ran on my phone.
Since when do we call PC programs "apps"?
To me an app is a scaled down program with limited functionality that runs on a mobile device.
Ok yeah I can definitely see that now. I always thought of "app" as just short for "application" or something but not necessarily referring to mobile. Microsoft has an app store for Windows and that's where you would find the HDR Calibration Tool.
Definitely not the best HDR monitor out there for sure. Eight edge-lit zones is not enough to do anything even if doing them optimally and of course we do not live in world where displays are made optimally so we need better tech.Maybe it's just because my Asus XG438Q's DisplayHDR600 isn't the best HDR implementation out there, but when I have played games with good HDR support (Like Cyberpunk 2077) my impression has been that it was a nice to have, but in the grand scheme of things didn't make a huge difference.
I just know on both monitors I tried it with (the FALD I returned and now the OLED) it did improve HDR quite a bit; it's a shame it's not supported under Windows 10.
(It's a lot like a game console's HDR calibration if you've ever run that.)
Sorry for the "app" confusion
Since when do we call PC programs "apps"?
app is short for application. so regular desktop application programs (application are programs designed for end users) are still apps. when talking with someone about a windows/desktop topic i assume app means desktop app.
"App" is literally just short for "application" and has nothing to do with mobile. Just another word for "program". I think it's more likely to people talk about "mobile apps" or "iOS app" or "Android app" when they want to point to mobile.Ahh. I heard the term "app" and assumed it was something that ran on my phone.
Since when do we call PC programs "apps"?
To me an app is a scaled down program with limited functionality that runs on a mobile device.
I just assume you didn't even read what I said and had speech prepared in your head and jumped on first inkling on occasion to write it. Its common discussion boards phenomena and usually people will say someone said is all wrong without reading what they are replying to or if they agree to it or not because they have to write their train of thoughts quickly or they feel it might disappear and just saying someone is wrong, especially if the essay they have to write is prepared to be reply to someone who said something actually wrong.I spend a reasonable amount of time doing color work for film, but I'll also say it's as a freelancer. So I'll say "I'm an amateur", even though I have spent a lot of time understanding color pipelines and doing color work.
The very short version is: if you're watching anything that has shown on either TV or in a film and has had a professional colorist touch it: what you're describing does not happen.
There are two parts to what you said.I just assume you didn't even read what I said and had speech prepared in your head and jumped on first inkling on occasion to write it. Its common discussion boards phenomena and usually people will say someone said is all wrong without reading what they are replying to or if they agree to it or not because they have to write their train of thoughts quickly or they feel it might disappear and just saying someone is wrong, especially if the essay they have to write is prepared to be reply to someone who said something actually wrong.
From experience I also know that trying to argue with such person only make things worse. People assume all their past decisions were valid and with lots of effort and if they said you were wrong in something which were obviously true then they will assume it was false and the longer discussion is the more they think they are correct. It can go to the point that simple things can get rewritten as opposite in their head. Even if at first someone had correct view then from heated discussion with someone trying to convince them true is true they get in the end strong emotionally signed truth that true is indeed false and false is true. Its how human psyche works and better to avoid having discussions with people who say everything you said is all wrong.
Or you actually think issue of color conversions never ever happen or that there are no people who calibrate their displays to sRGB and think Rec.709. movies look like they look like because its all artist intent?
Then what's the 4th image? You think sRGB 80nits can look close to the 4th image?The best part of this is that the original HDR image of the hole-in-rocks image looks fantastic with HDR enabled on my OLED, as has virtually every image you've used to try to "own" OLED. No, I don't have a FALD right next to it to compare, and I'm sure if I did, the FALD would be brighter, but isolated looking at just my OLED, it looks great to me - vibrant colors, plenty of pop, plenty of detail, and as bright as I would want. If that's tone mapping, works for me! It looks MUCH closer to the image on the left of your example comparison in HDR than it does the one on the right. *shrugs*
On the other hand, with the UCG, I saw a LOT of blooming, especially in desktop use. The only remedy would have been turning off local dimming, which of course would have resulted in much poorer contrast on the desktop than OLED can do. Blooming is very real and it's not a serious argument to say you'll never see it on FALD, because you absolutely will in certain content. Similarly, things like pinpoint starfields will get crushed somewhat on FALD. Both these undesirable things are resolved on OLED, with HDR brightness being the tradeoff, and that's still a tradeoff I'll easily take at the moment based on my usage, especially when I have yet to be disappointed by anything I've tried in HDR, including your images.
Ah ok not sure what else is missing then. Here's how bad AutoHDR used to look. Clip city!
![]()
It's still not perfect today but it's much better than it was back when it first launched.
Then what's the 4th image? You think sRGB 80nits can look close to the 4th image?
OLED not only looks worse in HDR, It looks worse in SDR as it can only display sRGB.
I don't know if you only calibrate sRGB or you calibrate all sRGB, Adobe RGB, DCI-P3. If the color is overblown it means the brightness is not enough. And you need to calibrate wide gamut SDR as well to prevent it reaching over Rec.2020 designed for 10bit 1000nits.Its like a psuedo SDR effect with overblown saturated colors and i have a pretty good HDR grading display.
Then what's the 4th image? Your OLED cannot even see native HDR where it shoots up 2000+nits. You can see a 300nits sun.I think an HDR game looks best in native HDR. As I've said, none of your images have any problem displaying very nicely on my OLED in HDR, nor do games - they look fantastic. Apart from that, I'm not sure what you're trying to get at. I believe the Horizon game you're showcasing has native HDR, so I'd just use that if I played it. As far as the images you've posted, they're apparently custom ones you've made? Custom images you make yourself don't mean a whole lot without context.
OLED can do wide gamut just like any modern monitor, including in SDR, so I have no idea where you're getting the "can only display sRGB" assertion since that's quite obviously false, and you already know this. I just don't use the wider gamut as I want to see SDR in sRGB for the accuracy, as I've explained ad nauseum. That's my preference, yes I think it looks better and more accurate, and that's not going to change.
This is not a problem of AutoHDR. AutoHDR is automatically matched to the dynamic range. It will get max brightness to the sky. AutoHDR doesn't know ABL. The sky will not be clipped if the monitor hold the max brightness.
Your X27 max brightness is 1200nits. So that should be 1200nits sky but X27 can only hold 10% window before drop to 900nits at 20% so you see 900nits clipping.
Monitors are much advanced than TVs. TVs are not accurate. The world first HDR1000 monitor with QD layer is out in 2018. Back then no TV can do this specs. QD layer is already on monitors. You cannot do anything serious on TVs.MicroLED in monitor format is probably like 10yrs away lol.
Still using DP 1.4 as the premier connectivity from 2016.
PC users really get shafted with monitor tech its way way behind TVs
Have you checked it's supposed to be 2000nits? If you capture the heatmap the max brightness is exactly the brightness of your display. AutoHDR is corresponded to the dynamic range of the monitors not some suddenly 2000nits out of specs.No it is a problem with AutoHDR and not ABL. ABL issue would mean I would've at least seen the detail first before ABL kicks in and clipped it, but I never saw the detail to begin with. The moment I start panning towards the clouds it already had zero detail, it was clipped right from the start and not because of ABL. That's because by default AutoHDR was using a 2000 nits curve so it was setting the highlights up to 2000 nits which the X27 is not capable of displaying period. Back then, maybe even today, Windows does not know the capability of your HDR monitor so it will just set it to 2000 nits peak and that's why everything looked awful. The way to get around this was to use CRU but now we have the calibration tool so there is no need to do that method anymore.
Have you checked it's supposed to be 2000nits? If you capture the heatmap the max brightness is exactly the brightness of your display. AutoHDR is corresponded to the dynamic range of the monitors not some suddenly 2000nits out of specs.
And Windows knows very well how much brightness your display has. Just like how the firmware reading details from VESA DisplayHDR Test.
Then what's the 4th image? Your OLED cannot even see native HDR where it shoots up 2000+nits. You can see a 300nits sun.
It's not like I cannot see sRGB. I can see much better in SDR then why just seeing sRGB?