Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
But which display can show pictures of Spider-Man?!
The truth:
Still would take ABL drawbacks over shit blacks, haloing and motion issues.The truth:
PG35VQ HDR vs AW3423DW HDR
View attachment 547880
View attachment 547881
PG35VQ SDR vs AW3423DW HDR
View attachment 547882
View attachment 547883
That's why you can only see SDR on OLED.Still would take ABL drawbacks over shit blacks, haloing and motion issues.
Yup everything about OLED is dim especially HDR. You should see my QN90B in HDR with FALD and Contrast enhancer both on high holy shit It's jaw dropping at full brightness in all it's glory it's a sight to beholdThe truth:
PG35VQ HDR vs AW3423DW HDR
View attachment 547880
View attachment 547881
PG35VQ SDR vs AW3423DW HDR
View attachment 547882
View attachment 547883
That's why you can only see SDR on OLED.
Any actual HDR image with OLED always looks this bad. A little bit tweak of EOTF FALD LCD can have as much black as OLED.
Why would I want SDR above 120 nits?SDR on OLED is only 250nits top while SDR on FALD LCD is 500-600nits similar to HDR 400. There is more pop between SDR sRGB 80nits vs HDR 400. But there isn't much difference between SDR Adobe 400nits vs HDR 400.
On the true HDR monitors, 400nits SDR is basically HDR400 compared the original sRGB 80nits while the true HDR1000 is not anywhere a monitor like AW3423DW is capable of.Why would I want SDR above 120 nits?
Perhaps the sun is shining on your monitor, or maybe you are on a crusade to save humanity from the evil OLED by showing them the divine light of the FALD LCD - halos and all.Why would I want SDR above 120 nits?
SDR is SDR. It will never not be SDR. SDR on an HDR monitor shouldn't be any different from SDR on an SDR monitor. What is the difference between your imaginary "SDR Adobe 400nits" and SDR on a regular old fashioned 400 nits capable SDR monitor?On the true HDR monitors, 400nits SDR is basically HDR400 compared the original sRGB 80nits while the true HDR1000 is not anywhere a monitor like AW3423DW is capable of.
What?It means every SDR image I see has at least 4x range compared the original sRGB 80nits that won't look anywhere as good.
Very true. No monitor can do that, since it is literally impossible as there is no hidden information stored in an SDR image that can be magically revealed or enhanced.These monitors can turn SDR into HDR.
Do what?OLED cannot do this.
Right... Which is a huge problem... Or maybe not...OLED doesn't even have Adobe colorpace. It only has DCI-P3.
SDR is SDR. It will never not be SDR. SDR on an HDR monitor shouldn't be any different from SDR on an SDR monitor. What is the difference between your imaginary "SDR Adobe 400nits" and SDR on a regular old fashioned 400 nits capable SDR monitor?
If only you could see what I see with an actual HDR monitors that makes SDR similar to HDR. So the images are always 4x better than the original.
Yeah, if only... However, I am afraid that no one can see what you see.If only you could see what I see with an actual HDR monitors that makes SDR similar to HDR. So the images are always 4x better than the original.
Get a true HDR monitor first.Yeah, if only... However, I am afraid that no one can see what you see.
Perhaps I will consider it if you answer my questions. You would of course have to give me the definition of what a True HDR Monitor is first as well. Also, if your True HDR Monitor turns SDR into HDR, what does that make actual HDR viewed on this True HDR Monitor?Get a true HDR monitor first.
I don't do imagination.Perhaps I will consider it if you answer my questions. You would of course have to give me the definition of what a True HDR Monitor is first as well. Also, if your True HDR Monitor turns SDR into HDR, what does that make actual HDR viewed on this True HDR Monitor?
That's funny, because it seems like it's all you do. If it's not imagination, then where do you obtain this magic Adobe SDR 400 nits material?I don't do imagination.
In certain material, of course. It's a real shame that there is no True HDR Monitor that can do dark scenes without major artifacts.When the actual HDR1000 shows up, OLED is dwarfed by true HDR monitors with 3X - 5X higher range side by side. This is the actual HDR with 10bit 1024 shades colors lit by at least 1000nits.
It's more like you are the one doing imagination because you don't even know most HDR is graded from just sRGB 80nits. The color can be stretched.That's funny, because it seems like it's all you do. If it's not imagination, then where do you obtain this magic Adobe SDR 400 nits material?
In certain material, of course. It's a real shame that there is no True HDR Monitor that can do dark scenes without major artifacts.
Your pictures are absolutely meaningless btw. They could just as well be two identical SDR monitors at different brightness.
What am I imagining? I am not the one coming up with ridiculous terms such as "Adobe SDR 400 nits". Please do give me a source on this HDR grading from "sRGB 80nits", just so I can see what on earth you're actually trying to say. Of course you won't provide any external sources to any of your claims about anything.It's more like you are the one doing imagination because you don't even know most HDR is graded from just sRGB 80nits. The color can be stretched.
I don't? That's news to me. But good thing there are all-knowing beings, such as yourself, that can enlighten me on these things.And you don't know SDR can have infinite contrast, 0 black as well. That's why I said most people don't even realize what they are seeing on OLED is in fact SDR.
I believe that you have in fact proven this, yes.Imagination is a powerful tool.
Where? I said they are meaningless. They are useless. They can't be used to prove anything.I like how you deny the pictures as if they are not true.
I don't?You don't want me to lit up the light to show the stand of AW3423DW as proof because there is no polarizer on that OLED. It can look grey/red under the light.
If you've bought the remastered UHD version of the old movies such as 1975 Jaws, there is a section specifically explained how the old Rec709/sRGB footage is regraded to HDR1000 for better images more than what the director had seen before.What am I imagining? I am not the one coming up with ridiculous terms such as "Adobe SDR 400 nits". Please do give me a source on this HDR grading from "sRGB 80nits", just so I can see what on earth you're actually trying to say. Of course you won't provide any external sources to any of your claims about anything.
I don't? That's news to me. But good thing there are all-knowing beings, such as yourself, that can enlighten me on these things.
I believe that you have in fact proven this, yes.
Where? I said they are meaningless. They are useless. They can't be used to prove anything.
I don't?
The image inside sRGB 80nits is never a natural or realistic look. It is a compromise to technology. sRGB 80nits looks lifeless now.Sorry, but mapping SDR to HDR does not always work well/look good. It CAN, but sometimes (I'd even say often) it just doesn't. Doesn't matter the monitor tech. I tried Auto HDR (after Windows HDR calibration) on a $3K very well reviewed FALD LCD and in some games, things just did not look the way they were supposed to. For example, in one game (that doesn't natively support HDR), a cutscene looked much more washed out with Auto HDR than it did under SDR, where it looked a lot more natural. I've seen similar things on my FALD TV, which is great at HDR, but I always try to watch SDR as SDR because I think it legitimately looks better/more natural than trying to blow out SDR into an HDR image.
HDR has a lot more information than just being stretched to fit a certain amount of nits. It knows the sun is supposed to be X number of nits. And sparks or headlights a different value. A lot of games and applications also allow customizing the tone-mapping to your specific monitor so you can set the ceilings/floors and it sticks within that range. There's badly done native HDR, to be sure, but that just shows that HDR is very dependent on being mapped properly to look its best and having proper support for setting the appropriate limits on the display.
Also, it's just factually incorrect to say running a game in native HDR on an OLED is the same as an SDR image. It's simply not. While they look good in both modes, games like Dead Space or Cyberpunk 2077 look a world different in native HDR vs their SDR version, especially as far as the highlights. And it's something you just simply couldn't do in SDR. (Sure, you could crank up the brightness in SDR on a FALD, but then you're just asking for eyestrain - learned that with my FALD TV when it was set a little too bright initially for SDR.) There is no contest that higher brightness panels like FALD ones have more impactful HDR. It's a fact they can get closer to what HDR is graded as (1000 nits or what-have-you). It's also a fact they also have their own issues, like blooming in certain situations or dimming certain things down too much because of reliance on FALD algorithms, just as OLED has certain limitations like ABL or possibility of burn-in. But, that does NOT mean OLED's can't still have impactful, good-looking HDR that differs significantly from SDR, especially with modern tone-mapping.
.The Samsung QN900B has amazing peak brightness in SDR. Scenes with small areas of bright lights get extremely bright, and overall, it can easily overcome glare even in a well-lit room. Unfortunately, it has a fairly aggressive automatic brightness limiter (ABL), so scenes with large bright areas, like a hockey rink, are considerably dimmer.
.Thanks to its decent Mini LED local dimming feature, blacks are deep and uniform in the dark, but there's some blooming around bright highlights and subtitles
.Local dimming results in blooming around bright objects.
.Like most Samsung TVs, the local dimming feature behaves a bit differently in 'Game' mode. There's a bit less blooming, but it's mainly because blacks are raised across the entire screen, so the blooming effect isn't as noticeable.
The image inside sRGB 80nits is never a natural or realistic look. It is a compromise to technology. sRGB 80nits looks lifeless now.
When a monitor can display extended range from sRGB 80nits to Adobe 400nits, it's already halfway decent as it is a higher range.
HDR needs both contrast and color. If AutoHDR looks washed out, it is either that cutscene not supported by AutoHDR or it means the monitor doesn't provide enough color when the brightness gets higher.
This video has 3 stages. The 1st has sRGB 80nits. The 2nd has only the boosted contrast but keeps the same color of sRGB. The 3rd is the HDR1000 with both boosted contrast and extended Rec.2020 color.
And there are already monitors can do 1000nits SDR. If you put Rec.2020 colorspace, they look the same as HDR1000. That's why I said many average OLED 100nits image with a few dots highlight are not exactly HDR. They are inside the range of 400nits SDR 8-bit.
Kramnelis isn't wrong across the board though by any means, at least as far as his base concepts of brighter and brighter HDR peaks as goals on the high end being more and more realistic and colorful, like future 4000nit HDR and 10,000 nit HDR screens someday, and longer sustained periods. . - it's just that doing it with (FALD) LCD panels is a huge tradeoff or downgrade of many facets of picture quality. OLED, even if not the most pure color at the high end and even if not achieving the highest current ranges (though compared to HDR 4000 and HDR 10,000 - even 1500nit is low). or sustained periods. . OLED is starting to climb to 1000nit peak highlights and perhaps a bit over in the meantime. Heatsink tech and QD-OLED tech and perhaps some other advancements may improve this over gens. There are tradeoffs either way though for sure. We do need much higher peaks eventually. Especially for highlights and direct light sources. But for many of us, after seeing OLED -> (FALD) LCD is not the way to do it. OLED needs to be replaced by another better per pixel emissive display technology someday for sure, but it's not LCD.
Am I the only one to find this an extremely annoying compromise? I'm not using OLED anywhere but my phone but I've tried auto-hiding my taskbar before and just found the lack of what is basically the permanent HUD with valuable info on my desktop to be an anti-feature.
Huh? If something is made in SDR (whether TV, movie, or a game), it's generally mastered in sRGB and meant to display in sRGB. If you're going for creator's intent, it's the most accurate. I'm also not sure where you're getting the 80 nits per se as being optimal - that's pretty dim, especially for most home rooms. 100 is often the standard for SDR, even in a dark room... Based on personal preference and varying light conditions, I calibrate SDR to closer to 160 nits as I like a bit of added brightness without pushing things over the top or making things too inaccurate (I don't mind things a bit brighter than reference). And I'm very pleased with how content look. Yes, SDR to me looks more natural than Auto HDR. Auto HDR doesn't necessarily do a bad job - in some circumstances, it can look quite good. But since there's no manual grading or conversion, you're trusting an algorithm to figure out what's supposed to be bright vs dim and more importantly HOW bright, and that doesn't always work as expected. For me, the unintended effects aren't worth it - I'd rather see SDR as it was mastered to be seen. For those that prefer an HDR look, that's fine, but it's a personal preference, and has nothing to do with "lifeless"ness. It's the standard showing you what was always meant to be shown.
"Supported by Auto HDR" isn't so much a thing (if the game supports DirectX 11 or 12, it supports Auto HDR). It's going to try to convert any game content to HDR using an algorithm [that cutscene is done in-engine anyways, not video], and depending on that content, is going to have mixed results. By the way, just for giggles, I just fired it up on my OLED in both SDR and Auto HDR. The Auto HDR version looked quite good and much closer to SDR than when I saw it on the FALD monitor I had. Definitely not *as* washed out. BUT, SDR still looked slightly better and more contrasty. Colors looked good in both. Menu text and such were too bright in HDR. As I said, you're trusting an algorithm to do more than an algorithm can realistically do perfectly.
This is different than, say, when a movie is remastered for HDR (like the Jaws example earlier I'd imagine, though to be fair, I haven't seen it, and I don't know all the details of the remastering). But from what I can tell, it was graded and enhanced by PEOPLE, not simply letting the a piece of equipment expand the range. That's necessary and crucial if you want an accurate image without the anomalies that a technology like Auto HDR can add. Again, if you prefer the look of Auto HDR, there's not a thing wrong than that. But SDR definitely has its advantages and is the most accurate, unless something is hand-enhanced to HDR.
The sRGB standard is made for all monitors. It's the lowest standard. Creator's intent has no commercial benefits if the image cannot be displayed. So they intend sRGB image in a way that all monitors can see it. But these sRGB 80nits images are never good. They are common and lifeless. These creators are never satisfied with sRGB 80nits. This is why it is HDR at a higher range. Now there are monitors capable of higher range to make you see better images than they indented in limited sRGB.
Windows AutoHDR is a test for monitor HDR capability as it is done automatically. If a monitor is capable of accurate HDR, Windows AutoHDR will look good. It means the monitor has more color, and more accurate EOTF tracking. If something goes off, Windows AutoHDR will look bad such as being washed out. Of course, AutoHDR is not as good as manual HDR. But compared to sRGB it is better on a competent HDR monitor.
I do notice the auto-dimming on my LG, but only very occasionally on full-white screens (such as the stock Google homepage or a blank page on a word processor). I'd estimate even if 1/5th of the page has some other color/content on it, it doesn't happen...it's only when it's almost all white. It's not so distracting it really bothers me 98% of the time, and certainly less distracting than I found the blooming on the FALD IPS panels I tried, which was really distracting to me in productivity and day to day tasks. ABL certainly might be an issue if you're particularly sensitive to it, though. In the few circumstances it happens to me, generally I just notice it briefly and keep working. The picture, even with slightly limited brightness, still looks pretty good and just as readable, so it's not drastic enough I feel it impacts my work at all.Don't know if it's that big of an issue. The auto-dimming to me is the bigger issue in terms of general productivity use outside of gaming.
You just downplay the importance of brightness as if it doesn't matter. If there is no brightness then there is no color. Realistic images has much higher mid to showcase 10bit color. A higher range is always better.Kramnelis is a zealot for lifting the whole standard scene Mids a lot lot more than the default, not just increasing the high end's highlights and light sources - which is his preference and opinion - so that might explain part of the reason why he is especially aggressive about lower range of OLED compared to say his 1500nit pro art FALD LCD. While I disagree with a lot of his opinions on that (as I said, unlike Kram, I don't think the whole scene needs to be lifted brighter, at least not "unnecessarily") - Kramnelis isn't wrong across the board though by any means, at least as far as his base concepts of brighter and brighter HDR peaks as goals on the high end being more and more realistic and colorful, like future 4000nit HDR and 10,000 nit HDR screens someday, and longer sustained periods. . - it's just that doing it with (FALD) LCD panels is a huge tradeoff or downgrade of many facets of picture quality. OLED, even if not the most pure color at the high end and even if not achieving the highest current ranges (though compared to HDR 4000 and HDR 10,000 - even 1500nit is low). or sustained periods. . OLED is starting to climb to 1000nit peak highlights and perhaps a bit over in the meantime. Heatsink tech and QD-OLED tech and perhaps some other advancements may improve this over gens. There are tradeoffs either way though for sure. We do need much higher peaks eventually. Especially for highlights and direct light sources. But for many of us, after seeing OLED -> (FALD) LCD is not the way to do it. OLED needs to be replaced by another better per pixel emissive display technology someday for sure, but it's not LCD.
Accuracy in limited sRGB doesn't mean it will have a better or a natural image. Accuracy is intended for mass distribution. It's more important for the creators. These creators might not have the intention or budget to implement HDR for better images. Even if they implement native HDR, the UI can still shine at max brightness just like every game made by Supermassive. These anomalies are easy to fix.The best HDR monitor in the world will still look odd and have anomalies (menus are one example) with SDR content in Auto HDR. sRGB is still more accurate, and aside from that, myself and many others will find it looks better and more natural to our eyes. It's perfectly fine if your prefer Auto HDR - many people do, but if you like accuracy, sRGB will get you closest.
Personally, I'll save HDR for content natively mastered in (or professionally converted to) HDR, and from what I've tested so far, I'm very happy with the few titles I've tried out on my OLED, even if the brightness can't get quite as high. I'm sure in these titles, a FALD monitor with higher brightness can look unquestioningly even better, and I don't dispute that, but the many tradeoffs in other areas weren't worth it to me, especially when I get enough impact to satisfy me from the HDR picture on OLED.