OLED monitor news

I agree. Especially because plasma and crt are gone too and also did other things right
 
Uh... not trying to inflame anyone, but did you just say abuse regarding people who do nothing more than display images on their display, which is precisely their main function? It doesn't matter if you display noise 24/7, or blocks of black/white that never, ever move. Displays are built for displaying. So, no sort of displaying of images can possibly constitute abuse. That is literally their only purpose. Excusing some forms of displayed image as abuse, tells me you're simply excusing a very real - however infrequent - grave flaw in the technology.

This kind of argument perplexes me. If you use any sort of consumer product outside of what is considered normal operating conditions and expected use (and they all come with them) -- yes, that is abuse.

I do nothing more than drive my car, which is its main function -- to be driven. To use your words, that is literally its only purpose. Are you really telling me that if I act like a speed demon, have a lead foot, slam the brakes, and hit the accelerator that the car should last 150,000 miles? I'd say good luck with that. People would probably tell me I'm a shitty driver, too.

Take your second point: 14% of image issue, whether 7% burn-in or image retention. Even if we just think of 7% burn-in, that's unacceptably high. If LG sells 1 million TVs, that's 70,000 people with burn-in. That's a lot of burn-in! For a product whose only function is to display images, having 7% of the production fail at its main and only function is really, really bad!

Right now, OLED is still too high-risk, because buying a product that has one main function where 7% of the production is quite likely to fail, is a bad value proposition. There is definitely a luck component, then, because you don't see OLEDs advertised as great for everything except if you only watch cable news all day! Then don't buy this! They're displays. They must be able to display everything, perfectly, to the best of their ability, until they die. Burn-in is a very real situation that proves they fail at their main function, without any abuse of any sort.

Again, not trying to antagonize anyone, just wanted to point out that those 2 arguments, abuse and % of product failures, are simply unacceptable. At least, they are to me... and I'd think they should be to anyone forking more than one thousand dollars for a display.

It's not 7% of the production that's a failure. It's 7% of people who engage in user error, because burn-in is entirely preventable based on how you use the display and what you use it for. It's not a panel lottery. We're not talking about backlight bleed here.
 
Uh... not trying to inflame anyone, but did you just say abuse regarding people who do nothing more than display images on their display, which is precisely their main function? It doesn't matter if you display noise 24/7, or blocks of black/white that never, ever move. Displays are built for displaying. So, no sort of displaying of images can possibly constitute abuse. That is literally their only purpose. Excusing some forms of displayed image as abuse, tells me you're simply excusing a very real - however infrequent - grave flaw in the technology.

Nonsense. Almost every machine made by humans won't last as long pushing it to 100% of its capabilities versus a lower percentage. Running a bright red stationary CNN logo for 20 hours per day at 100% brightness for months on end is not a normal viewing experience. ALL display types can get burn-in if abused. I don't think I've ever seen an airport LCD that didn't have burn in.
 
This kind of argument perplexes me. If you use any sort of consumer product outside of what is considered normal operating conditions and expected use (and they all come with them) -- yes, that is abuse.

I do nothing more than drive my car, which is its main function -- to be driven. To use your words, that is literally its only purpose. Are you really telling me that if I act like a speed demon, have a lead foot, slam the brakes, and hit the accelerator that the car should last 150,000 miles? I'd say good luck with that. People would probably tell me I'm a shitty driver, too.

You'd make a very good point, if it weren't because these are very different scenarios. My car can go up to 140mph - or whatever, I've never really looked at the end of that thing. However, it was designed to drive between 10-100mph, anything else out of range would be pushing it.

TVs, including OLED, are designed to work within a spec, let's say the HDR spec. It is expected to display things properly, not abusively, within that spec. Displaying black and white values is certainly not out of spec. Let's further specify, HDR1000. That means you should be able to display super bright whites for short periods of time, after that, you're allowed to dim those highlights, because that much white time would be out of spec. If you're watching cable news 24/7, first of all you're watching it in SDR - is there any HDR cable news signal yet? But let's assume it's HDR. That white text is not going to be 1000 nits. It'll be 100nits, purely within expected spec. Let's assume the signal is garbage and for some reason they code that text at 1000 nits. Your OLED is supposed to, within spec, last as much as it can at 1000nits, then drop to whatever value it's comfortable with. That remains within spec. Going out of spec, away from what it was designed for, would be to display 1000 nit white text non stop until it burns the pixel. It's not designed to do that, so it'd burn, understandably. That, however, is not how these panels work. (I'm focusing on the HDR spec because that's more taxing than SDR spec. Nothing should burn-in on SDR because it doesn't push anything - we've been using it for the past 30 years after all - and yet, most TVs getting burn-in are doing so while displaying SDR. That's unacceptable).

So. TVs behave within spec, and you should expect no burn-in while you're within spec, which is all the time because you can't override that stuff on TVs (unless, maybe, if you find the code to the service menus? In that case, that'd certainly constitute abuse, and if you get burn-in, you deserve it for going out of spec. If you are even able to). Going back to your car metaphor, driving at 140mph, is out of spec, because the car wasn't designed to be driven at that speed. It can, but it wasn't supposed to. While cars allow you to abuse them, TVs don't. They're designed to be abuse-proof. OLEDs can't display something they weren't supposed to, because firmware won't let them, so they can't, effectively, be abused. It's as if your car had a limiter to 90mph so you can never go out of spec (which I wish was a thing, honestly, to eliminate d-bags from the road).

This means OLEDs work within spec, yet may suffer burn-in without any sort of abuse (it's not a guarantee, just a risk I'd avoid taking). I wanted to buy an OLED, I was going to buy an OLED, and then 6 reps in different stores in 3 states (AL, OH, IL), told me to not buy one because burn-in is a very real risk. And that comes from customers who returned their TVs due to the issue, and their own experience with store models that are playing videos (no permanent logos) and still had burn-in. Case in point: you know that famous LG OLED demo with the colored pencils? I've seen several of the frames burned-in (I specifically remember the frame at 0:18 seconds in the video), and that's frames that are only displayed for 2 seconds or less, multiple times a day. That should not result in burn-in. And yet it does. This is the demo I'm taking about, check at 18 seconds for reference:



Nonsense. Almost every machine made by humans won't last as long pushing it to 100% of its capabilities versus a lower percentage. Running a bright red stationary CNN logo for 20 hours per day at 100% brightness for months on end is not a normal viewing experience. ALL display types can get burn-in if abused. I don't think I've ever seen an airport LCD that didn't have burn in.

Read my post above. OLEDs work within spec, so you're not pushing them to 100%. You're using them as they're supposed to be used. In fact, you cannot "push" it, because firmware will restrict you to the spec. 100% brightness on your CNN logo implies 100nits, or, if more, an SDR value. That's a far cry from burning-in anything. If you did so at 1000nits in HDR, then sure, you'll scorch that thing, but OLEDs (and the HDR1000 standards) are designed to require a certain time limit to drop the brightness value. That CNN logo goes down in brightness after a few seconds, thus not being so bright, and so you're not pushing anything. You're just displaying mild white. Within spec. No excuse for burn-in.
 
Last edited:
Nonsense. Almost every machine made by humans won't last as long pushing it to 100% of its capabilities versus a lower percentage. Running a bright red stationary CNN logo for 20 hours per day at 100% brightness for months on end is not a normal viewing experience. ALL display types can get burn-in if abused. I don't think I've ever seen an airport LCD that didn't have burn in.

FYI - you were likely looking at plasma displays. Tons of airports were using them (for whatever reason, never did figure it out). The screens at the airport I've seen were clearly plasma displays with phosphor burn. Again... Not sure who the genius was who picked that display tech for that particular scenario (displaying bright static images) but there you go.
 
You'd make a very good point, if it weren't because these are very different scenarios. My car can go up to 140mph - or whatever, I've never really looked at the end of that thing. However, it was designed to drive between 10-100mph, anything else out of range would be pushing it.... (other good stuff)

Pretty sure that if I calibrated my (fictitious) OLED display to 115 cd/m2 (this is what I have my LCD set to right now), I wouldn't really have any burn-in issues. I may take it down to CRT levels (85cd/m2) so that I can finally get that "flat, crt" image. :)
 
SDR is a limited (narrow) band, having a 1000nit HDR display doesn't push that same rendered band 2x to 3x higher like it would ramping up the brightness on a SDR screen. SDR brightness is relative, HDR uses a different system for brightness which uses absolute values. It opens up the spectrum so content's highlights, shadows, etc can go into a much broader band (1000nit peak to .05 black depth). When SDR goes "outside" of it's narrow band, it crushes colors to white, and muddies dark detail to black. HDR will show the actual colors at higher brightness highlights (gleaming reflections,edges) without crushing to white. HDR shows the same content at the same brightness when that content falls within a calibrated SDR range, it does not scale up the brightness of the whole scene like turning the brightness of a SDR screen up would.

If you viewed HDR content on a 1000 nit display or a 500 nit display, any scene where the peaks are lower than 500 nits should look exactly the same on both displays, due to the way that brightness is now coded. It represents an exact value, not a relative value.
For scenes which have highlights exceeding 500 nits, you will have highlight clipping on the lower brightness display (everything above 500 nits turns white) which would not be present on the higher brightness display. <---- Which will show full color gradations across the spectrum in bright highlights instead of crushing to white after hitting the sdr ceiling.
HDR enables far more vivid, saturated, and realistic images than SDR ever could.
High-brightness SDR displays are not at all the same thing as HDR displays.""

https://www.lightillusion.com/uhdtv.html

Referring to PQ as an 'absolute' standard means that for each input data level there is an absolute output luminance value, which has to be adhered to. There is no allowance for variation, such as changing the gamma curve (EOTF), or increasing the display's light output, as that is already maxed out.
(This statement ignores dynamic meta-data, more on which later.

One of the often overlooked potential issues with PQ based HDR for home viewing is that because the standard is 'absolute' there is no way to increase the display's light output to overcome surrounding room light levels - the peak brightness cannot be increased, and neither can the fixed gamma (EOTF) curve.

As mentioned above, with PQ based HDR the Average Picture Level (APL) will match that of regular SDR (standard dynamic range) imagery. The result is that in less than ideal viewing environments, where the surrounding room brightness level is relatively high, the bulk of the PQ HDR image will appear very dark, with shadow detail potentially becoming very difficult to see.
To be able to view PQ based 'absolute' HDR imagery environmental light levels will have to be very carefully controlled. Far more so than for SDR viewing. This really does mean using a true home cinema environment.

Or, the PQ EOTF (gamma) has to be deliberately 'broken' to allow for brighter images
 
Last edited:
Uh... not trying to inflame anyone, but did you just say abuse regarding people who do nothing more than display images on their display, which is precisely their main function? It doesn't matter if you display noise 24/7, or blocks of black/white that never, ever move. Displays are built for displaying. So, no sort of displaying of images can possibly constitute abuse. That is literally their only purpose. Excusing some forms of displayed image as abuse, tells me you're simply excusing a very real - however infrequent - grave flaw in the technology.

Take your second point: 14% of image issue, whether 7% burn-in or image retention. Even if we just think of 7% burn-in, that's unacceptably high. If LG sells 1 million TVs, that's 70,000 people with burn-in. That's a lot of burn-in! For a product whose only function is to display images, having 7% of the production fail at its main and only function is really, really bad!

Look, I get it. OLEDs look wonderful. Ignoring their other problems (comparatively low brightness at full white, gradients not up to par with other LCD screens), they're a really attractive option. But we gain nothing from excusing their very real flaws. A few months ago I had to choose: get a 55" OLED or FALD LCD. I ended up getting a Vizio P55-F1 because it looks wonderful and only cost me $800. It's not as striking as an OLED, but it's pretty damn close for much less money. And it has none of the flaws that OLED has (it does, like most LCDs, have other flaws, but that's not what we're talking about here). Right now, OLED is still too high-risk, because buying a product that has one main function where 7% of the production is quite likely to fail, is a bad value proposition. There is definitely a luck component, then, because you don't see OLEDs advertised as great for everything except if you only watch cable news all day! Then don't buy this! They're displays. They must be able to display everything, perfectly, to the best of their ability, until they die. Burn-in is a very real situation that proves they fail at their main function, without any abuse of any sort.

Don't take it from me. The market has pretty much agreed with this perspective. Or you think MicroLED is being aggressively developed just because? MicroLED has the same benefits of OLED, with none of the drawbacks of organic components. If OLED were so great and risk-free, there would be no point in developing MicroLED. And yet, the advantages are obvious to the key market players. Let's not ignore the reasons why that is happening.

Again, not trying to antagonize anyone, just wanted to point out that those 2 arguments, abuse and % of product failures, are simply unacceptable. At least, they are to me... and I'd think they should be to anyone forking more than one thousand dollars for a display.

Don't bother. There are so many shills and apologists floating around it's disgusting. This is basically why monitors blow today. People tolerate complete crap.
 
Uh, more 27" nonsense. Can we please get more 32"+ monitors, preferably around 40" and in the very near future? 27" is started to feel very small, especially with 4K. I'm not looking forward to the spate of 27" 8K monitors you know we're going to get.
Yes. For 4K, anything less than 32" is just too small and cramped. Probably need a minimum of 40" for 8K.
 
I dunno what all this nonsense is about 'displaying images isnt abuse'. Parameters of use are defined by the manufacturer and LG clearly states in their manual and other support documentation that leaving static images up can cause retention and burn-in.

If you aren't willing to take any care whatsoever about how you use your display that's fine, but there's a reason that literally every TV reviewer for the past several years has always stated 'just buy an OLED' if you want the best TV. Of course, we would all love self-emissive displays that are as good or better than OLED with infinite durability and no possibility of burn-in, but it's unclear that micro LED will be able to provide a buyable product any time soon or even if it does, that it will be anything more than an incremental upgrade on current FALD backlighting.
 
I'm VERY curious if the eSports version of the JOLED will have a low-persistence rolling scan mode.

Theoretically, rolling scan can be lagless -- by scanning in realtime off the cable -- unlike global strobe (which has to scanout to LCD in the dark, before flashing the backlight). Less lag than ULMB!
 
it's unclear that micro LED will be able to provide a buyable product any time soon or even if it does, that it will be anything more than an incremental upgrade on current FALD backlighting.

Uhm... No. Read up on it. You clearly don't understand what microLED is if you think it's just "incremental".

While you're watching 24/7 CNN i'll be enjoying HDR movies and games with true blacks.

Please don't act like OLEDs are the 2nd coming of Christ. You have nicely deep blacks, but your highlights are garbage. Enjoy your 500nit whites and full screen dimming during daylight viewing while I get 1000+ nits. Both technologies are great, you need to buy depending on when you view content. OLED ain't better, it simply has different advantages. MicroLED will have the best of both worlds: pure blacks and bright highlights.
 
Uhm... No. Read up on it. You clearly don't understand what microLED is if you think it's just "incremental".

I'm well aware of what it is, you're assuming that RGB per-pixel microLED Is going to be commercialized any time soon, but all indications are that this is 5-10 years away if not longer. The LEDs are still too big for use in anything sub-100 inch and they are having substantial difficulties making them any smaller. The only possible use of microLED in the next couple of years is improved backlighting for LCDs.
 
Uhm... No. Read up on it. You clearly don't understand what microLED is if you think it's just "incremental".



Please don't act like OLEDs are the 2nd coming of Christ. You have nicely deep blacks, but your highlights are garbage. Enjoy your 500nit whites and full screen dimming during daylight viewing while I get 1000+ nits. Both technologies are great, you need to buy depending on when you view content. OLED ain't better, it simply has different advantages. MicroLED will have the best of both worlds: pure blacks and bright highlights.

lol no
 
I'm well aware of what it is, you're assuming that RGB per-pixel microLED Is going to be commercialized any time soon, but all indications are that this is 5-10 years away if not longer. The LEDs are still too big for use in anything sub-100 inch and they are having substantial difficulties making them any smaller. The only possible use of microLED in the next couple of years is improved backlighting for LCDs.

https://www.avforums.com/news/samsung-microled-tvs-to-release-in-2019.15141

Between 1 and 9 years. Considering Samsung desperately needs MicroLED to compete with OLED, because their QLED is an embarrassment, you can bet we'll have MicroLED available for purchase sooner rather than later.


Right back at you. Who knows what you're laughing at. Whether it's MicroLED's per-pixel local dimming. Or OLED's brightness limitations. I guess you can choose to remain ignorant and act as a troll.
 
FYI - you were likely looking at plasma displays. Tons of airports were using them (for whatever reason, never did figure it out). The screens at the airport I've seen were clearly plasma displays with phosphor burn. Again... Not sure who the genius was who picked that display tech for that particular scenario (displaying bright static images) but there you go.

FYI - I know what the difference is between a plasma and an LCD. LCD does and will get burn in after a sufficient amount of time.

Right from the horses mouth:

The HD Guru spoke to Bob Scaglione, Senior Vice President of Marketing for Sharp Electronics USA. He acknowledged that pixels can get “stuck” on its LCD HDTVs, leaving a retained image.
 
FYI - I know what the difference is between a plasma and an LCD. LCD does and will get burn in after a sufficient amount of time.

Right from the horses mouth:

The HD Guru spoke to Bob Scaglione, Senior Vice President of Marketing for Sharp Electronics USA. He acknowledged that pixels can get “stuck” on its LCD HDTVs, leaving a retained image.

Thanks for clarifying. EDIT: So I'm guessing that the polarizer "switches" or whatever they're called just get stuck in a certain position?

DLP's can have that issue too, where the micromirror gets stuck and a "burn-in" appearance can happen. I didnn't realize LCD can have that too.
 
I'm VERY curious if the eSports version of the JOLED will have a low-persistence rolling scan mode.

Theoretically, rolling scan can be lagless -- by scanning in realtime off the cable -- unlike global strobe (which has to scanout to LCD in the dark, before flashing the backlight). Less lag than ULMB!

I haven't been following the conversation, but I think that Sony's OLED PVM and BVM monitors do a rolling scan. I kinda wish all OLEDs would do that, to be honest. It would pretty much dethrone CRT from everything.
 
Please don't act like OLEDs are the 2nd coming of Christ. You have nicely deep blacks, but your highlights are garbage. Enjoy your 500nit whites and full screen dimming during daylight viewing while I get 1000+ nits. Both technologies are great, you need to buy depending on when you view content. OLED ain't better, it simply has different advantages. MicroLED will have the best of both worlds: pure blacks and bright highlights.
The infinite OLED contrast gives a far better "HDR" image feel than having a thousand nits blasted at your face with BLB and hazy blacks.

Besides, the current 2018 OLED models are like 700-800 nits, more than enough. 2019 models coming out will have something like 1500 nits I hear, if you really want to kill your eyes.
 
The infinite OLED contrast gives a far better "HDR" image feel than having a thousand nits blasted at your face with BLB and hazy blacks.

Besides, the current 2018 OLED models are like 700-800 nits, more than enough. 2019 models coming out will have something like 1500 nits I hear, if you really want to kill your eyes.

%100 correct. But the post I was responding to wasn't talking about contrast. It was talking about black levels. If black levels are good, but highlights are garbage, your contrast is not that great, which shows in OLEDs in their widely accepted poor gradients. As for LCDs, you get better highlights, but the lack of true black doesn't allow the display to get as punchy (also, "hazy blacks"??? You clearly haven't looked at decent ~$1000 FALD LCDs in 2018 - seriously, go to your nearest best buy and check the Vizio P55-F1 that I bought for even less money. And there's better models than that. I watch movies with the usual top/bottom black bars at night, pitch black room, and those borders are indistinguishable from the rest of the room's darkness. Even my husband, who understands nothing about technology, has repeatedly pointed out how strikingly dark blacks look on this TV).

That "infinite contrast" argument needs to die, it's such marketing garbage. Sure, divide anything by 0 and you get infinite. But you certainly don't get infinite contrast with OLED. There are limits, marked by how bright they can get. Funny you say 800nits is enough, when serious display critics are already saying true HDR benefits come only starting at around 2000nits and OLEDs absolute blacks, everything else is compromised in one way or another. If 800 were enough, why would Dolby master material at ten thousand nits?

Extra points: do not be ignorant and spread the FUD that more nits will burn your eyes. Unless the creator and viewer are absolute idiots, nobody is going to watch 2000nits fullscreen for hours on end, just like you don't look directly towards the sun for minutes on end (much higher nits) to blind yourself. The capability is what matters, to sustain bright highlights when they're needed, which is brief lapses of time (hence why brightness retention is codified into the HDR specs). But for fun, let's actually make a real comparison of brightness. On a regular sunny day, you're looking at 10,000 lumens while you're outside. That's not scorching your eyes, is it? (granted, we have to calculate square feet, but we don't need to get that specific for this simple comparison). Now convert those 10K lumens into nits, and for the equivalent, non-eye-scorching luminance, that'd be 32910nits. As in, thirty thousand nine hundred and ten. That's equivalent to daylight on a sunny day. So, stop spreading FUD about 2K nits melting your face off. You're not displaying the whole screen at 2K nits. (here's a bit more, easy to follow info on this)

And the real solution to get those 2000nits and absolute blacks? Hint: starts with Micro and ends with LED.
 
Last edited:
So, stop spreading FUD about 2K nits melting your face off. You're not displaying the whole screen at 2K nits. (here's a bit more, easy to follow info on this)

And the real solution to get those 2000nits and absolute blacks? Hint: starts with Micro and ends with LED.
2000 nits is for a super bright sunlit room maybe, definitely not something absolutely needed. But I know people comment their 800 nit OLED's about being too bright in in dark room usage, where OLED shines the most.
 
%100 correct. But the post I was responding to wasn't talking about contrast. It was talking about black levels. If black levels are good, but highlights are garbage, your contrast is not that great, which shows in OLEDs in their widely accepted poor gradients. As for LCDs, you get better highlights, but the lack of true black doesn't allow the display to get as punchy (also, "hazy blacks"??? You clearly haven't looked at decent ~$1000 FALD LCDs in 2018 - seriously, go to your nearest best buy and check the Vizio P55-F1 that I bought for even less money. And there's better models than that. I watch movies with the usual top/bottom black bars at night, pitch black room, and those borders are indistinguishable from the rest of the room's darkness. Even my husband, who understands nothing about technology, has repeatedly pointed out how strikingly dark blacks look on this TV).

That "infinite contrast" argument needs to die, it's such marketing garbage. Sure, divide anything by 0 and you get infinite. But you certainly don't get infinite contrast with OLED. There are limits, marked by how bright they can get. Funny you say 800nits is enough, when serious display critics are already saying true HDR benefits come only starting at around 2000nits and OLEDs absolute blacks, everything else is compromised in one way or another. If 800 were enough, why would Dolby master material at ten thousand nits?

Extra points: do not be ignorant and spread the FUD that more nits will burn your eyes. Unless the creator and viewer are absolute idiots, nobody is going to watch 2000nits fullscreen for hours on end, just like you don't look directly towards the sun for minutes on end (much higher nits) to blind yourself. The capability is what matters, to sustain bright highlights when they're needed, which is brief lapses of time (hence why brightness retention is codified into the HDR specs). But for fun, let's actually make a real comparison of brightness. On a regular sunny day, you're looking at 10,000 lumens while you're outside. That's not scorching your eyes, is it? (granted, we have to calculate square feet, but we don't need to get that specific for this simple comparison). Now convert those 10K lumens into nits, and for the equivalent, non-eye-scorching luminance, that'd be 32910nits. As in, thirty thousand nine hundred and ten. That's equivalent to daylight on a sunny day. So, stop spreading FUD about 2K nits melting your face off. You're not displaying the whole screen at 2K nits. (here's a bit more, easy to follow info on this)

And the real solution to get those 2000nits and absolute blacks? Hint: starts with Micro and ends with LED.


You sound like you watch watch daytime TV outside in Arizona.

Black levels are laughable on LCD even on a 300+ zone FALD. HDR highlights on an LCD LMFAO. LCD is laughably pathetic in every way compared to OLED in a dark room.
 
SDR is a limited (narrow) band, having a 1000nit HDR display doesn't push that same rendered band 2x to 3x higher like it would ramping up the brightness on a SDR screen. SDR brightness is relative, HDR uses a different system for brightness which uses absolute values. It opens up the spectrum so content's highlights, shadows, etc can go into a much broader band (HDR premium content of at least 1000nit peak to .05 black depth). When SDR goes "outside" of it's narrow band, it crushes colors to white, and muddies dark detail to black. HDR will show the actual colors at higher brightness highlights (gleaming reflections,edges) without crushing to white. HDR shows the same content at the same brightness when that content falls within a calibrated SDR range, it does not scale up the brightness of the whole scene like turning the brightness of a SDR screen up would.


If you viewed HDR content on a 1000 nit display or a 500 nit display, any scene where the peaks are lower than 500 nits should look exactly the same on both displays, due to the way that brightness is now coded. It represents an exact value, not a relative value.
For scenes which have highlights exceeding 500 nits, you will have highlight clipping on the lower brightness display (everything above 500 nits turns white) which would not be present on the higher brightness display. <---- Which will show full color gradations across the spectrum in bright highlights instead of crushing to white after hitting the sdr ceiling.
HDR enables far more vivid, saturated, and realistic images than SDR ever could.
High-brightness SDR displays are not at all the same thing as HDR displays.



..

One of the often overlooked potential issues with PQ based HDR for home viewing is that because the standard is 'absolute' there is no way to increase the display's light output to overcome surrounding room light levels - the peak brightness cannot be increased, and neither can the fixed gamma (EOTF) curve.

As mentioned above, with PQ based HDR the Average Picture Level (APL) will match that of regular SDR (standard dynamic range) imagery. The result is that in less than ideal viewing environments, where the surrounding room brightness level is relatively high, the bulk of the PQ HDR image will appear very dark, with shadow detail potentially becoming very difficult to see.
To be able to view PQ based 'absolute' HDR imagery environmental light levels will have to be very carefully controlled. Far more so than for SDR viewing. This really does mean using a true home cinema environment.

Or, the PQ EOTF (gamma) has to be deliberately 'broken' to allow for brighter images
 
FYI - I know what the difference is between a plasma and an LCD. LCD does and will get burn in after a sufficient amount of time.

Right from the horses mouth:

The HD Guru spoke to Bob Scaglione, Senior Vice President of Marketing for Sharp Electronics USA. He acknowledged that pixels can get “stuck” on its LCD HDTVs, leaving a retained image.

This is not true for most LCD panels and may only affect specific types.

I have an LCD here at work next to me that has displayed the same image for 5 years almost continuously, 24/7, 365. It has no burn-in. Not a single tiny trace of it.

You literally just quoted a VP of Marketing - who more than likely knows absolutely nothing about the physics of how an LCD works.

Please propose the mechanism by which the LCD burns in. I will wait.
 
Last edited:
Thanks for clarifying. EDIT: So I'm guessing that the polarizer "switches" or whatever they're called just get stuck in a certain position?

DLP's can have that issue too, where the micromirror gets stuck and a "burn-in" appearance can happen. I didnn't realize LCD can have that too.

Go find a burned-in LCD at your office. Try, I dare you.

The only LCDs that have demonstrated burn-in (that is not some kind of one-off defective unit) are some models with LG IPS panels.
 
You sound like you watch watch daytime TV outside in Arizona.

Black levels are laughable on LCD even on a 300+ zone FALD. HDR highlights on an LCD LMFAO. LCD is laughably pathetic in every way compared to OLED in a dark room.

Your eye has limited dynamic range itself. If you like to watch dark movies in dark rooms, great, buy an OLED. Otherwise, with the average brightness of content and the average brightness of viewing rooms, I find the blacks on top notch LCDs to be good these days.

The black plastic bezel on my LCD here is actually not as black as the black background of this webpage in a lit room.
 
Your eye has limited dynamic range itself. If you like to watch dark movies in dark rooms, great, buy an OLED. Otherwise, with the average brightness of content and the average brightness of viewing rooms, I find the blacks on top notch LCDs to be good these days.

The black plastic bezel on my LCD here is actually not as black as the black background of this webpage in a lit room.

Yeah if you view in brightly lit rooms LCDs aren't as terrible by comparison, but you're losing a ton of detail from the ambient light.
 
Yeah if you view in brightly lit rooms LCDs aren't as terrible by comparison, but you're losing a ton of detail from the ambient light.

You'd lose the exact same amount of detail with daylight whether you're using an LCD or an OLED. Only LCDs can get brighter, so you end up seeing a better picture with LCDs than OLEDs... during daytime. Which is what I've been saying all along.
 
One of the often overlooked potential issues with PQ based HDR for home viewing is that because the standard is 'absolute' there is no way to increase the display's light output to overcome surrounding room light levels - the peak brightness cannot be increased, and neither can the fixed gamma (EOTF) curve.

As mentioned above, with PQ based HDR the Average Picture Level (APL) will match that of regular SDR (standard dynamic range) imagery. The result is that in less than ideal viewing environments, where the surrounding room brightness level is relatively high, the bulk of the PQ HDR image will appear very dark, with shadow detail potentially becoming very difficult to see.
To be able to view PQ based 'absolute' HDR imagery environmental light levels will have to be very carefully controlled. Far more so than for SDR viewing. This really does mean using a true home cinema environment.

Or, the PQ EOTF (gamma) has to be deliberately 'broken' to allow for brighter images

--------------------------------

That means HDR's absolute values in the SDR range, (and up to the limit of the OLED's brightness range) should be exactly the same on both. What happens is that when LCD's black levels hit their low limit they bottom out to their max black depth (in fald that can be pretty dark on a modern high end FALD) while OLED goes much darker. A samsung Q9FN can do 19,000:1 contrast ratio though which is very good. They can go from .0001 nit black to .048 black in SDR and .002 to .06 in HDR depending on the window according to some reviews - which is definitely "black" , just not infinite/"off" black of OLED. The FALD algorithms do have to cheat slightly one way or another though, either slight blooming or slight dimming even on a 480zone array (they chose dimming in their current firmware)..



On the opposite end, where HDR really shines, both OLED and LCD instead of continuing to show the full color volume (up to the 10k mastered HDR ceiling of movies) will crush all the color and luminance to white at their peak luminance capability. OLED hits that ceiling much sooner than a high end HDR FALD LCD.

Btw, here is a list of the peak brightness levels different UHD HDR movies have been mastered at. Bladerunner 2049 stands out as the only one mastered at 10,000 nit so far I think but there are plenty of 4,000 and then a lot of 1000.


Some OLED are reported to get higher than 550nit brightness in 2018 and especially in HDR but I've heard it's by cheating using a white sub pixel which doesn't make for true color and perhaps not true 4:4:4 anymore. I'm not sure how true that is or if it's true of the 2019 sets upcoming. I've also heard they do an automatic dimming burn in avoidance feature when the brightness goes over a threshold so I'm not sure how true those brightness stats are overall in real viewing scenarios with the safety on.

from the avsforum thread linked further below in my post:
The WRGB OLEDs due to the introduction of the 'white' sub-pixel, this distorts the standard RGB color channel relationship - excessively at HDR brightness levels. (if you sum the Y of 100% patch of R+G+B primaries you get 400nits while the same time if you display a 100% White patch you get 800nit...so your color gamut is limited to 400 nits... this means that WRGB OLED's can never be calibrated accurately for HDR... ...but can be calibrated with 3D LUT in SDR mode, the recommendation is up 105-110 nits, there will be to ABL limiting and displays are more stable overs the time at these nits levels.


---------------------------------

This is the LG C8 compared with the Samsung Q9FN on a few of the key features that LCD excels at (from rtings.com comparison)

Color Volume C8 ---- Q9FN
EGEpIWq.png




Peak Brightness C8 ---- Q9FN
Yd9Ush7.png


----------------------------

There's a little more to it than that.. here is a bit from avsforum


each display is doing different tone mapping according to the movie metadata, some displays don't do anything, some are roll-off sooner when they will see 4000nits (like 2017 LG OLED's), others don't do anything (like 2016 LG OLED's....if you send 1000 or 4000), others like Panasonic EZ1000 is counting MaxCLL also, but not all values; it's ignoring any MaxCLL below 401nits and above about 5000 nits. If you pause a frame and send different MaxCLL with HD Fury, the picture will change, it can be measured also what its doing by taking greayscale sweep.

For these reasons with HDR10 and static metadata, because there no golden standard about what tone/gamut mapping the HDR10 displays will perform with each movie incoming metadata, it's up to display model/brand/firmware that strategy will follow (clip/soft roll-off etc).

Dolby Vision bypass each display model/brand tone mapping and its using their own (Dolby's) defined tone/gamut mapping algorithm, per frame, so its more accurate.

---------------------------

What it really comes down to - for PC and Gaming use - on spending over $1000, $2000 up to nearly $3k on a top of the line OLED or 480zone FALD LCD is this quote from rtings which echos most professional reviews:

The LG C8 OLED TV is better than the Samsung Q9FN for most people, unless you watch a lot of static content and are concerned about burn-in. The LG C8 has an infinite contrast ratio and no need for a local dimming feature, as well as an ultra-wide viewing angle, but it can experience permanent burn-in. The C8 has a nearly instantaneous response time, although this can bother some people as 24p content can appear to stutter. The Samsung Q9FN has much better color volume and is much brighter
 
Last edited:
(all the good things)

Excellent post. It's the point I've been trying to drive home, thanks for adding data to back that up. I've been having a pretty busy week so I didn't feel like justifying my posts anymore.

My only complaint is this "infinite" constrast in OLED. It's only "infinite" because we're dividing by 0 on the lower end, but technically it's not true. If we had infinite contrast, gradients would be perfect, and we know they're not on OLEDs. I wish we would just get an approximate contrast value, because "infinite" it certainly isn't, despite being damn good and certainly better than any LCD that's not made with self-emissive pixels..
 
Well it's not infinite contrast and black gradients it's "infinite" black depth - meaning you can turn the pixel off without any backlights around it bleeding into the turned off area, so more like zero or 1 or whatever the bottom of whatever scale you are measuring by is. That isn't a measure of gradients, crushing, clipping, noise, etc.

https://www.trustedreviews.com/reviews/lg-c8-oled
------------------------------------------------------------------------------
Obviously, the LG can’t hit the same 2000-nit peaks as the Samsung Q9FN. Nor can it offer the same lush levels of vividness in bright scenes. Nevertheless, it’s capable of impactful highlights and its colours don’t suffer any issues with saturation. Mostly bright scenes, meanwhile, don’t endure anywhere near the amount of clipping seen on older OLED models.

That being said, I’d still recommend checking out the Samsung Q9FN if most of your viewing is in brightly lit conditions, or if you want that real HDR ‘punch’.
-- and watch a lot of static content and don't want to blow thousands of dollars on something that risks permanent burn-in. --
------------------------------------------------------------------------------
 
Ok we get it, OLED could potentially burn in. Jesus. Can't we just be excited at the fact that OLED monitors might finally be a reality for those of us who can deal with the burn in risk?
 
Burn in is the deal breaker for me but it's more than burn in. OLED colors can't hit higher HDR color volume. They hit a color luminance/ brightness cap much earlier where they clip the colors to white or roll off out of the riskier burn in levels. They also , even if saturated within their alllowed levels, can't be calibrated in HDR. This untrue color is due to their safety/fear factor against burn in - clipping to white early or rolling back from even the oled peak color luminances.. and because they "cheat" higher brightness levels than the first gens of OLED by adding a white subpixel.

-----Quote------ https://www.forbes.com/sites/johnar...ch-tv-is-the-best-ive-ever-seen/#7bd8c7312705

contrary to what I know many people reading this might expect, even the brightest parts of the image on this claimed 10,000 nits screen didn’t make me feel uncomfortable. They didn’t hurt my eyes. They didn’t make me worry for the safety of children, or send me rushing off to smother myself in suntan lotion. Even though the area of Sony’s stand where the screen was appearing was blacked out (apart from the light being cast from both the 10,000-nit screen and five other more ‘normal’ screens sharing the space).

What the boldness of the highlights DID do, on the other hand, was make the pictures look more stunningly realistic and dynamic. More like real life. More direct. More intense.

Some of the demo content Sony is showing has extreme brightest peaks appearing within the context of generally much higher light levels than we’ve seen on an LCD TV before, so that they don’t stand out too brazenly. But actually, as noted earlier, even when the brightest images elements are ‘blaring’ out of a dark setting, they still just look compellingly realistic rather than something you should only watch with sunglasses on.

------
this underlines the importance of brightness to delivering truly realistic images. Shots of villagers embroidering on a bright sunny day show looked incredibly life-like with 10,000 nits (or so) of light output to define them.

These shots showed, too, that it’s only when you’ve got this sort of real-life brightness at your disposal that colors truly start to look representative of reality rather than a construct limited by the screen technology of the day.
------------------
 
Last edited:
I am wondering how RGB OLED will work. I have been less than impressed with every OLED except the LG W-OLED with color filters. They don't have the cavity effect color shift that all mobile displays have.
 
It doesn't just seem better. It IS better. But there's no point in talking about MicroLED now because we won't get monitor sized mLED displays for many many years, if ever.
It won't be definitively better until each sub pixel has its own micro led like oled.

Not just each pixel like what seems to be their first goal.
 
It'll be better because it will go over 2000 nit in HDR highlights and color volume, perhaps even 10,000 nit like the article I quoted.

contrary to what I know many people reading this might expect, even the brightest parts of the image on this claimed 10,000 nits screen didn’t make me feel uncomfortable. They didn’t hurt my eyes. They didn’t make me worry for the safety of children, or send me rushing off to smother myself in suntan lotion. Even though the area of Sony’s stand where the screen was appearing was blacked ou
[...]
These shots showed, too, that it’s only when you’ve got this sort of real-life brightness at your disposal that colors truly start to look representative of reality rather than a construct limited by the screen technology of the day.


OLED will be rolling down from its low color volume brightness ceiling and/or cutting off it's color volume to white at a ceiling as a safety feature/fear of burn in mechanism. The brightness it's capable of before that is also making impure color by using a white subpixel (that means lots of added white subpixels on the screen) to cheat higher brightness measurements which makes the color accuracy way off and the color volume lower , so it can't even be calibrated in HDR.

The current samsung Q9FNs already do ~1800nit, more advanced displays and especially micro LED should do 4000 to 10,000nit. Movies are mastered at 10,0000nit and UHD HDR discs are mastered at 10,000 .. 4,000 .. and 1,000 currently.

-------
https://www.resetera.com/threads/hdr-games-analysed.23587/
 
Last edited:
It won't be definitively better until each sub pixel has its own micro led like oled.

Not just each pixel like what seems to be their first goal.
One LED per pixel wouldn't make sense - how would you make the individual colors? It'd only work as the backlight in an LCD. Maybe you're thinking of Mini LED FALD LCDs? That's going to initially only be a few thousand zones though, at most - a long, long way from the 8 million needed for per-pixel backlighting on a 4k display. I think if it ever becomes economical to do 8 million LEDs, it won't be much of a stretch to do 24 million for a properly emissive Micro LED display.
 
Back
Top