DUAL Layer LCD Display Info Thread (Mono LCD as Backlight)

One 1080p pixel per every four of a 4k array is the finest pixel size ratio backlight I've heard of outside of microled. The ability to tone the monochrome backlight layer at a 1080p pixel level would be nothing to shrug at. The ability to turn that pixel layer "off" with another layer of light filtering yields a quoted .0003 black depth. .0003 is very, very black. Normal LCDs are around .14 black depth so that is 467 times darker. That while getting ~ 3000nit HDR color volume since it's not shackled down low by burn in risk safety features like OLED. They will be using a quantum dot color filter also. Of course it's tech could have it's own visible idiosyncrasies and the power draw and heat could be quite huge even though I'm not a low power zealot.

--------------------------
So 600nit OLED is something like
350nit SDR color + 250 higher nit color* capability
(*white pixeled and varying - reflexively toned down from higher than 600 peak back to 600 via ABL)

1000nit is SDR +650 greater color volume capability

1800nit is SDR + 1450 greater color volume height capability

2900nit is SDR + 2550 greater color volume height capability

---------------------------

That's the obvious gain for this tech potentially.. getting very very deep blacks, so near pixel level backlighting to make difference negligible, while getting 3000 nit HDR color volume and no oled burn in risk color nit governors or oled aging.

------------------

Side note . I have several glossy displays. Glossy looks so good. I hope the dell alienware OLED 55" gaming display is glossy as well.. I'd assume so.

VA .. I own several from 3000:1 gaming, 5000:1 4k desktop, 7800:1 to 13000:1 FALD one. A non-FALD ips' black levels are like .13 - .14 and have like 900:1 contrast ratio. I'll never go back to that but I'm not interested in a ~ 13" tall FALD g-sync IPS that lacks hdmi 2.1 for $2000+ either.

Considering HDR pc gaming is not ubiquitous yet, the 55" Dell Gaming OLED could be a contender for right now if the price is expensive rather astronomical to my perspective. But in the following years while still waiting on MicroLED to be in the same position, a dual layer LCD could fit the bill and blow the color volume ceiling up nicely while maintaining very deep black levels.

Hopefully it'll be more like a 2yr gap rather than 3 between each tech now. If that were the case it'd be something like:

.. 2019 - 2020 .... OLED hdmi 2.1 /or/ 480 to 1000+ zone FALD hdmi 2.1
..(suffer 600nit 'SDR+250nit' whitewashed ABL limitation of OLED in HDR content or dim/bloom zone offset of lower density FALDs on all content - but with - 2000nit hdr color)

.. 2020 -2022 Dual Layer LCD? - 3000 nit hdr color and. 0003 black depth(suffer large power draw, perhaps some other idiosyncrasy of the tech)

.. 2023 - 2024 = MicroLED available and expensive rather than astronomically priced ?

... 2030+ = AR/VR - hopefully at some point we'll just have clear insane resolution AR glasses that can slap viewing surfaces, virtual objects and characters anywhere mapped in real space.
 
Last edited:
elvn I'm pretty sure ABL is Absolute Black Level (yes?), but what's FALD?
Fast Acting Light Dissipation?
Functionally Advanced Llama Detector? heh
 
NVsBTV1.png
 
While I promise I read the thread... I also read it really late at night :(

Thanks guys... heh
 
Thanks.
Good article but the quoted burn in risk, while technically the reason sony is moving away from OLED, is probably not because the RTINGs tests referenced show some torture test burn-in in extreme scenarios


-------------------------------------------------

Reasons I originally considered the burn in risk/avoidance being applicable to sony:
-intense time usage scenarios of reference monitors
- if they were actually using them at a true 600nit without white subpixel augmentation and/or if they moved the HDR color volume cap of ABL to an actual 1000nit instead of a 600nit ABL reflex, (neither of those are what RTINGs torture tests were showing since they were using consumer grade OLED WRGB w/ ABL tvs).

-- HOWEVER, while I was thinking about it - I decided to look again at the RTINGS tests to see if any of them were actually tested on daily HDR color volume ranges for all of those hours.. that could be the difference since SDR is typically around 350nit peak. This is what I found:

RTINGS https://www.rtings.com/tv/learn/permanent-image-retention-burn-in-lcd-oled
TEST SETUP

"The TVs are placed side-by-side in one of our testing rooms as shown to the right. The TVs will stay on for 20 hours per day, 7 days per week, running our test pattern in a loop. They will be turned off for 4 hours each day using USB infrared transmitters connected to each TV and controlled by a PC to better represent normal (but still very heavy) usage. Calibration settings have been applied, with the backlight or OLED light set to produce 175 nits on our checkerboard pattern. On the B6, the 'Pixel Shift' option is enabled. A single Android TV Box is used as a source, with a HDMI splitter used to provide the same material to each display."

"A 5.5 hour video loop is used as the test pattern. It has been designed to mix static content with moving images to represent some typical content. The base material is a recording of over the air antenna TV with RTINGS overlay logos of different opacities and durations, and letterbox black bars added. These additional elements are:

  • Top and bottom: Letterbox bars present for 2 hours, then absent for 3.5 hours (movie example)
  • Top left: 100% solid logo, present for the whole clip (torture test)
  • Top right: 50% opacity logo, present for the whole clip (network logo torture test)
  • Bottom left: 100% solid logo, present for 2 hours then absent for 3.5 hours (video games example)
  • Bottom right: 50% opacity logo, present for 10 minutes then absent for 2 minutes (sports or TV shows example) "

-------------------------------------------------

So correct me if I'm wrong but ..
it sounds like RTINGS.com's torture tests were NOT HDR color volume tests (color brightness nit ranged up to 600nit or perhaps even briefly more) but they were rather low brightness (175nit) SDR content of mixed types (solid logo, letterbox bars, opactiy logos) for 20 hours a day. This does not sound like a HDR content ordinary usage test for burn in let alone a HDR torture test.

So the burn in risks - that the RTINGs.com torture tests seemed to indicate were unlikely - could be quite different even in ordinary or regular heavy usage scenarios of HDR material.

edit: The real life OLED usage tests did set the CNN test to 200 nits normal and 380 nits "full brightness SDR" while still having ABL active. Still not 600 nit HDR material but brighter than 175nit and notably the 480nit showed much worse burn in of the cnn logo than the 200 nit one.

-------------------------------------------------

The dual layer LCD tech also sounded like it had the potential to go much brighter in consumer displays than the 1000nit color volume of the reference monitors, at least from the hisense shown at CES that the representative was quoting at 3000nit for release in china in 2019. This would be a lot closer to HDR 4000 content levels so would get a lot more color volume out of hdr 1000, 4000, and 10,000 content than OLED with no burn in risk .
 
Last edited:
  • Like
Reactions: Nenu
like this
"The TVs are placed side-by-side in one of our testing rooms as shown to the right. The TVs will stay on for 20 hours per day, 7 days per week, running our test pattern in a loop. They will be turned off for 4 hours each day using USB infrared transmitters connected to each TV and controlled by a PC to better represent normal (but still very heavy) usage. Calibration settings have been applied, with the backlight or OLED light set to produce 175 nits on our checkerboard pattern. On the B6, the 'Pixel Shift' option is enabled. A single Android TV Box is used as a source, with a HDMI splitter used to provide the same material to each display."
You could be right but,
There is a chance they mean the sets were calibrated at 175nits for SDR but HDR tests were carried out at a higher level.
Oddly I havent come across any information about calibrating HDR.
175 nits sure isnt HDR.
 
From what I've read on AVSforum, OLED tvs are only able to be calibrated to 400nit due to the white pixels mixed in and the ABL at higher HDR ranges... so are unable to be calibrated solidly in HDR.

I didn't find anything on RTINGS test info saying anything about HDR content being shown at all.
They did set the CNN regular usage test to 200 nits for the normal CNN tv test and 380nits for the "max" brightness (SDR) tv test. This is still considerably lower than the 400 to 600nit after ABL brightness range of HDR material on the LG OLED tvs, and notably the 380nit one burned in much worse than the 200nit one in the CNN test. That is a static logo though not the 400 - 600 nit of the sun light source and glinting off of waves or a gun barrel dynamically in HDR movies and games to be fair. It probably wouldn't be recommended for using a HDR wallpaper or working on HDR photos or HDR video frames for hours.
 
Last edited:
300 watts for a 31 inch screen. I don't see this being made into large TV's if it consumes that much power. Although I think either TCL or Hisense was working on something like this.
 
300 watts for a 31 inch screen. I don't see this being made into large TV's if it consumes that much power. Although I think either TCL or Hisense was working on something like this.
For consumer displays I think FALD backlighting (which the pro displays apparently lack) will pretty much be a requirement to keep power consumption and heat under control.

Personally I'm very much looking forward to these displays entering the consumer market - especially if it's priced competitively with OLED. The superior brightness and innate burn in resistance coupled with OLED-like black performance make it an easy choice IMO.
 
Compare it with the past:
- 480 lines B&W CRT in the 50's
- 480 lines Color CRT in the 60's
- 480 lines Color CRT in the 70's
- 480 lines Color CRT in the 80's

I think we cannot really complain that much :whistle:
And yet still more motion resolution than current top tech. If that isn't sad I don't know what is.
 
For consumer displays I think FALD backlighting (which the pro displays apparently lack) will pretty much be a requirement to keep power consumption and heat under control.

Personally I'm very much looking forward to these displays entering the consumer market - especially if it's priced competitively with OLED. The superior brightness and innate burn in resistance coupled with OLED-like black performance make it an easy choice IMO.

If they add FALD, then they will have to use ABL (Auto Brightness limiter) like OLED to keep power under control. Anywhere near full screen brightness will still need enormous power.

The reason for so much power is the losses driving through 2 LCD layers instead of just one.

I would bet the first consumer sets based on dual layer would go to a lower overall brightness to keep power/heat in check, rather than FALD, which adds more cost/complexity.
 
they are 1000nit in the hisense 65" consumer tv model and the sony and vizio reference models. No word on power and heat yet but the hisense is a consumer tv supposedly for release in china in 2019
 
Last edited:
Google translated page from the avs forum thread about the tvs

https://translate.google.com/translate?hl=en&sl=zh-CN&tl=en&u=https://www.znds.com/tv-1150652-1-1.html


Correction, the current TV version is 680nit tested and up to 890 nit in labratory conditions according to the article...
"First of all, the light control layer does reduce some of the brightness. We measured the peak brightness of Hisense U9E to 680 cd/m2 (it is known to measure 890 cd/m2 under standard laboratory conditions), compared to the previous Hisense U9 series."



So their early model is less than 1000nit so far, but apparently has been measured at 890 nit in lab conditions. They claimed at CES that they will work toward 3000nit brightness.

The eizo reference monitor site quotes 1000nit on those very expensive reference model monitors.

https://www.eizo.com/products/coloredge/cg3145/
 
Google translated page from the avs forum thread about the tvs

https://translate.google.com/translate?hl=en&sl=zh-CN&tl=en&u=https://www.znds.com/tv-1150652-1-1.html


Correction, the current TV version is 680nit tested and up to 890 nit in labratory conditions according to the article...
"First of all, the light control layer does reduce some of the brightness. We measured the peak brightness of Hisense U9E to 680 cd/m2 (it is known to measure 890 cd/m2 under standard laboratory conditions), compared to the previous Hisense U9 series."



So their early model is less than 1000nit so far, but apparently has been measured at 890 nit in lab conditions. They claimed at CES that they will work toward 3000nit brightness.

The eizo reference monitor site quotes 1000nit on those very expensive reference model monitors.

https://www.eizo.com/products/coloredge/cg3145/

Well yeah, that is the same as the Sony Panel in the Video I linked above, and it also has a similar massive power draw:
Typical Power Consumption 267 W
Maximum Power Consumption 472 W

That is at ~31", which is why I have a hard time seeing this tech delivering 1000 nits at TV sizes.

I'd be perfectly happy with 500 nits.
 
Well yeah, that is the same as the Sony Panel in the Video I linked above, and it also has a similar massive power draw:

Typical Power Consumption 267 W
Maximum Power Consumption 472 W

FUCK
THAT

That's a ton of extra heat to dump into a room.
 
That's like a plasma at 300 - 450w , or ~ two to three 170w FW900's .. :wideyed:
 
My rig uses a good amount of juice even when idle (idle/light work browsing etc, not sleeping). When I turn on my surround receiver and wake up my subwoofer, and/or play a game with both of my 1080ti's it uses a ton more. I'm not on a severe energy budget but even for me those are some large numbers on the early reference dual layer lcd model. The tech is definitely at a young state so hopefully they will make some advancements eventually.

This picture is from the sony video you linked which shows that the heat with all those watts is an issue.. The reviewer makes mention of the fans kicking into overdrive in HDR mode which makes noticeable fan noise. The power draw and cooling is more like having another pc case than a display at this point. I'm not sure what hisense's team will come up with on their first consumer tv model using their proprietary version of the tech and if it will be any better.

oJ6d4qx.png
 
From the panasonic 55" dual layer prototype monitor video:

---------------------------------------

"Dual layer LCD is transformative you know it gives you not only almost OLED like blacks because of the per pixel control - on top of that you also get brightness as well. You get a full brightness of 1000 nits on this display. You have to see it to appreciate the impact, the HDR impact that can be generated from 1000 nits for film without any restriction in ABL - it just makes scenes look more realistic and I need to mentiont that the DCI p3 color gamut coverage is 99% and the underlying monochrome light modulating cell layer would be full HD or 1080p in resolution but the outer display cell layer is true 4k in resolution 4096 times 2160.

"The chasis is quite thick as well because obviously there are two LCDs and because the transmittance is probably going to be five percent or even lower then you have to have a more powerful backlight with more cooling , with more power consumption to try and drive these TVs to 1000nits so it's fairly thick but I think iin a studio - I mean let's face it they've been using Dolby Pulsar with water cooling and stuff so I don't think this will be a problem but this product currently is a prototype.. it is pitched toward the hollywood studios and also the post production community and I don't believe it will be cheap because if you can imagine that the 32" Sony HX 310 will cost you around 35,0000 pounds,"

"I believe that a corresponding increasein screen size and also the implementation of a wider viewing angle I believe is not an effect of being angle composition film because one of my colleagues managed to take a picture of the sub pixel structure of this TV - it is IPS by the way - they don't have the sort of blurriness that you see on say , Samsung Hyun IDR or the Samsung Q950R so I think you know, it will be expensive, it will be north of probably 50,000 - but we certainly hope that this technology can slowly trickle down into the domestic environment and the hollywood community - to them I think you know what's 50,000 pounds lets be honest here, it will not be a problem and they have been craving for a larger screen and without any automatic brightness limiter restriction without any burn in worriest - to be used as color grading monitors and I believe that Panasonic may have come up with a solution. "


----------------------------------------

All of these prototypes are air cooled with a few small fans (and possible heatsink piping internally) somewhat like a small pc case, but he makes mention that the dolby reference pulsar monitors have been using water cooling. Water cooling your monitor sounds very [H]. :geek:
 
Last edited:
Looks beautiful, I bet the 55" version would double the power bill in my current (tiny) home though haha.
 
... it is IPS by the way - they don't have the sort of blurriness that you see on say , Samsung Hyun IDR or the Samsung Q950R...
What blurriness does the Samsung Q950R exhibit?
 
What blurriness does the Samsung Q950R exhibit?

It's only on a closeup of the pixels where you really see blur. They have a filter to help viewing angles that blurs the pixels up close.
 
I was going to say , "yeah when you touch your nose to the screen" , but then I saw they actually put a microscope up to the screen. SEE ! No , actually I wouldn't see that heh. I get the argument but in reality I think HDR1600 and higher with samsung's filter would be a better screen than 600nit ABL (or being limited to non HDR 400nit on the dell gaming panel apparently). The 55" panasonic doesn't have that blurriness under a microscope/macro photo with it's screen either. Thanks for the clarification about what he was talking about though, it sounded odd without further details.
 
I was going to say , "yeah when you touch your nose to the screen" , but then I saw they actually put a microscope up to the screen. SEE ! No , actually I wouldn't see that heh. I get the argument but in reality I think HDR1600 and higher with samsung's filter would be a better screen than 600nit ABL (or being limited to non HDR 400nit on the dell gaming panel apparently). The 55" panasonic doesn't have that blurriness under a microscope/macro photo with it's screen either. Thanks for the clarification about what he was talking about though, it sounded odd without further details.

I don't think we have brightness specs for LG's 8K Nanocell (FALD LCD) TV's yet.
 
Ah I was talking about OLED. The 8k video was referenced for the mention of the viewing angle filter more than anything so I was comparing dual layer LCD tech to OLED, and commenting that the filter's difference in fidelity is noticeable with a microscope against the screen which is even more extreme than your nose against the screen.

Personally I suspect even a die shrink will have trouble running 4k at 100fps/Hz average or better when using high+to ultra settings on demanding games - considering the graphics ceilings are arbitrary and can always go up. So 8k (at lower fps and lower hz) outside of a 70"+ tv in my living room tv isn't really of great interest to me but high density FALD arrays are and better yet, future consumer models of dual layer LCD tech someday hopefully.
 
Last edited:
Ah I was talking about OLED. The 8k video was referenced for the mention of the viewing angle filter more than anything so I was comparing dual layer LCD tech to OLED, and commenting that the filter's difference in fidelity is noticeable with a microscope against the screen which is even more extreme than your nose against the screen.

Personally I suspect even a die shrink will have trouble running 4k at 100fps/Hz average or better when using high+to ultra settings on demanding games - considering the graphics ceilings are arbitrary and can always go up. So 8k (at lower fps and lower hz) outside of a 70"+ tv in my living room tv isn't really of great interest to me but high density FALD arrays are and better yet, future consumer models of dual layer LCD tech someday hopefully.

I agree, that 8K is ridiculous, you have to sit uncomfortably and ridiculously close to see any benefit over 4K, and we barely have real 4K content.
 
And still no where close to the 10,000 nit specification of Dolby Vision. When am I going to have the option of burning my eyes out of my head?
 
I just don't know how people find super high HDR brightness to NOT be eye straining at all. I have been going back and fourth playing Gears 5 on my Acer X27 for horde mode and LG OLED for campaign. The OLED is just so much more comfortable to play on since the peak brightness isn't as high, and on top of that it looks far better since it doesn't suffer from nastyass blooming that the X27 suffers from when playing in HDR. The only area where the X27 wins is when you get to the open world snow area; ABL kicks in hard on the OLED making the image look really dull. But other than places with tons of snow the OLED is seriously enough brightness for HDR, if I can't tolerate 1000 nits for a long time then no way I can tolerate 10,000 nits.
 
I just don't know how people find super high HDR brightness to NOT be eye straining at all. I have been going back and fourth playing Gears 5 on my Acer X27 for horde mode and LG OLED for campaign. The OLED is just so much more comfortable to play on since the peak brightness isn't as high, and on top of that it looks far better since it doesn't suffer from nastyass blooming that the X27 suffers from when playing in HDR. The only area where the X27 wins is when you get to the open world snow area; ABL kicks in hard on the OLED making the image look really dull. But other than places with tons of snow the OLED is seriously enough brightness for HDR, if I can't tolerate 1000 nits for a long time then no way I can tolerate 10,000 nits.
Simple solution, reduce the brightness for that game.
 
Simple solution, reduce the brightness for that game.

I ended up doing that but that's my point....I don't see the use for super high brightness. It didn't make the image POP anymore than my OLED, all it did was cause me more eye strain.
 
I ended up doing that but that's my point....I don't see the use for super high brightness. It didn't make the image POP anymore than my OLED, all it did was cause me more eye strain.
The problem appears to be with that game being too bright with HDR.
Your OLED display cannot reach a high enough brightness (a known issue with OLED), that is why you are ok using that.
Blaming displays that can do HDR justice when the game isnt making best use of HDR is a bit raw.
 
The problem appears to be with that game being too bright with HDR.
Your OLED display cannot reach a high enough brightness (a known issue with OLED), that is why you are ok using that.
Blaming displays that can do HDR justice when the game isnt making best use of HDR is a bit raw.

Ok wanna give me a game that supposely isn't too bright with HDR then? And who said I am blaming the display? I am saying that in general I don't see the need for 10,000 nits displays. Nobody said anything about blaming, I just don't think its necessary.
 
Back
Top