DUAL Layer LCD Display Info Thread (Mono LCD as Backlight)

elvn

2[H]4U
Joined
May 5, 2006
Messages
3,868
I thought I'd start a thread where details about dual layer LCD tech could be posted rather than it being only mixed into other threads.


Here are a few links with some basics:


-------------------------------
https://www.researchgate.net/publication/261050283_HDR_medical_display_based_on_dual_layer_LCD
(2013)
-------------------------------
https://appleinsider.com/articles/1...echnology-promises-crisp-lifelike-hdr-images-
(2016)
Apple's dual-layer LCD patent application was first filed for in August 2015

--------------------------------
------ Current news -----
--------------------------------


https://www.displaydaily.com/article/display-daily/if-one-lcd-is-not-enough-try-two

Sony was clearly stung by this technology (which was also adopted by several other broadcast monitor bands) and, as we reported from IBC in 2018, introduced a monitor based on a similar panel from Panasonic (Sony, of course, claims it has some special technology that is slightly different). The company also told us that it was no longer developing OLED broadcast monitors, but concentrating on the LCD type (while continuing to supply OLED products to those that love them) That was big news to me. (Sony Moves from OLED for Monitoring)

-------------------------------

https://www.displaydaily.com/paid-n...lanning-cg3145-high-contrast-monitor-shipment

Effectively, the monitor adds the high 'colour volume' that Samsung has been promoting for its QLED TVs, to the black levels of OLED.

--------------------------------

https://www.engadget.com/2019/01/07/hisense-ces-2019/

ULED XD's secret sauce, of course, is the pairing of two panels that sit in front of the LED backlight, with a 4K RGB display up front. And, sandwiched between the 4K panel and the backlight is a 1080p greyscale panel that displays the image in black and white. That means that, for local dimming, you'll actually be looking at both a dimmed LED source and a B&W image, which should offer far deeper blacks.

The company hasn't offered up many concrete specifications, but says that the TV offers more than 2,900 nits of brightness and the "highest dynamic range" seen on an LCD panel.

-------------------------------

https://www.oled-info.com/hisenses-new-uled-xd-technology-uses-dual-lcd-panels-achieve-high-contrast

basically a way to achieve a high number of 'local dimming' zones for an LCD display (over 2 million such zones, in fact). The TV itself is very bright (over 2,900 nits) and reportedly offers a great image quality and an almost perfect contrast. HiSense it will release its first ULED XD TVs later this year in China. Apparently SkyWorth is also demonstrating a similar technology at CES.


--------------------------------------

https://www.techradar.com/news/sams...-uled-xd-offers-brightness-and-super-contrast

The biggest and brightest of these technologies is Hisense's ULED XD tech that will be going into future screens that promises upgrades in local dimming, colors and dynamic range. It does this, according to Hisense, by using a Dual-cell ULED XD panel layer that puts a 1080p module displaying a grayscale image between a full array LED backlight and a 4K module.

While Hisense hasn't yet announced any TVs that use ULED XD, its 2019 flagship TV is no slouch in the performance department: The Hisense 75U9F is a 75-inch Quantum Dot screen with Android TV, 1,000 local dimming zones and a peak brightness of 2,200 nits. That puts the U9F on par with Samsung's Q9FN QLED that it debuted in 2018 and became one of the best TVs of last year.

Unfortunately, Hisense's 2019 flagship won't come cheap: when it launches in June, the 75-inch Hisense U9F will cost $3,499.99 (around £2,740, AU$4,999).

---------------------------------------

Note: Linus mentions the peak brightness wrong at 300. The reps quote 2900 nit HDR color and .0003 black depth


----------------------------------------
...

So for the hisense tv at least, they are quoting ~ 3000 nit HDR color volume, extreme .0003 black depth, no dim/fade area offset problems of FALD, no burn in risk nor being shackled to lower nit colors due to burn in avoidance of OLED.

Since micro-LED is still a ways off and likely to be astronomically priced at first, this tech could potentially be able to bridge the gap between OLED and MicroLED phases of monitor tech.

Here's hoping some major mfgs develop some in the next few years that can do some proper gaming. HDMI 2.1 4k VRR would be the main thing but 120hz capable response times and input lag of ~10ms or less would be important too.. BFI, Flicker, Interpolation options would be pluses.
 

CrazyRob

[H]ard|Gawd
Joined
Aug 26, 2004
Messages
1,273
My knee-jerk reaction to this tech was "well, it obviously still won't be as good as self emitting tech like oled and micro led." But, as I thought about it, Consumer 4k content (streaming, 4k blu-ray, etc) is at best 4:2:2 chroma subsampling, which has an effective resolution of 1080p anyway. So although it may not be great for a computer monitor, this could look great in TV's when handling the content they're designed for. Pretty nifty.
 

elvn

2[H]4U
Joined
May 5, 2006
Messages
3,868
Yes 1920x1080 backlight is pretty dense at 1 over every 4 pixels being backlit or at down to .0003 black level. even for per pixel content. At 4k that could still have a very impressive result.

I mean, people were using edge lit high color ips screens for years on pcs and edge lit VA tvs prior so I don't get how it wouldn't be good for computer use??

To make sure you are on the same page here -- the front screen is still 4K per pixel content. The 1080 screen is being used as the direct lighting backlight for the front 4k screen. So, the 1080p backlight pixel is 4 times larger and so either spans lighting four of the 4k pixels .. perhaps evenly, or they could be offsetting them by some amount if it makes a better result idk. They are also using a quantum dot color filter layer.


This is an over 2milllion zone FALD down to pixel levels essentially, which capitalizes on the dual layer nature to improve light blocking dramatically down to ultra black depths. The asus and acer FALD 27" HDR gaming monitors have 384 zones.. a samsung Q9 has 480.. some tvs from past years had 64 or even 32 zones if they had FALD at all (most didn't).. and of course most monitors just have edge lighting, even high color ips ones. Hisense's own non dual layer HDR TV model due out this year even has "merely" 1000 zones.

If 1 to 4 pixel ratio of 2,073,600 backlights (1/4 out of 8,294,400 at 4k) combined with two layers of light blocking to .0003 black depths eliminates dim and glow zones down that far to quad pixel zones or less in effect.. while cranking HDR color up to ~ 3000 nit with no burn in risk (nor severe hdr color nit limitations due to burn in prevention) it would be quite an accomplishment. I'd be very interested in seeing one.
 
Last edited:

Keller1

Weaksauce
Joined
Dec 10, 2013
Messages
65
The Ultimate LCD's only weakness might be the Non-instant pixel response times.

Samsung Allegedly has added an OLED-like viewing angle to their 2019 TVs, while these solve the contrast issues.

This means that we might actually have LCD still being relevant for a decade longer, as much as i'd prefer an outright emissive display instead. Exciting stuff.


Would be nice to see one in person, I hope these start existing outside of china soon.
 

sharknice

2[H]4U
Joined
Nov 12, 2012
Messages
2,166
The Ultimate LCD's only weakness might be the Non-instant pixel response times.

Samsung Allegedly has added an OLED-like viewing angle to their 2019 TVs, while these solve the contrast issues.

This means that we might actually have LCD still being relevant for a decade longer, as much as i'd prefer an outright emissive display instead. Exciting stuff.


Would be nice to see one in person, I hope these start existing outside of china soon.

Backlight uniformity is a huge issue with LCDs too.
 

Keller1

Weaksauce
Joined
Dec 10, 2013
Messages
65
Backlight uniformity is a huge issue with LCDs too.
Right. Tottally forgot that one. Phew. Almost got worried our emissive future would be taken away. Though to be fair it could be potentially QA'd away, though at this point I doubt [they]'d bother.
 

CrazyRob

[H]ard|Gawd
Joined
Aug 26, 2004
Messages
1,273
I mean, people were using edge lit high color ips screens for years on pcs and edge lit VA tvs prior so I don't get how it wouldn't be good for computer use??

Well, it depends on how the display works, and application. For example, if you were doing work that is affected by single pixel accuracy, your image quality could be compromised, assuming the mono lcd layer couldn't be disabled. If you had highly contrasting colors, say a black line on a white background, up to 3 pixels around each pixel would have an incorrect level of brightness, causing ringing or blurring. If it can be manually disabled or controlled to be uniform, then these issues could be corrected and calibrated out.

Additionally, it was mentioned in linus' video that this process requires a lot of video processing to make it work right, adding up to 25ms to the input lag of the display, in addition to whatever lag is caused by their scalers and native screen response times. Realistically, that could end up with a 50ms+ response time. For many, that may not be an issue, but for competitive gamers, that may be unacceptable.

Again, these are likely issues you wouldn't run into in the living room, but would deter a lot of power desktop users.
 

elvn

2[H]4U
Joined
May 5, 2006
Messages
3,868
yeah thats because its hisense.. a better mfg could potentially reduce the input lag. The samsung Q9fn was 25ms but now after a firmware uodate is 10ms at 120hz (1440p since not a hdmi 2.1 model yet). LG oled tvs are still 25ms at 60hz afaik but could potentially reduce it with hdmi 2.1 and 120hz. This is more about gaming aesthetics with reasonably lowinput lag not tn response time 1080p sparse graphics for pvp.

And how inaccurate is edge lit uniformity and at what poor black depth? this tiny pixel grouping is a huge advancement and 3-4 pixel tradeoff for some real hdr 3000 color with. 0003 black depth and no burn in risk nit limit.

You realize LG OLED once above 400nit is a polluted color brightness by added white subpixels on every pixel, and then rolls down the alrady limited color volume with ABLback to 600nit from a bit higher. This is so color inaccurate that wrgb OLEd can't even be color calibrated past 400nit sdr. As long as you never master hdr video and and photos down to per pixel color i gues it could be ok. or watch hdr 1000,4000,10000 mastered material or play hdr 1000,4000,10000 color capable games.

These aren't about authoring such- though from the above links you can see sony and other reference monitors are moving to dual layer lcd... this is about media consumption and aesthetics.. .. movies, games.

Who knows maybe the higher end ones do 1:1 pixel even, all we know now is what this hisense exhibition was saying at ces for their model.
 
Last edited:
  • Like
Reactions: Nenu
like this

elvn

2[H]4U
Joined
May 5, 2006
Messages
3,868
From Necere 's links about dual layer reference monitors....

NJxFoUK.png

-------------------------------------------------------------

wsfIk8Q.png




-------------------------------------------------


NVsBTV1.png
 

Keller1

Weaksauce
Joined
Dec 10, 2013
Messages
65
yeah thats because its hisense.. a better mfg could potentially reduce the input lag. The samsung Q9fn was 25ms but now after a firmware uodate is 10ms at 120hz (1440p since not a hdmi 2.1 model yet). LG oled tvs are still 25ms at 60hz afaik but could potentially reduce it with hdmi 2.1 and 120hz. This is more about gaming aesthetics with reasonably lowinput lag not tn response time 1080p sparse graphics for pvp.

And how inaccurate is edge lit uniformity and at what poor black depth? this tiny pixel grouping is a huge advancement and 3-4 pixel tradeoff for some real hdr 3000 color with. 0003 black depth and no burn in risk nit limit.

You realize LG OLED once above 400nit is a polluted color brightness by added white subpixels on every pixel, and then rolls down the alrady limited color volume with ABLback to 600nit from a bit higher. This is so color inaccurate that wrgb OLEd can't even be color calibrated past 400nit sdr. As long as you never master hdr video and and photos down to per pixel color i gues it could be ok. or watch hdr 1000,4000,10000 mastered material or play hdr 1000,4000,10000 color capable games.

These aren't about authoring such- though from the above links you can see sony and other reference monitors are moving to dual layer lcd... this is about media consumption and aesthetics.. .. movies, games.

Who knows maybe the higher end ones do 1:1 pixel even, all we know now is what this hisense exhibition was saying at ces for their model.


Polluted Color brightness still gets you a very decent REC 2020 rating though, and we're getting QD-OLED from samsmug, which is RGB, next year, which is a bit after these ULEDs drop.

So it'll be worse Viewing Angle, Response Times & Potential Higher DSE versus Burn In and ABL. QD-OLEDs do like 85% of BT2020 according to samsung so that's actually quite good, along with 115% DCI-P3.


Again, seems to me that it'll be a game of price is right.
 

elvn

2[H]4U
Joined
May 5, 2006
Messages
3,868
Color Gamut
VR6gxX2.png



Color Volume

3r1M8aR.png



OLED's organics require severe shackling of the color's vertical brightness volume/nits and rolling down to 600nit with ABL to avoid burn in.. which isn't all that much higher than the 2D plane really but it's something vs 350nit SDR.

Q9FN's do ~ 2000nit color volume, the Dual layer LCDs if they come to market and work well are reportedly 2900+ nit HDR color volume (the hisense one anyway) as well as sinking way down into the deep end to .0003 black depth.
So figure OLED's white polluted color past 400nit to a 600nit ABL mark would be at:

-- 1/3 the height of a Q9FN,
-- and potentially 1/5th (1 / 4.8) of the height of the hisense 2900 nit color volume

(~1800 to 2000nit color and ~2900 nit color already being fractional of 4000nit HDR content and full 10,000 nit HDR capable content incl. a handful of games and some movies (incomplete list - spreadsheet link))

https://pcgamingwiki.com/wiki/List_of_games_that_support_high_dynamic_range_display_(HDR)

https://www.resetera.com/threads/hdr-games-analysed.23587/

So 600nit OLED is something like
350nit SDR color + 250 higher nit color* capability
(*white pixeled and varying - reflexively toned down from higher than 600 peak back to 600 via ABL)

1000nit is SDR +650 greater color volume capability

1800nit is SDR + 1450 greater color volume height capability

2900nit is SDR + 2550 greater color volume height capability
 
Last edited:

XoR_

[H]ard|Gawd
Joined
Jan 18, 2016
Messages
1,046
This solution to contrast ratio issue was so obvious I came up with it myself shortly after I heard how LCD panels work including using lower resolution panel without color filters. Pretty much the same story as with dynamic variable refresh rate tech.
Only LCD improvement of this type I did not come out immediately was backlight strobing to improve sharpness of moving objects but back then I didn't really understand how persistence of vision worked.

Biggest technical issue with this idea is how to do it cheap, not how to do it at all which is obvious. Also obvious are viewing angle issues and strange motion artifacts which this kind of tech will have if not done right.

Frankly I thought it will come out quicker. Now we have a lot of OLED products and still zero 2xLCD screens. Strange...
 

bigbluefe

Gawd
Joined
Aug 23, 2014
Messages
951
Honestly this is all garbage tech until MicroLED. It's all tap dancing around the features and properties that people actually want. We want inorganic self emitting displays that don't burn in. Hurry the fuck up already.
 

elvn

2[H]4U
Joined
May 5, 2006
Messages
3,868
Welll if you are probably waiting 4 - 5 years for micro LED.. (perhaps up to 5 - 6 for high rather than astronomical pricing as a guess)... in the meantime Dual Layer LCD tech could be a great tech to bridge the gap between OLED and MicroLED display tech phases and could start showing up by the end of 2019 in at least one tv as well as being in high end reference monitors now, so hopefully 2020 and on could have some more Dual Layer LCD consumer options.

So for me I'd consider a dell alienware oled with dp 1.4 + hdmi 2.1 and some kind of VRR for "SDR+250" and ultra black depths (depending if price is very high or insane), in 2020 or 2021 hopefully a dual layer LCD if they are great performers, and then wait out microLED for another 2 - 3 yrs most likely. Just guessing.

This tech is very similar to self emitting in a way - using pixel size quasi "emission" layer of white light sources (sort of like how LG oleds are all white or white via combo layer of emitters then goes through a color filter) .. and utilizing the double layer of light blocking to hit ultra darks of .0003 black depth.
 
Last edited:

Vega

Supreme [H]ardness
Joined
Oct 12, 2004
Messages
6,531
You still have slow pixel transitions inherent to LCD tech, especially VA. Emissive displays like CRT, OLED, microLED don't have this problem and hence will always be superior. The only thing you could do is strobe the LCD back-light to try and hide the slow pixel transitions. Basically with this dual LCD technology, it is a band-aide, placed over a band-aid.
 

elvn

2[H]4U
Joined
May 5, 2006
Messages
3,868
Once you go over 120fpsHz 's 8.3ms frame time you start getting more past what a modern VA;s response time can do to keep up. If it's fast enough for 120hz considering all of it's other specs it'd be a winner. High end mfgs of TV's like Samsung also have 480hz flicker backlight outright that can also match 60hz or 120hz flicker mode optionally for games, optional interpolation (while still having relatively low input lag), and optional BFI activation. They also are down to 10ms input lag at 120hz 1440p currently. So if a big manufacturer picks up this tech and releases models with some high end feature sets it could be something.

OLED 600nit ABL quasi color volume due to burn avoidance safety limits would be trumped up to 3000nit HDR color on these from what they are touting. while still getting .0003 black depths. OLED is good for SDR+ for now, though .. in fact I'd consider one of those dell alienware ones depending... but if one of these dual layer ones came out in 2020 - 2021 and was a good performer while microOLED was still 2 or more years off past that . I'd definitely be interested in buying one of these until someyear microled is out and high priced rather than astronomically priced.
 
Last edited:

elvn

2[H]4U
Joined
May 5, 2006
Messages
3,868
I guess I'll kick in again in defense of an infant tech's "rollout" , at least from my personal perspective..
(hopefully some consumer models avail 2020-2021 by some major mfgs rather than just the reference monitors and a single hisense in china in 2019)

and say I run
..a 43" 4k VA tv as desktop/app/media space in my monitor array
..a LG 32" 1440p g-sync VA that notably wastes energy running hot,
.. an apple cinema display 27" 1440p ips
.. and my 8.5" oled tablet all on at once at my pc desk.

The pc has
..dual 1080ti hybrids in sli
..and a 7900x cpu also..
..On the side is a full onkyo 7.1 surround receiver that runs hot like a hotplate through it's top grill,
..with a 7.1 array of klipsch promedia desktop satellite speakers
..and a ~3' tall subwoofer.

.. And a ton of peripherals and a handful of large ext hdds.

--- So power draw of "2" more panels in one wouldn't be a huge consideration for me personally. :)
 
Last edited:

Necere

2[H]4U
Joined
Jan 3, 2003
Messages
2,743
--- So power draw of "2" more panels in one wouldn't be a huge consideration for me personally. :)
I believe the majority of power draw is from the backlight rather than a panel, doubly so with HDR. Dual layer only makes it that much worse, as each layer reduces light output some amount, even when it's completely clear. This is evidenced by the power consumption numbers on the product pages I linked above: 250-270W typical in HDR, 450-470W maximum. We haven't seen numbers like that since plasma, and maybe never for a panel this size. That's a lot of added heat to get rid of, so no surprise these have multiple cooling fans on the back.
 

suiken_2mieu

2[H]4U
Joined
Apr 7, 2010
Messages
2,910
THis tech looks sick, especially if it comes in cheaper than OLED. If It's passable for gaming, I could see myself putting this in my living room. I'm still running 2 samsung DLP's because they look fine, albeit with too much overscan.
 

elvn

2[H]4U
Joined
May 5, 2006
Messages
3,868
When I said panel I was using the term more generically meaning that running two monitors wouldn't be a big deal to me but yes that is a big draw and while it wouldn't affect my desire to purchase one it is definitely a little higher than I expected.

A Q9Fn which can do almost 2000nit HDR has
Power Consumption : 74 W .. Power Consumption (Max) : 249 W .
So if you were running two it would be ~ 150w / 500w max compared to
dual panel tech's quoted "250-270W typical in HDR, 450-470W maximum"
... so like 110w more typical than even two entire Q9FNs.

Some interesting info from an old forum post in regard to you mentioning Plasmas, and in relation to how much power surround sound uses too..
(2012) http://forums.redflagdeals.com/has-...easure-power-consumption-av-receiver-1251021/

"I have a Denon 809. Idle power usage is around 50w, -20db is about 135w driving 5 in-ceiling speakers. It all depends on what's playing at the moment. My Klipsch Synergy 12 idles at 40w and uses around 80-100w.

OTOH, my Pioneer 5020 uses 455w showing a blue or white screen! And around 300w on average."

----------------------------------------------

So that definitely sounds comparable to some plasma power usage.

-----------------------------------------------

Samsung F8500 Plasma review (2013)
https://www.flatpanelshd.com/review.php?subaction=showfull&id=1375254254#5
Power consumption after calibration comes in at 310 W on average. This is a bit lower than on last year’s E8000 plasma TV but still quite high for a modern TV.

Power consumption during 3D use is very heavy, a bit disturbing actually. It peaked at over 600 W and averaged around the mid-400s in the Standard as well as the Movie mode. So… plant some trees if you want to watch a lot of 3D on this mammoth TV.

This is a measure of power usage of pcs themselves according to pcgamer (dec 2018):
https://www.pcgamer.com/how-much-power-does-my-pc-use/
 
Last edited:

Necere

2[H]4U
Joined
Jan 3, 2003
Messages
2,743
When I said panel I was using the term more generically meaning that running two monitors wouldn't be a big deal to me but yes that is a big draw and while it wouldn't affect my desire to purchase one it is definitely a little higher than I expected.

A Q9Fn which can do almost 2000nit HDR has
Power Consumption : 74 W .. Power Consumption (Max) : 249 W .
So if you were running two it would be ~ 150w / 500w max compared to dual panel tech's "250-270W typical in HDR, 450-470W maximum"
... so like 110w more typical than even two entire Q9FNs.

Some interesting info from an old forum post in regard to you mentioning Plasmas, and in relation to how much power surround sound uses too..

"I have a Denon 809. Idle power usage is around 50w, -20db is about 135w driving 5 in-ceiling speakers. It all depends on what's playing at the moment. My Klipsch Synergy 12 idles at 40w and uses around 80-100w.

OTOH, my Pioneer 5020 uses 455w showing a blue or white screen! And around 300w on average."

So that definitely sounds comparable to plasma power usage.

This is a measure of power usage of pcs themselves according to pcgamer (dec 2018):
https://www.pcgamer.com/how-much-power-does-my-pc-use/
Also take into account that these are only 31", vs. 65" for the Q9FN or 50" for that Pioneer you mention, so power per square inch really seems unprecedented.

OTOH, it's important to keep in mind that these are pro displays. If Hisense or some other company commercialized it as a consumer product, it seems likely they'd try to keep the power consumption to more manageable levels. Lower max brightness, ABL, and conventional FALD behind the dual layers (not sure if the pro monitors are doing this) could all help to keep it under control. Also seems like VA might have better light transmittance than IPS, which also helps and might explain why Hisense is using VA.
 

elvn

2[H]4U
Joined
May 5, 2006
Messages
3,868
well the hisense VA prototype at ces looked like it was around 65" and was quoted as doing 2900nit hdr color so the peak brightness was actually much higher than the 1000nit 31" reference monitors. The power draw info would be interesting. While not optimal I'd personally be ok with plasma power draw for a few years until microled was in full swing in order to get - 3000nit hdr color along with. 0003 black depth.. without fald dim or bloom offset haloing nor oled 400nit color whitewashed through 800 - 1000 nit and immediately ABL'd down to 600nit to avoid burn in risks. .

Maybe we'll get some info closer to when they are supposed to release in china sometime in 2019.

I'm hoping
some other major manufactures start making feature rich hdmi 2. 1 dual layer lcd models in 2020 - 2021 to bridge the gap another 2 - 3 years after that

until hopefully microled is in full swing and down to high priced (up to $2k - $3k) rather than astronomically priced (like some display tech arrrives at $5k - $10k even $15k).

Although the dual layer lcd tech has high power draw they are saying it is cheaper to make.
 
Last edited:

JackCY

n00b
Joined
Nov 14, 2018
Messages
19
Bring on the dual layer LCDs, sure, old stuff for ages, but still nowhere near products that are accessible to regular people. All we get are super expensive pro mastering monitors and now the TV. Right now it's easier to buy OLED than dual layer LCDs.
Micro led, dual layer, with glow reducing filter, heck why not, but they don't wanna make it. They will rather sell crappy panels for another decade.
 

CajunAzn

n00b
Joined
Sep 8, 2014
Messages
25
This solution to contrast ratio issue was so obvious I came up with it myself shortly after I heard how LCD panels work including using lower resolution panel without color filters.

Care to explain exactly how light leakage is controlled by the lower LCD layer, for those of us who are not so obviously enlightened?
 

XoR_

[H]ard|Gawd
Joined
Jan 18, 2016
Messages
1,046
Care to explain exactly how light leakage is controlled by the lower LCD layer, for those of us who are not so obviously enlightened?
LCD layer in LCD panel by itself is merely twisting polarization of light so you put backlight, then polarizer, then LCD matrix, then color filter, then another polarizer and then some AG coating (or not!) and get final image. With 2xLCD you might remove color filter from first LCD and make it lower resolution and put another LCD layer on top of second polarizer and then third polarizer on top of that and then AG. First LCD will limit most of the light increasing contrast ratio of final image. Of course both LCD layers would need to be rotated by 90 degrees. This last bit is problematic mostly if you tried to DIY (which is totally doable!) and for LCD manufacturer is not really such a big deal, especially if they were already making custom LCD layer for this purpose specifically.

Only tricky part is that this setup would worsen viewing angles and might introduce strange effects due to screen door of first panel so you would need some diffusion layer above first LCD (above its polarizer) and for that you would probably need to use 4 polarizers in total. And all this, especially in setup with four polarizer would make backlight loose a lot of light so it would need to be much stronger... especially for HDR stuff. Also both panel would need to be properly synchronized and have at least similar response times. I would imagine first panel would not really go all the way to black where response times are the worst but slightly higher and it would still give excellent inky black blacks.

Back in CCFL days this whole setup was a pipe dream but since W-LED can be much much brighter this is doable. Though it won't be cheap for sure. Also it would be easier to make backlight made from ridiculous amount of LEDs tightly packed together than go with 2xLCD, maybe even cheaper though I am not really sure what would be more expensive...

Definitely OLED seems like much cheaper solution than either 2xLCD or FALD with real backlight resolution. By real I mean at least 320x160 if 640x320 is wanting too much. Todays FALD do not even control single leds but groups of them which is really stupid and decrementing to image quality
 

elvn

2[H]4U
Joined
May 5, 2006
Messages
3,868
Bring on the dual layer LCDs, sure, old stuff for ages, but still nowhere near products that are accessible to regular people. All we get are super expensive pro mastering monitors and now the TV. Right now it's easier to buy OLED than dual layer LCDs.
Micro led, dual layer, with glow reducing filter, heck why not, but they don't wanna make it. They will rather sell crappy panels for another decade.

Agreed. I mentioned the dell alienware oled as a consideration for me aa "right now" gaming monitor.


Hopefully it'll be more like a 2yr gap rather than 3 between each tech now. If that were the case it'd be something like:

.. 2019 - 2020 .... OLED hdmi 2.1 /or/ 480 to 1000+ zone FALD hdmi 2.1
..(suffer 600nit 'SDR+250nit' whitewashed ABL limitation of OLED in HDR content or dim/bloom zone offset of lower density FALDs on all content - but with - 2000nit hdr color)

.. 2020 -2022 Dual Layer LCD? - 3000 nit hdr color and. 0003 black depth(suffer large power draw, perhaps some other idiosyncrasy of the tech)

.. 2023 - 2024 = MicroLED available and expensive rather than astronomically priced ?

... 2030+ = AR/VR - hopefully at some point we'll just have clear insane resolution AR glasses that can slap viewing surfaces, virtual objects and characters anywhere mapped in real space.

At this rate I hope I'm still young enough to appreciate some of the more far off tech. :yuck: Seems like it's been slow to advance.
 
Last edited:

XoR_

[H]ard|Gawd
Joined
Jan 18, 2016
Messages
1,046
Seems like it's been slow to advance.
Compare it with the past:
- 480 lines B&W CRT in the 50's
- 480 lines Color CRT in the 60's
- 480 lines Color CRT in the 70's
- 480 lines Color CRT in the 80's

I think we cannot really complain that much :whistle:
 
  • Like
Reactions: elvn
like this

elvn

2[H]4U
Joined
May 5, 2006
Messages
3,868
To 80's me on a C-64 computer originally with an old black and white tv, an upgrade was to a small color tv .. so I could play "Ultima IV: Quest of the Avatar" in color - so things have changed for sure.

Computer capable monitor tech seemed somewhat stagnant for awhile until 120hz 1080p and then g-sync, higher rez. Display tech seems to to be moving along now albeit slowly with hdmi 2.1, oled, and promise of dual layer LCD and later microLED.
 
Last edited:

Formula.350

[H]ard|Gawd
Joined
Sep 30, 2011
Messages
1,102
Here's my Triple Layer approach: LED Backlighting -> Quantum Dot layer -> LCD 'contrast' layer -> Color panel layer
:pompous:

With 8Ks launching this year (foolishly), I feel like my hopes of getting a 'high-end' 46in 4K QLED TV that has Adaptive Sync (as I use my TV as a PC monitor), at a price point of $650-850... will never happen. 4K will end up being ignored for prime features and tech, forcing people like me who can't afford a $1300-2500 set, nor want something >55in, to "settle" with something which doesn't tick but half the boxes we were hoping to

This may [hopefully] provide another layer of quality sets at an affordable price and still manage to tick a couple extra boxes.
I mean, I know it won't... but I'll keep my fingers crossed.:meh:
 

HiCZoK

Gawd
Joined
Sep 18, 2006
Messages
821
xm311k 45k usd :O
And it's just a layer blocking out the backlight lol.
All of this is pretty simple sounding... I am surprised we are stuck with the crap we are... the push for stupid pointless 4k and huge 75" tv's is bad. I just want 27"-32" monitor. no bigger, no smaller....
That dual layer stuff is still just mva and very expensive
 
Last edited:

CajunAzn

n00b
Joined
Sep 8, 2014
Messages
25
Only tricky part is that this setup would worsen viewing angles and might introduce strange effects due to screen door of first panel so you would need some diffusion layer above first LCD (above its polarizer) and for that you would probably need to use 4 polarizers in total.

Wow that's more than I expected for an answer. But I still don't see how this multi-layer setup would reduce light leakage (around each pixel).

Every successive layer you add would have the effect of diffracting the light further (creating blooming around the lit pixel). What I'm interested in knowing is precisely how they avoid this haloing of light—especially to the point of being able to rival the pinpoint light control of self-emissive subpixels in OLED.

To me, that is the magic sauce of this tech. ;)
 
Last edited:

XoR_

[H]ard|Gawd
Joined
Jan 18, 2016
Messages
1,046
Wow that's more than I expected for an answer. But I still don't see how this multi-layer setup would reduce light leakage (around each pixel).

Every successive layer you add would have the effect of diffracting the light further (creating blooming around the lit pixel). What I'm interested in knowing is precisely how they avoid this haloing of light—especially to the point of being able to rival the pinpoint light control of self-emissive subpixels in OLED.

To me, that is the magic sauce of this tech. ;)
By the looks of it (eg. using monochrome 1080p panel under 4K panel) they do not aim to completely eliminate it.
And frankly in reality I would imagine it does not really matter. If we imagine first LCD panel acting as FALD backlight it would mean we bumped amount of zones from like 384 or 512 to 2073600. Blooming effect would be minimal if not so much invisible that you would not be able to tell difference between emissive pixel display. Definitely not without specially prepared images or things like pixel wide lines, single pixels, etc. on pure black background. CRT's for example have much more severe blooming and I am not even so much sure there is zero voltage leakage in OLED panels as there can be some and no one would notice it anyway.

So while what you hint at is a real issue in reality it rather at the bottom of the list of issues such 2xLCD device: cost, power consumption, heating, maximum achievable brightness, viewing angles and avoiding motion artifacts caused by now having to drive two LCD panels instead of one, increased thickness (for some reason displays nowadays need to be very slim... XD). Blooming in this case will be like pretty much any parameter of display or any tech for that matter that when is good enough you do not even hear about it. At least I hope so.
 

Necere

2[H]4U
Joined
Jan 3, 2003
Messages
2,743
The one area I could see even a minimal bloom radius being an issue is on text, where it could give the impression of a slight blurring or smudging. This would be an argument for using a VA panel, since it blocks a lot more light than IPS and should minimize haloing as much as possible.
 

XoR_

[H]ard|Gawd
Joined
Jan 18, 2016
Messages
1,046
The one area I could see even a minimal bloom radius being an issue is on text, where it could give the impression of a slight blurring or smudging. This would be an argument for using a VA panel, since it blocks a lot more light than IPS and should minimize haloing as much as possible.
Hell no! VA have gamma shift and terrible pixel response times.
VA is terrible and with contrast ratui 2xLCD certainly does not need this crap!!!!

Only viable display tech for this is IPS
 

Necere

2[H]4U
Joined
Jan 3, 2003
Messages
2,743
Hell no! VA have gamma shift and terrible pixel response times.
VA is terrible and with contrast ratui 2xLCD certainly does not need this crap!!!!

Only viable display tech for this is IPS
Not arguing that VA doesn't have its own problems, but Hisense was reported to be using VA on their demo so I'm just trying to think of reasons why. Could just be that's what they normally use, too - I'm not really familiar with their TVs at all.
 

CajunAzn

n00b
Joined
Sep 8, 2014
Messages
25
I am not even so much sure there is zero voltage leakage in OLED panels as there can be some and no one would notice it anyway.

Blooming in this case will be like pretty much any parameter of display or any tech for that matter that when is good enough you do not even hear about it. At least I hope so.

Hold on a second, this is way too hand-wavy of an explanation.

First of all, any cross-talk voltage on OLED driver transistor wires would an order of magnitude smaller than the threshold voltage required to activate the surrounding pixels. So no, there is no additional light leakage on OLED panels that you're theorizing.

Secondly, you are too lightly dismissing the central problem of light control for LCD. Take a look at any LCD (IPS/VA/whatever) showing 100% black...what do you see? Grey. Because LCD polarizers are not perfect light blockers. Add to that the diffraction effect of small light apertures through a many-layered display stack and you have a significant optical hurdle to overcome.

On the other hand, showing pure black is simple for self-emissive display and the stack can be made very thin, with even less optical interference for newer top-emission OLEDs.

So the problem of controlling light leakage is a considerable engineering challenge and would require a special type of LCD stack underneath the normal color LCD. Especially to create a reference grade monitor like the BVM-HX310.

The technical details of this stack is what I'm trying get at.
 
Last edited:

Formula.350

[H]ard|Gawd
Joined
Sep 30, 2011
Messages
1,102
Hell no! VA have gamma shift and terrible pixel response times.
VA is terrible and with contrast ratui 2xLCD certainly does not need this crap!!!!

Only viable display tech for this is IPS
I dunno, my Samsung's S-VA panel on my 2010 TV has an amazing viewing angle with, IMO, little/no color or gamma shift. However, it also has a glossy panel which will no doubt help.
As for response times, I can't say, given it's a TV and even in "PC" mode there are still SOME post-process shit active :(
 

VIC-20

[H]ard|Gawd
Joined
Mar 24, 2006
Messages
1,060
I'm using an Apple Thunderbolt display hooked up to an Apple Radeon from a Mac Pro installed in a Z170 UEFI PC running Windows 10. (edit: in my office)

It looks like an older OLED just because it has a thick piece of glass on top of the LCD. That's it. Simple optical trick to get great blacks and vivid color.
Plasma had glass glare too, but I don't care. It looks great and I wish I could buy a TV, new PC monitor or PC laptop with a glass panel.

High gloss plastic exists, but it bubbles on laptops and looks like crap in comparison.

Don't TV engineers ever look at their gorilla glass phones and think "why does this look so much better than my TV?"
 
Top