OLED monitor news

Per pixel FALD would be very welcome especially at HDR mastered levels of color volume resulting from color luminances ranging up to 1000, 4000, and 10,000nit that HDR is designed for in movies and games. Even mini LED would be welcomed..
4 pixel per light would be over 2million backlights,
10 pixels per light would be over 800,000 backlights.
1million backlight FALD would be 8 pixels per light and a more modest
100,000 backlight fald would be 83 pixels per light.
The samsung Q9FN already does a good job with a little dim offset at 17,280 pixels per light (3840 x 2160 divided by 480 zone FALD), so those much higher numbers would a huge gain.

https://www.techhive.com/article/3239350/smart-tv/will-hdr-kill-your-oled-tv.html

"manufactured by LG) don’t actually use true blue, red, and green OLED subpixels the way OLED smartphone displays do. Instead, they use white (usually, a combination of blue and yellow) subpixels with red, green, and blue filters on top. This has many advantages when it comes to large panel manufacturing and wear leveling, but filters also block light and reduce brightness.

To compensate, a fourth, white sub-pixel is added to every pixel to increase brightness. But when you add white to any color, it gets lighter as well as brighter, which de-emphasizes the desired color. "
 
Last edited:
Per pixel FALD would be very welcome especially at HDR mastered levels of color volume resulting from color luminances ranging up to 1000, 4000, and 10,000nit that HDR is designed for in movies and games. Even mini LED would be welcomed..
4 pixel per light would be over 2million backlights,
10 pixels per light would be over 800,000 backlights.
1million backlight FALD would be 8 pixels per light and a more modest
100,000 backlight fald would be 83 pixels per light.
The samsung Q9FN already does a good job with a little dim offset at 17,280 pixels per light (3840 x 2160 divided by 480 zone FALD), so those much higher numbers would a huge gain.
Sure, but again, it seems more likely that by the time millions - or even hundreds of thousands - of LEDs are cost efficient to put into a display, Micro LED will be, too, which will largely make FALD LCDs obsolete. I think we may still see FALD zone counts hit the tens of thousands, but after that I believe that as OLED becomes more cost-competitive it will take the lion's share of the market, at which point LCD will be relegated to very low-end displays for a few more years, and finally die out.
 
It won't be definitively better until each sub pixel has its own micro led like oled.

Not just each pixel like what seems to be their first goal.

I think the difference will be minimal.
Sure, but again, it seems more likely that by the time millions - or even hundreds of thousands - of LEDs are cost efficient to put into a display, Micro LED will be, too, which will largely make FALD LCDs obsolete. I think we may still see FALD zone counts hit the tens of thousands, but after that I believe that as OLED becomes more cost-competitive it will take the lion's share of the market, at which point LCD will be relegated to very low-end displays for a few more years, and finally die out.

If we are talking high end TVs, sure. For everything else, I don't think OLED will get any significant market penetration except replacing FALD LCD TVs. LCD will be around for a very long time until Micro LED becomes cheap enough to replace edge-lit LCDs. OLED will not get broad acceptance in the monitor market due to burn-in risk.
 
Who cares about HDR when they can't even make a computer display these days that gets the basics right? This OLED news...Finally something after so many [expletive deleted] years -- I'll take the risk.
 
RGB OLED panels suffer from degradation in color quality due to the difference in the rate in which each of the colors degrade over time/use. Each of the color chemicals has a different rate of decomposition. Sony put out an 11" "limited edition" OLED monitor a few years back and found that with less than 1000 hours on the monitors users were reporting unacceptable color changes. LG's approach resolved this by using a "white" OLED material along with color filters to produce a vastly better contrast system but in which the gradual degradation of the material happened at more color balanced rates. The Japan Display reverts back to RGB. I suspect they will probably learn this was a bad decision. The attempts to resolve lifetime of each color at this stage have focused on sealing methods and current control to both seal the chemicals from atmospheric degradation and careful use of current to try to extend lifetimes. Samsung, who is perfectly capable of making RGB Oled displays doesn't believe they have a market worthy product. Take that as you will regarding this new display (22.1 RGB).

The reason LG has not done monitors in OLED is a few different factors. Monitors have more intense periods of static images which isn't good for the technology. And also they are selling as many OLED panels as they can make now in TV's, a higher margin product. Eventually if saturation of the market comes about and they have excess capacity they may output other sizes. But remember the margins on monitors are small and they would have to justify factory refit costs (substantial $$), to change to different sizes vs the amount of profits to be made in changing sizes. The speed they make these changes will be influenced by market saturation of TV panels and the gradually cheaper costs of producing the panels over time.

Before anyone waxes too philosophic on microLED, please keep in mind that OLED was here almost 20 years ago and we still aren't using it in as many places as we'd like. Nor is it perfected yet. If you consider how long OLED took, you might find that microLED is farther away than the press releases let on. The reason to make the markets "think" that microLED is right around the corner is because OLED TV panel sales are knocking other technologies out of the market. Since LG holds much of the technology patents on the only workable display tech at large sizes (the "white"+filter tech), others either have to buy their panels, license the tech and make their own, or release press releases about microLED or burn in or whatnot hoping to keep people "waiting" long enough to catch up in some way.

Food for thought...
 
Who cares about HDR.... .

I definitely care about high HDR color volune, high percent rec 2020 color, hdmi 2.1 input, VRR, 4k 120hz native 4:4:4, and burn in risk in any display I buy in 2019 and on - especially if I pay ~ $2000 - $3500 on one.
 
Well given that 99% of content I'm playing/viewing at this time is in SDR I could care less about the color volume or maximum brightness. Give it a few more years than maybe that will be a relevant selling point to me.
 
Samsung, who is perfectly capable of making RGB Oled displays doesn't believe they have a market worthy product. Take that as you will regarding this new display (22.1 RGB).

Actually, Samsung already has plans to mass-produce OLED displays by 2020. You can read about it on AVSForum:

"According to related industries, Samsung Display has invested in a QD OLED pilot. Specific equipment orders are scheduled for February next year, and equipment imports are scheduled for October next year. Pilot mass production is expected to commence in July 2020. Full-fledged investment has been announced since 2021 to 10 generations of A5 production facilities."

They are using Blue OLED + Quantum-Dot Color Filter tech to solve the issue of differential aging on RGB OLEDs. So it's more accurate to say they believe they have the superior product, and are gearing up to take LG head on.
 
Well given that 99% of content I'm playing/viewing at this time is in SDR I could care less about the color volume or maximum brightness. Give it a few more years than maybe that will be a relevant selling point to me.

If only you could get the OS to handle it well too...
 
Give it a few more years

Hopefully you go that long w/o burn in on oled. ;)

But seriously -- that is an understandable perspective. Which makes me wonder why upgrade at all until that happens if you have something decent right now and a high end upgrade is $1500 - $2000 - $3000 not counting gpus with some options risking permanent burn in.

However people said similar things about 3d passthrough gpus when only a few games supported glide and openGL, 16:9 support vs 4:3 , 1080p "HD" support, 120hz vs 60hz support.. 1440p 144hz being unnecessary/too demanding (at first), 4k only being capable of 30hz (and then 60hz limit, soon 120hz but limited frame rates by gpu power) etc. How many games supported each right away? How many movies and streams supported 4k resolution right away? Yet people eager to experience the upgrades adopted some or all of them earlier than others.

Personally I have a decent monitor array already and good gpu power so I'm not in the market for expensive carry overs of 2018 tech in 2019 which I consider the hdmi 2.1 lacking 2000 series gpus and lower nit pseudo to weak HDR displays to be. I'll buy on hdmi 2.1 , 4k 120hz native 4:4:4, VRR, QFT, ~ 480zone FALD, ~ 2000nit HDR that is taller than ~13" of a 27" 16:9 monitor. The Q9FN series' continuation in 2019 would be tempting and should have all of that plus relatively low input lag even in interpolation mode for 30fps console games or 60hz limited pc games, as well as 120hz 1440p support. They also can do an optional black frame insertion for blur reduction/elimination on movies or games.. They are huge at 65" and expensive at $2700+ but I'd much rather arrange my setup for something like that rather than spending $1500 - $2k or more on monitors that lack all of those features (and/or that risk burn in).
 
I've already gone 2 years on my older 2016 B6 OLED without any burn in so I think I'll be fine until it's time to upgrade. The only thing stopping me from actually buying one of these future OLED monitors isn't risk of burn in, but the size. I already downgraded from a 32 inch LG32GK850G for the 27 inch Acer X27 and at this point not even OLED will get me to buy another tiny display. Either I get a 30" minimum or I'll just pass and get the rumored 43" BFGD.
 
For monitors I tend to minimize the brightness. Much more interested in black reproduction and other qualities that used to be considered fundamental. And maybe with the return of emissive technologies will be considered so again.

I figure LCD manufacturers talk about color volume and such, because it's what they can talk about. I certainly will not mourn that technology's passing as much as I lamented its ascendance.

Am indeed holding out for HDMI 2.1. Don't want a 4K 120 Hz panel whose resolution I can't access...
 
I've already gone 2 years on my older 2016 B6 OLED without any burn in so I think I'll be fine until it's time to upgrade. The only thing stopping me from actually buying one of these future OLED monitors isn't risk of burn in, but the size. I already downgraded from a 32 inch LG32GK850G for the 27 inch Acer X27 and at this point not even OLED will get me to buy another tiny display. Either I get a 30" minimum or I'll just pass and get the rumored 43" BFGD.

How do like the Acer X27 compared to the LG? I've been using the LG for almost a year now.

The only thing that concerns me about the Acer X27 and its Asus cousin are the reports of sdr content being too dark/wrong gamma.
 
How do like the Acer X27 compared to the LG? I've been using the LG for almost a year now.

The only thing that concerns me about the Acer X27 and its Asus cousin are the reports of sdr content being too dark/wrong gamma.

The Acer X27 is just as good, if not better than the LG OLED when it comes to bright HDR scenes or just bright room HDR performance. Dark scenes in HDR or using HDR in a dark room I found the LG to deliver a better experience because that's when you will start to notice the FALD halo'ing on the X27. The broken SDR is when you are using 144Hz mode but that will also result in chroma subsampling so I've been sticking to 120Hz 8bit for SDR content and there's zero issues there and no chroma subsampling. Overall it 's the best gaming monitor but way too expensive for most people and too small for 4k(in my opinion). I'll be looking to replace it with the rumored 43" BFGD next year.
 
27" seems great. I came here to post this about 20 days late. I've been waiting years to replace a 24" LG 246.
 
I've read all the "color volume" and other features of LCD that has been posted. And yet, when put in a room to watch for several hours of good old fashioned "content" 9 out of 10 people will pick the OLED panels as better looking every single time. Bright rooms, dark rooms, it's all been tested over and over again. In the end you'll get what you prefer for the reasons that keep you happy, but just keep in mind what the eyes have preferred. Perhaps we're just not "holding it right" :)
 
Color Gamut
VR6gxX2.png



Color Volume

3r1M8aR.png



pick your poison

NVsBTV1.png

--------------------------
So 600nit OLED is something like
350nit SDR color + 250 higher nit color* capability
(*white pixeled and varying - reflexively toned down from higher than 600 peak back to 600 via ABL)

1000nit is SDR +650 greater color volume capability

1800nit is SDR + 1450 greater color volume height capability

2900nit is SDR + 2550 greater color volume height capability

(Dual Layer LCD tech thread)
OLED 600nit ABL quasi color volume due to burn avoidance safety limits would be trumped up to 3000nit HDR color on these from what they are touting. while still getting .0003 black depths. OLED is good for SDR+ for now, though .. in fact I'd consider one of those dell alienware ones depending... but if one of these dual layer ones came out in 2020 - 2021 and was a good performer while microOLED was still 2 or more years off past that . I'd definitely be interested in buying one of these until someyear microled is out and high priced rather than astronomically priced.
 
Last edited:
I've read all the "color volume" and other features of LCD that has been posted. And yet, when put in a room to watch for several hours of good old fashioned "content" 9 out of 10 people will pick the OLED panels as better looking every single time. Bright rooms, dark rooms, it's all been tested over and over again. In the end you'll get what you prefer for the reasons that keep you happy, but just keep in mind what the eyes have preferred. Perhaps we're just not "holding it right" :)

Scenes with blacks are extremely common in tv shows, movies, games, and everything. They're also very easy to see the difference on when you compare OLED to LCD.
Scenes exceeding OLEDs color volume capabilities are very rare by comparison and the difference between OLED and LCD isn't easy to see.


When people try to say LCD is better because it has more color volume it remeinds me of this
135.png


Yes it is an advantage of LCD, but no it doesn't even come close to making up for advantages OLED has.
 
HDR not being ubiquitous right now I'd agree and say OLED is a great picture for right now since most content, gaming content in particular, is still SDR. I just wouldn't get one expecting even fractional HDR 1000, 4000, or 10,000 color volume. This is because OLED is like 350nit SDR + 250 nit white pixel mixed color volume at 600nit ABL limits to avoid burn in.

So 600nit OLED is something like
350nit SDR color + 250 higher nit color* capability
(*white pixeled and varying - reflexively toned down from higher than 600 peak back to 600 via ABL)

1000nit is SDR +650 greater color volume capability

1800nit is SDR + 1450 greater color volume height capability

2900nit is SDR + 2550 greater color volume height capability

OLED is shackled to severe nit limits by ABL and uses white subpixels to pollute color space for brighter ranges as it is... so it's hardly "HDR" and due to those methods can't even be calibrated in HDR.

Some tvs like samsung Q9FN can do 1800 ~ 2000 nit color volume but you get bloom/dim offset of FALD and while they are 19,000:1 contrast they aren't like oled esp concerning FALD offsets.

-the 2019 tvs in both LCD and OLED also lack the displayport the dell alienware gaming OLED will have which is huge since nvidia has no hdmi 2.1 gpus yet
(though the ~13" tall asus and acer 2560x1440 27" 384zone 1000nit FALD gaming monitor models both also have dp 1.4 to be fair (but aren't 4k and are "only" hdmi 2.0b))

So if not using for HDR (and since HDR content is not ubiquitous yet..), OLED is much more solidly the best choice for now in regard to gaming imo as long as the burn in color nit limiting safety features do their job to prevent burn in.

- and also specifically because the Dell Alienware has hdmi 2.1 and dp 1.4 so can do 98Hz - 120Hz 4k with some sort of variable refresh rate tech off of current gpus and has some hdmi 2.1 future proofing for 4:4:4 chroma at 120Hz 4k and probably hdmi VRR whenever hdmi 2.1 gpus come out.

That said - OLED manufacturers like LG are not confident enough in their safety feature methods that they would cover burn in under warranty at all.. even 1 year where some high nit LCDs cover burn in 4 years.

.....................................................
.....................................................


Dual layer, using a 2nd LCD in monochrome as a "white emitter" backlight and double layers of black filtering to .0003 black depth with 2million pixel sized backlights could come to market potentially as a good "HDR 3000" upgrade in a few years until microled is in full swing much later.

(Dual Layer LCD tech thread)
OLED 600nit ABL quasi color volume due to burn avoidance safety limits would be trumped up to 3000nit HDR color on these from what they are touting. while still getting .0003 black depths. OLED is good for SDR+ for now, though .. in fact I'd consider one of those dell alienware ones depending... but if one of these dual layer ones came out in 2020 - 2021 and was a good performer while microOLED was still 2 or more years off past that . I'd definitely be interested in buying one of these until someyear microled is out and high priced rather than astronomically priced.

 
Last edited:
Back
Top