Why I don't believe in FALD

Roen

Weaksauce
Joined
Aug 7, 2021
Messages
102
Why I don't believe in the concept of FALD

I've never owned a panel with lots of dimming zones so this is mostly theory but I don't believe in FALD. I'll explain why in technical terms shortly, but the broader perspective is that it is an attempt to somehow get around the hard fact contrast limit inherent to LCD panel technology. It tries to turn something into something that it's not, and can't become.

If the manufacturer configures dimming zones to make a big difference to blacks, meaning they block a ton of light, then you're going to really notice whenever those zones are turned on or off. The more light the zone blocks the bigger the difference between a blocked zone and an open zone, thus the more blooming you will see around it. Having more, smaller zones reduces the blooming but only OLED has 1x1 pixel "zones". So blooming remains an issue with FALD. And the more light a FALD zone can block to actually improve contrast, the more it blooms/distracts, given the same size/amount of zones. So the more effective it is, the more distracting it gets - again given the same number of zones. The 50" QN90A for instance has 448 zones. That is not a lot when you have 8,294,400 pixels. That's 18,513 pixels per zone a.k.a. an area the size of 136 x 136 pixels. That's still a pretty big chunk:
1650612078213.png

You've probably built a house in Minecraft with smaller blocks than that. :D I know I have. Then you know how hard it is to create any sort of detail with that. I'll quote from the "cons" of DisplayNinja's review of the Acer Predator X35 multi thousand dollar monitor with 512-zone FALD: "Noticeable blooming", and that's only a 34". A 50" QN90A TV as for instance has 2.5x the surface area and 448 zones, therefore should bloom more than 2.5 times as much, other aspects being equal. When speed and latency are important (game mode!) the panel doesn't have time to gradually fade zones in / out, assuming it can do it in the first place. Which brings me to my next point: FALD in game mode, and what people have written about it. These are some comments from QN90A owners, about the FALD on their MiniLED TVs:
In game mode, Local Dimming on Medium or High completely washes out parts of the screen. Local Dimming on Low resolves that but lessens how bright the screen gets.
Some blooming around bright objects especially viewed off center. In Game mode, local dimming zones are bigger and the entire screen looks more gray than black (overbrightened), blooming is more aggressive than outside game mode too because the zones are larger and zones switch a little slower.
You will notice slight bloom halo/bloom on 2021 qn90a if you use a black background on the desktop, gaming I didn't notice though. As in around the Taskbar and the mouse cursor but I have my brightness maxed out.
Global dimming effect that you can't turn off (link to LTT Forums), confirmed by a 2nd user at the bottom of that thread
B. the Installer's QN90A review, the section about local dimming:

He's not a fan - at all.
A MiniLED TV is supposed to be more suited to FALD than others, so what does that tell us?

My experience with the Gigabyte Aorus FV43U's dimming - which I understand is worse on paper - is: it is so subtle that you get the exact same result by lowering the brightness to 45/100. Same black levels, same highlight brightness, same slightly veiled / less impactful image due to the lowered brightness. You don't get brightness control when local dimming is on and highlights are a little dim so I prefer to disable its local dimming and simply set brightness to 45-65 depending on what game I'm playing (dark scenes or mostly bright). In a moderately dark game I like to set it to 50. Very slightly worse blacks than local dimming (which = brightness 45) but it offers just that little bit much needed highlight pop that local dimming mode lacks.

The Samsung Odyssey G7 that I briefly used before returning it had only 8 dimming zones, not unlike the FV43U but they were more aggressive so they actually did something which made them very distracting so the local dimming was useless there too.

I know you're probably thinking: "Those are all bad examples of local dimming and there's a panel I saw at a trade show in Beijing made from unobtainium that proves you wrong." All I can say is based on the theory above I am not convinced at all that (especially in game mode, the mode I care about) "good FALD" should be a major factor to a purchasing decision. I could be wrong, I'm just not currently concvinced that it should be. FALD in general is such a crutch. Like I said the more effective it is, the more obvious it gets so it's the panel's actual static contrast that counts. FALD is a technology that fully solves nothing and creates other problems doing it. Ideally it should not have to exist, and maybe it should not exist period. Either 'dim' per-pixel or don't do it at all.
 
Last edited:
FALD is not perfect, but it is a good tech. If OLED did not exist, it would be a great tech, but now that Samsung has released its QD-OLED products, FALD LCD tech may be at its peak. OLED's biggest con is brightness, and FALD backlighting fixes this.

One thing I will say about my Samsung Q90R vs. my LG CX is that the Q90R has a lot more color volume than LG's WOLED. Personally, I find that color volume a lot more pleasing in most situations. However, the LG CX wins in high-contrast scenes, especially with HDR content. Also, most of this tech is made (or broken) by its processing. That's the reason why so many people like Sony's OLED TVs more than LG, even though they are technically the same panel; the LG OLED tvs are more well rounded due to feature content, but Sony's picture processing is second to none.

Where LG really pulls ahead, though, is absolute features and post-purchase support (firmware/software updates), but that's not related to the panel tech itself.

TL;DR FALD tech works great when its used properly. It's not perfect, but no tech is.
 
now that Samsung has released its QD-OLED products, FALD LCD tech may be at its peak
For me personally that depends on whether QD-OLED burns in when used as a PC monitor. I suspect it will, at least for my use case. It'd just take a little longer.

OLED's biggest con is brightness, and FALD backlighting fixes this.
True that. Until the number of zones is increased significantly to at least, say, 1/9th of the total number of pixels (3x3 pixel zones) or 1/16th (4x4 = 518,400 zones) without breaking the bank I can't see myself getting too excited though. I can't see that happening tbh. I'll take a lower contrast image free of blooming artifacts over one with blooming that is noticeable. Those half a million zones would effectively give you a luminance grid with a resolution of 960x540 for a 3840x2160 panel.
 
Last edited:
I don't see as much need for it now that OLED has been around for awhile and is relatively affordable. QD-OLED fixes a lot of shortcomings from LG's WOLED and doesn't have a crazy starting price either, so I expect it may get pretty affordable next year.

Awhile back, before OLED existed at reasonable prices it would have been great to see FALD on more sets though. I think when implemented well it adds a lot over a standard LCD. Back then these TV manufactures were all pumping out edge lit garbage except at the absolute high end (even then, Samsung had a few years that were all edge lit even at the flagship level). But if you go back and look at sets like the Samsung UNB8500 reviews held those in high regard even compared to Plasmas at the time.
 
Like everything in displays, you have compromises. My experience is that FALD with enough zones gives good results. To me the primary issue is that the tech is way, way overpriced. We have OLEDs from both LG and now Samsung that perform well in HDR at a much lower cost than most Mini-LEDs. As more options come to market I hope that also means Mini-LED LCD displays tank in price when they will be no longer seen as a premium option.

Note that FALD is mainly relevant for HDR content. There isn't much point in using it for SDR where you will instead use typical LCD performance. Compared to OLED, in SDR content the main thing you will notice is contrast ratio. Even though my Samsung CRG9's VA panel is nowhere near my LG CX 48" OLED panel in this regard, in desktop use I feel there isn't a significant difference in how "deep" or "contrasty" they look when calibrated to the same 120 nits brightness levels I would typically use for desktop. IPS panels tend to be far worse for contrast and that is noticeable in SDR.
 
Even though my Samsung CRG9's VA panel is nowhere near my LG CX 48" OLED panel in this regard, in desktop use I feel there isn't a significant difference in how "deep" or "contrasty" they look
For desktop, sure. For dark SDR games in my experience VA static contrast still leaves quite a lot to be desired. (I'm talking FV43U, measured 4600:1 by RTINGS. Manufacturer's spec 4000:1, very high by VA standards). Beats IPS obviously but still makes me pick between having deep blacks or bright highlights, or a compromise that has neither.
 
Last edited:
This is probably going to get the OLED crowd's panties in a knot but here goes. OLED will never be great in anything other than a dark room. The problem is not really the peak brightness but the huge brightness drop as the bright area increase in size.

Although QD/Evo OLED finally hit 1000 nits barrier, it can only maintain a dismal ~200 nits at 100% (although already much better than the 128 nits of a C1). I don't know how people can accept a screen that goes from ~1000 nits at 2-10% to ~400 nits by the time you hit 50% and drops to a dismal ~200 nits at 100%.

Even the Sony X95J which cost $1000 less (65") than a G2 and $1500 less than the S95B that also peaks at ~1000 nits can still maintain ~600 nits at 100%. and their FALD gets pretty high marks (8.5) from rtings.

My Q90T can hit close to 1500 nits and can still maintain ~800 at 50% and ~600 at 100%.

For fast paced FPS gaming in a dark room, an OLED is arguably better with better pixel response resulting with minimal blur/smearing but that's about it. I had a 77" C1 for about 2 months and I promptly replaced it with a 75" Q90T as while it looks good in the evening, it can't even compare to my old 2016 KS9500 in the daytime.

I wasn't really concern about the peak brightness as you won't really notice a big difference between the ~850 on the old C1 vs something that would do 1000 but I had high hope that QD/Evo will dramatically improve the brightness level at 100%. Guess the O in OLED is rearing it's ugly head as it seems anything over 200 nits sustained on 100% will cause thermal degradation on the pixel.
 
Last edited:
  • Like
Reactions: Nenu
like this
LCD remains king of the bright rooms. But honestly, when watching content on my C1 in a room without direct sunlight but windows open... not really an issue. I could see it being one if there was direct sunlight though.

Anyway FALD was trying to solve a problem and did alright. But it's very complex and the end result is still worse than OLED and Plasma before it. It has been a really cool idea, but it was doomed from the start.

That said... dual LCD imho was cooler than FALD. Although I think the only consumer TV we got with the tech was the Hisense U9DG. Over 2 million "dimming zones" thanks to the 2nd Black n' White 1080p panel used to block light intead of an array of LED's.
 
  • Like
Reactions: N4CR
like this
Yaeh, the acer predator x38s (IPS) is stuck at 700nits no matter what and the advantage over the constantly ABL-ing OLED is very very apparent in HDR bright scenes and actually overall monitor usage including SDR content.
Dark scenes can't be compared though, OLED wins hands down, no contest. Black crash is insane on the IPS in HDR mode.
 
Yaeh, the acer predator x38s (IPS) is stuck at 700nits no matter what and the advantage over the constantly ABL-ing OLED is very very apparent in HDR bright scenes and actually overall monitor usage including SDR content.
Dark scenes can't be compared though, OLED wins hands down, no contest. Black crash is insane on the IPS in HDR mode.
VA black is much better. There's almost no black crush at all on my Q90T even running LG's 4K demo (some blooming is present though).
 
LCD remains king of the bright rooms. But honestly, when watching content on my C1 in a room without direct sunlight but windows open... not really an issue. I could see it being one if there was direct sunlight though.

Anyway FALD was trying to solve a problem and did alright. But it's very complex and the end result is still worse than OLED and Plasma before it. It has been a really cool idea, but it was doomed from the start.

That said... dual LCD imho was cooler than FALD. Although I think the only consumer TV we got with the tech was the Hisense U9DG. Over 2 million "dimming zones" thanks to the 2nd Black n' White 1080p panel used to block light intead of an array of LED's.
Yeah except Hisense has discontinued it. And there were issues with the approach, such as ghosting and response times according to RTINGS.

Until we get self-emissive display tech without burn-in risk (not because burn-in is a major issue on the most recent TV/monitor releases but because burn-in risk is mitigated by limiting brightness, shifting pixels, and so on) like MicroLED or any other equivalent tech, all of these solutions, OLED, FALD LCD, QD-OLED, etc , are basically stop-gap solutions. It's a matter of choosing what you like and what you can afford/live with until we get there. For me, it's FALD MiniLED monitors, for others it's OLED TVs, QD-OLED, etc.
 
  • Like
Reactions: Roen
like this
I know. The Dual LCD thing was kinda complicated, but it looked promising.

Yeah the MicroLED is looking like it solves the problem. Still feels ages away. Till then we are stuck with the seemingly never ending battle of LCD+Boltons & Tech with burn-in :D
 
Yaeh, the acer predator x38s (IPS) is stuck at 700nits no matter what and the advantage over the constantly ABL-ing OLED is very very apparent in HDR bright scenes and actually overall monitor usage including SDR content.
Dark scenes can't be compared though, OLED wins hands down, no contest. Black crash is insane on the IPS in HDR mode.
The Alienware OLED has no ABL while in SDR mode, only when HDR is engaged. That’s true for the LG’s however.
 
I believe in FALD. :)

At least in terms of watching movies.

I briefly had the original 40" Samsung FALD set. Movies looked pretty great on it. However, someone had dropped it or something and one of the corners was messed up. So I returned it. Figured I'd check out the following year's 40" model. I mean it's just going to get better and better right? Nope. There were some successors, all larger, but then it seemed abandoned. And instead we got edge lit. That's the sad story of 21st century displays. Large swaths of it anyway. I figured FALD had been left for dead until later Vizio kind of revived it and went all in. And then others...
 
Yeah except Hisense has discontinued it. And there were issues with the approach, such as ghosting and response times according to RTINGS.
That's a shame. It showed promise as an alternative to mini-LED.

As for ABL on OLEDs, I can't seem to get this to happen at SDR with brightness at 20-30 on my LG CX 48". Even putting up a full screen white webpage. I guess it triggers if you run at higher brightness than I do or in HDR mode as I can see it happening there. For me it has been an entire non-issue running this thing for over 1.5 years now.
 
512 zones is just way too low amount for this FALD idea to work well.
It not only cannot possibly provide good blooming-less image but larger zones make it harder to use dimming backlight effectively. With something which could resemble displayed picture it would be much easier to make this tech work well in games.

Something like 640x360 = 230K zones the image would be pretty good even with simple control algorithms.
Heck, make it at lest 320x180 = 57K zones, it would still be pretty good.
And on IPS panel with A-TW polarizer. VA is sheet that cannot be fixed by anything.
 
VA can get way darker than IPS and can knock on the door of Plasma levels of dark without any sort of special backlight tech. So if you combine VA with FALD you would get a better result than IPS as the blooming would be less noticeable.

Shame that G9 Neo is such buggy pos.
 
Last edited:
  • Like
Reactions: Roen
like this
There's got to be more early OLED adopters besides me (people that have had an OLED for 6+ years). No burn-in yet.

I mean, the experts say "yes".... how many people are suffering from it (real world)?
 
Let me borrow it for a month and you'll see. :D
Regardless, it's not as if burn-in is OLEDs only problem. They're not suitable for gaming (VRR gamma flicker) nor as monitors (PWM and a mirror finish). They are made specifically for one use case: entertainment watching in the evening. To get that modicum of brightness they use twice the power per nit of light compared to LCD.

OLED is like FALD in that it's a stopgap technology that will be left behind as soon as we have a real solution.
 
Last edited:
Why I don't believe in the concept of FALD

I've never owned a panel with lots of dimming zones so this is mostly theory but I don't believe in FALD. I'll explain why in technical terms shortly, but the broader perspective is that it is an attempt to somehow get around the hard fact contrast limit inherent to LCD panel technology. It tries to turn something into something that it's not, and can't become.

If the manufacturer configures dimming zones to make a big difference to blacks, meaning they block a ton of light, then you're going to really notice whenever those zones are turned on or off. The more light the zone blocks the bigger the difference between a blocked zone and an open zone, thus the more blooming you will see around it. Having more, smaller zones reduces the blooming but only OLED has 1x1 pixel "zones". So blooming remains an issue with FALD. And the more light a FALD zone can block to actually improve contrast, the more it blooms/distracts, given the same size/amount of zones. So the more effective it is, the more distracting it gets - again given the same number of zones. The 50" QN90A for instance has 448 zones. That is not a lot when you have 8,294,400 pixels. That's 18,513 pixels per zone a.k.a. an area the size of 136 x 136 pixels. That's still a pretty big chunk:
View attachment 466324
You've probably built a house in Minecraft with smaller blocks than that. :D I know I have. Then you know how hard it is to create any sort of detail with that. I'll quote from the "cons" of DisplayNinja's review of the Acer Predator X35 multi thousand dollar monitor with 512-zone FALD: "Noticeable blooming", and that's only a 34". A 50" QN90A TV as for instance has 2.5x the surface area and 448 zones, therefore should bloom more than 2.5 times as much, other aspects being equal. When speed and latency are important (game mode!) the panel doesn't have time to gradually fade zones in / out, assuming it can do it in the first place. Which brings me to my next point: FALD in game mode, and what people have written about it. These are some comments from QN90A owners, about the FALD on their MiniLED TVs:




B. the Installer's QN90A review, the section about local dimming:

He's not a fan - at all.
A MiniLED TV is supposed to be more suited to FALD than others, so what does that tell us?

My experience with the Gigabyte Aorus FV43U's dimming - which I understand is worse on paper - is: it is so subtle that you get the exact same result by lowering the brightness to 45/100. Same black levels, same highlight brightness, same slightly veiled / less impactful image due to the lowered brightness. You don't get brightness control when local dimming is on and highlights are a little dim so I prefer to disable its local dimming and simply set brightness to 45-65 depending on what game I'm playing (dark scenes or mostly bright). In a moderately dark game I like to set it to 50. Very slightly worse blacks than local dimming (which = brightness 45) but it offers just that little bit much needed highlight pop that local dimming mode lacks.

The Samsung Odyssey G7 that I briefly used before returning it had only 8 dimming zones, not unlike the FV43U but they were more aggressive so they actually did something which made them very distracting so the local dimming was useless there too.

I know you're probably thinking: "Those are all bad examples of local dimming and there's a panel I saw at a trade show in Beijing made from unobtainium that proves you wrong." All I can say is based on the theory above I am not convinced at all that (especially in game mode, the mode I care about) "good FALD" should be a major factor to a purchasing decision. I could be wrong, I'm just not currently concvinced that it should be. FALD in general is such a crutch. Like I said the more effective it is, the more obvious it gets so it's the panel's actual static contrast that counts. FALD is a technology that fully solves nothing and creates other problems doing it. Ideally it should not have to exist, and maybe it should not exist period. Either 'dim' per-pixel or don't do it at all.


FALD is a trade-off. It has plenty of problems. But just because it isn't perfect doesn't mean it isn't better than lighting up the entire screen evenly 100% of the time.

Kind of a pointless technology if OLED is cheaper though.
 
"I've never owned a panel with lots of dimming zones so this is mostly theory but I don't believe in FALD."

Roen, though workarounds can be aesthetically distasteful, they can work. FALD is certainly a workaround to get around LCD's limitations. It can produce a very impressive picture though.

I wouldn't lump OLED with FALD. OLED has its issues, but it's a true emissive technology that stands on its own.

 
though workarounds can be aesthetically distasteful, they can work.
Perhaps we ought to make a little tier list then.
EDIT: not my best idea ever, suggestion for tier list format removed from this post.
 
Last edited:
Perhaps we ought to make a little tier list then. If it grows enough to become annoying in this thread we can always move it to a dedicated thread.


FALD / Local Dimming display model tier list
  • Tier 1: Improves contrast by orders of magnitude, never distracting
  • Tier 2: Improves contrast by orders of magnitude, occasionally distracting
  • Tier 3: Improves contrast by orders of magnitude, regularly distracting
  • Tier 4: Noticeably improves contrast, never distracting
  • Tier 5: Noticeably improves contrast, occasionally distracting
  • Tier 6: Noticeably improves contrast, regularly distracting
    • Samsung Odyssey G7 32"
  • Tier 7: Little to no improvement to contrast, never distracting
    • Gigabyte Aorus FV43U 43"
  • Tier 8: Little to no improvement to contrast, occasionally distracting
  • Tier 9: Little to no improvement to contrast, regularly distracting
Format: BRAND MODEL SIZE, one line per model. Append "(EOL)" to models no longer in production. Don't quote, manually copy & paste the whole list including this line before adding to it.
That's pointless work. The tier list with current technology and displays available basically reads:

1. OLED
2. Mini-LED
3. Regular FALD
4. Edge lit
5. No local dimming

Within that you will have pros and cons like Mini-LED being capable of higher full window brightness and highlights but not do as well as OLED for scenes combining small bright and dark areas. Meanwhile OLED has per pixel dimming, superior black levels but is capable of less maximum brightness and sustained brightness. Like with everything for displays it's just picking different compromises.
 
  • Like
Reactions: Roen
like this
In broad terms I agree with your point so I've removed from my previous post the tier list format I suggested. It's probably more useful to only compare directly within the same category of the 5 you just mentioned the specific displays one is interested in. I'm aware of the categories and their differences, just wasn't thinking this through.
 
Last edited:
Another bad thing about FALD is that it adds input lag, and that cannot be fixed. It will always be part of the technology.
 
  • Like
Reactions: N4CR
like this
There's got to be more early OLED adopters besides me (people that have had an OLED for 6+ years). No burn-in yet.

I mean, the experts say "yes".... how many people are suffering from it (real world)?
*waves*

B6P, heavy use. mainly as a PC display. Most of the damage was done early (1st year); it was the Windows taskbar that burned in. Fewer problems since I lowered the OLED backlight, but over a year of WFH definitely done some wear, and in non-gaming situations I can see the areas that can't be driven as well due to them being where I put my Windows during WFH.

I would say as long as you don't blast the OLED backlight you should be fine, but at least on early panels you absolutely could suffer burn in on relatively short timespans.
 
I know. The Dual LCD thing was kinda complicated, but it looked promising.

Yeah the MicroLED is looking like it solves the problem. Still feels ages away. Till then we are stuck with the seemingly never ending battle of LCD+Boltons & Tech with burn-in :D
MicroLED seems to be the next thing for sure, as you get the per-pixel emissions combined with the lack of burn in risk. For now, super-high end (77" and up) only, but I'm hopeful on the timespan of a half decade they'll start making their way into more mainstream sets (55" and lower). Not expecting them in actual PC displays for a good decade though...
 
MicroLED seems to be the next thing for sure, as you get the per-pixel emissions combined with the lack of burn in risk. For now, super-high end (77" and up) only, but I'm hopeful on the timespan of a half decade they'll start making their way into more mainstream sets (55" and lower). Not expecting them in actual PC displays for a good decade though...
I think at this point half a decade even for expensive consumer priced TVs seems quite optimistic. It might end up being another option that remains the domain of very expensive and large displays for a long time as the manufacturing is more involved than other technologies currently available. If even Samsung is investing in QD-OLED as an interim feature that tells me that Micro-LED is still quite a few years away.
 
*waves*

B6P, heavy use. mainly as a PC display. Most of the damage was done early (1st year); it was the Windows taskbar that burned in. Fewer problems since I lowered the OLED backlight, but over a year of WFH definitely done some wear, and in non-gaming situations I can see the areas that can't be driven as well due to them being where I put my Windows during WFH.

I would say as long as you don't blast the OLED backlight you should be fine, but at least on early panels you absolutely could suffer burn in on relatively short timespans.
I mean, I get it (as one of those "early" people), I'm just not seeing it. But I don't leave it on 24x7 with the "same thing" displaying.
 
This is probably going to get the OLED crowd's panties in a knot but here goes. OLED will never be great in anything other than a dark room. The problem is not really the peak brightness but the huge brightness drop as the bright area increase in size.

Although QD/Evo OLED finally hit 1000 nits barrier, it can only maintain a dismal ~200 nits at 100% (although already much better than the 128 nits of a C1). I don't know how people can accept a screen that goes from ~1000 nits at 2-10% to ~400 nits by the time you hit 50% and drops to a dismal ~200 nits at 100%.

Even the Sony X95J which cost $1000 less (65") than a G2 and $1500 less than the S95B that also peaks at ~1000 nits can still maintain ~600 nits at 100%. and their FALD gets pretty high marks (8.5) from rtings.

My Q90T can hit close to 1500 nits and can still maintain ~800 at 50% and ~600 at 100%.

For fast paced FPS gaming in a dark room, an OLED is arguably better with better pixel response resulting with minimal blur/smearing but that's about it. I had a 77" C1 for about 2 months and I promptly replaced it with a 75" Q90T as while it looks good in the evening, it can't even compare to my old 2016 KS9500 in the daytime.

I wasn't really concern about the peak brightness as you won't really notice a big difference between the ~850 on the old C1 vs something that would do 1000 but I had high hope that QD/Evo will dramatically improve the brightness level at 100%. Guess the O in OLED is rearing it's ugly head as it seems anything over 200 nits sustained on 100% will cause thermal degradation on the pixel.
You have been scammed by manufacturers like Samsung, blowing up midtones, making them brighter than they are supposed to be. They have done this trick again and again, even with their new QD-OLED TV.

When doing a blind-test with real people and calibrated screens, OLED will always win because of superior dynamic range. There is no competition, LCD is inferior.

But sure, enjoy your bright LCD in a bright room all you want.
 
If the manufacturer configures dimming zones to make a big difference to blacks, meaning they block a ton of light, then you're going to really notice whenever those zones are turned on or off. The more light the zone blocks the bigger the difference between a blocked zone and an open zone
You aren't describing FALD FYI.
 
You have been scammed by manufacturers like Samsung, blowing up midtones, making them brighter than they are supposed to be. They have done this trick again and again, even with their new QD-OLED TV.

When doing a blind-test with real people and calibrated screens, OLED will always win because of superior dynamic range. There is no competition, LCD is inferior.

But sure, enjoy your bright LCD in a bright room all you want.
Don't bother arguing with him.
 
I don't know about FALD in monitors, but Sony LCD TVs have the best algorithms that works the same even in game mode.
My 55XH9505 with only 48 zones:




Changing from local dimming off to medium it's mind blowing for LCD and this XH95 has average native contrast for VA TV, only 3000:1 because of anti-reflection/wide angle coating.
 
Back
Top