Why OLED for PC use?

This gets to the heart of things I think. From what I've gathered the thing is that Kram stretches/filters HDR curves into his own curves he applies, lifts all the top lows and the mids like a torch mode. That's his contrast increase, moving the normal color curve goal posts to his own distorted ones. He's said that the default hdr curves are inadequate and made for hdr1000 and less screens most people have rather than his 1400nit one - but he'd likely throw that same kind of curveball to the top lows and the mids on a HDR 4,000 or HDR 10,000 capable game's curve vs default screen mapping on those kind of screens too if he had them.. That's fine if it's what he likes, and it can look pleasing sort of like a "HDR reshade" bump on a very bright FALD - but it's not accurate, it's just stretched/inflated/bulged. That may be being applied to objects and unknown object properties, large and small areas with unknown conditions, and scenes whose levels wouldn't necessarily accurately look lifted like that IRL or by creator's intent. It's a shaped filter so it's operating blind of what the content is and what the content's properties are IRL or what they are virtually representing. It's his own custom look like someone' favorite reshade settings. Additionally, 7000 pixels to 15,000 pixels per fald zone + offsetting and toning multiple surrounding fald zones x 7k - 15k each all of the time is never going to accurate to begin with as has been evidenced many times in the thread already.

That's not what you debated originally. You original claim is the lowlight is lifted that results in less contrast even the higher contrast waveform is shown as proof. And you don't know how to read it.

"
Lift is a lift. You turned the whole orange, yellow and red heatmapped erdtree into ~1000nit and lifted the whole green area on the right from a more meaningful depth. Those areas are not as dark in absolute value nor in comparison to whatever peak brightness areas are displayed anymore. Simple as that.
"
The lowlight is even lower. You just don't know how to read waveform
727274_52343528939_3430f00a5f_o_d.png




Since I make HDR with whatever preference I like, you should consider if you OLED monitor is capable of displaying the content with my intent. Once ABL kicks in, all pixels of OLED shrinks both contrast and color which results in a even worse accuracy.
 
Last edited:
Of course FALD LCD cannot do as good as dual-layer LCD but it does much better than whatever OLED you have. That's why the professional use it. Yet you call them idiots. You are so exposed.
Ah yes, all those Chinese professional youtubers with professional Nikon Z9s making super serious content for Youtube.
That's not what you debated originally.
LOL. Now you suddenly care about what was debated originally. Holy double standards, Batman.
 
That's not what you debated originally. You original claim is the lowlight is lifted that results in less contrast even the higher contrast waveform is shown as proof. And you don't know how to read it.

Since I make HDR with whatever preference I like, you should consider if you OLED monitor is capable of displaying the content with my intent. Once ABL kicks in, all pixels of OLED shrinks both contrast and color which results in a even worse accuracy.

Yes as contrast measurements are often considered as depth to darkness capability vs peaks, as in a VA screen or a FALDs screen vs an edge lit IPS's black depth capability. Taking a 200nit log in a swamp and lifting it to 500 nit and scaling all of the mids even higher (exaggerating a bit there but maybe not) is a farther distance sure but it's not what people usually intuit from a raw/flat contrast of screen X number : 1 measurement in specs. It's also not necessarily accurate to IRL or artistic intent. .. but I won't go into the whole debate again here though. That will suffice. Its just another can of worms and this is all just you resorting to ad hominem attack on someone personally again in order to distract from the points of this thread in the first place.


The problems with OLEDs are just Brightness, Burn-in, Text Clarity, and Color Fringing. Everything else is perfect.

Re: Text / Fringing

There are two ways imo to get clear text.

- The optimal way is to sit far enough away to get 60PPD, or better yet higher (especially with WRGB subpixel structure on OLEDs). Sitting at near distances with the screen on your desk will result in a 1500p like pixel density to your eyes which will make text and graphics look fringed even with text-ss and game AA applied aggressively. It will also push the sides of the screen outside of your 50 deg to 60deg human viewpoint and exacerbate off-axis and color uniformity issues on the sides of the screen.

- If you have to sit that close, where you are ~ 1500p like pixel density, you can sacrifice desktop real estate down from 4k 1:1 pixel by using windows scaling to scale everything up some. That will mean more pixels per character of text so will help but you will lose space from 1:1 px 4k.

- desktop scaling doesn't do anything vs. aliasing of game graphics though. Once you drop below 60PPD, even aggressive AA won't be able to compensate enough vs pixelization. Still usable but not the full picture quality it would get otherwise. (more like a 1500p screen's density at near desk distances rather than the fine pixels you'd expect from using a 4k screen).

- also worth noting that 2D desktop graphics and imagery typically remains completely uncompensated for vs pixelization (no AA, no sub-sampling other than edges of fonts)
It would be nice if there were some 32" models eventually for people who need shorter view distances.

42" 4k screen at 24" view distance is 52 PPD.

27" screen 2688 x 1512 rez at 24" view distance = 52 PPD

The 24" view crowd are in a way using a 42" 4k like a 27" 1500p screen's pixels and exacerbating off axis viewing angle issues instead of getting 4k fine pixel PQ. 😝


768839_8ss1o9P.png
View attachment 528910

. . . . . . .



On the other hand, a

. . 32" 4k at 24" view would be ~ 64 PPD (and 60 deg) which can be compensated for with aggressive AA and text sub-sampling (though the 2d desktop's graphics and imagery lacks AA outside of text sub sampling)

. . 32" 4k at 27" view would be ~ 70 PPD and 55 deg viewing angle.

So really a much better fit for the near desk view crowd. Right now it's a square peg in a round hole thing going on with larger screens for a lot of people from what I've read in threads and seen in images.

---------------------------------

Color non-uniformity (optimal viewing angles and optimal PPD layouts)

At the human 50 to 60 degree viewing angle, all 4k screens of any size are around 64PPD at 60deg and 77 PPD at 50 deg.

So if you are set up properly especially if getting 70PPD+ I think you'll be fine on most of these screen types. Sitting at optimal viewing angle will also prevent side panels worth of screen space from being pushed outside of your viewpoint and will keep the off-axis color uniformity issue from growing larger.


42" 4k and 48" 4k optimal view distances (the 24" view distance in the example is too close as a comparison, with low PPD and poor viewing angle):
tJWvzHy.png


You end up doing this kind of thing when you sit too close (as well as driving your PPD down):

optimal view distance (The outside viewer is the same amount of degrees off axis from the sides of the screen as the central viewer is).
XvKRu9t.png


Sitting too close. (The outside viewer is the same amount of degrees off axis from the sides of the screen as the central viewer is).
RUdpoK8.png


Human viewing angle:
3kU3adt.png


It's when people try to stuff a 42" - 48" - 55" screen on a (desk instead of mounting it separately at a more optimal distance) that they end up getting 1500p-like PPD or worse so the pixel grid appears too large. Then they try to squeeze as much masking/compensation out of text-ss and AA as possible to hide the ugliness. The 2D desktop's graphics and imagery typically remain uncompensated for entirely too since it gets no sub-sampling or game AA - so that will look even worse.
Also worth mentioning that AG/matte screen surface treatments can affect text clarity and how clear pixels/subpixels appear in general. Matte type AG surface treatments also raise blacks and affect how wet and saturated a screen would otherwise look.


798439_d3047e25cf1e8c6b2679bd7aeaf8a0b7612859e4_480x480.png




798444_magnified_text.jpg


----------------------------------
Re: Burn in

https://hardforum.com/threads/what-new-oled-gaming-monitors-in-2023.2024551/post-1045547989

Probably not for a long time especially with some precautions since LG OLEDs reserve a top 25% brightness buffer for a wear-evening routine. You are always burning your emitters down slowly but faster the more you abuse the screen. The wear-evening routine will even out all of the emitters every time it runs, losing some of the buffer. You won't know how much buffer vs. burn in you have left until you burn down through it all and bottom out since there is no way to look that up in the OSD or even the service menu. Considering that however, using one as a static desktop/app monitor is not the best or safest usage scenario, though there are precautions you can take and different modes you can use while using one for that.
 
Last edited:
Ah yes, all those Chinese professional youtubers with professional Nikon Z9s making super serious content for Youtube.

LOL. Now you suddenly care about what was debated originally. Holy double standards, Batman.

Now who is shifting focus?

You cannot deny the fact their HDR videos look good regardless of what kind of cameras they have.

You cannot deny the fact both of them use FALD LCD for HDR production.

You cannot deny the fact they don't use a 200nitis OLED for HDR production.

You are just downplay the OLED accuracy by saying FALD cannot do very serious HDR work. And you don't even dare to accept the fact OLED has much worse accuracy than FALD on the actual HDR.
 
Now who is shifting focus?

You cannot deny the fact their HDR videos look good regardless of what kind of cameras they have.

You cannot deny the fact both of them use FALD LCD for HDR production.

You cannot deny the fact they don't use a 200nitis OLED for HDR production.

You are just downplay the OLED accuracy by saying FALD cannot do very serious HDR work. And you don't even dare to accept the fact OLED has much worse accuracy than FALD on the actual HDR.
FALD can't do accurate except on test slides and no professionals use them for grading. You cannot deny this.
 
Got any kind of review, data, or proof to back what you're saying?
The review data of your own post is good enough.

1676566971452.png

QN95 has raised EOTF. It has much higher peak brightness across the board.

When there are footages need 10% window from 950nits - 1800nits or 25% window from 387nits-1200nits then QN95 will be more accurate. OLED is only accurate when the footage is lower in these ranges because its brightness is not enough.
 
FALD can't do accurate except on test slides and no professionals use them for grading. You cannot deny this.
I can deny this easily. I just showed you how professionals do grading work with FALD LCD. It is not as accurate as dual-layer LCD but a lot more accurate than your 200ntis OLED. Your own link shooting your own foot even encourages you to do grading on the monitor you can buy.

Again, the truth is both of them use FALD LCD to grade HDR. They don't use a 200nitis OLED for HDR production. And you are just downplay the OLED accuracy by saying FALD cannot do very serious HDR work. And you don't even dare to accept the fact OLED has much worse accuracy than FALD on the actual HDR.
 
The review data of your own post is good enough.

View attachment 549650
QN95 has raised EOTF. It has much higher peak brightness across the board.

When there are footages need 10% window from 950nits - 1800nits or 25% window from 387nits-1200nits then QN95 will be more accurate. OLED is only accurate when the footage is lower in these ranges because its brightness is not enough.
That doesn't address the lack of accuracy
 
Yes as contrast measurements are often considered as depth to darkness capability vs peaks, as in a VA screen or a FALDs screen vs an edge lit IPS's black depth capability. Taking a 200nit log in a swamp and lifting it to 500 nit and scaling all of the mids even higher (exaggerating a bit there but maybe not) is a farther distance sure but it's not what people usually intuit from a raw/flat contrast of screen X number : 1 measurement in specs. It's also not necessarily accurate to IRL or artistic intent. .. but I won't go into the whole debate again here though. That will suffice. Its just another can of worms and this is all just you resorting to ad hominem attack on someone personally again in order to distract from the points of this thread in the first place.
You still don't understand contrast. You still don't know how to read waveform where the low light goes down while high light goes up.

You are still claiming the contrast is lower even on the exactly the same location where 2000nits sun vs 0.01nits shadow has 20 times more contrast than 1000nits sun vs 0.1nits shadow

Again, better understand how the basic contrast works. Contrast is the luminance ratio from different parts of a picture. For example, the luminance ratio between point A and point B.

The higher the ratio, the more contrast these two parts have. One of the aspects of HDR (High Dynamic Range) is the contrast, another aspect is the similar ratio of color volume.

For example,
In a low contrast image-1, the ratio between A and B is B/A= 1.274
In a high contrast image-2, the ration between same location A+ and B+ is B+/A+ = 1.581

1.581>1.274, so (B+/A+)/(B/A)>1. It means part A+ and part B+ have a larger difference in luminance, higher contrast, compared to the luminance difference between A and B in image-1.

And this applies to every random point A and point B in the high contrast image vs low contrast image. The ratio is always higher in image-2 if the contrast is increased properly.

If a monitor is more capable of displaying a higher dynamic range, image-2 will look significantly better because the contrast and color volume is more close to the impactful real-life images.

728000_52345832775_b022f93b3c_o_d.png

728001_52345406261_b5f6547f1d_o_d.png

728002_52344449432_f797202ec1_o_d.png
 
Last edited:
I can deny this easily. I just showed you how professionals do grading work with FALD LCD. It is not as accurate as dual-layer LCD but a lot more accurate than your 200ntis OLED. Your own link shooting your own foot even encourages you to do grading on the monitor you can buy.

Again, the truth is both of them use FALD LCD to grade HDR. They don't use a 200nitis OLED for HDR production. And you are just downplay the OLED accuracy by saying FALD cannot do very serious HDR work. And you don't even dare to accept the fact OLED has much worse accuracy than FALD on the actual HDR.
LOL. The truth is you have proved nothing. You showed a video of a guy promoting the display (as in he is literally getting paid to say nice things about it, and at no point does he say he actually say he uses it for grading, while there is proof that he uses very expensive cameras and has previously purchased a $30.000 pro monitor so money is obviously not an issue for him) and some random Chinese youtube channel because that was all you could find while searching youtube for proof LOL.

Here's an actual professional colorist talking about what actual professionals use (obviously sponsored by Panasonic, so unfortunately we don't get his comments on the Sony BVM-HX310):

You are still denying the contrast is lower even on the exactly the same location where 2000nits sun vs 0.01nits shadow has 20 times more contrast than 1000nits sun vs 0.1nits shadow
Where exactly is he doing that? Seems to be only in your head...
 
Last edited:
Just follow any major thread dialogue Kram is in and you'll see similar.

Kramnelis. . he knows better than dolby, and better than TFTCentral .. /s . . " a few words from A review" .. probably one of the top monitor reviewing sites if not the top monitor reviewing site, known for very detailed technical reviews dissecting screens and known for being very honest about it.

When faced with the actual factual import and impact of the tradeoffs of fald, or any technical topic of debate or discussion for that matter really, he'll often fall back to more enraged sounding personal attacks on individuals themselves. He's done so multiple times to multiple people in this thread and others. It's his m.o. You know that someone is hard pressed when they swing wildly like that at the person rather than the facts. It's one of the logical fallacies, ad hominem. He actually employs several of them. While no-one is immune to falling into some of those traps from time to time, I try not to fall prey to that so I usually try not to focus on the person so much for that reason but I will say that facts and opinions on performance of screen types aside (he's brought up some interesting data at times) - I find his character questionable. At least I personally find that his online persona is kind of toxic.
Because the facts you provided are only partial of the reviews. You use it to downplay OLED has even less accuracy in HDR1000.

These reviews hasn't talk about OLED has less accuracy than FALD LCD when the range is out of its reach. They don't have tools to scan and compare as proof which monitor loses more accuracy.

But the eyes can easily tell the difference which monitor looks better. It's very solid fact FALD is commercialized for grading HDR 1000 while OLED isn't.
 
LOL. The truth is you have proved nothing. You showed a video of a guy promoting the display (as in he is literally getting paid to say nice things about it, and at no point does he say he actually uses it for grading, while there is proof that he uses very expensive cameras and has previously purchased a $30.000 pro monitor so money is obviously not an issue for him) and some random Chinese youtube channel because that was all you could find while searching youtube for proof LOL.

I provide proof they use FALD for HDR production. It's you provide nothing that FALD cannot be used for HDR production.

You can keep your imagination. One thing for sure is nobody uses a 200nits OLED to grade HDR1000. In the meanwhile, I can use PA32UCG to grade whatever HDR1000 I want.
 
I provide proof they use FALD for HDR production.
"for HDR production" yeah, lmao. You think you're so clever...
It's you provide nothing that FALD cannot be used for HDR production.
Next thing you want me to prove that God doesn't exist...
You can keep your imagination.
Thank you, that's very kind of you.
In the meanwhile, I can use PA32UCG to grade whatever HDR1000 I want.
Yeah, lol. Bet you have tons of material lined up. You are a professional colorist after all.
 
Now you just trash talk into oblivion. I grade whatever HDR I want with images you can never see on OLED.




Wooow, I'm so sad I'm missing out on your expertly crafted fake regraded anime HDR.
You're a gold mine. Please do keep going. But you're going to have to come up with some new material that I can work with...
 
Wooow, I'm so sad I'm missing out on your expertly crafted fake regraded anime HDR.
You're a gold mine. Please do keep going. But you're going to have to come up with some new material that I can work with...
Don't forget you don't even grade HDR.

These distributed anime UHD has exact the same look as sRGB even it is in HDR mode. You can grade better images since it is already 10bit. You miss a lot of fun. You miss even more better images.
 
LOL. And you don't even make anime.
I can see better images. That's what matters. I can grade any footage I record while you cannot even see what I see. You can't even grade HDR on your 200nits OLED. This is a big difference.

All you do is downplay the OLED accuracy by saying FALD cannot do very serious HDR work. And you don't even dare to accept the fact OLED has much worse accuracy than FALD on the actual HDR. Better buy a dual-layer LCD to see better.
 
I can see better images. That's what matters. I can grade any footage I record while you cannot even see what I see. You can't even grade HDR on your 200nits OLED. This is a big difference.

All you do is downplay the OLED accuracy by saying FALD cannot do very serious HDR work. And you don't even dare to accept the fact OLED has much worse accuracy than FALD on the actual HDR. Better buy a dual-layer LCD to see better.
You are not a colorist. You are not a movie director. You are not a display manufacturer. You are not a calibrator. You don't even make anime.
Nothing you say or do holds any value. All you do is shill FALD displays. You should buy an actually accurate monitor like a dual cell LCD if you actually care. But do you care? No you do not. You just shill. All day every day. On this forum. On other forums. The same nonsense shilling all the time. Almost like you were getting paid for it, but the ones paying for it, albeit indirectly, are probably your government and not the FALD display manufacturers.
 
You are not a colorist. You are not a movie director. You are not a display manufacturer. You are not a calibrator. You don't even make anime.
Nothing you say or do holds any value. All you do is shill FALD displays. You should buy an actually accurate monitor like a dual cell LCD if you actually care. But do you care? No you do not. You just shill. All day every day. On this forum. On other forums. The same nonsense shilling all the time. Almost like you were getting paid for it, but the ones paying for it, albeit indirectly, are probably your government and not the FALD display manufacturers.

I mean FALD displays can get pretty up there if this is anything to go by: https://www.newsshooter.com/2021/07/14/flanders-scientific-xm312u-5000nit-hdr-mastering-monitor/

A 5000 nits mastering monitor is probably something he should buy if he wants to "see better images" lol because surely 5000 nits beats out a measely 1600 on the Asus amirite?
 
You are not a colorist. You are not a movie director. You are not a display manufacturer. You are not a calibrator. You don't even make anime.
Nothing you say or do holds any value. All you do is shill FALD displays. You should buy an actually accurate monitor like a dual cell LCD if you actually care. But do you care? No you do not. You just shill. All day every day. On this forum. On other forums. The same nonsense shilling all the time. Almost like you were getting paid for it, but the ones paying for it, albeit indirectly, are probably your government and not the FALD display manufacturers.
It doesn't matter who I am. It's never about who I am. Everybody is a nobody on the Internet.

Your attacks about me did nothing about the fact that OLED has much worse accuracy than FALD on the actual HDR.

All you do is downplay the OLED accuracy by saying FALD cannot do very serious HDR work.

It's more about you get an 200nits OLED locked in a very limited range.
 
I mean FALD displays can get pretty up there if this is anything to go by: https://www.newsshooter.com/2021/07/14/flanders-scientific-xm312u-5000nit-hdr-mastering-monitor/

A 5000 nits mastering monitor is probably something he should buy if he wants to "see better images" lol because surely 5000 nits beats out a measely 1600 on the Asus amirite?
Funny thing is they went from inaccurate FALD, then replaced that with a nice and accurate dual cell LCD, and now they're back to inaccurate FALD. I mean they could have kept the dual cell LCD in the line-up, but I guess they hope that most people just think MORE BRIGHT MORE BETTER!
It doesn't matter who I am. It's never about who I am. Everybody is a nobody on the Internet.
Of course. It only matters who other people are because that you can use for your incessant FALD shilling.
All you do is downplay the OLED accuracy by saying FALD cannot do very serious HDR work.
Uhuh. Think you're getting yourself confused now.
 
Funny thing is they went from inaccurate FALD, then replaced that with a nice and accurate dual cell LCD, and now they're back to inaccurate FALD. I mean they could have kept the dual cell LCD in the line-up, but I guess they hope that most people just think MORE BRIGHT MORE BETTER!

Of course. It only matters who other people are because that you can use for your incessant FALD shilling.

Uhuh. Think you're getting yourself confused now.
You know what I mean.

It doesn't change anything about how AW3423DW low-range "HDR" get wrecked by a 4-year-olld 512-zone FALD high-range SDR. It doesn't change the fact OLED is underwhelming. It doesn't change the fact OLED has much worse accuracy than FALD on the actual HDR.

I never shill on FALD. FALD has problems. I cannot wait to replace FALD. But OLED is even worse. The brightness can hardly get higher. Once it gets higher it flickers even harder.
 
It doesn't change anything about how AW3423DW low-range "HDR" get wrecked by a 4-year-olld 512-zone FALD high-range SDR. It doesn't change the fact OLED is underwhelming. It doesn't change the fact OLED has much worse accuracy than FALD on the actual HDR.

I never shill on FALD. FALD has problems. I cannot wait to replace FALD. But OLED is even worse. The brightness can hardly get higher. Once it gets higher it flickers even harder.
None of what you say changes the fact that no FALD display can do anything accurately except large uniform test patterns. You can type a million posts with the same shilling, but you can't change the facts.
You think MORE BRIGHT MORE BETTER is the only thing that matters, but to most people it is far from the only thing that matters. Very few people exclusively view youtube videos by people who only make demo material for display manufacturers to showcase bright HDR. Some people like accuracy but FALD can never do true accuracy and you know this. You know I speak the truth but you can't stop yourself from shilling FALD. You always bring OLED into it because it's the only defense you have - that something is worse, even though it's not even relevant. And maybe get your head checked because you're reading a lot things that no one has ever written.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------
If you feel a need to reply to this post I have saved you some trouble. You can simply copy a few of these sentences that I have prepared for you*:

All you do is downplay the OLED accuracy by saying FALD cannot do serious HDR work.
Even Adobe SDR 400 nits on FALD reks 200 nits OLED.
OLED HDR is even less accurate than high range FALD HDR.
You don't even dare to accept the fact OLED has much worse accuracy than FALD on the actual HDR.
You don't even grade HDR.
Professionals use FALD monitors for HDR workflow.

*remember to not quote anything below the dashed line
 
None of what you say changes the fact that no FALD display can do anything accurately except large uniform test patterns. You can type a million posts with the same shilling, but you can't change the facts.
You think MORE BRIGHT MORE BETTER is the only thing that matters, but to most people it is far from the only thing that matters. Very few people exclusively view youtube videos by people who only make demo material for display manufacturers to showcase bright HDR. Some people like accuracy but FALD can never do true accuracy and you know this. You know I speak the truth but you can't stop yourself from shilling FALD. You always bring OLED into it because it's the only defense you have - that something is worse, even though it's not even relevant. And maybe get your head checked because you're reading a lot things that no one has ever written.
Of course a higher range is better.

You've never spoken any truth. All you do is downplay the OLED accuracy by saying FALD cannot do serious HDR work.

Funny FALD is already commercialized as professional HDR grading monitor. It can grade some of the best HDR videos out there than just test patterns lol. It's your OLED can not do any grading on HDR1000.

You don't even dare to accept the fact OLED has much worse accuracy than FALD on the actual HDR.
 
You've never spoken any truth.
Damn. Literally everything I have ever said about anything is a lie. Holy moly. I think I might need some time to truly process this revelatory information that you have bestowed upon me and retrain my brain.
I'm really disappointed you didn't use any of your classic lines that I had carefully selected for you, but instead chose some other of your other classic lines. Kinda breaks my heart to be honest.
Btw, I've thought about training a chatbot on your posts. It would be incredibly simple. I would make a new user for it to post from called "kramneIis". No one would notice the difference.
 
Damn. Literally everything I have ever said about anything is a lie. Holy moly. I think I might need some time to truly process this revelatory information that you have bestowed upon me and retrain my brain.
I'm really disappointed you didn't use any of your classic lines that I had carefully selected for you, but instead chose some other of your other classic lines. Kinda breaks my heart to be honest.
Btw, I've thought about training a chatbot on your posts. It would be incredibly simple. I would make a new user for it to post from called "kramneIis". No one would notice the difference.
Funny you are the one spreading misinformation while secretly shilling on OLED. I'm just doing the counterpart job.

I know how this work when you pull out dual-layer LCD as an excuse that FALD LCD is not accurate enough for HDR grading. You don't say anything abut OLED is even worse on this regard. FALD is already made professional HDR 1000 grading while OLED isn't.

You claim OLED has better accuracy in a low range that doesn't cover HDR. The truth is OLED has worse accuracy than FALD in a high range where HDR matters. So FALD is more accurate than OLED on the HDR range.

You claim nobody use FALD to grade HDR. I just showed you there are professionals use FALD to grade HDR1000 while nobody uses a 200nits OLED for HDR1000.

Then you claim anyone uses FALD to grade HDR are idiots while the truth is some of the best HDR videos out there are made FALD.

You've claimed nothing. Your claims don't stand a chance.
 
Rarely before has a troll been fed this much and persisted as long. Nor has a single point been uselessly rehashed so many times.
 
Rarely before has a troll been fed this much and persisted as long. Nor has a single point been uselessly rehashed so many times.
Funny I'm just saying the truth. It can be repeated multiple times. FALD has much better accuracy than OLED at high range.

The inaccuracy of FALD is the pixels under limited dimming zones. The inaccuracy of OLED is all the pixels hit by ABL. Once the ABL kicks in, all the pixels loses brightness to cause a less accurate HDR presentation than FALD. You don't get the brightness, you don't get the accurate color either. This is why OLED is a bigger tradeoff. This is why FALD is used to grade HDR1000 while a 200nits OLED isn't .

Prove me wrong if you can use 200nits OLED to grade HDR1000.
 
I use a mini led every day and I have never, literally never seen blooming in any type of mixed content. What I do see is an incredible jaw dropping picture every time no matter what game or movie or show I'm watching from bright to dark.
 
I use a mini led every day and I have never, literally never seen blooming in any type of mixed content. What I do see is an incredible jaw dropping picture every time no matter what game or movie or show I'm watching from bright to dark.
which one?
 
I use a mini led every day and I have never, literally never seen blooming in any type of mixed content. What I do see is an incredible jaw dropping picture every time no matter what game or movie or show I'm watching from bright to dark.
Most FALD backlight has optimization to reduce blooming such as lower brightness at smaller window size or lift the whole screen a bit. Some manufacturers such as AUO labeled it as Adaptative mini LED or AmLED.

The contrast of high light vs low light is less accurate but the blooming is less distracting as well. Yet the contrast of FALD is still a lot higher than OLED in HDR range as OLED brightness call fall off the chart to have even less contrast and less colors. Basically 1000nits sun vs 0.1nits shadow on FALD becomes 1000nits sun vs 0.11nits or 800nits vs 0.1nits. But on OLED it can be just 300nits vs 0.1nits.

A professional FALD monitor is accurate on brightness at any window size even though the blooming is more visible. This is enough to say how important brightness is on HDR. And a better HDR monitor is the one that has more accurate brightness despite more blooming on limited zones. Dimming zones will increase over time.
 
I'd chime in but I was told I didn't know how to use a monitor. I think I'm getting closer though. I plugged it into the toaster and I am seeing super high nits sparks and flames. Unfortunately, the smoke is making me kind of lightheaded, but I think I've achieved peak HDR, so there are worse ways to go out.

On a more serious note:
I use a mini led every day and I have never, literally never seen blooming in any type of mixed content. What I do see is an incredible jaw dropping picture every time no matter what game or movie or show I'm watching from bright to dark.

Can I ask what display and what settings the backlight uses? I don't notice blooming overly much on my Sony FALD TV, but it's still noticeable in certain HDR content (barely noticeable on most SDR though). On the ProArt monitor I tried, it was a major problem, particularly in desktop use, and I noticed it almost constantly doing anything, especially anything in night mode or with a darker background and any lighter elements. (I considered turning the local dimming on and off each time I switched between desktop use and gaming use, but that was a pain and not really a great solution. Given the price of the monitor, I assumed I could use local dimming all the time; I have read the ROG version has a feature where it can preserve local dimming settings for HDR vs. SDR, which would have been a bit more appealing as I could have had it on for HDR and off for SDR; still not ideal, but better. Local dimming did work well in games - I will say that. Keeping that panel was a no-go anyways because of the sleep issues I had with it and poor quality control (debris in the screen, at least 3 dead pixels, and 1 stuck pixel). Even if I had wanted to, I felt like the panel lottery was not in my favor giving reports of having to go through many exchanges from other people to get a good one.

All that said, I think switching to the LG OLED was the best thing that could have happened, and I feel the same way about my OLED picture as far as jaw-dropping picture. I haven't been disappointed yet, especially with SDR, but even when viewing the HDR videos supposedly showing what my monitor can't do well. I'm sure it doesn't get as bright and uses some tone-mapping to get there, but it still looks darn good, and I'm happy. As long as we all find the option that looks great to us and allows us to view the content we enjoy in a way we enjoy it, whether FALD or OLED, that's the important thing.
 
Last edited:
You guys want to know what displays I have to prove that it has blooming in certain rare scenarios. That's fine, me and my wife have never noticed anything bad or distracting when watching even dark content like Stranger things which has a lot of very dark scenes. I know what blooming is and I have never noticed it, ever. We have QN90A in the living room and a QN90B in my computer room. We love to open up the blinds during the day because we both enjoy natural sunlight and a view of our garden and trees in back yard. The QN90A in the living room is fantastic. I couldn't ask for anything more as we are very satisfied with it's picture quality. The QN90B in my PC room is arguably even better. It's so immersive and I love it at 144hz and extremely powerful HDR along with G-sync or whatever auto sync it uses to sync to the 4090. The PC room has a window that I can choose to shut or open for light or dark moods whatever I feel like. Both displays are 50". Also I tried a C2 LG and we honestly disliked it. I couldn't accept the ABL and dim pictures and my wife was believe it or not disappointed in the dim brightness also. I'm not even kidding. We returned it within a couple days. After buying the QN mini led it knocked out socks off. This is our experience.
 
You guys want to know what displays I have to prove that it has blooming in certain rare scenarios. That's fine, me and my wife have never noticed anything bad or distracting when watching even dark content like Stranger things which has a lot of very dark scenes. I know what blooming is and I have never noticed it, ever. We have QN90A in the living room and a QN90B in my computer room. We love to open up the blinds during the day because we both enjoy natural sunlight and a view of our garden and trees in back yard. The QN90A in the living room is fantastic. I couldn't ask for anything more as we are very satisfied with it's picture quality. The QN90B in my PC room is arguably even better. It's so immersive and I love it at 144hz and extremely powerful HDR along with G-sync or whatever auto sync it uses to sync to the 4090. The PC room has a window that I can choose to shut or open for light or dark moods whatever I feel like. Both displays are 50". Also I tried a C2 LG and we honestly disliked it. I couldn't accept the ABL and dim pictures and my wife was believe it or not disappointed in the dim brightness also. I'm not even kidding. We returned it within a couple days. After buying the QN mini led it knocked out socks off. This is our experience.
Not a gotcha at all; I was genuinely curious!

I completely agree for situations where the room can be bright, windows are open, etc., FALD displays like those are likely the best choice. If my room weren't so well light controlled, I would have probably gone FALD also, but OLED is bright enough for my needs since I can get my room pretty dark even in the daytime, so its advantages for my uses won out.

Samsung is known to have pretty aggressive local dimming (some older models resulted in black crush, but the models you mention are also pretty good in that regard from what I've seen), so that's probably why you don't notice much blooming. They seem like really great displays for bright rooms. Your post made me really curious, because I honestly feel like if I could use a large TV for my computer monitor, I would have had better luck with FALD (but I only have room for a desk mounted monitor, and 27-32" is about the most I can comfortably fit.) Just out of curiosity, I cast my Calendar tab to my Sony FALD TV. The Calendar is one of the places I noticed a ton of blooming on the ProArt monitor (around every date, and then the mouse always had it as well). Granted, casting a stream isn't the same as a direct connection, but it gave me a good enough picture from my computer to the TV to test my theory, and sure enough, no blooming at all around the calendar dates or mouse. So I think for whatever reason, some TV's might handle blooming better. (Sony's local dimming algorithm tends to not be as aggressive as Samsung's, but is still pretty good at suppressing blooming...preventing crush is where Sony tends to be most aggressive though, so in a situation where it's crush or bloom, it'll show modest bloom).

At any rate, experience matters. It makes perfect sense OLED was not a good fit for you, and it's awesome you're happy with the Samsungs. I've never tried a Samsung display personally (no particular reason - just haven't happened to), but I've heard plenty of good things.
 
You guys want to know what displays I have to prove that it has blooming in certain rare scenarios. That's fine, me and my wife have never noticed anything bad or distracting when watching even dark content like Stranger things which has a lot of very dark scenes. I know what blooming is and I have never noticed it, ever. We have QN90A in the living room and a QN90B in my computer room. We love to open up the blinds during the day because we both enjoy natural sunlight and a view of our garden and trees in back yard. The QN90A in the living room is fantastic. I couldn't ask for anything more as we are very satisfied with it's picture quality. The QN90B in my PC room is arguably even better. It's so immersive and I love it at 144hz and extremely powerful HDR along with G-sync or whatever auto sync it uses to sync to the 4090. The PC room has a window that I can choose to shut or open for light or dark moods whatever I feel like. Both displays are 50". Also I tried a C2 LG and we honestly disliked it. I couldn't accept the ABL and dim pictures and my wife was believe it or not disappointed in the dim brightness also. I'm not even kidding. We returned it within a couple days. After buying the QN mini led it knocked out socks off. This is our experience.
I also was curious as I returned the C2.
 
It's a pick your poison choice. I've seen plenty of reviews and forum comments going the other way after trying to go back from an OLED.

Here are a few points though:

- HDR content is made for dim to dark viewing environments so it will look optimal in those whether FALD or OLED.

-Most FALD screens are matte type AG coating which will raise blacks, and would whether oled or FALD, etc. Also can impact fine details and text.

- FALD's lighting resolution is around 45x25 cells. That's very poor. There is no way it can light blocks of 7,000 to 15,000 pixels plus the surrounding edge cells it tones brighter or darker in a sort of jumbo anti-aliasing of lights vs darks. Whether that is worthwhile tradeoff is up to you. There are tradeoffs either way. Some people don't "see" over 60hz, don't "see" smearing on VAs, etc etc. but the blooming and dimming "halos" or blobs of bright and dark cotton balls are there on FALD screens. Even when a scene allows the cells to blend more smoothly they are toning multiple cells of 7k to 15k pixels up or down from what they should be were the screen some type of per pixel emissive tech.

- Longer response times on LCDs , usually resorting to overdrive

- the larger than 43 and 50" models have rainbow effect when light hits the surface treatment



=============================================

From RTINgs:

"The LG C2 OLED and the Samsung QN90B QLED are both impressive TVs, and the best one depends on your viewing conditions. The LG is a better choice for a dim or dark room, as it has much better contrast and no blooming around bright objects in dark scenes. The Samsung TV, on the other hand, is a better choice for a bright room, as it gets significantly brighter." <-- HDR material is designed for dim to dark viewing environments no matter what screen type so that is the optimal viewing condition for it. The LG is the better choice for a dim or dark room according to RTings.
. . .

"The Samsung QN90B has great contrast, with deep blacks in a dark room. " <--- QN90B has deep blacks, . .. in a dark room. Again focusing on darker viewing environments for HDR.

"The Samsung QN90B has fantastic peak brightness in SDR. It's bright enough to overcome glare in any room, even if you have a lot of windows or lights. Unfortunately, large bright scenes are dimmed considerably by the TV's Automatic Brightness limiter (ABL)." <---- As I argued in this thread before , ABL is not necessarily limited to OLEDs. It also affects samsung's 8k screen since both of these screens can go 2000nit or so. This may be an issue as we get to higher peak brightness screens above 2000nit in the future - even on FALD LCD screens.

. . .

"Unfortunately, like most Samsung TVs, the local dimming feature performs worse overall in 'Game' Mode. The overall performance is pretty similar, but the TV seems to be spreading highlights out over a greater number of zones, so there's a bit more noticeable blooming. The processing is also slightly slower, so zone transitions are more noticeable. However, it's mainly due to the increased blooming. On the other hand, shadow details are a bit better, and there's less black crush overall." <--- again not a perfect screen. They all have tradeoffs. These downgrades in gaming mode and also indicates that outside of gaming mode the screen has black crush.

. .

HDR movies:

"The Samsung QN90B is exceptionally bright in HDR. Small specular highlights are incredibly bright, so fine details stand out in any scene. Large, bright scenes are significantly dimmer, but they're still bright enough for a good HDR experience. The brightness of the display doesn't fade at all over time, which is great. Small and medium-sized highlights are brighter than the Hisense U8H, but it also has more aggressive ABL, so full field bright scenes are brighter on the U8H.

These measurements are in the 'Movie HDR' Picture Mode with Brightness and Contrast at max, Local Dimming set to 'High', and Color Tone set to 'Warm2'.

If you want to make HDR even brighter, as seen in this EOTF, then set Contrast Enhancer to 'High' and ST.2084 to 'Max'. These settings result in considerably brighter scenes, but the overall peak brightness of the TV is the same. The 'Dynamic' Picture Mode is even brighter, reaching a momentary peak brightness of 3,126 cd/m² with a 10% window, but it can't maintain those brightness levels, and the brightness decreases to 440 cd/m² after a few seconds" <----- The bright FALD LCD is not immune from ABL . While the downpoint is still higher than OLED after it triggers it's still a big drop relative to the screen you were viewing, down to 1/4 or to almost 1/5th of the display's top end in some cases when it triggers. As I said with OLEDs, most scenes are pretty dynamic in media and games though so while you might notice it occasional it's not happening all of the time. (Unless, on an oled, you are distorting the hdr curves yourself so that normal scene %'s of the screen on OLED are all hitting the ABL limit way more often but who would do that.)

. .

HDR Games

"Highlights in dark scenes are about almost as bright in 'Game' Mode as in 'Movie' Mode, but unfortunately, most real content isn't as bright. It also doesn't track the PQ EOTF as well, meaning most scenes are displayed significantly brighter than they should be, and there's a steeper roll-off near the TV's peak brightness." <---- torch mode issue kind of, overly bright vs content

. .

Tracking, Clipping:

"Unfortunately, the Samsung QN90B QLED doesn't track the PQ EOTF properly, and most scenes appear significantly brighter than the content creator intended. There's also a very sharp cutoff at the TV's peak brightness, which causes bright white highlights to clip, so fine details are lost. It also behaves differently with different content, as content mastered at 4,000 cd/m² starts to roll off at lower peak brightness, as the TV's tone mapping kicks in earlier than with 1,000 and 600 cd/m² content." <---- torch mode brightness, clipping and detail loss.

. .

"Sadly, like other TVs with the Ultra Viewing Angle layer, there's a rainbow-like effect that scatters across the screen, but the 43 and 50-inch models don't have this layer and don't have this rainbow-like issue. " <------ screen surface treatments are bad. Most FALD screens have some kind of sufrace treatment on the screen. While not a FALD issue technically, in real world use it's a big tradeoff imo. Lighting hitting a screen washes out the contrast and saturation in areas of the screen so it is a bad/not optimal viewing environment no matter what type of screen surface. Allowing the lighting conditions to vary throughout the day will also change how your eyes and brain perceive contrast and saturation so unless you set up a bunch of settings for different times of day, your values and look of the screen will swing wildly.
 
Back
Top