Why OLED for PC use?

If I or anyone else says or shows anything (which has already happened countless times on this thread), whether it's objective evidence or subjective opinion, you'll deny it, say your way is "better", scream that no one "understands" and start the whole cyclic strawman arguing from the beginning. I've never seen someone repeat the same pointless unimportant nonsense on a forum in my life.
Funny I never denied OLED is worse for PC uses. High rang images are better as it is why there is HDR. It's you don't understand it and pretend accuracy matters so much in very limited range lol.
 
sRGB displayed in the wrong color space is not HDR. It's simply displaying the content with incorrect colors. If you like people's skin to look like it suffers from a sunburn, then maybe you would prefer it. I prefer not to see artificial sunburns, and I don't understand why you're arguing against this obvious point that sRGB should be displayed in its correct gamut. If distorted colors look better to you for some peculiar reason, then enjoy, but arguing that this is the way to look at this content is ridiculous.
If the skin tone changes then it only means the monitor brightness isn't enough. High gamut is matched with high brightness. If you rise the gamut as well as the brightness the extended yellow at 400nits will still have the same tone at 100nits. It will not look like orange.

If you just rise brightness the same color will look shallower. And you have to add more color to prevent it. This is how SDR is graded to HDR.

If somebody say the skin tone changes then it only means the monitor doesn't have enough range. It's a good thing the monitor already gives you wide gamut as an option to see higher range compared to no option at all.
 
He reminds me of someone on this very forum who was adamant that CRT was the best display tech and everything else is trash whether its LCD/OLED/etc. and if you used anything other than CRT you aren't "seeing better images" because everything would just be a blurry mess and not CRT clear. Boy would I love to see those two go at it lol.
Funny I never shill on miniLED. I use whatever with better images. MiniLED has issues of its own. But OLED is far more worse.
 
Eh, I prefer darker images with perfect self-emission than the blobby blooming of FALD for desktop use.

For movies and games, it's a toss up, self emission is so beautiful, but brighter images can also increase enjoyment.
 
Two different things are being talked about here and you've made it less than clear which you're doing.

Either, you're:
- Displaying sRGB content (the web, photos, etc.) shoehorned into a wide gamut range such as P3, where the colors will indeed be incorrect, and no amount of brightness will fix it.

OR

- You're talking about Auto HDR, which is quite a bit different and essentially does correct for the skin tone/color shift. (Though it can still do weird things for contrast and I still don't personally like it for games because of the unpredictability, but I can understand why some people do).
 
Speaking of HDR - is there any point to having it until Rec 2020? Isn't that ultimately the colorspace that the industry is shooting for? I'm sure it looks great with DCI-P3 but Rec 2020 displays should be capable of displaying sRGB, DCI-P3, and of course Rec 2020 content. Don't know, just thinking out loud. I'm good with having a kickass sRGB display now and waiting for Rec 2020 to be more widespread.
 
Speaking of HDR - is there any point to having it until Rec 2020? Isn't that ultimately the colorspace that the industry is shooting for? I'm sure it looks great with DCI-P3 but Rec 2020 displays should be capable of displaying sRGB, DCI-P3, and of course Rec 2020 content. Don't know, just thinking out loud. I'm good with having a kickass sRGB display now and waiting for Rec 2020 to be more widespread.

Yes. This is kinda like asking if there's any point to HDR until monitors can do 10,000 nits as that is what they are shooting for. Even without 100% Rec2020 and 10,000 nits you'll still get much better image quality over sRGB (Assuming the HDR content was graded/mastered well and your monitor is up to snuff with displaying it or at least tone mapping it good). It's still got a ways to go but it's still enjoyable now as it makes progress, sorta like how leaping from 360p to 1080p is still an improvement in image quality even though the ultimate goal is something like 4K or 8K or 16K whatever.
 
Last edited:
Speaking of HDR - is there any point to having it until Rec 2020? Isn't that ultimately the colorspace that the industry is shooting for? I'm sure it looks great with DCI-P3 but Rec 2020 displays should be capable of displaying sRGB, DCI-P3, and of course Rec 2020 content. Don't know, just thinking out loud. I'm good with having a kickass sRGB display now and waiting for Rec 2020 to be more widespread.
The difference is vast the richness of colors my panel has 89% rec 2020 coverage and its much more vibrant than the DCI-P3 color space.

Its going to be years if ever Rec 2020 becomes mainstream in monitors its reserved for reference monitors at this stage.
 
To see is to believe. But you need a competent HDR1000 monitor with Adobe color space to emulate HDR400 in SDR. Not just any monitor can do it.

Below is a HDR picture containing 1000+nits highlight. You can open it with Windows Photo in HDR mode if the monitor can show 1000nits


Then there is a SDR version of it.

It looks similar like below.

You will easily find out except the 1000+nits sun light the SDR image will look more closer to HDR version when it is at 400nits with extended Adobe color.
52724618779_bc8fab9418_o_d.jpg


These monitors have Adobe color to give you wide range to see more vivid image close to HDR400 other than just limited sRGB 80nits lol
 
Looks gorgeous in HDR on my OLED, though obviously I can't compare to a FALD directly (don't think my TV supports this file type, but I'm not 100%).
 
The highlight around the sun is 1840nits. The 400nits Adobe SDR looks 80% similar to the rest part of the images while OLED has less than 600nits highlight.
 
Funny I never denied OLED is worse for PC uses. High rang images are better as it is why there is HDR. It's you don't understand it and pretend accuracy matters so much in very limited range lol.

When you get blacks almost as good as OLED from a top Mini-Led panel along with the capability of having 3x as much peak brightness without any migitating risks in a PC environment well the choice was easy in my case especially.

I work with so many grading apps adobe premier and cc which you cannot avoid static toolbars for hours on end why would you get a OLED?

Lounge/bedroom different story but hey kudos to the people who can make a OLED tv work in a PC environment.
 
If the skin tone changes then it only means the monitor brightness isn't enough. High gamut is matched with high brightness. If you rise the gamut as well as the brightness the extended yellow at 400nits will still have the same tone at 100nits. It will not look like orange.

If you just rise brightness the same color will look shallower. And you have to add more color to prevent it. This is how SDR is graded to HDR.

If somebody say the skin tone changes then it only means the monitor doesn't have enough range. It's a good thing the monitor already gives you wide gamut as an option to see higher range compared to no option at all.

I think your spot on.

Vincent did a video on this case a few years back.

 
Understandable for your uses you prefer FALD - if you're grading content, the high-nit capability would be important, and it makes sense with your heavy usage to be wary of burn-in. I can see how that would outweigh any blooming issues, etc.

As for skin tones, for HDR, that'd be correct (and I liked Vincent's video on this).

A lot of us weren't sure if he meant "Auto HDR" (which works as described above) OR if he meant displaying sRGB content (the web, photos, etc.) shoehorned into a wide gamut range such as P3, where colors will be incorrect no matter what. They are very different things.
 
sRGB displayed in the wrong color space is not HDR. It's simply displaying the content with incorrect colors. If you like people's skin to look like it suffers from a sunburn, then maybe you would prefer it. I prefer not to see artificial sunburns, and I don't understand why you're arguing against this obvious point that sRGB should be displayed in its correct gamut. If distorted colors look better to you for some peculiar reason, then enjoy, but arguing that this is the way to look at this content is ridiculous.
When confronted with facts he can't deny, the comeback from him in this thread is always "yOu JuSt DoN'T uNdErStAnD!"

Just watch.
 
When you get blacks almost as good as OLED from a top Mini-Led panel along with the capability of having 3x as much peak brightness without any migitating risks in a PC environment well the choice was easy in my case especially.

I work with so many grading apps adobe premier and cc which you cannot avoid static toolbars for hours on end why would you get a OLED?

Lounge/bedroom different story but hey kudos to the people who can make a OLED tv work in a PC environment.
I used an LG CX 48" OLED as a software developer for two years and it's still going strong without burn in. I use virtual desktops a lot so that means the image is never static for a significant amount of time and I'm sure that helped. For your uses I can totally see why you'd pick mini-LED instead.

I picked OLED in 2020 because because at the time it seemed like the best compromise:
  • Excellent motion performance thanks to real sub-1ms pixel response times.
  • Excellent viewing angles.
  • Good enough HDR, better overall performance than you could get in other products. The PG32UQX was released a year later. There were 27" models with blooming issues.
  • 4K 120 Hz when we didn't have much choice in smaller 4K high refresh rate screens, 32" 4K 144 Hz models started coming out in 2021.
  • Caveats being the large size, pixel structure and potential for burn-in. Pixel structure for me was a non-issue as 1m viewing distance and 125% scaling largely mitigates it. Burn in hasn't happened so that leaves the large size, which was certainly an inconvenience.
Now the situation is different so I'm looking to pick a mini-LED superultrawide next. The smaller 4K OLEDs are basically unchanged since the CX, with only the smaller 42" size which performs slightly worse for HDR than my CX 48" but is otherwise nearly identical. Mini-LED TV options are still not great at under 50-55" sizes.

What annoys me about this thread is that kramnelis has adamantly refused to accept that people pick different compromises to his. He says our preferences don't matter when of course they do. We pick which compromises are going to work for our usecases.

For me the PG32UQX is totally out because I don't want to pay 3499 euros for a display that excels in HDR but is not that great in motion performance nor does it have enough desktop space as a single monitor. It would be a very expensive device where I'd use its main advantage for only about 20% of my usage. For 80% of it, it would perform really no better than my 399 euro 28" Samsung G70A which is a pile of crap for HDR but does really well for SDR gaming and desktop use.

With the Samsung 57" superultrawide mini-LED looking to be potentially between 2500-3200 euros, I'm more willing to pay that kind of money for something with a ton of desktop space at 4K sharpness because from using the CRG9 I know that form factor works for my uses and even if I have to take a hit in HDR performance, the overall package appeals to me more.

Ideally I'd like a 40" 5120x2160 screen but that form factor is again something that manufacturers refuse to offer in high refresh rate, good enough HDR option. Just like 32" 4K 144+ Hz models were only a few years ago.

I'm sure someone else would pick a different option from mine, but I'm not here arguing that their choice is wrong and nothing but a specific subset of specs matters.
 
Last edited:
I think your spot on.

Vincent did a video on this case a few years back.



Ironic the thumbnail has glow aura around Vincen'ts whole silhouette as well as that of Arwen/Liv tyler, glow/lifted blacks around all of the text, and bleed on the letterbox areas. I know it's not representing any particular fald or anything for real but I find it funny considering the argument and the tradeoffs of low lighting resolution fald we have currently.

wMamzLM.png


His other video shows the tradeoffs of the ucx/ucg just as well so back to tradeoffs.. circular argument. We can all keep replying with the same videos and links over and over.

814233_endless-toy-train-robot-half-loop.gif


The screenshot compression is greatly exaggerating the effect in these images but what he is saying in the closed caption is valid. You can look at the original video for a clearer idea of what is going on. He still gives it high marks because there are big tradeoffs with each tech. I don't think I'd be happy going back to these kinds of tradeoffs going back from per pixel emissive personally. Go with what you value more. It's likely that the highest rated HDR gaming display in 2023 will be a flagship qd-oled again though so it's not like oled isn't a valid set of tradeoffs.

817240_6qhlnnW.jpg



817241_fmy5LPa.jpg


(inhalation auto cc = elevation)
817242_eaLHXHV.jpg



817243_goBhMyR.jpg


817244_obR2Tlj.jpg


(hollowing = haloing)
817245_JkvDkLh.jpg
817246_JE9QyJj.jpg



817247_nyUGznV.jpg

817248_GxzcagS.jpg




https://www.youtube.com/embed/v26lTJAHaFU?start=308&autoplay=1
 
Last edited:
kasakka - The 57" Samsung is likely to have serious issues like all recent high-end Samsung monitors. That's the problem...
 
I used an LG CX 48" OLED as a software developer for two years and it's still going strong without burn in. I use virtual desktops a lot so that means the image is never static for a significant amount of time and I'm sure that helped. For your uses I can totally see why you'd pick mini-LED instead.

I picked OLED in 2020 because because at the time it seemed like the best compromise:
  • Excellent motion performance thanks to real sub-1ms pixel response times.
  • Excellent viewing angles.
  • Good enough HDR, better overall performance than you could get in other products. The PG32UQX was released a year later. There were 27" models with blooming issues.
  • 4K 120 Hz when we didn't have much choice in smaller 4K high refresh rate screens, 32" 4K 144 Hz models started coming out in 2021.
  • Caveats being the large size, pixel structure and potential for burn-in. Pixel structure for me was a non-issue as 1m viewing distance and 125% scaling largely mitigates it. Burn in hasn't happened so that leaves the large size, which was certainly an inconvenience.
Now the situation is different so I'm looking to pick a mini-LED superultrawide next. The smaller 4K OLEDs are basically unchanged since the CX, with only the smaller 42" size which performs slightly worse for HDR than my CX 48" but is otherwise nearly identical. Mini-LED TV options are still not great at under 50-55" sizes.

What annoys me about this thread is that kramnelis has adamantly refused to accept that people pick different compromises to his. He says our preferences don't matter when of course they do. We pick which compromises are going to work for our usecases.

For me the PG32UQX is totally out because I don't want to pay 3499 euros for a display that excels in HDR but is not that great in motion performance nor does it have enough desktop space as a single monitor. It would be a very expensive device where I'd use its main advantage for only about 20% of my usage. For 80% of it, it would perform really no better than my 399 euro 28" Samsung G70A which is a pile of crap for HDR but does really well for SDR gaming and desktop use.

With the Samsung 57" superultrawide mini-LED looking to be potentially between 2500-3200 euros, I'm more willing to pay that kind of money for something with a ton of desktop space at 4K sharpness because from using the CRG9 I know that form factor works for my uses and even if I have to take a hit in HDR performance, the overall package appeals to me more.

Ideally I'd like a 40" 5120x2160 screen but that form factor is again something that manufacturers refuse to offer in high refresh rate, good enough HDR option. Just like 32" 4K 144+ Hz models were only a few years ago.

I'm sure someone else would pick a different option from mine, but I'm not here arguing that their choice is wrong and nothing but a specific subset of specs matters.

I'll spare him the effort. Funny, OLED sucks, Samsung VA sucks, you could be seeing better images!
 
When you get blacks almost as good as OLED from a top Mini-Led panel along with the capability of having 3x as much peak brightness without any migitating risks in a PC environment well the choice was easy in my case especially.

I work with so many grading apps adobe premier and cc which you cannot avoid static toolbars for hours on end why would you get a OLED?

Lounge/bedroom different story but hey kudos to the people who can make a OLED tv work in a PC environment.
Grading involves putting a raw image into different color spaces, such as Rec.2100 or Rec.709, and adjusting it accordingly based on HDR/SDR distribution. The HDR image will have higher contrast after raising the highlights and lowering the black level. After adding color, the image will have a higher dynamic range with increased color and contrast.

Monitor manufacturers already make this happen automatically. SDR can be stretched close to HDR when an sRGB image is put into a wider color space with higher brightness. These wide gamut color spaces, such as Adobe, are already calculated to automatically add color to the highlights and lower the black level. With enough brightness, the image will just pop like HDR, similar to how SDR is manually graded to HDR or how Windows Auto HDR puts an sRGB image into Rec.2020.

The dynamic range on the current monitor is already FALD Rec.2020 HDR1000 > FALD wide gamut SDR > OLED Rec.2020 HDR > SDR sRGB. This trend will only continue in the future, and higher range images can always have better quality.

OLED brightness is too low, which ends up having worse accuracy on HDR. OLED cannot hold brightness either. Therefore, OLED will end up being a low-range SDR monitor compared to FALD. Additionally, OLED has flickering all the time, which can cause eye strain easily. These are why OLED is not good for PC use.
 
TFTCentral measured OLED flicker:
Additionally, OLED has flickering all the time, which can cause eye strain easily. These are why OLED is not good for PC use.

<MYTHBUST MODE ALERT>

Mythbusting image: Here's TFTCentral measurements of OLED flicker which is much less bad than most kinds of LCD flicker:

brightness_fluctuation[1].png

https://tftcentral.co.uk/reviews/lg-27gr95qe-oled

These are 5% udulations in brightness happening briefly (for less than 1ms every 240Hz refresh cycle). A double-digit percentage of LCDs have worse flickergraphs than this! Including certain FALDs that use PWM-per-backlight-LED for dimming each backlight LED. So, YMMV.

By all practical purposes, this is flickerfree to the vast majority of eyes. The depth is 95% less flickerdepth + 95% shorter-duration than the type of yesteryear LCD PWM dimming that creates eyestrain. Apples vs bananas comparision.

There's, however, a different kind of problematic flicker (gamma flickering) from variable refresh rate, where the gamma curves appears to change proportionally to frametime. This can be much more problematic. But, there is no visible OLED flickering for the fixed-Hz modes that creates eyestrain as bad as certain kinds of LCD flickers (e.g. inversion-artifact-related flickers, certain VRR flickers) or other aspects.

Also, OLED whites at 10% windows do get brighter than many LCDs -- e.g. 600 nits. So they work excellently with HDR effect in dark cave games with bright lights and neons; which can look pretty fantastic. Some people are extremely sensitive to FALD blooming too. I love MicroLED-backlit LCDs too, but they also have their pros/cons.

Everybody has a preference of priorities. OLEDs are better and worse at some things. But you've proven far beyond a shadow of doubbt, that you're overdoing it with major embellishments to the negative side.
 
Last edited:
TFTCentral measured OLED flicker:


<MYTHBUST MODE ALERT>

Mythbusting image: Here's TFTCentral measurements of OLED flicker which is much less bad than most kinds of LCD flicker:

View attachment 553827
https://tftcentral.co.uk/reviews/lg-27gr95qe-oled

These are 5% udulations in brightness happening briefly (for less than 1ms every 240Hz refresh cycle). A double-digit percentage of LCDs have worse flickergraphs than this! Including certain FALDs that use PWM-per-backlight-LED for dimming each backlight LED. So, YMMV.

By all practical purposes, this is flickerfree to the vast majority of eyes. The depth is 95% less flickerdepth + 95% shorter-duration than the type of yesteryear LCD PWM dimming that creates eyestrain. Apples vs bananas comparision.

There's, however, a different kind of problematic flicker (gamma flickering) from variable refresh rate, where the gamma curves appears to change proportionally to frametime. This can be much more problematic. But, there is no visible OLED flickering for the fixed-Hz modes that creates eyestrain as bad as certain kinds of LCD flickers (e.g. inversion-artifact-related flickers, certain VRR flickers) or other aspects.

Also, OLED whites at 10% windows do get brighter than many LCDs -- e.g. 600 nits. So they work excellently with HDR effect in dark cave games with bright lights and neons; which can look pretty fantastic. Some people are extremely sensitive to FALD blooming too. I love MicroLED-backlit LCDs too, but they also have their pros/cons.

Everybody has a preference of priorities. OLEDs are better and worse at some things. But you've proven far beyond a shadow of doubbt, that you're overdoing it with major embellishments to the negative side.
flicker_white_AW3423DW.png

They said the same thing about the AW3423DW, that it had only "a minor fluctuation of brightness". Besides, the 27GR95QE only has a maximum of 200 nits in SDR. They didn't test the fluctuation at HDR highlight either.

When the brightness gets higher, the fluctuations also increase. Frequent ABL makes it even worse as well, resulting in even more eye strain. These OLED flicker frequencies can be as low as sub 1000 Hz compared to 40KHz backlight or DC dimming backlight.

You had them sit in a dark room looking at AW3423DW with 250 nits or playing HDR games. Anyone from TFTcentral would experience eyestrain after just 30 minutes due to the flickers.
 
All l I know is in real-world usage, my LG (can't comment on the AW as I've never tried it) is easier on my eyes than any other monitor I've tried (and I have chronically dry eyes). I've been playing games at least a few hours a day the last couple days (SDR currently, though I did play some in HDR), not to mention hours of general PC usage. Some people may be more sensitive to certain kinds of displays than others, but to definitively say OLEDs cause eyestrain is just not true, at least not for everyone. (If you're arguing theoretically at a higher brightness, maybe so, but that's not testable at the moment, and until it becomes anything beyond theoretical, I'm not personally worried about it).
 
All l I know is in real-world usage, my LG (can't comment on the AW as I've never tried it) is easier on my eyes than any other monitor I've tried (and I have chronically dry eyes). I've been playing games at least a few hours a day the last couple days (SDR currently, though I did play some in HDR), not to mention hours of general PC usage. Some people may be more sensitive to certain kinds of displays than others, but to definitively say OLEDs cause eyestrain is just not true, at least not for everyone. (If you're arguing theoretically at a higher brightness, maybe so, but that's not testable at the moment, and until it becomes anything beyond theoretical, I'm not personally worried about it).
Yup.

I've noticed that my OLED is (overall) easier on my eyes too. Hundreds of hours of Visual Studio. None whatsoever.

Displays are woefully imperfect windows to real life -- flicker is not the only eyestrain attribute, and the OLED flicker duty cycle is invisible.

The variables of the flicker matters. Compared to this 240Hz shallow-depth short-brief flickergraph. It's vastly MUCH easier to see 2000 Hz flicker (via stroboscopics) at 10%:90% duty cycle full depth ON:OFF flicker, than this 240 Hz flickergraph showing brief-and-shallow flicker that's even shallower than an average incandescent light-bulb flicker (they actually flicker at roughly 5%-10% flickerdepth in sync with the AC sinewave, from the cooldown of the tungsten during the AC zero-volts crossing events). Follow the science rather than the nonsense.

Now, that being said, fast GtG (on any display like OLED) also amplify visibility of stutter, of which stutter-edge flicker is gigantically the biggest problem. But this is a problem with low frame rates on any fast-GtG displays. Look at the 15-30fps UFOs at www.testufo.com for example and edge-flicker effects -- they visibly stutter (edge flicker) more on fast-GtG displays than slow-GtG displays. Low-framerate stutter is a vastly a bigger cause of eyestrain on fast-GtG displays, but this is mitigatable (either by higher frame rates, or by intentionally adding GPU motion blur effects). Now if you stick to high-Hz OLEDs (120Hz+) and keep framerates high, edge-flicker of stutter on fast-GtG displays can cease to be an eyestrain. More OLED eyestrain causes were traced to this, than by the nonsense by a specific forum member here.

And oh yes, there's the pros/cons of OLED and LCD.

Yes -- there's problems with OLED PWM on some OLEDs (e.g. low-Hz PWM on some mobile OLEDs that have very deep duty cycles -- 0%-100%-0%-100% -- instead of 95%-100%-95%-100% -- and for shorter pulsewidths with longer dark time. Famously mentioned elsewhere, obviously.

But scapegoating flicker these specific 240hz OLEDs, when they're now among the best-in-class for lowest OLED flicker during everyday use?
 
Last edited:
What kills OLED for me is potential burn in and reported less than stellar font rendering. At current prices they would need to last 10 years at least and I don't see any of them making it even half as long without burn in. I use my monitor at least 12 hours a day most of it being on static images. Unless they come down in price to be cheaper than a non FALD IPS screen or they manage to actually fix these issues it's a DOA technology for me for monitor use even though I was looking at it as the holy grail 10 years ago. It promised a lot and failed massively in the delivery department. FALD is much newer tech and seems to have solved most problems already so looks like a much more feasible approach that only needs to come down in price. Time will tell, until then I'll keep living with non FALD LCD shortcomings which for me personally aren't deal breakers, just annoying.
 
  • Like
Reactions: Senn
like this
- I'm concerned about burn-in, but I've read from multiple users OLED has come a long ways in this regard. This is a concern, but it's been so long since I've tried an OLED since that first TV burn-in experience, I wanted to give it another shot. I am taking some burn-in prevention measures, but none I find too inconvenient. It's no big deal to have the screen go off after 5 minutes or hide the taskbar/go full screen at times. I changed the desktop background anyways. Still getting used to not showing all desktop icons, but it was a small adjustment.
This might not be such a big issue on newer panels but it can still affects users. For example users who choose to deploy burn-in mitigation actions are affected by the mere fact they worry and have to make compromises.

Personally I would be more confident I can stop worrying if OLED burn-in tests showed no burn-in after two years.

- I'm not extremely well-versed in bit depth precision at lower brightness levels; that said, I haven't noticed anything specifically on dimmer images.
When using very low brightness levels gradation is not perfect. Not an issue at medium to high brightness, even in dark images.
Also not very big issue and then it could be easily resolved with dithering - though at the cost of having some dithering noise.

- I haven't really noticed any near black chrominance overshoot issues on my LG, and Vincent's recent review mentioned it being kept to a minimum, so I'm not sure how much of a problem this is on newer panels.
Still WOLED panels exhibit this issue and when brightness is low it can be spotted in certain situations.
When brightness is high its much less noticeable.

- A lot of people have issues with text readability on panels like mine. I can notice some fringing if I look closely, but for whatever reason, it doesn't tend to bother me. If I was sensitive to it, though, I could see how it'd be a dealbreaker. It's sort of funny how different people are sensitive to different things and other things don't bother them at all. Many people don't mind the blooming in sRGB on FALD displays, but it was a dealbreaker for me.
I'd definitely like to see these resolved and a more standard RGB structure in the future. That, additional brightness, and even better burn-in resistance would make OLED even more appealing, but I know some of that is contradictory with current tech.
I am sure in next few years we will get much better and more suited for PC usage monitors.

We are after all in the very early days of this tech.
I am sure most of the issues will be resolved and for PC gaming market OLED will be the most popular choice.

---------------------------
BTW. To not be too negative here one of the OLED advantage for PC use, even now with non-RGB subpixels is panel structure and lack of LCD-like depth effects, including these which cause viewing angle issues.
What I am saying is that LCD panel has three-dimensinal structure made by translucent layers and it is pretty visible and can negatively impact eye-comfort for people who struggle focusing on such "surface". OLED on the other hand is the most flat panel tech after maybe electronic ink panels.
When switching between OLED and LCD or when using them at the same time its quite obvious there is something wrong about LCD and these panels do not look like flat surface.
OLED on the other hand, especially matte monitors, look like printed photos on high quality photo paper, perfectly flat. That has to be more comfortable.
No viewing angle issues also mean both eyes see the same color which further improves eye/brain comfort.

All in all OLED might quickly become panel of choice for people who have eye comfort issues. In these cases people already prefer lower brightness levels and in this case burn-in issues can be easily ignored.
In fact matte OLED at something like 80 nits is the most comfortable display tech I ever used.
 
Funny, only actual fools use displays. All displays are objectively garbage. I just look outside and I see real life. You can't even see real life so you cope by looking at your dull and lifeless FALD and OLED monitors.
This takes crown for most stupid post in this topic. This is quite an achievement given how much stupidity is posted here everyday 😵
 
This takes crown for most stupid post in this topic. This is quite an achievement given how much stupidity is posted here everyday 😵
That was almost certainly a sarcastic dig at a particular repetitive poster in this thread.
 
What kills OLED for me is potential burn in and reported less than stellar font rendering. At current prices they would need to last 10 years at least and I don't see any of them making it even half as long without burn in. I use my monitor at least 12 hours a day most of it being on static images. Unless they come down in price to be cheaper than a non FALD IPS screen or they manage to actually fix these issues it's a DOA technology for me for monitor use even though I was looking at it as the holy grail 10 years ago. It promised a lot and failed massively in the delivery department. FALD is much newer tech and seems to have solved most problems already so looks like a much more feasible approach that only needs to come down in price. Time will tell, until then I'll keep living with non FALD LCD shortcomings which for me personally aren't deal breakers, just annoying.
Exactly my situation. I remember getting a Galaxy S2 which actually had an RGB OLED panel, thinking that I wish I could have this sort of thing on the desktop. Fast forward to now and I'm waiting for the stepping stone tech known as OLED to make way for a self-emissive panel with decent brightness and without the burn in concerns. That should do away with the need for weird subpixel layouts too.
 
  • Like
Reactions: Ors
like this
I am sure in next few years we will get much better and more suited for PC usage monitors.

We are after all in the very early days of this tech.
I am sure most of the issues will be resolved and for PC gaming market OLED will be the most popular choice.
I unfortunately can't share your optimism. They had decades with this tech already and it seems to go nowhere (https://en.wikipedia.org/wiki/OLED). Would love to be able to move on from LCD limitations for desktop use but I just don't see it happening anytime soon. Just purchased another IPS LCD for the next 5+ years of wait.
 
  • Like
Reactions: Senn
like this
Yeah, I can understand if you're wanting to keep a display for more than 5 years and have heavy usage not wanting to risk the burn-in. It's a valid concern.
 
Meanwhile people in the LG oled threads have been using their oleds for mixed usage incld. desktop use with asbl turned off via service menu for 4yrs+ so far.

There are burn in , really burn-down of the 25% reserved wear evening buffer, mitigations like dark modes, no desktop icons, hide the taskbar (taskbar hider app can show/hide toggle it via hotkey , lock it away when not in use). Using different screen mode for deskop/2d sdr use, etc. Unlike phones and tablets, modern OLED gaming TVs reserve the top end of the brightness/energize capability for a wear evening routine that runs over hundreds and thousands of hours and into years of use. Considering that reserved brightness/reserved energize buffer - burn in doesn't happen until that buffer is exhausted so it really doesn't happen for probably ~ 5 yrs unless you are very foolishly abusive of the screen, perhaps even longer depending on your usage scenario (e.g.dynamic media and games rather than desktop use). So there could be your ~5 years instead of an ips.

The text issue is way overblown because oled gaming screens have been largely (pun intended) adopted from what has been available in relatively larger (vs. desktop) oled gaming tvs. To be fair it does look a lot worse the way most people are using them though. These tvs and their non-standard subpixel layouts, but really perceived pixel size generally overall, look optimal at over 60PPD, really nearing 70PPD or higher. Using them at ~ 24" or so on a desk like a smaller desktop screen distances makes their PPD look more like ~ 1500ppd which will have fringing on highly contrasted edges on anything, let alone sub-standard subpixel layouts. Any 4k screen no matter what the size at the optimal 60 to 50 deg human viewing angle gets 64PPD to 77PPD, respectively. 1440p screens are not a good idea for the same reason imo.

42" 4k screen at 24" view distance is 52 PPD.

27" screen 2688 x 1512 rez at 24" view distance = 52 PPD

The 24" view distance on a large 4k crowd are in a way using a 42" 4k like a 27" 1500p screen's pixels and exacerbating off axis viewing angle issues instead of getting 4k fine pixel PQ. 😝

The off axis viewing angle on the sides of the screen is also made larger by sitting too close. People shoehorning oversized screens onto desks instead of decoupling from their desk with a simple floor tv stand, wall mount, etc. exacerbates all of those issues where they really wouldn't be a big issue otherwise.

OLED desktop monitor /laptop screens typically have lower brightness than the tv oleds since they are made for part-time desktop use rather than full media duty. A smart oled gaming tv user that is using one for desktop/app use will use a named setting with a more appropriate set of settings similarly while using their oled for desktop use though, and switch when viewing gaming/media. Without a good 32" to 35" 4k OLED option for desktop in the consumer space I can see why fewer have been adopted for desktop use so far though.

Personally I'd just use two different screens, one as a media and gaming stage and one as a static desktop/app screen. I've had to do that since at least 2005 due to the various tradeoffs in screen technologies. Things change but they stay the same in some ways.
 
Last edited:
TFTCentral measured OLED flicker:


<MYTHBUST MODE ALERT>

Mythbusting image: Here's TFTCentral measurements of OLED flicker which is much less bad than most kinds of LCD flicker:

View attachment 553827
https://tftcentral.co.uk/reviews/lg-27gr95qe-oled

These are 5% udulations in brightness happening briefly (for less than 1ms every 240Hz refresh cycle). A double-digit percentage of LCDs have worse flickergraphs than this! Including certain FALDs that use PWM-per-backlight-LED for dimming each backlight LED. So, YMMV.

By all practical purposes, this is flickerfree to the vast majority of eyes. The depth is 95% less flickerdepth + 95% shorter-duration than the type of yesteryear LCD PWM dimming that creates eyestrain. Apples vs bananas comparision.

There's, however, a different kind of problematic flicker (gamma flickering) from variable refresh rate, where the gamma curves appears to change proportionally to frametime. This can be much more problematic. But, there is no visible OLED flickering for the fixed-Hz modes that creates eyestrain as bad as certain kinds of LCD flickers (e.g. inversion-artifact-related flickers, certain VRR flickers) or other aspects.

Also, OLED whites at 10% windows do get brighter than many LCDs -- e.g. 600 nits. So they work excellently with HDR effect in dark cave games with bright lights and neons; which can look pretty fantastic. Some people are extremely sensitive to FALD blooming too. I love MicroLED-backlit LCDs too, but they also have their pros/cons.

Everybody has a preference of priorities. OLEDs are better and worse at some things. But you've proven far beyond a shadow of doubbt, that you're overdoing it with major embellishments to the negative side.

Hmm could this be why my OLED gives me less eye strain than all the miniLEDs I tried (when fald was on)?
 
The active links buried in three places in this reply I posted several back showed the comparison of the samsung QD-LED LCDs vs the C2 OLED vs the samsung QD-OLED flicker charts from RTings too, similarly.

They'll go brighter than current FALDs potentially/ultimately but ABL will have to be addressed

It's all tradeoffs. Per pixel emissive is the future just as much as HDR is. - - Per pixel emissive is here - - with tradeoffs just like brighter HDR is here with FALD's stop-gap solution tradeoffs. They will both fail good brightness and blooming, uniformity, variance tests. Ad nauseam: pick your poison.

. . .

I did look into these samsung FALD tvs previously as I was curious how the FALD tech option was doing on that front.

Rtings reports in the quotes, other comments are mine.

===========================================

QN90B

- Bright SDR (for bright room use since our eyes work in a relative way 😝 )
- Aggressive ABL
"The Samsung QN90B has fantastic peak brightness in SDR. It's bright enough to overcome glare in any room, even if you have a lot of windows or lights. Unfortunately, large bright scenes are dimmed considerably by the TV's Automatic Brightness limiter (ABL)."

-Local Dimming even worse in Game Mode. More blooming, slow transitions visible.
"Unfortunately, like most Samsung TVs, the local dimming feature performs worse overall in 'Game' Mode. The overall performance is pretty similar, but the TV seems to be spreading highlights out over a greater number of zones, so there's a bit more noticeable blooming. The processing is also slightly slower, so zone transitions are more noticeable. However, it's mainly due to the increased blooming. On the other hand, shadow details are a bit better, and there's less black crush overall."
- conversely worse shadow details in movies, more black crush

- - Blooming and varying elevated blacks in adjacent areas (and sometimes even non-adjacent) in dynamic content are unavoidable on FALD densities we have now. Tradeoffs = Not very noticeable, not too bad, can live with it, could be worse . . etc. They are all tradeoffs :rolleyes:
Movies rather than the worse game mode: "There's a bit of blooming around bright objects in dark scenes" - - - > "but it's not very noticeable." - but but but. . how very is very ? noticeable = noticeable tradeoff

-shows things brighter than intended, clips, loses detail.
"most scenes appear significantly brighter than the content creator intended. There's also a very sharp cutoff at the TV's peak brightness, which causes bright white highlights to clip, so fine details are lost. It also behaves differently with different content, as content mastered at 4,000 cd/m² starts to roll off at lower peak brightness, as the TV's tone mapping kicks in earlier than with 1,000 and 600 cd/m² content."

-PWM an issue in some modes , may make some modes and settings unusable.
"The Samsung QN90B uses pulse width modulation (PWM) to dim its backlight, and the flicker frequency varies between picture modes and with certain settings. In 'Movie' mode, with the backlight set between '46' and the max of '50', the backlight flickers at 120Hz. However, it increases to 960Hz with a backlight setting below '46'. The flicker frequency drops to 120Hz in the 'Dynamic', 'Natural', 'Standard', and 'Filmmaker' Picture Modes, or if you enable the Game Mode or Picture Clarity settings. This low flicker frequency can cause headaches if you're sensitive to flicker, and it also causes image duplications with 60Hz content."

"The LG C2 isn't quite flicker-free, as there's a small decrease in brightness that corresponds with the refresh cycle of the display. It's very different from pulse width modulation flicker (PWM) on TVs with LED backlights. "

Samsung QN90B PWM . . LG C2 VRR backlight . . S95B OLED backlight

-- Directions,... err.. screen surface, unclear. Rainbow matte AG layer ( doesn't have the rainbow version on the 43" and 50" models though at least)
"Rainbow sheen from AG coating if light hits it" - also lifted blacks and compromised detail, lack of as saturated of a look, etc. from any activated by ambient lighting matte type AG surface abrasion

- BGR (OLED is pentile or WRGB though)

- OLED has better contrast (into the depths of oblivion and side-by side pixel vs colors rather than brickwork of large zones) and no blooming around bright objects in bright scenes, no varying black background lifting adjacent to bright areas, highlights stand out since they are down to razor's edge per pixel next to darks.
"The LG C2 OLED and the Samsung QN90B QLED are both impressive TVs, and the best one depends on your viewing conditions. The LG is a better choice for a dim or dark room, as it has much better contrast and no blooming around bright objects in dark scenes. The Samsung TV, on the other hand, is a better choice for a bright room, as it gets significantly brighter.

. .

QN 95B QD LED LCD

Per RTings (official RTings in the quotes):

- Bright SDR (for bright room use since our eyes work in a relative way 😝 )
- Aggressive ABL
"The Samsung QN95B has superb peak brightness in SDR. It's bright enough to overcome glare in any room, even if you have a lot of windows or lights. Unfortunately, large bright scenes are dimmed considerably by the TV's Automatic Brightness limiter (ABL),"


-Local Dimming even worse in Game Mode
"Unfortunately, like most Samsung TVs, the local dimming feature performs worse overall in 'Game' Mode. The overall performance is pretty similar, but the TV seems to be spreading highlights out over a greater number of zones, so there's a bit more noticeable blooming. The processing is also slightly slower, so zone transitions are more noticeable. However, it's mainly due to the increased blooming. On the other hand, shadow details are a bit better, and there's less black crush overall."

= Zoned out. Confining to less zones on highlights sounds like more black crush and more lost details, spreading to more zones has more blooming and slower more overt transitions.

-Slightly worse color than the QN90B
"The Samsung QN95B has decent HDR color volume, but it's slightly worse than the Samsung QN90B QLED. It's mainly limited by its incomplete color gamut, but colors aren't quite as bright as they should be"

- - - Blooming and varying elevated blacks in adjacent areas (and sometimes even non-adjacent) in dynamic content are unavoidable on FALD densities we have now.
"Some blooming around bright objects"
-Local Dimming even worse in Game Mode. More blooming, slow transitions visible.
"Unfortunately, like most Samsung TVs, the local dimming feature performs worse overall in 'Game' Mode. The overall performance is pretty similar, but the TV seems to be spreading highlights out over a greater number of zones, so there's a bit more noticeable blooming. The processing is also slightly slower, so zone transitions are more noticeable. However, it's mainly due to the increased blooming. On the other hand, shadow details are a bit better, and there's less black crush overall."
- - conversely worse shadow details in movies, more black crush

-- Directions,... err.. screen surface, unclear. Rainbow matte AG layer on all models I think
"Rainbow sheen from AG coating if light hits it" - also lifted blacks and compromised detail, lack of as saturated of a look, etc. from any activated by ambient lighting matte type AG surface abrasion

- BGR (OLED is pentile or WRGB though)

-PWM an issue in some modes , may make some modes and settings unusable.
  • In 'Movie' mode, with the backlight set between '38' and the max of '50', the backlight flickers at 120Hz. However, it increases to 960Hz with a backlight setting below '38'.
  • The flicker frequency drops to 120Hz in the 'Dynamic', 'Natural', 'Standard', and 'Filmmaker' Picture Modes. This low flicker frequency can cause headaches if you're sensitive to flicker, and it also causes image duplications with 60Hz content.
  • In 'Game' mode, it flickers at 120Hz with a backlight setting of '36' and up, and it flickers at 960Hz below '36'. If you enable the variable refresh rate feature, it always flickers at 960Hz.
  • In 'PC' mode, it always flickers at 120Hz in the 'Graphics' Picture Mode. In the 'Entertain' mode, it flickers at 120Hz with a backlight setting below 49, but it's flicker-free at 49 or 50.

"The LG C2 isn't quite flicker-free, as there's a small decrease in brightness that corresponds with the refresh cycle of the display. It's very different from pulse width modulation flicker (PWM) on TVs with LED backlights. "

Samsung QN95B PWM . . LG C2 VRR backlight . . S95B OLED backlight

- OLED has better contrast (into the depths of oblivion and side-by side pixel vs colors rather than brickwork of large zones) and no blooming around bright objects in bright scenes, no varying black background lifting adjacent to bright areas, highlights stand out since they are down to razor's edge per pixel next to darks.
"The LG C2 OLED delivers a better dark room viewing experience than the Samsung QN95B QLED, but the Samsung looks better than the LG in a bright room. The LG's near-infinite contrast ratio delivers incredibly deep, uniform blacks, and lets bright highlights stand out with no blooming. The Samsung, on the other hand, gets significantly brighter, so it's a better choice for a bright room with lots of natural light."

. . .

QN900B is their 8k tv which also has some tradeoffs.

Worth mentioning since they keep being brought up that the aggressive ABL on them is listed in reviews as a con. I think all of the 2022 2000nit qd-led LCDs do, incl. their 8k one. Not sure what the C 2023 ones are going to do but I read that their 8k 2023 900C flagship has lower brightness than their 2022 models so maybe that one doesn't will have to see reviews on it later. Idk if 2000nit+ progressing toward 4000nit, 10,000nit will be able to avoid ABL even on FALD leds unless they do some serious cooling and/or refine the tech brightness vs heat somehow. The FALD zones behave much worse in game mode on samsung LED FALD tvs too.

-Weird gray uniformity issue with a large dark band across middle of screen + can actually see the backlight brickwork grid in solid bright fields of color (wtf) :
"The Samsung QN900B has just decent gray uniformity. There's a large dark band across the entire width of the screen, which is distracting when watching sports. The sides of the screen are also a bit darker than the center. Unfortunately, the LED backlight grid is noticeable with certain content, especially if the TV is displaying a white screen or any uniform color."

-PWM an issue in some modes , may make some modes and settings unusable:
  • In 'Dynamic', 'Standard', and 'FILMMAKER' modes, the backlight always flickers at 120Hz. This low flicker frequency can cause headaches if you're sensitive to flicker, and it also causes image duplications with 60Hz content.
  • In 'Movie' and 'Game' mode, the backlight flickers at '960Hz' if the 'Brightness' setting is between 0 and 30. The flicker frequency drops to 120Hz with 'Brightness' set to 31 or higher.
  • With the input label set to 'PC', it flickers at 120Hz with a 'Brightness' setting of 46 or lower in both 'Entertain' and 'Graphic' modes, but it's flicker-free between 47 and 50.
"The LG C2 isn't quite flicker-free, as there's a small decrease in brightness that corresponds with the refresh cycle of the display. It's very different from pulse width modulation flicker (PWM) on TVs with LED backlights. "

Samsung QN900B PWM . . LG C2 VRR backlight . . S95B OLED backlight

- Loss of fine details in some content:
"tone mapping is a bit off with highly saturated colors, causing a loss of fine details. You won't notice this with most content, but the Rec. 2020 color space is gaining in popularity, especially in animated films and some nature documentaries."


-Local Dimming even worse in Game Mode. More blooming, slow transitions visible.
"Unfortunately, like most Samsung TVs, the local dimming feature performs worse overall in 'Game' Mode. The overall performance is pretty similar, but the TV seems to be spreading highlights out over a greater number of zones, so there's a bit more noticeable blooming. The processing is also slightly slower, so zone transitions are more noticeable. However, it's mainly due to the increased blooming. On the other hand, shadow details are a bit better, and there's less black crush overall."



- Directions,... err.. screen surface, unclear:
"Unfortunately, the 'Ultra Viewing Angle' layer causes bright lights to create a rainbow smear across the screen, which can be distracting even if the lights aren't directly opposite the TV, including overhead lights." - also lifted blacks and compromised detail, lack of as saturated of a look, etc. from any activated by ambient lighting matte type AG surface abrasion

- BGR (OLED is pentile or WRGB though)


==============================================

Between all of those tradeoffs and those I posted on the ucx/ucg type screens from the hdtvtest screencaps and video I replied with . . .

They all have tradeoffs, oled and FALD and even between different makes and models of oled and falds. After per pixel emissive I couldn't go back to large zone lighting Tetris brickwork personally but I can see why people take those tradeoffs for the other gains. If there were no oleds I'd have more or less "no other choice" but I do have the choice. VR will go all per pixel emissive soon with bright mico-oled so will be per pixel from then on through microLED era. . but I'll have to wait years to get a consumer priced 42" - 55" microLED room sized screen. Then again, time flies. 3 years isn't that big deal of wait to me now with a good monitor at hand but another 5+ from now is a good wait if that long. Per pixel emissive is worth the tradeoffs I get now, and worth the wait and looking forward to microLED later. I'll still look at FALDs if their lighting resolution increases by a lot someday before that though, depending on what advances oled comes up with in that timeframe, or some kind of other tech leaps across the board.
 
What kills OLED for me is potential burn in and reported less than stellar font rendering. At current prices they would need to last 10 years at least and I don't see any of them making it even half as long without burn in. I use my monitor at least 12 hours a day most of it being on static images. Unless they come down in price to be cheaper than a non FALD IPS screen or they manage to actually fix these issues it's a DOA technology for me for monitor use even though I was looking at it as the holy grail 10 years ago. It promised a lot and failed massively in the delivery department. FALD is much newer tech and seems to have solved most problems already so looks like a much more feasible approach that only needs to come down in price. Time will tell, until then I'll keep living with non FALD LCD shortcomings which for me personally aren't deal breakers, just annoying.
You need to consider what is the typical timeframe for how long you keep a display. I wouldn't buy OLED either if your intention is to keep it 10 years. My upgrade cycle has been about 6 years at its longest with about 3 years being more normal.

In my experience burn in is not a major problem at least on LG OLEDs and text rendering is fine if you are using one of the 4K 42+" models at an appropriate viewing distance for the size. Those are certainly not the things that keep me from buying one for desktop use. If LG could release a 42" 4K 165-240 Hz model with curvature and better HDR performance, I'd probably buy it. Unfortunately does not seem likely with the C3 being another year with no real improvement from the CX I have.
 
Personally I'd just use two different screens, one as a media and gaming stage and one as a static desktop/app screen. I've had to do that since at least 2005 due to the various tradeoffs in screen technologies. Things change but they stay the same in some ways.
VA ultrawides and/or 4K monitors + OLED is the rich man's "have your cake and eat it too". Most VA ultrawides have good contrast (3000:1 and up) and look terrific and you don't have to worry about burnin. Want to go to games? OLED. :) My VA monitor was exceptionally good for coding on. I miss it so much that I sometimes fantasize about buying another one just for work. But... I only have one desk, so my XG-4321 pulls double duty.
 
Back
Top