Why OLED for PC use?

That says nothing about 4k FALD HDR media and gaming experience and tradeoffs vs a modern 4k OLED HDR media and gaming experience anyway. Disingenous and misdirection. Arguing the pros of FALD and referencing a non-fald edge lit, non HDR, 1080p TN with horrible blacks, low ppi to ppd and a matte surface aggravating it. Grasping at straws.
I will never get why some people go so hard on the "competitive gaming bro!" thing. Like if you are a competitive gamer, cool, and if you compete at such a level that the difference between something like a 240hz and 360hz monitor matters then that's great and you should get that. But acting like most people would care, or see any benefit, is silly. There are a LOT of people who are no competitive gamers, and even those who play competitive games there are a LOT that don't care about trying to be the very best.
 
I will never get why some people go so hard on the "competitive gaming bro!" thing. Like if you are a competitive gamer, cool, and if you compete at such a level that the difference between something like a 240hz and 360hz monitor matters then that's great and you should get that. But acting like most people would care, or see any benefit, is silly. There are a LOT of people who are no competitive gamers, and even those who play competitive games there are a LOT that don't care about trying to be the very best.
So OLED doesn't go as hard then?

Most people won't care either when they are satisfied with OLED SDR while thinking it is HDR.
 
endless-toy-train-robot-half-loop.gif
 
So I've been living a lie for the last 6 months!?
It's maybe an upgrade for you when you took a jump from traditional LCD to OLED for the last 6 months. But it is a downgrade if you had a HDR1000 FALD monitor 4 years ago.
 
It's maybe an upgrade for you when you took a jump from traditional LCD to OLED for the last 6 months. But it is a downgrade if you had a HDR1000 FALD monitor 4 years ago.


Still completely discounting people like me who had an IPS before, tried a FALD, and decided they'd like an OLED better?

Both technologies have their pros and cons, but your continued insistence that anyone who prioritizes other things over sheer brightness is missing the point.

I'm quite sure at this point I like this OLED better than I would have liked any current FALD. After a few weeks with it, it's a great monitor and the best I've owned. I'll still have to see if things like burn-in become an issue down the road, but for now, I wouldn't trade it for a FALD. FALD just had too many issues that affect me, and I'm reminded how much I like this thing every time I use it.

As far as comparing sales between the new 27" LG and the XL2566K, that's a really odd argument. The LG has only been out a few weeks and is sold out from LG. I know the Alienware sold well also (and out initially). And they have different price points and markets.

TN, OLED, IPS (both FALD and edge-lit) all have their uses and markets. And all sell pretty well with their feature set/price point. For me, OLED just happens to have the fewest compromises with the best feature set for what I do, best-in-class SDR with better contrast and incredible viewing angles, and still pleasant, decently impactful HDR. Whether it fits your specific and narrow custom-made, non-standard definition of HDR is besides the point. The videos you tried to showcase still looking pretty great on the monitor attest to that.

It's understandable you prefer what you prefer. I'd agree based on your criteria, FALD is the only and best choice for YOU. But trying to invalidate everyone else's opinions and preferences because they don't want to watch SDR at 400 nits (I'd have killer headaches) or try to convert everything to a nonstandard form of HDR isn't doing your argument any favors.
 
This thread is more about OLED monitors for PC.
It doesn't work when OLED "HDR" falls off chart.
It doesn't work when OLED is just a little fast SDR monitor.
It doesn't work when OLED has the most ABL to display 200nits APL.
It doesn't work when OLED has unsolvable flickers.
You are putting your very specific qualifiers and twisting the discussion as if those are the only things that matter.
  • Most content people work with is not using HDR in any capacity.
  • See point one.
  • I'll give you this, but it only becomes an issue if you need to use it in a bright environment. Sunlight is flooding in through the windows as I write this and I have no problem seeing my LCD calibrated to 120 nits. If I ran it at high brightness I would find it uncomfortably bright for desktop use.
  • LCDs also have flickering on a model by model basis. It may or may not be an issue for you. I have never had any eyestrain issues using OLEDs of any kind, but that doesn't mean you won't. So choose the tech or display model that helps you avoid that but don't try to pose it as some universal problem that affects everyone. I have never seen any flicker issues that would be a problem on the OLEDs I've owned, whether it's a phone, tablet or TV.
This thread no longer has anything to do with using OLEDs as PC monitors and is just multiple people going in circles with you. You keep repeating the same things over and over again (I present this thread as exhibit A) yet refuse to acknowledge anyone else's point of view.

I don't see you using a reference grade monitor so you are clearly also making your own compromises, just like the rest of us. Yet you can't seem to accept that people choose compromises that are not yours. Others may put more value to some other aspect (e.g eSports player would favor motion clarity over HDR or people look for a mix of good performance in any scenario), the price to performance aspect etc.

Compare that to SoCali who gave a far more nuanced point of view where he found himself preferring your favorite display without turning it into "you are all OLED shills, OLED can't display HDR" bullshit. I found his posts far more useful than anything you have written on this forum. In fact it has given me more to consider for my next display and TV where I will have to also look at mini-LED models more carefully rather than dismiss them outright because I have been happy with how my OLED TVs perform.
 
It's maybe an upgrade for you when you took a jump from traditional LCD to OLED for the last 6 months. But it is a downgrade if you had a HDR1000 FALD monitor 4 years ago.

But I did try HDR1000 FALD monitors? I didn't like the ones we currently have. For the IPS ones the blooming and response times/input lag (due to seemingly dated FALD tech?) were very noticeable and undesirable to me. For the VA ones the ridiculous curve, gamma shift, scan lines, screwy EOTF curve (fixed now?), and flickering were very undesirable to me.

I'm not discounting miniLED or bashing it. I'm a huge proponent of them. I want companies to invest more into them, improve them, and stop completely sidestepping it for OLED. The current offerings have too many flaws in my eyes, either blooming and major input lag when FALD is on, or uses Sammy VA which has too many problems to list. I'm just waiting for very good or even great ones to finally come, however I'm worried they won't since companies might just say "Fuck it, OLED is easier".

The best one is currently the Chinese InnoC 32M2V (which is also 1/3rd the price of the PG32UQX). I'm hoping the LG models in 2024 will be winners, and if so I'll probably swap to one right before my C2 warranty runs out. But like I said before, my 42C2 is serving me greatly and leaves me extremely satisfied which is a first for a monitor.
 
Last edited:
But I did try HDR1000 FALD monitors. I didn't like them. For the IPS ones the blooming and response times/input lag were undesirable to me. For the VA ones the curve and gamma shift were very undesirable to me.

I'm not discounting miniLED. I'm just waiting for a really good one to finally come. I'm hoping the LG models in 2024 will be great and I'll probably swap to one right before my C2 warranty runs out. The best one is currently the Chinese InnoC 32M2V, but has no warranty and still has some blooming, lag, and firmware issues (albeit an improvement).
What kind of FALD HDR1000 monitor do you have? Samsung G7/G8? That is not even a HDR1000 monitor. It's close to a VESA HDR600 VA with far less color. You pay the cheap one like G8 of course it won't look good but still brighter than OLED.

There is already good one like PG32UQX out there 2 years ago. You just don't buy it. You care about blooming then you won't see HDR1000 because nothing else can do HDR1000 at a better level unless of course you compare it to a reference monitor like typical OLED users do without realizing OLED is much worse.

If the best thing for you is just the cheap thing then you will wait for a few more years to see a cheap monitor get close to the level of PG32UQX.
 
What kind of FALD HDR1000 monitor do you have? Samsung G7/G8?

I tried the PG32UQX. The only one I have not tried is the 32M2V, but I'd rather wait for LG's offering at this point then buy a monitor without a warranty from a Chinese company I have never heard of before, or more in depth reviews.

You care about blooming then you won't see HDR1000 because nothing else can do HDR1000 at a better level unless of course you compare it to a reference monitor like typical OLED users do without realizing OLED is much worse.

I'm not really sure what you mean by this sentence. I noticed severe blooming and sluggishness with FALD on. I didn't like it. I returned it. Some people don't care and kept it and love it. That's fine too. This is not complicated.

Four years later the tech seemingly hasn't improved much. I'm not happy about that, I wanted major improvements in miniLED monitors. I'm hoping LG does this in 2024.

If the best thing for you is just the cheap thing then you will wait for a few more years to see a cheap monitor get close to the level of PG32UQX.

That's already happened with the 32M2V though didn't it? Also, If miniLED is still at the level of PG32UQX's bloom and lag in a few years, we're in deeper trouble lol. I'm currently using a $700 "monitor" that I find to be far superior than the PG32UQX, much less "close to the level". I hate bloom and lag more than I hate dim HDR. You clearly care mostly about hdr highlights/nits, in which case that's great and fine. Not everyone does.

I'm not bashing your opinion as you have difference preferences. I find OLED to be superior, and I hope MINILED gets better and quick. It's been stagnating badly. LG save us!
 
Last edited:
Still completely discounting people like me who had an IPS before, tried a FALD, and decided they'd like an OLED better?

Both technologies have their pros and cons, but your continued insistence that anyone who prioritizes other things over sheer brightness is missing the point.

I'm quite sure at this point I like this OLED better than I would have liked any current FALD. After a few weeks with it, it's a great monitor and the best I've owned. I'll still have to see if things like burn-in become an issue down the road, but for now, I wouldn't trade it for a FALD. FALD just had too many issues that affect me, and I'm reminded how much I like this thing every time I use it.

As far as comparing sales between the new 27" LG and the XL2566K, that's a really odd argument. The LG has only been out a few weeks and is sold out from LG. I know the Alienware sold well also (and out initially). And they have different price points and markets.

TN, OLED, IPS (both FALD and edge-lit) all have their uses and markets. And all sell pretty well with their feature set/price point. For me, OLED just happens to have the fewest compromises with the best feature set for what I do, best-in-class SDR with better contrast and incredible viewing angles, and still pleasant, decently impactful HDR. Whether it fits your specific and narrow custom-made, non-standard definition of HDR is besides the point. The videos you tried to showcase still looking pretty great on the monitor attest to that.

It's understandable you prefer what you prefer. I'd agree based on your criteria, FALD is the only and best choice for YOU. But trying to invalidate everyone else's opinions and preferences because they don't want to watch SDR at 400 nits (I'd have killer headaches) or try to convert everything to a nonstandard form of HDR isn't doing your argument any favors.
You see sRGB 80nits all the time. You give it a little more, you see sRGB 160nits.

With all that accuracy around sRGB that's as good as an office monitor, you don't care about HDR accuracy at all. You don't see better images at all.

You buy a professional FALD PA32UCG while don't know how to use it. I doubt you even bought it or just for the sake of OLED ads. In the end, you can only see tone-mapped SDR on OLED made by PA32UCG
 
You are putting your very specific qualifiers and twisting the discussion as if those are the only things that matter.
  • Most content people work with is not using HDR in any capacity.
  • See point one.
  • I'll give you this, but it only becomes an issue if you need to use it in a bright environment. Sunlight is flooding in through the windows as I write this and I have no problem seeing my LCD calibrated to 120 nits. If I ran it at high brightness I would find it uncomfortably bright for desktop use.
  • LCDs also have flickering on a model by model basis. It may or may not be an issue for you. I have never had any eyestrain issues using OLEDs of any kind, but that doesn't mean you won't. So choose the tech or display model that helps you avoid that but don't try to pose it as some universal problem that affects everyone. I have never seen any flicker issues that would be a problem on the OLEDs I've owned, whether it's a phone, tablet or TV.
This thread no longer has anything to do with using OLEDs as PC monitors and is just multiple people going in circles with you. You keep repeating the same things over and over again (I present this thread as exhibit A) yet refuse to acknowledge anyone else's point of view.

I don't see you using a reference grade monitor so you are clearly also making your own compromises, just like the rest of us. Yet you can't seem to accept that people choose compromises that are not yours. Others may put more value to some other aspect (e.g eSports player would favor motion clarity over HDR or people look for a mix of good performance in any scenario), the price to performance aspect etc.

Compare that to SoCali who gave a far more nuanced point of view where he found himself preferring your favorite display without turning it into "you are all OLED shills, OLED can't display HDR" bullshit. I found his posts far more useful than anything you have written on this forum. In fact it has given me more to consider for my next display and TV where I will have to also look at mini-LED models more carefully rather than dismiss them outright because I have been happy with how my OLED TVs perform.
If OLED wants to be the future it needs to do more than just 100nits plus a few dots of highlight.

The future is HDR with better images. The OLED has flickers and low brightness. These two factors are preventing HDR not even mentioning they contradicts each other. Once OLED gets brighter, OLED gets stronger flickers.

It's never about speaking nice about OLED to persuade you buying a miniLED. I don't care what you buy. It's all about OLED being worse.
 
It's all about OLED being worse.

Worse in some areas and better in others. I haven't seen any major tech site or review site say OLED was worse overall, especially when discussing current products.

It boils down to preference as of now. I highly prefer OLED over the current miniLED offerings, but I'm hoping that changes next year with LGs miniLEDs.
 
I tried the PG32UQX. The only one I have not tried is the 32M2V, but I'd rather wait for LG's offering at this point then buy a monitor without a warranty from a Chinese company I have never heard of before, or more in depth reviews.


I'm not really sure what you mean by this sentence. I noticed severe blooming and sluggishness with FALD on. I didn't like it. I returned it. Some people don't care and kept it and love it. That's fine too. This is not complicated.

Four years later the tech seemingly hasn't improved much. I'm not happy about that, I wanted major improvements in miniLED monitors. I'm hoping LG does this in 2024.


That's already happened with the 32M2V though didn't it? Also, If miniLED is still at the level of PG32UQX's bloom and lag in a few years, we're in deeper trouble lol. I'm currently using a $700 "monitor" that I find to be far superior than the PG32UQX, much less "close to the level". I hate bloom and lag more than I hate dim HDR. You clearly care mostly about hdr highlights/nits, in which case that's great and fine. Not everyone does.

I'm not bashing your opinion as you have difference preferences. I find OLED to be superior, and I hope MINILED gets better and quick. It's been stagnating badly. LG save us!

Even with blooming, FALD like PG32UQX still has far more accuracy on both contrast and color in HDR. OLED is not accurate at all or it will be used for HDR1000. OLED doesn't even reach HDR600.

This year or even next year you won't get better miniLED monitor than PG32UQX. All you can get is cheap backlight with opencells without G-sync that won't look better.

You think OLED is better because you are the same seeing sRGB SDR most of time. If you want to see better true HDR, OLED is not an option.
 
This year or even next year you won't get better miniLED monitor than PG32UQX.

That would really suck, miniLED tech stagnation has been annoying. But are you sure on this? The rumored LG monitor is 27" 4k and has 1,500 zones. Surely it must beat the PG32UQX which I find to be inadequate with dated FALD tech.

You think OLED is better

I don't think OLED is better for me, I know it is. I had both.
 
Worse in some areas and better in others. I haven't seen any major tech site or review site say OLED was worse overall, especially when discussing current products.

It boils down to preference as of now. I highly prefer OLED over the current miniLED offerings, but I'm hoping that changes next year with LGs miniLEDs.
It's never about preference. OLED cannot do HDR with so little brightness plus flickering. It can be a fast SDR monitor and that's all. And the future content is going to have much higher range.

These reviews don't have tools to scan and compare which has worse HDR. But it is very obvious a low brightness OLED cannot make HDR while a FALD can. It says enough about accuracy.
 
Last edited:
I tried the PG32UQX. The only one I have not tried is the 32M2V, but I'd rather wait for LG's offering at this point then buy a monitor without a warranty from a Chinese company I have never heard of before, or more in depth reviews.


I'm not really sure what you mean by this sentence. I noticed severe blooming and sluggishness with FALD on. I didn't like it. I returned it. Some people don't care and kept it and love it. That's fine too. This is not complicated.

Four years later the tech seemingly hasn't improved much. I'm not happy about that, I wanted major improvements in miniLED monitors. I'm hoping LG does this in 2024.


That's already happened with the 32M2V though didn't it? Also, If miniLED is still at the level of PG32UQX's bloom and lag in a few years, we're in deeper trouble lol. I'm currently using a $700 "monitor" that I find to be far superior than the PG32UQX, much less "close to the level". I hate bloom and lag more than I hate dim HDR. You clearly care mostly about hdr highlights/nits, in which case that's great and fine. Not everyone does.

I'm not bashing your opinion as you have difference preferences. I find OLED to be superior, and I hope MINILED gets better and quick. It's been stagnating badly. LG save us!

Even LG has a 1,500-zone somewhere it still has blooming. It doesn't matter if LG makes it accurate in both color and contrast. I don't think LG will put G-sync on it. But the current FALD still has far better accuracy than OLED.

I have both OLED and FALD too. OLED looks like trash to me.
 
Even LG has a 1,500-zone somewhere it still has blooming.

Yeah but it'll be less. Also maybe LG will update the FALD tech so it doesn't introduce so much lag?

Anyway I'll stick to OLED until it advances more.
 
I have both OLED and FALD too. OLED looks like trash to me.
Your opinion about OLED looks like trash to me 🙃

No one really cares about HDR.
Maybe if I got super expensive FALD display and had no other use case for it where it makes sense to use it and not OLED than burning my eyes then I would use HDR on it and think super bright display is the most important thing evar... even if 400 nits always seemed way too bright to be comfortable...
 
Yeah but it'll be less. Also maybe LG will update the FALD tech so it doesn't introduce so much lag?

Anyway I'll stick to OLED until it advances more.
It's always funny how people think the blooming become much less from 1152-zone to 1500-zone. It won't have that much difference. Or the monitor won't look accurate. But you can at least see HDR.

You stick to OELD because you see SDR most of the time.
 
Your opinion about OLED looks like trash to me

No one really cares about HDR.
Maybe if I got super expensive FALD display and had no other use case for it where it makes sense to use it and not OLED than burning my eyes then I would use HDR on it and think super bright display is the most important thing evar... even if 400 nits always seemed way too bright to be comfortable...
So you are also the ones care about sRGB that looks as good as an office monitor. You never care about seeing better at a higher range.

I use FALD for almost everything except for competitive gaming. I have TN for ranking. I don't need OLED because of brightness is too low to see. Once the OLED brightness is a bit higher it gives me eyestrain instead. What a paradox.

Saying 400nits too bright only exposes that your monitor is flickering. Eyes cannot withstand a flickering 200nits OLED in a dim environment.
 
It's always funny how people think the blooming become much less from 1152-zone to 1500-zone. It won't have that much difference.

That sucks. I was hoping for improvements, as I really want miniLED to become great. Guess I'll just use OLED until then, as I mind bloom and lag (in all content) more than brighter HDR.
 
That sucks. I was hoping for improvements, as I really want miniLED to become great. Guess I'll just use OLED until then, as I mind bloom and lag (in all content) more than brighter HDR.
Then you just see SDR on OLED with worse images. You cannot see better with OLED because it is a SDR monitor after all.
 
Then you just see SDR on OLED with worse images. You cannot see better with OLED because it is a SDR monitor after all.

Uhh okay. I kept the one that the content I typically consumed looked and played better on. It's that simple.
 
Uhh okay. I kept the one that the content I typically consumed looked and played better on. It's that simple.
What you typically consumed is worse SDR on OLED. It's that simple with FALD you can easily see beyond that limited range with better images. It's never about preference.
 
What you typically consumed is worse SDR on OLED. It's that simple with FALD you can easily see beyond that limited range with better images. It's never about preference.

So you wanted me to keep the display where stuff looked and played worse on for me?

This is getting silly.
 
So you wanted me to keep the display where stuff looked and played worse on for me?

This is getting silly.
It's not about the display you choose. It's your preference after all. If you see sRGB 80ntis most of the time, of course, you choose OLED.

I'm saying on OLED you can only see SDR which won't look anywhere better on FALD. Even with blooming FALD still has far more accuracy than OLED. And you haven't counted in the higher color volume of FALD IPS over OLED under proper brightness.
 
That sucks. I was hoping for improvements, as I really want miniLED to become great. Guess I'll just use OLED until then, as I mind bloom and lag (in all content) more than brighter HDR.

Blooming is not just about raw zone count. It is also about how bright the highlights are pushed. The InnoCN 32M2V has almost no blooming at all even with the same 1152 zone count because the highlights are dimming significantly, they peak at around 600 nits instead of over 1000 so it eliminates blooming almost entirely. But you give up some HDR impact in exchange for almost no bloom and I'm ok with that. Bigger window sizes/higher APL scenes my 32M2V still outshines my LG CX.
 
It's your preference after all.

Yeah, which is why I went with OLED for now.

Here's hoping LG hits it out of the park with their miniLED. Really all they need to do is make something slightly better than the 32M2V. A product like this but being from a brand people actually have heard + actual marketing so people know it exists of will go a long way.

Uhhh...anyway this is all wildly off-topic.
 
Yeah, which is why I went with OLED for now.

Here's hoping LG hits it out of the park with their miniLED. Really all they need to do is make something slightly better than the 32M2V. A product like this but being from a brand people actually have heard + actual marketing so people know it exists of will go a long way.

Uhhh...anyway this is all wildly off-topic.
With OLED you are seeing worse images. And you make the the wrong hope that LG can make good miniLED monitors.

LG doesn't use DC dimming backlight on its own monitors. It's always some cheap low-frequency hybrid PWM that are going to be worse than X32FP, which is still better than OLED.

LG doesn't even use G-sync. So there is even less hope whatever FALD from LG has good chips to deal with a few more zones.

Your only bet is AUO that won't have high refresh rate 2000-zone panel until next year. Yet the price range can be above $2000 which you won't buy anyway.
 
And you make the the wrong hope that LG can make good miniLED monitors.

LG doesn't use DC dimming backlight on its own monitors. It's always some cheap low-frequency hybrid PWM that are going to be worse than X32FP,

LG doesn't even use G-sync. So there is even less hope whatever FALD from LG has good chips to deal with a few more zones.

Oh damn, that's a shame. Thanks for the heads up, I'll stick with OLED then.

Guess my next upgrade will be a 42" QDOLED (if it's ever made) or the rumored next gen 34" QDOLED panels in 2024.

Yet the price range can be above $2000 which you won't buy anyway.

I would be willing to, if it's good enough.
 
Last edited:
Oh damn, that's a shame. Thanks for the heads up, I'll stick with OLED then.

Guess my next upgrade will be a 42" QDOLED (if it's ever made) or the rumored next gen 34" QDOLED panels in 2024.

I would be willing to, if it's good enough.

Then you are always seeing worse images on OLED with extreme limited range.
 
Dunno, I liked my OLED more than the miniLEDs I tried. Going to keep the display I liked more.
Just because you are seeing sRGB SDR all the time doesn't mean it is a better image. There are better images out there. It won't be on OLED.
 
You see sRGB 80nits all the time. You give it a little more, you see sRGB 160nits.

With all that accuracy around sRGB that's as good as an office monitor, you don't care about HDR accuracy at all. You don't see better images at all.

You buy a professional FALD PA32UCG while don't know how to use it. I doubt you even bought it or just for the sake of OLED ads. In the end, you can only see tone-mapped SDR on OLED made by PA32UCG

Accuracy "that's as good as an office monitor." And "You don't see better images."
This is just flat out incorrect. You keep dismissing SDR, but quite frankly, it's still THE STANDARD, far more so than HDR, and right now, OLED tends to represent it with fewer compromises than FALD displays. It's 100% true I do far more in SDR than I do HDR. I still enjoy HDR, but it's a mixed bag (FALD or otherwise). Some implementations are great...some implementations are terrible...things like AutoHDR are uneven and subjective (and I think often look worse than HDR, regardless of monitor type, and I've given a concrete example of that earlier in the thread). All your arguments pretend HDR is this new fully baked standard that can't be displayed at all on OLED, and that's just simply objectively wrong. HDR is great, and it's getting better, but there are still too many different types (HDR10, HDR10+, Dolby Vision, HGiG, HDR400,600,1000, etc.), too much unevenness with implementation, and a whole host of other potential pitfalls.

I just started a brand new game that doesn't support HDR. And even though it's in SDR, it looks beautiful. If they ever add HDR, it might give me a really nice excuse to replay. But I can just about guarantee you I'm having a better experience with it on this monitor than I would on a FALD.

OLED can display most HDR content just fine...not as bright, but tone-mapped still looking quite nice, especially since the majority of HDR games are meant to support a wide variety of monitors. Everything I've tried, including video "ah hahs" shared here to try unsuccessfully to say OLEDs can't do HDR, has looked great. I'm sure not quite as impactful as a FALD display, but still plenty punchy and pleasant. Of course, I'd like HDR1000 (heck, HDR4000...HDR10000) accuracy if I could get it all in the same monitor, but I can't at this time since no monitor does everything well, so I'll focus on accuracy and good experience for the vast majority of what I watch, while enjoying that it can still do the higher ends of HDR "good enough" to have an enjoyable experience.

You keep trying to argue that custom-made, custom-graded HDR is anything but preference, and that users should grade their own stuff. That's something only a small minority of people are interested in, and it's great that some people are, but the majority of us just want to enjoy our content in the best way possible. I have less-than-zero interest in grading my own HDR. Even if I did, I couldn't ensure accuracy, and neither can anyone else without the original source - at that point, it's just making it look pretty for your own eyes, which is great if that's what you want to do, but as soon as custom grading comes into the equation, image accuracy is no longer a consideration.

I love that we're turning this into "I doubt you even bought it". I did; It really doesn't matter to me if you believe I bought it or not. I'm not sure why anyone would make up trying multiple monitors over more than a month before choosing one that finally offers expected performance with minimal compromises for their uses. Honestly, I was very opposed to going OLED before this experience because of burn-in fears, but as another poster mentioned, my FALD experience was just simply underwhelming with way too many compromises for me to be happy with it at this time. I too am super excited for miniLED (and eventually microLED technology) in the future, but I thought it was further ahead than it is - that's all; why would I root for any technology to fail?

I know how to use a monitor just fine, thanks. I bought the PA32UCG because it was supposed to be the best FALD currently available. It was expensive, but would have been worth it to me had the performance been up to snuff. As I'd mentioned, maybe I should have tried the QX instead, but from what I've read on the topic, including other user experiences here, I'm reasonably sure I would have many of the same problems with it and wouldn't have been totally happy. Even if I would have kept it and settled, after using this, I know for a fact OLED is a better choice for me right now based on my uses. Just the better viewing angles, better motion, and perfect contrast alone are a game-changer for me.

Like I and others have said, your preferences are your own. FALD is a great technology, and it's great you've found a monitor you love. FALD isn't perfect, and neither is OLED. But with the current state of things, I just feel like FALD is not for me because it compromises on too much that I use the most, and like I said, you're not doing your argument any favors by telling people they don't know how to use a monitor or calling them liars as far as their own purchase experiences.

I use FALD for almost everything except for competitive gaming. I have TN for ranking. I don't need OLED because of brightness is too low to see. Once the OLED brightness is a bit higher it gives me eyestrain instead. What a paradox.

Saying 400nits too bright only exposes that your monitor is flickering. Eyes cannot withstand a flickering 200nits OLED in a dim environment.

I'm sorry if OLED gives you eyestrain, but that can be subjective and vary person to person. I can tell you OLED feels easier on my eyes than most other technologies I've tried. If it's a problem for you, that's unfortunate, and I can understand why FALD is preferable if that's the case.

That said, SDR is not meant to be displayed at 400 nits unless you're in an extremely bright room you have to compensate for. When I first got my (FALD) TV, I had it way too bright (I didn't measure, but I don't think 400 nits would have been an unreasonable guess; this was the default brightness after turning off the light sensing auto-adjust feature), and it caused extreme eyestrain at that brightness (any white flashes or full screen bright colors). Only after calibration to more reasonable nits did I realize just how bright it was set at. (And yes, of course initially decreasing brightness made it look "duller", but only til you get used to what you're actually supposed to be seeing.)

My OLED can hit 200 nits (and realworld ~650 nits for HDR for highlights/most real scenes...close to 1000 is possible depending on mode, though the most accurate mode is closer to that 650). I notice zero flickering and no noticeable eyestrain. I have it set to ~160 nits at the moment as that's generally my target since I do like things a touch brighter than reference and my room isn't completely dark, but even when I first got it and was closer to 200, no flicker and no eyestrain.
 
Accuracy "that's as good as an office monitor." And "You don't see better images."
This is just flat out incorrect. You keep dismissing SDR, but quite frankly, it's still THE STANDARD, far more so than HDR, and right now, OLED tends to represent it with fewer compromises than FALD displays. It's 100% true I do far more in SDR than I do HDR. I still enjoy HDR, but it's a mixed bag (FALD or otherwise). Some implementations are great...some implementations are terrible...things like AutoHDR are uneven and subjective (and I think often look worse than HDR, regardless of monitor type, and I've given a concrete example of that earlier in the thread). All your arguments pretend HDR is this new fully baked standard that can't be displayed at all on OLED, and that's just simply objectively wrong. HDR is great, and it's getting better, but there are still too many different types (HDR10, HDR10+, Dolby Vision, HGiG, HDR400,600,1000, etc.), too much unevenness with implementation, and a whole host of other potential pitfalls.

Just because sRGB is standard for 99% of office monitors doesn't mean it has better images. It's like the minimal poverty line of monitors. It looks pathetic in limited range. If you think that's good then you should've seen better because eyes can see a lot more. I won't go back to just see sRGB. With FALD I can make whatever HDR I want at much higher range. Future content, even the current content already has much higher range in both contrast and color. You cannot have a high range with an OLED. How come OLED is the future? It doesn't even do HDR properly. OLED is always the worse one due to limited range.

OLED can display most HDR content just fine...not as bright, but tone-mapped still looking quite nice, especially since the majority of HDR games are meant to support a wide variety of monitors. Everything I've tried, including video "ah hahs" shared here to try unsuccessfully to say OLEDs can't do HDR, has looked great. I'm sure not quite as impactful as a FALD display, but still plenty punchy and pleasant. Of course, I'd like HDR1000 (heck, HDR4000...HDR10000) accuracy if I could get it all in the same monitor, but I can't at this time since no monitor does everything well, so I'll focus on accuracy and good experience for the vast majority of what I watch, while enjoying that it can still do the higher ends of HDR "good enough" to have an enjoyable experience.

You keep trying to argue that custom-made, custom-graded HDR is anything but preference, and that users should grade their own stuff. That's something only a small minority of people are interested in, and it's great that some people are, but the majority of us just want to enjoy our content in the best way possible. I have less-than-zero interest in grading my own HDR. Even if I did, I couldn't ensure accuracy, and neither can anyone else without the original source - at that point, it's just making it look pretty for your own eyes, which is great if that's what you want to do, but as soon as custom grading comes into the equation, image accuracy is no longer a consideration.

I know what OLED looks like compared to FALD. It looks pathetically underwhelming. It's fine for you because you see sRGB most of time. It's not fine for me at all. There are much better images at higher range with better accuracy on FALD.
OLED doesn't have better images. It's simple as that. It's never about preference or OLED will be used as HDR monitors. The truth is OLED isn't a HDR monitor. It's a SDR monitor that's going to stuck in SDR while FALD LCD will take the market very soon because FALD has better images.


My OLED can hit 200 nits (and realworld ~650 nits for HDR for highlights/most real scenes...close to 1000 is possible depending on mode, though the most accurate mode is closer to that 650). I notice zero flickering and no noticeable eyestrain. I have it set to ~160 nits at the moment as that's generally my target since I do like things a touch brighter than reference and my room isn't completely dark, but even when I first got it and was closer to 200, no flicker and no eyestrain.

As I already said the flickers are invisible. You won't see it. Once the brightness gets a bit higher you will have eye strain easily.
 
It won't be on OLED.

I'm not sure what you're getting at, I kept the display I found to be overall the best for my uses. I'm sorry if this fact offends you.

You can keep saying my opinion is wrong (how childish, I'm not even doing that to YOU), but several top display reviewing sites agree with my own opinion and gave an OLED display the win for last year. This list was already posted 4 pages ago (seemingly because you kept annoying others about their personal preference), and you can add HUB to it as well.

This isn't to say miniLED is bad, but you can't just say our preference is the wrong one just like we are respectfully not doing to you. Especially when we have so many trusted review sources agreeing with us.
 
Last edited:
As I already said the flickers are invisible. You won't see it. Once the brightness gets a bit higher you will have eye strain easily.

I use my c2 at max brightness, I have less eye strain than on my last IPS monitor. I don't notice any flicker or anything. I believe I even mentioned this remark at this forum when I first bought it months ago.

Not sure why the things you are typing are so often the opposite for me. Again, preference perhaps?
 
Last edited:
Just because sRGB is standard for 99% of office monitors doesn't mean it has better images. It's like the minimal poverty line of monitors. It looks pathetic in limited range. If you think that's good then you should've seen better because eyes can see a lot more. I won't go back to just see sRGB. With FALD I can make whatever HDR I want at much higher range. Future content, even the current content already has much higher range in both contrast and color. You cannot have a high range with an OLED. How come OLED is the future? It doesn't even do HDR properly. OLED is always the worse one due to limited range.

sRGB is the standard for *accuracy*. You keep saying you can see "better images" as if accuracy doesn't matter at all. To many of us, accuracy does matter, particularly in sRGB. Feel free to ask directors, colorists, etc. if they feel like an accurate picture is "pathetic".

Sure, people can convert things to HDR on a whim like you're apparently a fan of, or for SDR I could just turn things to Vivid or increase the brightness/color saturation. But that would all just be subjective "I guess that looks good to me", which is not something I care to do. I know people who casually watch things and prefer modes like Vivid because they give the picture extra pop. And you know what, that's fine - that's their preference and who am I to tell them any different? Same goes for you. But me personally? I'll take a calibrated image closer to creator intent, especially in SDR. In HDR, I'll get as close as I can with tone mapping. The SDR advantages of OLED matter to me more.

OLED is increasingly getting to higher ranges. As high as FALD, obviously not, but it's come quite a ways in the last few years. The HDR on these modern OLEDs, while not perfect and achieving the brightness of FALD, is quite impactful, and a lot of reviews tend to agree.

I know what OLED looks like compared to FALD. It looks pathetically underwhelming. It's fine for you because you see sRGB most of time. It's not fine for me at all. There are much better images at higher range with better accuracy on FALD.
OLED doesn't have better images. It's simple as that. It's never about preference or OLED will be used as HDR monitors. The truth is OLED isn't a HDR monitor. It's a SDR monitor that's going to stuck in SDR while FALD LCD will take the market very soon because FALD has better images.




As I already said the flickers are invisible. You won't see it. Once the brightness gets a bit higher you will have eye strain easily.

In your opinion - it looks great to me and I find it generally very pleasant. You cannot say there are "better images at higher range with better accuracy" on FALD when in the same breath you're talking about custom-made content without taking accuracy into account at all. You may find the images *subjectively* better, and yes they're at higher range, but if they're converted SDR, they're inherently less accurate than the sRGB version. That's just how it works.

It's almost always about preference. There's a set of pros and cons with each and every technology, and each of us has to choose what matters to us.

You're arguing that OLED displays are "pathetic" and would cause eyestrain if they were brighter, which they're not, but they better not ever be, or our eyes will be in trouble because of invisible flicker (while meanwhile touting viewing SDR at 400 nits). That's essentially unprovable since according to your own words, OLEDs aren't bright enough. I'd get eyestrain on any display at 400 nits.

If you get eyestrain on OLEDs, it's understandable you aren't fond of them, but that's not the experience of everyone, and I'd say probably not the experience of most people. Uhhh...sure. *shrugs* All I know is, with real world use of the display, it's very comfortable on my eyes.
 
Last edited:
  • Like
Reactions: Zahua
like this
I'm not sure what you're getting at, I kept the display I found to be overall the best for my uses. I'm sorry if this fact offends you.

You can keep saying my opinion is wrong (how childish, I'm not even doing that to YOU), but several top display reviewing sites agree with my own opinion and gave an OLED display the win for last year. This list was already posted 4 pages ago (seemingly because you kept annoying others about their personal preference), and you can add HUB to it as well.

This isn't to say miniLED is bad, but you can't just say our preference is the wrong one. Especially when we have so many trusted review sources agreeing with us.

You using OLED don't mean OLED has better images. I know how bad OLED looks like compared to FALD.

It's more like you have these "trusted" reviewers as a backup of your imagination. These reviewers never say Apple Pro Display XDR has the best HDR with 576-zone local dimming.

If they say that they will never be popular. They are also not that professional. They don't find as many problems. They don't even make HDR. If they've tried to make HDR, they will still use something like PA32UCG, not some 200nits OLED.

I like how you roll the dialogue saying OLED is better because you use it.
 
Back
Top