Why OLED for PC use?

Funny you are the one spreading misinformation while secretly shilling on OLED. I'm just doing the counterpart job.
Where did I spread misinformation? And oh no, my covert operation has been blown. My between the lines super secret shilling that no one else has noticed has been exposed!
I know how this work when you pull out dual-layer LCD as an excuse that FALD LCD is not accurate enough for HDR grading. You don't say anything abut OLED is even worse on this regard. FALD is already made professional HDR 1000 grading while OLED isn't.
I don't say anything about OLED because it's not a competition between OLED and FALD. It's you that is constantly crying about how OLED is worse. Apple makes similar claims for their Pro Display XDR btw. Doesn't make it true. They will say whatever it takes to sell their product, and since their monitors do well on test slides they can get away with it.
https://support.apple.com/guide/fin...r-video-with-pro-display-xdr-ver27c9f61bc/mac
You claim OLED has better accuracy in a low range that doesn't cover HDR. The truth is OLED has worse accuracy than FALD in a high range where HDR matters. So FALD is more accurate than OLED on the HDR range.
Yawn. OLED is accurate as long as the material does not trigger the ABL or hits the peak brightness. You can't dispute that. Yes, you can act like a baby and say dumb stuff like "B-B-But that's t-totally not REAL HDR". FALD is not accurate in any range.
You claim nobody use FALD to grade HDR. I just showed you there are professionals use FALD to grade HDR1000 while nobody uses a 200nits OLED for HDR1000.
You didn't, but you sure lie a lot. You showed one who uses it "somewhere in the production phase but not for critical finishing" and one random Chinese youtuber.
Then you claim anyone uses FALD to grade HDR are idiots while the truth is some of the best HDR videos out there are made FALD.
Yawn.
You've claimed nothing. Your claims don't stand a chance.
These statements are incredibly contradictory...

Rarely before has a troll been fed this much and persisted as long. Nor has a single point been uselessly rehashed so many times.
I really wouldn't call him a troll. If he is it is surely the most extensive and bizarre case I have ever seen. I just think he is just a little... different.
The inaccuracy of FALD is the pixels under limited dimming zones. The inaccuracy of OLED is all the pixels hit by ABL. Once the ABL kicks in, all the pixels loses brightness to cause a less accurate HDR presentation than FALD. You don't get the brightness, you don't get the accurate color either. This is why OLED is a bigger tradeoff. This is why FALD is used to grade HDR1000 while a 200nits OLED isn't .
All pixels are in dimming zones. All pixels are affected. Unless we're talking large uniform test slides.
No one serious would ever use FALD to grade anything.
Prove me wrong if you can use 200nits OLED to grade HDR1000.
Yawn.
I use a mini led every day and I have never, literally never seen blooming in any type of mixed content. What I do see is an incredible jaw dropping picture every time no matter what game or movie or show I'm watching from bright to dark.
Be happy with what you've got. All technologies have pretty significant flaws. If you don't see any now, then stop reading and enjoy your display.
Most FALD backlight has optimization to reduce blooming such as lower brightness at smaller window size or lift the whole screen a bit. Some manufacturers such as AUO labeled it as Adaptative mini LED or AmLED.
And that's one of the things that causes the FALD displays to be inherently inaccurate.
Yet the contrast of FALD is still a lot higher than OLED in HDR range as OLED brightness call fall off the chart to have even less contrast and less colors. Basically 1000nits sun vs 0.1nits shadow on FALD becomes 1000nits sun vs 0.11nits or 800nits vs 0.1nits. But on OLED it can be just 300nits vs 0.1nits.
Glad you suddenly agree that FALD is inherently inaccurate now. Really funny you completely change the way you write when you're talking to someone who has already declared their love to a FALD display.
A professional FALD monitor is accurate on brightness at any window size even though the blooming is more visible. This is enough to say how important brightness is on HDR. And a better HDR monitor is the one that has more accurate brightness despite more blooming on limited zones. Dimming zones will increase over time.
LOL. A "professional" FALD display may be accurate on test slides, but it is never accurate with actual content. Dimming zones and algorithms make it impossible. evln has made detailed explanations that are backed up with actual external sources, and you even just said it yourself.
 
-Most FALD screens are matte type AG coating which will raise blacks, and would whether oled or FALD, etc. Also can impact fine details and text.
I disagree.
Matte AG coating does not raise black level in any significant amount and while matte coating blurs what is behind it that level of blurring cannot impact fine details.

Light which on matte screen would raise black level would be really distracting on glare screen so its choose your poison situation.
Matte coating can also pick and reflect slight amount of light which on glare would reflect in direction which doesn't hit eyes but this is miniscule amount and much less than black level improvement that having ambient light provides anyways.
With this 'issue' I would say: non-issue.

As for details being lost with matte screen you can clearly see every subpixel with matte coating:
cleartype-on-large.jpg


The only details glare coating reveals are fine details on how subpixels look, i.e. completely inconsequential to image detail level:
cleartype-on-large.jpg

Even high PPI screens (like I have 4K 27') with matte coating show all pixels and you can even spot subpixels from up close. Conclusion: no details were harmed on matte screens 🙂

That said glare coating does make image (colors and contrast) pop out more than matte and I can understand why people like it. Apparently humans like other animals like glossy surfaces because it reminds them of water...
I myself like glossy/glare displays when it is possible to not have irritating reflections. When getting big OLED for movies and games I went with matte because modern matte is perfect ballance between looking 'wet' and having no reflections. I do not want to see myself in screen when using it with brightly lit room or seeing my face when sitting very close playing dark game which has bright elements.

- FALD's lighting resolution is around 45x25 cells. That's very poor. There is no way it can light blocks of 7,000 to 15,000 pixels plus the surrounding edge cells it tones brighter or darker in a sort of jumbo anti-aliasing of lights vs darks. Whether that is worthwhile tradeoff is up to you. There are tradeoffs either way. Some people don't "see" over 60hz, don't "see" smearing on VAs, etc etc. but the blooming and dimming "halos" or blobs of bright and dark cotton balls are there on FALD screens. Even when a scene allows the cells to blend more smoothly they are toning multiple cells of 7k to 15k pixels up or down from what they should be were the screen some type of per pixel emissive tech.
Agree with this.
Resolution of FALD is way too low to provide proper alternative to pixel emissive displays.
If we had at least 320x180 (or 384x214 to make it even better and round and more obvious number) then I would say blooming/haloing from FALD is just nitpicking and while 320x180/384x214 is not 3840x2160 it would be good enough for me to consider FALD instead of OLED for benefits LCD have over OLED: no burn-in and brightness.

- Longer response times on LCDs , usually resorting to overdrive
That is good point. OLED has advantage in motion handling.

Fortunately for LCD it hardly matters as long as response times ae fast enough because most motion blur is sample&hold blur.
When it comes to actual use 0.1ms of OLED doesn't really look that much sharper than 4ms IPS. This is also because other thing you mentioned: overdrive.
Overdrive LCDs use causes visual artifacts like reverse-ghosting BUT at the same time what overdrive really is is a kind of temporal sharpening filter which makes changes happening on screen seem sharper while these changes happen. This causes like I said some artifacts but at the same time when just eg. playing games it does reduce apparent motion blur.

Directly comparing LG 27GP950 4ms IPS to much faster LG 48GQ900 0.1ms OLED it doesn't look like OLED is significantly less blurry. Blur is there on both, very visible. LCD has its response times + sharpening while on OLED its pure sample&hold. Both look ok... and both look bad vs what strobed display can do when fps == Hz...
Overall not as big difference as numbers might suggest.
 
Where did I spread misinformation? And oh no, my covert operation has been blown. My between the lines super secret shilling that no one else has noticed has been exposed!

I don't say anything about OLED because it's not a competition between OLED and FALD. It's you that is constantly crying about how OLED is worse.

OLED is accurate as long as the material does not trigger the ABL or hits the peak brightness. You can't dispute that. Yes, you can act like a baby and say dumb stuff like "B-B-But that's t-totally not REAL HDR". FALD is not accurate in any range.

You didn't, but you sure lie a lot. You showed one who uses it "somewhere in the production phase but not for critical finishing" and one random Chinese youtuber.


All pixels are in dimming zones. All pixels are affected. Unless we're talking large uniform test slides.

No one serious would ever use FALD to grade anything.

LOL. A "professional" FALD display may be accurate on test slides, but it is never accurate with actual content. Dimming zones and algorithms make it impossible. evln has made detailed explanations that are backed up with actual external sources, and you even just said it yourself.

Funny my point is always that FALD looks much better than OLED in a higher range where HDR matters the most. My point is always that FALD has better accuracy than OLED in a higher range.

In a higher ranger OLED loses tons of brightness on every single pixels that results in a much less accurate image compared to FALD.

FALD has brightness, color, and moderate contrast while OLED only has contrast at low range, not enough brightness, not enough color. This is why FALD is accurate enough to be be used as a professional monitor for grading HDR1000 while a 200ntis OLED isn't.

I know how you roll. You simply pull out a dual-layer LCD to undermine FALD by saying stupid crap like FALD cannot be used by professionals because it is less accurate than a $50,000 dual-layer LCD.

It's always funny when I say FALD LCD > OLED in HDR while you say Dual-layer LCD > FALD LCD. It's always funny you say OLED is accurate as long as it is in the 200nits SDR range as if HDR matters more in that range. I can do the same roll. I tell you OLED doesn't even cover more color than FALD. The accuracy on low range doesn't even match a lot.

Then you double down by saying people use FALD for professional grading are idiots. LMAO. Some of the best HDR videos are made by FALD long time ago. None of them are made by a 200nits OLED.

There are professionals who use it. I just showed you. With your own link you provided that shoot yourself in the own foot, they even encourage you to grade HDR with things you can buy.

Since they are idiots to you. You better make better HDR videos than they did. But in reality, you with your 200nits OLED cannot even see better not even mentioning grading HDR.
 
I disagree.
Matte AG coating does not raise black level in any significant amount and while matte coating blurs what is behind it that level of blurring cannot impact fine details.
.. . .

Disagree 100%, with details from tftcentral and example pictures provided below. It's another major tradeoff.

. . . . . .


This has been argued in hardforum threads many times. Here is how I see it.

====================

Think of it like a light haze on clear "dry" ice vs. ultra clear wet ice.

Direct light sources hitting a screen are going to pollute the screen surface regardless. Some (many?) people are using their screens in poor setups. It's just like audio or photography - you should set up your environment to suit your hardware/screen not the other way around imo.


Like I said, improper lighting conditions and room layouts that allow direct light sources to hit a screen surface are going to pollute the screen regardless, as shown in the images below.

S9gqCVy.png




Lj3vf4O.png

(credit: vega)

light pollution compromises any screen surface:

full

(credit: vega)

Since traditionally desks have been laid out up against the wall like a bookshelf, or upright piano with sheet music against a wall - most setups act like a catcher's mitt for direct light source pollution from behind and overhead. Professional/reference monitors often come with a hood that covers the top and some of the sides, like some production cameras have. Light pollution (as well as allowing lighting conditions to change throughout the day) will alter/pollute how even a calibrated screen's values are seen and perceived.

The direct light source vectors hitting the matte or ag screens will blow out contrast and pale saturation, washing out areas of the screen they hit and are diffused onto. Allowing lighting conditions to change will also alter the way our eyes/brain perceives the screen's contrast and saturation so even their "calibrated" values will be lost to your eyes and brain. E.g. the screen will look more pale, weakly contrasted and undersaturated the brighter the room gets, and vice versa. Some keep several sets of settings so that they can switch between them for different times of the day or different room lighting conditions. So you are going to get compromised results if you don't design your viewing environment more optimally no matter what screen coating you have.

. . . . . . . . . .

From TFTcentral review of the PG42UQ:

The PG42UQ features a more traditional monitor-like matte anti-glare coating, as opposed to a glossy panel coating like you’d find on TV’s including the LG C2. This does a very good job of reducing reflections and handling external light sources like windows and lamps and we noticed much better reflection handling (no surprise) than the LG C2. However this does mean that in some conditions the blacks do not look as deep or inky visually to the user. With this being an OLED panel, famous for its true blacks and amazing contrast ratio this could be considered a problem – are you “wasting” that by having an AG coating that reduces your perceived contrast?
.

In certain conditions blacks look a little more dark grey as the anti-reflective coating reflects some of the surrounding light back at you and it “dulls” the contrast a bit. The anti-glare coating means the image is not as clear and clean as a fully glossy coating. You don’t get this same effect if the coating is fully glossy as there’s no AG layer, but what you do get instead is more reflections.

Don’t forget this same thing applies to all AG coated desktop monitors, you have the same impact on perceived black depth and contrast on IPS, TN Film and VA panels depending on your lighting conditions if there’s an AG coating used. You’d still get better relative blacks and contrast on the OLED (not to mention other benefits) compared with LCD technologies. They are all impacted in the same way by their coatings.

While they are concentrating on how it affects the blacks which is bad enough, it can also degrade the color saturation as it creates a haze. Basically unless you are viewing the screen in conditions that are dim to dark, where you wouldn't have a need to resort to matte type AG in the first place, the lighting in the room, (let alone direct light sources) hitting the screen will frost sheen the surface treatment raising the blacks and making the image not as clear, while making it look less saturated to our eyes. Unfortunately it's not a one way street though either - the diffusion surface treatment is also diffusing the light coming out of the screen itself so it will always be compromised some.

====================================================================================

https://arstechnica.com/gadgets/202...-more-reflections/?comments=1&comments-page=1

Or course you are seeing this picture below on whatever screen surface you are using at the moment so it's more of a simulation. ;)

matte-glossy.jpg




. . . . . .

https://euro.dough.tech/blogs/news/matte-vs-glossy-gaming-monitors-technology-explained

7fc6daaf63fa0869961e502ac6cb4fb2e99b179e_480x480.jpg


86e018a65e7c6b031682e3cf01edec16b7b177d6_480x480.jpg


075c78e242e254fd3f497774caaceadd54b947ba_480x480.jpg


subpixel photo the euro.dough.tech site referenced from TFTcentral:

d3047e25cf1e8c6b2679bd7aeaf8a0b7612859e4_480x480.png


Light pollution compromises any screen surface:
bee01dbf0c7a3ed7446fb60e138e8d0a6196b2d6_480x480.jpg
 
Last edited:
It's a pick your poison choice. I've seen plenty of reviews and forum comments going the other way after trying to go back from an OLED.

You call them poisons. If both are poisons then there are worse poison.

You've said nothing about OLED's points of ABL and flickering that result in a larger draw back in HDR. The accuracy is worse. The flickering gives eye strain.




Your post are the pieces from different reviews that don't cover everything. You don't even have materials of your own. It's always the same old trick.

- HDR content is made for dim to dark viewing environments so it will look optimal in those whether FALD or OLED.

A bright environment destroy contrast. Anybody knows that. It doesn't say anything about more brightness at higher range still has a bigger impacts in HDR. It doesn't say anything the monitor is better to be flicker-free to view high HDR in a black room, It doesn't say anything about there are people use FALD to view 2000nits highlight HDR in a pitch black room.

-Most FALD screens are matte type AG coating which will raise blacks, and would whether oled or FALD, etc. Also can impact fine details and text.

It doesn't say anything about how much raised black is enough to cause a significant difference. The black can go as low as 0.0019nits which is indistinguishable to the eyes.

Low APL inifinite contrast.png

808509_Low_APL_infinite_Contrast_4_1.png


It doesn't say how many people have dim environment. It doesn't say anti glare will benefit more from AG coating to have higher contrast over glossy when the environment gets brighter.
52144784027_f5a72238c0_o_d.png

1676663403067.png



- FALD's lighting resolution is around 45x25 cells. That's very poor. There is no way it can light blocks of 7,000 to 15,000 pixels plus the surrounding edge cells it tones brighter or darker in a sort of jumbo anti-aliasing of lights vs darks. Whether that is worthwhile tradeoff is up to you. There are tradeoffs either way. Some people don't "see" over 60hz, don't "see" smearing on VAs, etc etc. but the blooming and dimming "halos" or blobs of bright and dark cotton balls are there on FALD screens. Even when a scene allows the cells to blend more smoothly they are toning multiple cells of 7k to 15k pixels up or down from what they should be were the screen some type of per pixel emissive tech.

It doesn't say anything about when ABL kicks OLED brightness off the chart the accuracy will be worse than FALD.

It doesn't say why FALD 400nits SDR can even look better than OLED 200nits "HDR".

It doesn't say FALD is a lot better in HDR1000 and above.

PG35VQ HDR vs AW3423DW HDR
52142846992_ac36569c1f_o_d.png


PG35VQ SDR vs AW3423DW HDR
52157900062_07e6218f06_o_d.png



It doesn't say with FALD you can grade HDR1000 while you cannot do that on OLED.

It doesn't say why a 200nits OLED can not be made for grading HDR1000 even with over 8 million pixel dimming zone.

It doesn't say future contents and games are getting more range.

It doesn't say you can adjust multiple zones in games to have a much impactful HDR from FALD.

It doesn't say with FALD you can see 10x more range on movies and games.

783520_Heatmap_1x3_copy_Scene_1.jpg

807363_52682723089_afd9d7ed27_o_d.png


- Longer response times on LCDs , usually resorting to overdrive

It doesn't say anything about LCDs usually have higher refresh rate to have lower input latency while delivering similar motion image regardless.

It doesn't say some monitor can have even more clear motion pictures with strobe light at much higher brightness.

1676661632051.png



Again, your same old trick doesn't work.
 
  • Like
Reactions: Gabe3
like this
Strobing is another can of worms with major tradeoffs. Abl? Strobing cuts percieved brightness massively. Idk how you can bemoan abl drops and champion strobing drops vs whatever peaks your screen does you are losing up to 50percent hdr. while technically achievable, In real world released screens its typically incompatible with vrr too.

Again you refuted TFTcentral's results. This time on matte ag effects which they describe the effects of - on all screen types - very clearly for anyone to see.(pun intended).

I'll stick with them. Wasn't even talking to you but it apparently gave you opening to regurgitate your whole abl triggering custom curve madness stream again.

It is pick your poisons among all of the factors. One poison may be worse for your tastes but you can't speak for everyone. They are major tradeoffs on all current screen technologies. Aside from the intrinsic ones to each tech (though the 2000nit+ samsung 4k and 8k FALD LCD screens also suffer aggressive ABL) - - (matte) AG affects any of them. There are matte AG OLED screens, that is what the TFT central article started with. . . but it affects all screens. It's just that there aren't many glossy options on FALD screens to begin with so it compartmentalizes most if not all of them currently selection wise if you want to avoid that.
 
Last edited:
Unless you have a strong backlight. The drop doesn't happen on Zowie monitor with much stronger backlight even though it is made as SDR monitor. So the backlight can output double the brightness than non-strobe mode. The strobe mode on Zowie is even brighter than non-strobe. And it is unlike the uncontrollable flicker from OLED that cause eye strain. It's controllable.

Even with strobe, PA32UCG has easy 800nits fullfield. The fastest of the fast, the brightest of the bright. They are all LCD.

These reviews don't cover everything. You just pick pieces of them against FALD. It's always the same old trick.
 
Very good movie. Shame it's only out on regular old blu-ray and thus unwatchable garbage that doesn't even look real. Maybe our friend kram can make a professional remaster/regrade for us mere mortals who don't even make anime or grade HDR professionally. I propose it be called "Kram's Professional & Fantastic Fantastic Mr. Fox Real True HDR Remaster". It opens with a disclaimer stating that displaying the movie on an OLED display is not only a federal crime but an extreme health risk due to intense subliminal flickering in bright scenes.
[literally anything ever posted by this user]
 
  • Like
Reactions: Meeho
like this

ALL I need are just 2 facts. They will be common senses when people see what a good FALD can do.

I always talk about FALD looks much better than OLED in a higher range where HDR matters the most. My point is always that FALD has better accuracy than OLED in a higher range.

1. In a higher ranger OLED loses tons of brightness on every single pixels that results in a much less accurate image compared to FALD.

Brightness Chart.png


2. FALD has brightness, color, and moderate contrast while OLED only has contrast at low range, not enough brightness, not enough color. This is why FALD is accurate enough to be be used as a professional monitor for grading HDR1000 while a 200ntis OLED isn't.



In the meanwhile:

1.You simply pull out a dual-layer LCD to undermine FALD by saying stupid crap like FALD cannot be used by professionals because it is less accurate than a $50,000 dual-layer LCD. It's always funny when I say FALD LCD > OLED in HDR while you say Dual-layer LCD > FALD LCD. It's always funny you say OLED is accurate as long as it is in the 200nits SDR range as if HDR matters more in that range. OLED doesn't even cover more color than FALD. The accuracy on low range doesn't even match a lot.

2. It's you say stupid things such as people are idiots when using FLAD for grading HDR while claiming proof of professionals using FALD. Then you double down to shoot yourself in the own foot by providing links where they encourage you to grade HDR with things you can buy. Yet you still deny it even the though some of the best HDR videos made by FALD are right front of your face. It wasn't a surprise since you cannot see better anyway.

3. Since they are idiots to you. You better prove you are not an idiot. You better make HDR videos better than they did. But in reality, you with your 200nits OLED cannot even see better not even mentioning grading HDR.
 
I really wouldn't call him a troll. If he is it is surely the most extensive and bizarre case I have ever seen. I just think he is just a little... different.
TBH you're being far more trollish than he is. He has all of these displays in front of him from the PG32UQX to the AW3423DW (and many others) and is making a subjective evaluation I can buy where as you just come of as desperately defending your purchase.

What he's saying is fact. A FALD LCD capable of 1400nits that has only decent blacks relative to OLED provides far greater dynamic range. I mean even the video of Vincent from HDTVTest I linked spelled this out. Whether your eyes or mine find that preferable is a different story and that is the crux of this absurd circular argument.
 
Is the zowie backlight operating as a 1000zone + FALD array in strobing mode? I'm not up on that screen but it it's not operating at peak performance in FALD lighting and HDR while strobing than that's a moot point. Are any of them even 4k ? Do they even have HDR? If not I don't see how they are valid to the discussion.

edit: I looked up at least the zowie XL2566k 360hz review (a january 2023 screen/review), and it is:
. . 1080p
. . TN
. . no HDR
. . EDGE LIT , no FALD, not even local dimming
. . the contrast ratio is very low, so blacks look gray when viewed in a dark room.
  • Mediocre contrast ratio results in raised blacks in dark rooms.
"The BenQ ZOWIE XL2566K has just decent text clarity. The matte coating gives text a slightly hazy look, and due to the low pixel density, text isn't very sharp, even after optimizing the Windows ClearType settings for the display (top photo).

Is there some other zowie that actually compares to a high end FALD LCD and a modern OLED we are discussing the tradeoffs of?

. .

The bright room comments were in reply to someone else in this thread who uses their screens in that environment. Believe it or not, the thread is not all about Kramnelis.

. . .

Outside of that zowie screen, regarding your "PA32UCG has easy 800nits" in strobing mode - - Why would you cut your HDR brightness in half no matter what your screen's peak is?. . . that's like using an ABL triggered on mode all of the time instead of incidentally when necessary. All of the sudden cutting your HDR to 800nit (regardless full field or not) and less is just fine? wtf. Unless you are saying - there can be tradeoffs that make using a lower yet still impactful HDR range worthwhile - ????

BFI/strobing is also real world display release wise pretty much incompatible with VRR without major artifacts. Most screens don't allow you to even enable both. BFI/strobing also works best at very high frame rate minimums considering that you don't want your frame rate fluctuating without VRR capability and suffering those side effects . . and that you would get the least sample-and-hold state the monitor is capable of as a starting point (max fps+Hz) to apply BFI to aftter by running your game at fps minimums exceeding the max Hz of the screen too . . so BFI/strobing is very pigeon holed as to what type of games and demand it's of value for in the first place.

. . BFI = dims the screen by ~ 1/2 of what it would otherwise be capable of due to the nature of the way our eyes work (the strobes are still that bright though so could still trigger ABL on ABL screens if they were able to run that bright - it just doesn't look that bright to our eyes/brain due to the PWM like effect)
. . best use case requires frame rate minimums exceeding the max hz of the screen (for the aforementioned reasons) so usually low demand/complexity games, older, games, and/or lower settings, and/or lower resolutions.
. . for all practical purposes incompatible with VRR.
. . Not certain how it behaves with AI upscaling/DLSS and frame amplification tech (like frame insertion).
. . It's niche to say the least imo, and especially in regard to maximizing HDR content if you prioritize that - at least in it's current implementations.

. . . .

Strobing/BFI is a very pigeon holed/niche usage case, and matte type AG coatings can affect any screen type. And that zowie comment... ugh. . It seems like you are just being hyper defensive rather than objective.

. . .

FALD has major tradeoffs yes, but matte AG affects all screens with that surface treatment. Just happens to be there aren't many (any?) glossy FALD screens so it ends up being yet another major tradeoff to people.. .. though the OLED monitors rather than gaming TV options are typically AG too which is really unfortunate imo. So again it's not a FALD VS OLED thing, though thankfully there are some OLED gaming TVs that are still made glossy. It's more about the tradeoffs on what is available.

I never said FALD screens are bad screens. I never say destroyYYYYDD!!!, TRASH!!! . I said I don't like their tradeoffs (outlining them) compared to per pixel emissive's tradeoffs (knowing and accepting it's limitations) and what I find appealing about viewing on them. I do find the major tradeoffs on current FALD screens , especially wth matte, inadequate or not preferable at least as a screen tech for my taste compared to my other OLED options. That doesn't mean a FALD is a horrible screen, or unusable etc. And like I said, AG is just bad bad bad across the board . . but that goes for any screen with a matte AG surface treatment. TFTcentral, RTings, and other sites and forum replies outline the compromises matte AG surface treatments cause very clearly, on any screen type.

. .

When a near enough to per pixel emmissive glossy LCD (if even possible, possibly not) or of course micro-LED per pixel emissive gaming display comes out in an enthusiast price range I'd be all over it. 45 x 25 lighting resolution with a matte-scratch AG surface treatement is not it - for me and others who hold similar tastes/value things similarly.

While ABL doesn't trigger that often and at least for that long in dynamic content - on HDR curves suited to a given screen - it's still one of the tradeoffs. Unfortunately I have concerns that as consumer screens of any type get to 2000nit and higher that they might all use aggressive ABL, on LCD and maybe even microLED. The 2k+nit samsung 4k and 8k LCD screens already have aggressive ABL. Hopefully that isn't true and they can develop heatsink and cooling tech and ways to boost the brightness at less heat or something, or all of the above. I know some of the dual layer lcd reference monitors were pretty boxy and had heatsinks and fans in them. I'd definitely take function over form but that doesn't seem to be what is marketed so much in the enthusiast consumer space.
 
Last edited:
What he's saying is fact. A FALD LCD capable of 1400nits that has only decent blacks relative to OLED provides far greater dynamic range.
Not in the same image, though. It has several times the brightness capability, but orders of magnitude worse dark capability and contrast.

Whether your eyes or mine find that preferable is a different story and that is the crux of this absurd circular argument.
Exactly, but he refuses to acknowledge that. The high ABL and (over)saturated colors are the be all end all and nothing else matters.
 
.. . .

Disagree 100%, with details from tftcentral and example pictures provided below. It's another major tradeoff.

. . . . . .


This has been argued in hardforum threads many times. Here is how I see it.

====================

Think of it like a light haze on clear "dry" ice vs. ultra clear wet ice.

Direct light sources hitting a screen are going to pollute the screen surface regardless. Some (many?) people are using their screens in poor setups. It's just like audio or photography - you should set up your environment to suit your hardware/screen not the other way around imo.
Mostly true today, but it is not always the case. Removing the AG filter destroys blacks on the FW900 when there is stronger ambient light.
 
Point taken but CRT is a dinosaur lol. or a Corpse raised from a crypt and reanimated.

I meant modern matte type AG surface treatments on modern IPS/FALD, VA, TN, OLED. Thought that was implied or obvious but your comment gave me a chuckle. I went through two fw900 crts back in the day.

edit I think the contrast ratio is ~ 460:1 on the fw900, and the brightness is very low % of screen wise. "WIth a brand new one it's about 125cd/m², but older ones with more use won't be as bright. Also depends on the picture that is displayed. Will be brighter for a smaller white image and less bright when the whole screen is white."

So they are very low contrast even compared to a 6000:1 non-FALD VA LCD. The motion clarity was awesome on them though.
 
Last edited:
Didn't ask, didn't read, LOL.
Again

1. You simply pull out a dual-layer LCD to undermine FALD by saying stupid crap like FALD cannot be used by professionals because it is less accurate than a $50,000 dual-layer LCD. It's always funny when I say FALD LCD > OLED in HDR while you say Dual-layer LCD > FALD LCD. It's always funny you say OLED is accurate as long as it is in the 200nits SDR range as if HDR matters more in that range. OLED doesn't even cover more color than FALD. The accuracy on low range doesn't even match a lot.

2. It's you say stupid things such as people are idiots when using FLAD for grading HDR while claiming proof of professionals using FALD. Then you double down to shoot yourself in the own foot by providing links where they encourage you to grade HDR with things you can buy. Yet you still deny it even the though some of the best HDR videos made by FALD are right front of your face. It wasn't a surprise since you cannot see better anyway.

3. Since they are idiots to you. You better prove you are not an idiot. You better make HDR videos better than they did. But in reality, you with your 200nits OLED cannot even see better not even mentioning grading HDR.

4. You can even try to prove you can make HDR like I did. I can even give you some footage to grade.
 
TBH you're being far more trollish than he is.
Yeah, now I am, obviously, cause he's getting boring as hell, literally just repeating the same nonsense without actually addressing anything I, or anyone else, say.
He has all of these displays in front of him from the PG32UQX to the AW3423DW (and many others) and is making a subjective evaluation I can buy
I would care what he has if the discussion was about subjective evaluations. It's not.
where as you just come of as desperately defending your purchase.
What the hell? What purchase am I defending? I'm not the guy vehemently defending the technology used in his $3000 display that will be completely obsolete soon enough LOL.
What he's saying is fact.
What? Most of it is unsubstantiated bullshit.
A FALD LCD capable of 1400nits that has only decent blacks relative to OLED provides far greater dynamic range.
Eh, that would depend heavily on the criteria of the test you would use to determine that. No doubt that if your goal is very bright "impactful" HDR, the FALD display will obviously win, and I assume that's what you're alluding to. The image just can't be describe as accurate, which has been my primary point for the past few pages, which is what gets kram's panties in a twist and he goes "hurr durr muh professionals chinese youtubers are using it to grade the best HDR 1000 on the planet so it must be accurate and btw OLED is shit".
Whether your eyes or mine find that preferable is a different story and that is the crux of this absurd circular argument.
I really don't think you've been keeping up with the thread tbh, but ok..

LOL, didn't ask, didn't read.
 
Is the zowie backlight operating as a 1000zone + FALD array in strobing mode? I'm not up on that screen but it it's not operating at peak performance in FALD lighting and HDR while strobing than that's a moot point. Are any of them even 4k ? Do they even have HDR? If not I don't see how they are valid to the discussion.

Outside of that zowie screen, regarding your "PA32UCG has easy 800nits" in strobing mode - - Why would you cut your HDR brightness in half no matter what your screen's peak is?. . . that's like using an ABL triggered on mode all of the time instead of incidentally when necessary. All of the sudden cutting your HDR to 800nit (regardless full field or not) and less is just fine? wtf. Unless you are saying - there can be tradeoffs that make using a lower yet still impactful HDR range worthwhile - ????

I pull out Zowie and PA32UCG as counterpoint that your material you choose doesn't cover every thing and you just use it for your own purpose to undermine FALD.


I never said FALD screens are bad screens. I never say destroyYYYYDD!!!, TRASH!!! .
Just because some twisted words or the manipulation of review martials for a different purpose doesn't mean you don't say the same thing.

Remember a while back how you said PG32UQX is useless? Then you said PA32UCG are made for static desktop. You should try playing Doom on it. I have no problem playing Doom with the highest difficulty.

Then you talked about OLED "infinite black" and how content should lower down the mid zone without saying realistic images can have much higher mid.

Then you busted out all the numerous quotes and reviews that are not relevant just to deny a higher contrast image right in front of you.
 
edit I think the contrast ratio is ~ 460:1 on the fw900, and the brightness is very low % of screen wise. "WIth a brand new one it's about 125cd/m², but older ones with more use won't be as bright. Also depends on the picture that is displayed. Will be brighter for a smaller white image and less bright when the whole screen is white."
That must be ANSI (checkerboard) contrast for the fw900. On/off should be very high.
So they are very low contrast even compared to a 6000:1 non-FALD VA LCD. The motion clarity was awesome on them though.
6000:1 native would be the very best of VA-panels as well, no? (but please let's not get into VA panels here.. or even CRTs..)
LOL, didn't ask, didn't read.
 
Yeah, now I am, obviously, cause he's getting boring as hell, literally just repeating the same nonsense without actually addressing anything I, or anyone else, say.
You are one who is always obviously exposed at the very beginning to pull out nonsense while I provide facts and materials.

I know how you roll. You always like to say "is there a proof for whatever you said?". It's the same trick as long as there is no official technical details available for anything you are up against. I can do the same thing. I can make you prove facts of facts. I can even make you prove why 1+1=2. I tell you that you cannot prove why 1+1=2


LOL, didn't ask, didn't read.

You asked for this. You asked for facts.

1. In a higher ranger OLED loses tons of brightness on every single pixels that results in a much less accurate image compared to FALD.

2. FALD has brightness, color, and moderate contrast while OLED only has contrast at low range, not enough brightness, not enough color. This is why FALD is accurate enough to be be used as a professional monitor for grading HDR1000 while a 200ntis OLED isn't.

3.You simply pull out a dual-layer LCD to undermine FALD by saying stupid crap like FALD cannot be used by professionals because it is less accurate than a $50,000 dual-layer LCD. It's always funny when I say FALD LCD > OLED in HDR while you say Dual-layer LCD > FALD LCD. It's always funny you say OLED is accurate as long as it is in the 200nits SDR range as if HDR matters more in that range. OLED doesn't even cover more color than FALD. The accuracy on low range doesn't even match a lot.

4. It's you say stupid things such as people are idiots when using FLAD for grading HDR while claiming proof of professionals using FALD. Then you double down to shoot yourself in the own foot by providing links where they encourage you to grade HDR with things you can buy. Yet you still deny it even the though some of the best HDR videos made by FALD are right front of your face. It wasn't a surprise since you cannot see better anyway.

5. Since they are idiots to you. You need to prove it to yourself that you are not an idiot. You need to make HDR videos better than they did.
 
Last edited:
Not in the same image, though. It has several times the brightness capability, but orders of magnitude worse dark capability and contrast.

Exactly, but he refuses to acknowledge that. The high ABL and (over)saturated colors are the be all end all and nothing else matters.
No. OLED has overall a lot worse accuracy than FALD at higher range where HDR matters.

You need to have tools to scan, calculate, compare the difference of color and brightness between the reference monitor vs FALD vs OLED in each pixel. Then you can get the result which is worse at accuracy.

But It's already very obvious that FALD is more accurate since it can maintain brightness even the drawback is fewer dimming zones. It looks significantly better than OLED. This is why FALD is made into professional monitors for grading HDR1000 and above as the accuracy is closer to the reference.

OLED cannot be more accurate if the brightness and color is not enough, especially the brightness.

Exactly, but he refuses to acknowledge that. The high ABL and (over)saturated colors are the be all end all and nothing else matters.

I'm saying FALD looks much better than OLED in a higher range where HDR matters the most. It doesn't matter what you prefer.

When you say "overstated color" it just exposed that your monitor is too dim to display the proper brightness.

Color will look more saturate at low brightness. Color is lift by brightness. Color will become shallower when the brightness/contrast goes up. This is why a monitor needs more color volume at high brightness to make things look less shallower and more realistic.

If you've checked the HDR video I post, you could've seen same color becomes shallower when brightness/contrast goes up at the 2nd stage of this video.



You can see the same effect even under screenshot.

sRGB volume, low contrast
1676683401900.png


sRGB volume , high contrast
1676683383707.png


Rec2020, high contrast.
1676683355590.png


You can circlejerk all you want without understanding them. In the end, it is the HDR1000 FALD has better images than a 200nits OLED.
 
I dunno why he still goes on about a 200nit OLED, I guess all screens that he looks at are 100% of the screen being full white
 
My main issue is that subjective priorities and preferences are being dismissed out of hand, as if one criteria that matters to one person must automatically be the ONLY thing that matters to everyone else, when that's clearly not the case. If this thread proves anything, it's that priorities for a display vary widely among people, and that's why both technologies (and even others like VA) are popular in certain circles.

You can't just dismiss anything that's sRGB or SDR because it's not your preference, especially when accuracy is important to a lot of people; that's the range I and many others spend most of their time in. It's also the range that is a LOT more of a standard than HDR is at this point. HDR is still an evolving technology with many different standards (HDR10, HDR10+, Dolby Vision, HLG, then all the nit levels...HDR400,600,1000,1400,etc.), and the kinks are still being worked out as well. If someone watches mostly high-nit HDR, they'd be right to go FALD. For the rest of us though, it makes perfect sense we prioritize accurate sRGB, while still having some nice HDR capabilities.

Talking about 200-nit OLED minimizes what OLED can actually do in real-world use (where it can hit peaks, smaller windows, and real-world scenes significantly higher than that).

At the same time, talking about 400-nit SDR is just weird. That's too bright to be comfortably watchable for most people and certainly not any sort of widely used standard. The same goes for the other value cited often, 80 nits, which has the opposite problem and is much dimmer than most people watch at. When asked about these values, they're just thrown out there again with no real explanation as to why.

It also minimizes both OLED and FALD making top lists on professional review sites for gaming monitors, HDR TVs, etc. Reviewers generally consider room lighting and user priorities for their recommendations, which only makes sense. If you're in a bright room, an OLED isn't a good choice. Conversely, if you're in a really dark room, you might notice more blooming on a FALD display, where OLED's perfect blacks would benefit you more.

As far as self-graded HDR, that's only really of value to the person grading it as they can customize it to whatever they *want* it to look like. Customizing content for yourself is a great ability and it's nice people can do it, but it doesn't mean it's accurate. It may very well look better subjectively, but aside for one's own personal enjoyment, it's pretty pointless aside from that, and someone who values high-accuracy won't be very interested in it. Most people don't grade their own HDR, and most HDR that's commercially available is made to work reasonably well on a variety of technologies and monitor types (and I'm sure tested as such). Even the videos provided to dunk on OLED...looked pretty damn good on OLED, and I have a FALD TV so I could compare. I'm sure people who like to self-grade can and do make content to view on their OLED displays (and it looks great on FALD displays too). They probably just don't focus on as high a brightness.

The criticism about flickering causing eyestrain is also strange to me. I'm sure there's a point somewhere in there about how the technology works, but my *experience* is I've noticed absolutely zero flickering, and this has actually ended up the easiest monitor on my eyes I've owned yet, including my previous IPS and both IPS panels I tested/returned. I don't know how to explain it except to say it feels more like reading off a piece of paper than most monitors.

As far as brightness, nobody here has said HDR can get as impactful on an OLED as a FALD as far as brightness (and those do affect contrast and color to a point, agreed). But with tone-mapping for higher-nit content, or native content made to look good on multiple types of displays (games having settings, for example, or Windows HDR calibration tool if you use Auto HDR), it can still be a nice step above SDR and provide a very good experience for consuming HDR content.

At the end of the day, it's not particularly fun or informative to try to debate someone who won't concede any points, even when valid ones are raised about all current technologies having flaws in certain content, something that is indisputable. No matter what you're going with, you're going to be making some sort of sacrifices *somewhere*. As put by others here, you "pick your poison" and choose what make you happiest and the compromises you can live with. For me, that's excellent SDR/sRGB with perfect blacks and "pretty good" HDR, especially for gaming.

I tried the much-cited PA32UCG (had it just under two weeks) because, on paper, it is the king of FALD monitors. But in practice, my experience just wasn't very good. Aside from quality control issues which would have made keeping it unacceptable even if I liked everything else about it (and a big chance in the panel lottery of getting multiple defective ones had I tried to replace it), for the price point, I found there to be way too much blooming for my liking, particularly in SDR/sRGB content, and its hesitancy/slowness to return to sleep was very frustrating and annoying. Beyond that, there were smaller disappointments like the poor viewing angles (where even viewing the corners shows a little bit of color shift, etc.).
On the other hand, upon getting this OLED, it just felt good pretty immediately. It's a bit smaller, but that's okay since I just couldn't find a 32" I loved. The perfect blacks are awesome (it does have a matte finish, and I can sort of kind of see what people are talking about with that, but it still looks great to me), the calibrated mode looks superb in sRGB content (and even the out of the box sRGB was pretty good), refresh rates are amazing, viewing angles are perfect, no quality control issues I've noticed, great uniformity, etc. It's also a lot lighter and the LED effects look cool. =oP It has it's downsides...sure. Not being able to do super high brightness HDR is one of the bigger ones, but literally all the HDR I've tried looks good enough (in most cases, I'd even say great) that it's worth it to me. I do worry about eventual burn-in, but it's time I see how much of a problem that really is nowadays anyways. I don't watch movies on this; I mainly only game, and games tend to take into account different techs anyways with adjustments available for HDR. A bit more customizability in certain settings would be nice (ABL is less aggressive in game modes, but you can't turn that on in other modes for example), but any issues I have are nitpicks. The experience has just been better, at 1/3 of the price.

I've said it before and I'll say it again - user experience matters. All the highest brightness specs don't mean much if your every day experience with something is not great. Meanwhile, a few compromises are easier to accept it something generally does everything pretty well at a reasonable price. That's where I'm at with the OLED. It's just a better fit *for me*. I still like FALD monitors for their pros (and have a FALD TV I adore). I still may try one again someday in the future - I'm curious about the new Asus coming later this year to replace the QX. But pretending like only one factor, like brightness, matters, is missing the whole breadth of all the factors going into one's overall experience with a new display and what that particular person priortizes.
 
I dunno why he still goes on about a 200nit OLED, I guess all screens that he looks at are 100% of the screen being full white
It's simple math. An image with 10% 1200nits, 10% 500nits, 80% 100nits already has 250nits APL. This is just a medium APL. Fullfield brightness is the max capability of total brightness output. The monitor will only output APL lower than fullfield brightness. An OLED with 200nits fullfiled already output at least 20% less brightness. The overall brightness is at least 20% less accurate or even worse.

It doesn't sustain brightness. It has 300nits-400nis sun instead of 1000nits sun. That's not accurate at all.
 
Last edited:
I tried the much-cited PA32UCG (had it just under two weeks) because, on paper, it is the king of FALD monitors. But in practice, my experience just wasn't very good. Aside from quality control issues which would have made keeping it unacceptable even if I liked everything else about it (and a big chance in the panel lottery of getting multiple defective ones had I tried to replace it), for the price point, I found there to be way too much blooming for my liking, particularly in SDR/sRGB content, and its hesitancy/slowness to return to sleep was very frustrating and annoying. Beyond that, there were smaller disappointments like the poor viewing angles (where even viewing the corners shows a little bit of color shift, etc.).
On the other hand, upon getting this OLED, it just felt good pretty immediately. It's a bit smaller, but that's okay since I just couldn't find a 32" I loved. The perfect blacks are awesome (it does have a matte finish, and I can sort of kind of see what people are talking about with that, but it still looks great to me), the calibrated mode looks superb in sRGB content (and even the out of the box sRGB was pretty good), refresh rates are amazing, viewing angles are perfect, no quality control issues I've noticed, great uniformity, etc. It's also a lot lighter and the LED effects look cool. =oP It has it's downsides...sure. Not being able to do super high brightness HDR is one of the bigger ones, but literally all the HDR I've tried looks good enough (in most cases, I'd even say great) that it's worth it to me. I do worry about eventual burn-in, but it's time I see how much of a problem that really is nowadays anyways. I don't watch movies on this; I mainly only game, and games tend to take into account different techs anyways with adjustments available for HDR. A bit more customizability in certain settings would be nice (ABL is less aggressive in game modes, but you can't turn that on in other modes for example), but any issues I have are nitpicks. The experience has just been better, at 1/3 of the price.

I've said it before and I'll say it again - user experience matters. All the highest brightness specs don't mean much if your every day experience with something is not great.

The problem is you bought a professional FALD monitor to look sRGB most of the time. It's a rather limited experience.

Your experience is not a real-world usage of PA32UCG. It is mainly used for grading HDR 1000 or HDR1400 at a price tag of $3500. A few years ago you have to pay multiple times than $3500 to make HDR 1000. HDR 1000 is still a very high standard and only a few monitors can reach it to deliver good images.

You bought a monitor without knowing what it is specifically used for or what is the strongest point or used it at full potentials. PA32UCG even has HDR preview mode so you can see SDR picture as HDR. It doesn't cheat on brightness so the blooming is more visible. Of course your experience will be like this. But people will think OLED has better HDR than FALD from your experience. That's not true. The fact is FALD can be used to make HDR 1000. Even with drawback FALD still has better HDR than OLED at higher range. And OLED has problems of its own.
 
I doubt it would have really have mattered had I gone with the QX (or I also considered the G8 briefly). It might have been marginally better for my uses, but I still would have probably had to have local dimming off for SDR content, which is what I mostly view. Also, several reviews state that the PA32UCG, while overkill and certainly useful for professionals, is great for gaming and consuming content as well. Given it had some advantages over the QX, I thought if I was going to go for FALD, it was possibly my best bet. If I had to do it over again, I might have tried the QX instead, but I think I would have run into at least some of the same issues, and I'm still of the opinion for my uses, the OLED is a better fit/experiences. Part of the reason I didn't think blooming would be as much of a problem as it ended up being is I haven't had as much noticeable blooming on my TV (especially with SDR content). I would have expected a high-end monitor to have better options to address it for SDR; it did have an sRGB mode, after all.

OLED has its limitations, sure, but it really shines for SDR with its perfect blacks and lack of blooming, is very good with moderate brightness HDR, and is "good enough" for high-brightness HDR where it tone-maps. Based on my usage profile, that makes it the best compromise for what I do the most.
 
My main issue is that subjective priorities and preferences are being dismissed out of hand, as if one criteria that matters to one person must automatically be the ONLY thing that matters to everyone else, when that's clearly not the case. If this thread proves anything, it's that priorities for a display vary widely among people, and that's why both technologies (and even others like VA) are popular in certain circles.

You can't just dismiss anything that's sRGB or SDR because it's not your preference, especially when accuracy is important to a lot of people; that's the range I and many others spend most of their time in. It's also the range that is a LOT more of a standard than HDR is at this point. HDR is still an evolving technology with many different standards (HDR10, HDR10+, Dolby Vision, HLG, then all the nit levels...HDR400,600,1000,1400,etc.), and the kinks are still being worked out as well. If someone watches mostly high-nit HDR, they'd be right to go FALD. For the rest of us though, it makes perfect sense we prioritize accurate sRGB, while still having some nice HDR capabilities.

Talking about 200-nit OLED minimizes what OLED can actually do in real-world use (where it can hit peaks, smaller windows, and real-world scenes significantly higher than that).

At the same time, talking about 400-nit SDR is just weird. That's too bright to be comfortably watchable for most people and certainly not any sort of widely used standard. The same goes for the other value cited often, 80 nits, which has the opposite problem and is much dimmer than most people watch at. When asked about these values, they're just thrown out there again with no real explanation as to why.

It also minimizes both OLED and FALD making top lists on professional review sites for gaming monitors, HDR TVs, etc. Reviewers generally consider room lighting and user priorities for their recommendations, which only makes sense. If you're in a bright room, an OLED isn't a good choice. Conversely, if you're in a really dark room, you might notice more blooming on a FALD display, where OLED's perfect blacks would benefit you more.

As far as self-graded HDR, that's only really of value to the person grading it as they can customize it to whatever they *want* it to look like. Customizing content for yourself is a great ability and it's nice people can do it, but it doesn't mean it's accurate. It may very well look better subjectively, but aside for one's own personal enjoyment, it's pretty pointless aside from that, and someone who values high-accuracy won't be very interested in it. Most people don't grade their own HDR, and most HDR that's commercially available is made to work reasonably well on a variety of technologies and monitor types (and I'm sure tested as such). Even the videos provided to dunk on OLED...looked pretty damn good on OLED, and I have a FALD TV so I could compare. I'm sure people who like to self-grade can and do make content to view on their OLED displays (and it looks great on FALD displays too). They probably just don't focus on as high a brightness.

The criticism about flickering causing eyestrain is also strange to me. I'm sure there's a point somewhere in there about how the technology works, but my *experience* is I've noticed absolutely zero flickering, and this has actually ended up the easiest monitor on my eyes I've owned yet, including my previous IPS and both IPS panels I tested/returned. I don't know how to explain it except to say it feels more like reading off a piece of paper than most monitors.

As far as brightness, nobody here has said HDR can get as impactful on an OLED as a FALD as far as brightness (and those do affect contrast and color to a point, agreed). But with tone-mapping for higher-nit content, or native content made to look good on multiple types of displays (games having settings, for example, or Windows HDR calibration tool if you use Auto HDR), it can still be a nice step above SDR and provide a very good experience for consuming HDR content.

At the end of the day, it's not particularly fun or informative to try to debate someone who won't concede any points, even when valid ones are raised about all current technologies having flaws in certain content, something that is indisputable. No matter what you're going with, you're going to be making some sort of sacrifices *somewhere*. As put by others here, you "pick your poison" and choose what make you happiest and the compromises you can live with. For me, that's excellent SDR/sRGB with perfect blacks and "pretty good" HDR, especially for gaming.

I tried the much-cited PA32UCG (had it just under two weeks) because, on paper, it is the king of FALD monitors. But in practice, my experience just wasn't very good. Aside from quality control issues which would have made keeping it unacceptable even if I liked everything else about it (and a big chance in the panel lottery of getting multiple defective ones had I tried to replace it), for the price point, I found there to be way too much blooming for my liking, particularly in SDR/sRGB content, and its hesitancy/slowness to return to sleep was very frustrating and annoying. Beyond that, there were smaller disappointments like the poor viewing angles (where even viewing the corners shows a little bit of color shift, etc.).
On the other hand, upon getting this OLED, it just felt good pretty immediately. It's a bit smaller, but that's okay since I just couldn't find a 32" I loved. The perfect blacks are awesome (it does have a matte finish, and I can sort of kind of see what people are talking about with that, but it still looks great to me), the calibrated mode looks superb in sRGB content (and even the out of the box sRGB was pretty good), refresh rates are amazing, viewing angles are perfect, no quality control issues I've noticed, great uniformity, etc. It's also a lot lighter and the LED effects look cool. =oP It has it's downsides...sure. Not being able to do super high brightness HDR is one of the bigger ones, but literally all the HDR I've tried looks good enough (in most cases, I'd even say great) that it's worth it to me. I do worry about eventual burn-in, but it's time I see how much of a problem that really is nowadays anyways. I don't watch movies on this; I mainly only game, and games tend to take into account different techs anyways with adjustments available for HDR. A bit more customizability in certain settings would be nice (ABL is less aggressive in game modes, but you can't turn that on in other modes for example), but any issues I have are nitpicks. The experience has just been better, at 1/3 of the price.

I've said it before and I'll say it again - user experience matters. All the highest brightness specs don't mean much if your every day experience with something is not great. Meanwhile, a few compromises are easier to accept it something generally does everything pretty well at a reasonable price. That's where I'm at with the OLED. It's just a better fit *for me*. I still like FALD monitors for their pros (and have a FALD TV I adore). I still may try one again someday in the future - I'm curious about the new Asus coming later this year to replace the QX. But pretending like only one factor, like brightness, matters, is missing the whole breadth of all the factors going into one's overall experience with a new display and what that particular person priortizes.
Amen.

Complaining about eye strain on OLED as some undisputable fact is just wrong because eye strain is so highly individual. I have never had any issue with the OLEDs I've owned but have had eye strain from CRTs as well as some crappy LCDs I've had to use at client premises. You can find someone complaining about almost any panel tech or display model causing eyestrain. It's similar to getting motion sickness for example. I could not play through Gears 5 without putting a permanent aim point on screen because it would make me nauseus without a focal point. Other people would not experience anything like this and there is not a clear reason why this game caused me the problem but many other 3rd person games do not.

A few years ago I bought the LG CX 48" OLED TV to use as my main desktop display. At the time its main compromises were large size and potential burn-in. It worked alright, but I bought it with the wrong focus - gaming first. Instead I used it 70% for work and 30% for gaming, so it was ultimately not the right product for my uses. I still enjoy it almost every day as my living room TV and I'm happy with its performance, yes even for HDR even if it's not the brightest HDR display out there. Maybe when we get higher brightness and refresh rates I will replace it with something else.

If I were to buy the PG32UQX that Kramnelis champions so fiercely, it would perform no better than my 399 euro Samsung G70A 4K 144 Hz IPS for that 70% work usage. For the 30% gaming use it would do great at HDR, but perform worse for motion than the Samsung. At nearly 9x higher price, that's just not a great tradeoff.

So instead my next desktop display looks to be the 57" Samsung superultrawide 8K x 2K mini-LED because it has the desktop space and resolution I want for work while also having a decent HDR experience for gaming - hopefully better than the Neo G7/G8. Would I prefer that in QD-OLED? Hell yeah, but we are not there yet. It's going to be another compromise but I am ok with VA not having OLED-like motion performance or viewing angles, but in return it might do a bit brighter HDR, no burn in issues and hopefully can be run in narrower ultrawide resolutions for gaming. I'm probably in the minority here that likes this superultrawide form factor and even I can say it has a lot of issues of its own.

Which brings us back to "pick your poison".
 
I doubt it would have really have mattered had I gone with the QX (or I also considered the G8 briefly). It might have been marginally better for my uses, but I still would have probably had to have local dimming off for SDR content, which is what I mostly view. Also, several reviews state that the PA32UCG, while overkill and certainly useful for professionals, is great for gaming and consuming content as well. Given it had some advantages over the QX, I thought if I was going to go for FALD, it was possibly my best bet. If I had to do it over again, I might have tried the QX instead, but I think I would have run into at least some of the same issues, and I'm still of the opinion for my uses, the OLED is a better fit/experiences. Part of the reason I didn't think blooming would be as much of a problem as it ended up being is I haven't had as much noticeable blooming on my TV (especially with SDR content). I would have expected a high-end monitor to have better options to address it for SDR; it did have an sRGB mode, after all.

OLED has its limitations, sure, but it really shines for SDR with its perfect blacks and lack of blooming, is very good with moderate brightness HDR, and is "good enough" for high-brightness HDR where it tone-maps. Based on my usage profile, that makes it the best compromise for what I do the most.
That's just your experience with limited usage of the monitors.

Mine is totally different as I can see a lot more range with FALD other than with OLED. I can see a wider range of colors with FALD compared to OLED

To me AW3423DW has miserable brightness. It doesn't even look good in SDR with only DCI-P3 250nits brightness instead of Adobe YCbCr 400nits. The HDR performance is not enough with a peak brightness of only 460nits sun and frequent ABL. The HDR on AW3423DW can even look worse than the 400nits Adobe SDR on PG35VQ. Then it gives eye strain at only 200nits while DC dimming FALD never gives the problem even at 400nits. And my ambient environment is dim enough to sleep.

PG32UQX has less blooming on PA32UCG as it has G-sync to control the backlight. Your 100-zone Z95 TV works like that 96-zone Sony InZone M9 to have a bit lifted black to reduce blooming. If you care blooming that much and see sRGB 100nits most of the time the OLED will be fit. But I don't want to go back to just see sRGB at such limited range or just to see 300nits sun as bright a a ping-pong ball. I need to see as much range as possible as a better realstic image has much higher range.
 
A few years ago I bought the LG CX 48" OLED TV to use as my main desktop display. At the time its main compromises were large size and potential burn-in. It worked alright, but I bought it with the wrong focus - gaming first. Instead I used it 70% for work and 30% for gaming, so it was ultimately not the right product for my uses. I still enjoy it almost every day as my living room TV and I'm happy with its performance, yes even for HDR even if it's not the brightest HDR display out there. Maybe when we get higher brightness and refresh rates I will replace it with something else.
That's actually why I ended up sticking with my AW3821 on my desktop. I wanted the OLED Alienware and had actually ordered one, however when I look at my usage, it is almost all desktop. I browse the web and do e-mail, watch some video, play with Nuendo, and do work. I rarely game on my monitor anymore, I game on my TV (also hooked to the computer). For that kind of thing, I just like the 3821 better because it is bigger, and because I needn't worry about burn in when I have a big Nuendo session sitting mostly static for long periods. It just doesn't make sense, for my use case, to change.

Same kind of deal with something like an FALD display. The PG32 would be an option but again, major tradeoffs. I find 32" non-UWs too large, and I've really become a fan of UW for Nuendo. Plus the thing is like $2800. That's more than I spent for upgrading my system to a 13900K. It isn't out of reach, but damn that's a lot of money. FALD also isn't perfect, I've had an FALD TV in the past and looked at more modern ones and with movies I'd say they do pretty damn good. You can still notice the zones sometimes, but not a ton. With games though they don't do as well. There are more situation where there's a bright area surrounded by things not as bright that they don't handle well. It's not unplayable or anything, but it is a tradeoff you notice.

So for the desktop, I stick with my older LCD.

TVs I went the opposite route and had a similar debate: I was looking at either an OLED, which I did get the S95B, or a high end FALD, probably a QN90B. Both would get me what I really wanted (4k120 VRR) and both would be an improvement over my old FALD TV which didn't have that many zones, and was lacking in the color gamut department. I did like the brightness on the FALD TVs, the high-end ones can push near 2000 nits and can easy sustain over 600 nits full screen. That means it could be eye-searingly bright even with the windows open and sun flowing in. But it just didn't look as good as the OLED in a number of ways, most importantly being HDR in games, but also things like viewing angles (which matter more for our TV usage). So the OLED it was. Everything is a tradeoff, and that was the tradeoff that made the most sense for how we use it there (mostly games, always at night).
 
Complaining about eye strain on OLED as some undisputable fact is just wrong because eye strain is so highly individual. I have never had any issue with the OLEDs I've owned but have had eye strain from CRTs as well as some crappy LCDs I've had to use at client premises. You can find someone complaining about almost any panel tech or display model causing eyestrain. It's similar to getting motion sickness for example. I could not play through Gears 5 without putting a permanent aim point on screen because it would make me nauseus without a focal point. Other people would not experience anything like this and there is not a clear reason why this game caused me the problem but many other 3rd person games do not.

A few years ago I bought the LG CX 48" OLED TV to use as my main desktop display. At the time its main compromises were large size and potential burn-in. It worked alright, but I bought it with the wrong focus - gaming first. Instead I used it 70% for work and 30% for gaming, so it was ultimately not the right product for my uses. I still enjoy it almost every day as my living room TV and I'm happy with its performance, yes even for HDR even if it's not the brightest HDR display out there. Maybe when we get higher brightness and refresh rates I will replace it with something else.

Eye strain is not that highly individual once the brightness just goes a little bit higher. LG CX can only output image with APL lower than 150nits. That is a low APL. Once it goes up 250nits like AW3423DW then it gives eye strain as all OLED flickers regardless. Other things also make it worse such as low ambient light and frequent ABL.

OLED brightness can hardly get any brighter but once you just see a bit higher APL images in a dark room then OLED gives eye strain. This doesn't happen with DC dimming FALD even displaying much higher APL images.

You can try to see a bit higher APL with OLED in a dark room then see if it gives eye strain. I know the eye strain caused by AW3423DW. I can even get eye strain under 30 minutes playing Resident Evil which has a rather low APL. I get eye stain at max brightness when playing R6 on SDR which is just 250nits.
 
That's just your experience with limited usage of the monitors.

Mine is totally different as I can see a lot more range with FALD other than with OLED. I can see a wider range of colors with FALD compared to OLED

To me AW3423DW has miserable brightness. It doesn't even look good in SDR with only DCI-P3 250nits brightness instead of Adobe YCbCr 400nits. The HDR performance is not enough with a peak brightness of only 460nits sun and frequent ABL. The HDR on AW3423DW can even look worse than the 400nits Adobe SDR on PG35VQ. Then it gives eye strain at only 200nits while DC dimming FALD never gives the problem even at 400nits. And my ambient environment is dim enough to sleep.

PG32UQX has less blooming on PA32UCG as it has G-sync to control the backlight. Your 100-zone Z95 TV works like that 96-zone Sony InZone M9 to have a bit lifted black to reduce blooming. If you care blooming that much and see sRGB 100nits most of the time the OLED will be fit. But I don't want to go back to just see sRGB at such limited range or just to see 300nits sun as bright a a ping-pong ball. I need to see as much range as possible as a better realstic image has much higher range.

What you call limited use is the experience I'd call pretty standard for a lot of people, but I take your point. Agreed, though, that the UCG was probably overkill and the UQX might have worked a bit better (though I've found G-Sync can cause problems in some games regardless of monitor, so I often have it disabled anyways, so I'm not sure how much better the UQX would have been; it's fair it might have been less problematic as far as blooming - I didn't know that based on the reviews I read if that's the case).

I don't have any experience with the Alienware monitor to comment. I didn't want an ultrawide monitor, so that ruled it out for me. I'd agree the brightness on the LG wouldn't be enough for you based on what you do since you like porting everything into that higher range and high-nit HDR, so it makes sense FALD is your best option. For someone like me who doesn't find that appealing or to create a better picture (since I value accuracy first), OLED is more than enough.

I did actually consider the InZone as I tend to like Sony stuff but the reviews just weren't great for it and I didn't love the stand design.

I will say a sun on the OLED is brighter than a ping pong ball, but I take your point. You like really high-nit HDR, and I can understand the appeal even though other things take priority for me.

As far as the Alienware, I have no idea why the eyestrain. As pointed out by other comments in this thread, different people are sensitive to different things re different monitor technologies. I will say trying to view SDR at 200 nits is pretty bright in a dark room - I know when I first got my TV, I had brightness set way too high and experienced eyestrain until I dialed it into a more reasonable level. All I can say re OLED is I don't appear to have any from mine (though I generally view content at 160 nits, so it's a bit lower), and I haven't noticed any flickering.

You can try to see a bit higher APL with OLED in a dark room then see if it gives eye strain. I know the eye strain caused by AW3423DW. I can even get eye strain under 30 minutes playing Resident Evil which has a rather low APL. I get eye stain at max brightness when playing R6 on SDR which is just 250nits.

SDR that bright (in a dark room) would give me eyestrain too I think; it wouldn't matter the type of monitor.
 
I will say trying to view SDR at 200 nits is pretty bright in a dark room - I know when I first got my TV, I had brightness set way too high and experienced eyestrain until I dialed it into a more reasonable level. All I can say re OLED is I don't appear to have any from mine (though I generally view content at 160 nits, so it's a bit lower), and I haven't noticed any flickering.

SDR that bright (in a dark room) would give me eyestrain too I think; it wouldn't matter the type of monitor.

TV is bigger than monitor. 10% window on a 55" is almost 30% window on 32". But SDR at 250nits is not that bright on a smaller monitor. I don't have a problem with higher 300nits or 400nits DC dimming. These flickers are invisible but can still cause eye strain.
 
Fair enough. I'm sure different people may be more sensitive to flicker than others too.
 
RTings rates 2 out of the their top 3 gaming TV's as OLED with the samsung qd oled in the top spot. Their top gaming *monitors* also have an oled in the top spot. I'm not saying everyone would rank them in the same order as they did or value them the same - as we've said the tradeoffs of each screen and what you prefer is a personal choice. Personally I wouldn't go back to 1440p they put in the top gaming monitor spot, and if a true glossy option with good specs were available I'd choose that where possible as all of these screen surface issues are a major turn off to me. Still it's telling of how well reviewed OLEDs are by discriminating sites like RTings and TFTCentral for example. There are QD-LED LCDs in the top 3 or 4 on most sites though too. That said, they (OLEDs) imo are best as media and gaming displays. They aren't great as monitors in brighter rooms and at lower PPD (sitting too close to a larger 4k TV or using a 1440p gaming monitor) the subpixel and screen surface text issues stand out.

====================================

RTings #1 gaming monitor is the dell AW3423DW

" The best gaming monitor we've tested is the Dell Alienware AW3423DW. It's an excellent ultrawide gaming monitor that doesn't have the highest resolution or refresh rate compared to other monitors. However, it's known for its incredible motion handling and remarkable picture quality. It has a QD-OLED display, so it combines the perfect black levels of OLEDs with the wide range of colors of quantum dot displays. Content looks amazing in dark rooms, and colors look vivid in HDR, but there are issues when using it in a bright room as the black levels raise, so it's better to game with it in the dark. " <--------- lacks a polarizing layer. The screen causes pink tint when light hits it and raises blacks. Still takes the top spot with a "dark room only" use scenario warning. Might be using a weird surface treatment like the S95B QD-OLED TV. True glossy is better like the C1, C2.

. . .

RTings #1 gaming TV is the Samsung S95B QD-OLED

"The Samsung S95B is a fantastic TV overall. Its self-emissive panel technology is superb for watching movies or gaming in a dark room. HDR content looks superb thanks to its high peak brightness and exceptional color gamut. It also has an exceptional viewing angle, so you can enjoy an accurate image from any angle, making it amazing for watching sports or TV shows. Sadly, it uses an extremely uncommon pixel layout that results in noticeable color fringing and blurry text, so it's not well-suited for productivity use as a PC monitor. It's also best suited for completely dark rooms, as it has raised blacks in a room with any ambient lighting, and the screen has a pink tint to it."

"The Samsung S95B handles direct reflections incredibly well, but there are some flaws. Due to the lack of a polarizer, if you're in a room with any ambient lighting, the TV has a pink tint to it even when it's off. Bright lights are still distracting in a bright room, but it cuts the mirror effect slightly better than the LG G2 OLED. On the other hand, blacks look much better on the G2 when you're in a room with any ambient light." <---- samsung's weird surfaces screwing things up when light hits the surface. True glossy is better like LG C1, C2, G2.


. . .

gamesradar # gaming TV = LG C1 OLED

Forbes #1 Gaming TV Jan 2023 = LG C2

TechRadar's #1 gaming TV = LG C2

IGN 's #1 gaming TV = LG C2

eurogamer's #1 gaming TV = LG C2, C1

HowToGeek's #1 gaming TV = LG C2

. . . . . . . . .

TV is bigger than monitor. 10% window on a 55" is almost 30% window on 32". But SDR at 250nits is not that bright on a smaller monitor. I don't have a problem with higher 300nits or 400nits DC dimming. These flickers are invisible but can still cause eye strain.

If you are viewing any 4k screen at the human viewing angle of 50 to 60 degrees, the effective size is the same to your perspective so the size of the screen shouldn't matter if using the screen at proper/optimal viewing angles. Sitting closer than that on a larger 4k screens in a pc scenario is going to cause some major tradeoffs to begin with.

I do understand the eye strain thing as I get that from bfi/strobing displays vs sample and hold. I haven't experienced it with an oled though (not using bfi). I have a CX and a C1 and view a lot of HDR material.
 
Last edited:
  • Like
Reactions: Zahua
like this
If you are viewing any 4k screen at the human viewing angle of 50 to 60 degrees, the effective size is the same to your perspective so the size of the screen shouldn't matter if using the screen at proper/optimal viewing angles. Sitting closer than that on a larger 4k screens in a pc scenario is going to cause some major tradeoffs to begin with.

I do understand the eye strain thing as I get that from bfi/strobing displays vs sample and hold. I haven't experienced it with an oled though (not using bfi). I have a CX and a C1 and view a lot of HDR material.

One of the interesting things I noticed when trying a bigger IPS panels that I didn't expect (going from 27" to 32", which is about the biggest I could support on my desk) was I was a bit unhappy with viewing angles. I could see a color shift in the corners of the screen even looking straight on. Ironically, the viewing angles on an OLED wouldn't be an issue (especially with this MLA panel, it's like paper - I see no degradation even at extreme angles), but OLED doesn't have decent 32" options so far. Still, after experiencing that on both samples I tried, it's honestly a lot more comfortable without that shift and I was happy to go back to 27" for now.
 
LCD = 56K modem
OLED = broadband

Once you switch from 56K modem to broadband, you just don't go back. They will never make FALD with enough zones to prevent blooming effect. It is either per-pixel light emissions or its not as good as OLED. The only technology that comes close to OLED is plasma, which actually has superior motion compared to OLED, but plasma has many other issues that make it unviable as PC monitors.

OLED's greatest issue is the possibility of burn-in, but that is being mitigated successfully via pixel shift, pixel refresh, and healthy/careful OLED display use habits. You can also enable temporal dithering on NVidia (and probably AMD) cards to make sure there is beneficial non-stop pixel movement that adds to quality.

P.S. You don't have to audo-hide your taskbar, which doesn't audo-hide your mouse cursor. You can just move your taskbar around (top, bottom, left, right) every 4-5 hours.
 
They will never make FALD with enough zones to prevent blooming effect. It is either per-pixel light emissions or its not as good as OLED. The only technology that comes close to OLED is plasma, which actually has superior motion compared to OLED, but plasma has many other issues that make it unviable as PC monitors.

OLED's greatest issue is the possibility of burn-in, but that is being mitigated successfully via pixel shift, pixel refresh, and healthy/careful OLED display use habits. You can also enable temporal dithering on NVidia (and probably AMD) cards to make sure there is beneficial non-stop pixel movement that adds to quality.

P.S. You don't have to audo-hide your taskbar, which doesn't audo-hide your mouse cursor. You can just move your taskbar around (top, bottom, left, right) every 4-5 hours.

There is a taskbarhider app that allows you to show/hide toggle it via a hotkey that you can customize instead of the sloppy mouse-over method. So you can lock it away entirely regardless of mouse over until you hit that hotkey to show it. Then just hit the hotkey again to hide it and lock it away. Also using the translucent taskbar app so that it's clear.

http://www.itsamples.com/software.html

Since I use multiple monitors I do drag the taskbar to the top of the one to the right of my oled though.

Win+TAB is pretty useful to swap between running apps without using the taskbar to begin with. Win+S to pop up search field and typing 2 or 3 letters bring up almost anything pretty intelligently so you just hit a few buttons and slap the enter key to launch it. However I use a streamdeck with it's window management plugins , plus I micromanage things more with displayfusion pro's script library, with hotkeyed scripts tied to streamdeck buttons. I use it to launch things, or multi-launch multiple apps, also to place app windows either individually or via global saved window position profiles, etc. Also to make the app window active, min/restore it to home position, or use generic window sizing and placement buttons once the app is the active one, etc. You can also tie it's buttons into the LG remote control software to affect an LG oled's settings or activate features otherwise accessible in the tv menus.

I almost never have to manually move or resize any windows around with a mouse ever anymore. The only reason I have to show it is for the system tray once in awhile. I'm adding that stream deck usage information because it shows how I don't need a taskbar to navigate my system anymore.

That and ultra black wallpaper, no desktop icons, using different named settings for different usages, etc. but other than keeping the screen blank when not showing media and gaming, the "turn off the screen emitters" is probably the most useful saving of the burn in buffer.

Some useful info in previous replies from kasakka and I here:

https://hardforum.com/threads/what-new-oled-gaming-monitors-in-2023.2024551/post-1045547989

Personally I use mine as a media and gaming display. I'd still avoid using one as a static desktop/app monitor but some people go that route.
 
RTings rates 2 out of the their top 3 gaming TV's as OLED with the samsung qd oled in the top spot. Their top gaming *monitors* also have an oled in the top spot. I'm not saying everyone would rank them in the same order as they did or value them the same - as we've said the tradeoffs of each screen and what you prefer is a personal choice. Personally I wouldn't go back to 1440p they put in the top gaming monitor spot, and if a true glossy option with good specs were available I'd choose that where possible as all of these screen surface issues are a major turn off to me. Still it's telling of how well reviewed OLEDs are by discriminating sites like RTings and TFTCentral for example. There are QD-LED LCDs in the top 3 or 4 on most sites though too. That said, they (OLEDs) imo are best as media and gaming displays. They aren't great as monitors in brighter rooms and at lower PPD (sitting too close to a larger 4k TV or using a 1440p gaming monitor) the subpixel and screen surface text issues stand out.
Not to defend Mr. "200 nits" (I'm guessing you are arguing with him, I have him blocked) but I will say you have to be careful with RTings because they have their own biases, that being for contrast ratio. If you read a lot of their reviews, as I have, you notice that something that is real, real important to them is a high contrast ratio. Now I don't disagree that a higher contrast ratio is nicer, but I also don't feel it is as big a deal as they do. I'm an IPS fan, for TVs as well as monitors, and I'm ok with the lower contrast ratio as a tradeoff.

Not disagreeing that OLED is great for games, I *LOVE* my S95B for gaming, particularly good HDR games (Resident Evil Village is an example of fantastic HDR) just that much like some people on this forum, they have a particular thing they over-focus on a bit, in their case it is contrast ratio.
 
There is a taskbarhider app that allows you to show/hide toggle it via a hotkey that you can customize instead of the sloppy mouse-over method. So you can lock it away entirely regardless of mouse over until you hit that hotkey to show it. Then just hit the hotkey again to hide it and lock it away. Also using the translucent taskbar app so that it's clear.

http://www.itsamples.com/software.html

Since I use multiple monitors I do drag the taskbar to the top of the one to the right of my oled though.

Win+TAB is pretty useful to swap between running apps without using the taskbar to begin with. Win+S to pop up search field and typing 2 or 3 letters bring up almost anything pretty intelligently so you just hit a few buttons and slap the enter key to launch it. However I use a streamdeck with it's window management plugins , plus I micromanage things more with displayfusion pro's script library, with hotkeyed scripts tied to streamdeck buttons. I use it to launch things, or multi-launch multiple apps, also to place app windows either individually or via global saved window position profiles, etc. Also to make the app window active, min/restore it to home position, or use generic window sizing and placement buttons once the app is the active one, etc. You can also tie it's buttons into the LG remote control software to affect an LG oled's settings or activate features otherwise accessible in the tv menus.

I almost never have to manually move or resize any windows around with a mouse ever anymore. The only reason I have to show it is for the system tray once in awhile. I'm adding that stream deck usage information because it shows how I don't need a taskbar to navigate my system anymore.

That and ultra black wallpaper, no desktop icons, using different named settings for different usages, etc. but other than keeping the screen blank when not showing media and gaming, the "turn off the screen emitters" is probably the most useful saving of the burn in buffer.

Some useful info in previous replies from kasakka and I here:

https://hardforum.com/threads/what-new-oled-gaming-monitors-in-2023.2024551/post-1045547989

Personally I use mine as a media and gaming display. I'd still avoid using one as a static desktop/app monitor but some people go that route.

I am using that very same TaskBar Hider, but it doesn't hide the mouse cursor...
 
Back
Top