Why OLED for PC use?

But why? People here who have, enjoy and prefer OLED displays aren't going to start disliking them and go out and buy a FALD based on your incessant rambling. This entire thread makes no sense.

He's suffering from buyers remorse and is in a state of denial, desperately trying to convince himself and others that he made a good purchase.

It's pretty common on tech boards, this is an extreme case though.

He won't convince anyone who's actually read through the thread, but maybe the random person who just stumbles upon the thread and sees only his posts will be convinced in some way. Which sucks for them.
It's ok to have your own opinion and disagree but the outright dishonesty and spam is excessive and a problem for anyone uninformed looking for real information. If I was a mod I would probably ban him or something since he's making the forum worse for everyone else, but I'm not so I just ignore him.
 
He's suffering from buyers remorse and is in a state of denial, desperately trying to convince himself and others that he made a good purchase.

It's pretty common on tech boards, this is an extreme case though.

He won't convince anyone who's actually read through the thread, but maybe the random person who just stumbles upon the thread and sees only his posts will be convinced in some way. Which sucks for them.
It's ok to have your own opinion and disagree but the outright dishonesty and spam is excessive and a problem for anyone uninformed looking for real information. If I was a mod I would probably ban him or something since he's making the forum worse for everyone else, but I'm not so I just ignore him.

It's more like you are hallucinating without seeing what FALD can do.

Funny it's you haven't seen better. You don't know how disappointing OLED is like AW3423DW when compared to a 4-year-oled true HDR FALD monitor such as PG35VQ.

OLED flickers easily give eyestrain even the brightness is dim. OLED easily loses 2-3 times dynamic range. You are seeing lesser HDR400 or SDR all the time while I can already make HDR you cannot even see on OLED. Every time I use a FALD I see better images than OLED.
 
He's suffering from buyers remorse and is in a state of denial, desperately trying to convince himself and others that he made a good purchase.

It's pretty common on tech boards, this is an extreme case though.

He won't convince anyone who's actually read through the thread, but maybe the random person who just stumbles upon the thread and sees only his posts will be convinced in some way. Which sucks for them.
It's ok to have your own opinion and disagree but the outright dishonesty and spam is excessive and a problem for anyone uninformed looking for real information. If I was a mod I would probably ban him or something since he's making the forum worse for everyone else, but I'm not so I just ignore him.
He definitely doesn't have buyers remorse because he owns all the displays in question and has shown pictures of them all together.

I don't like the way he's going about it as if preference has no role but this forum is heavily OLED biased so he's fighting an uphill battle.

Until all of you have both a PG32UQX and C2/AW3423DW side by side to compare, you'll obviously defend your own purchase. I think there are 2 of us in this thread who have compared both but I'm not sure if the other person did side by side or got his OLED well after returning the LCD.
 
He definitely doesn't have buyers remorse because he owns all the displays in question and has shown pictures of them all together.

I don't like the way he's going about it as if preference has no role but this forum is heavily OLED biased so he's fighting an uphill battle.

Until all of you have both a PG32UQX and C2/AW3423DW side by side to compare, you'll obviously defend your own purchase. I think there are 2 of us in this thread who have compared both but I'm not sure if the other person did side by side or got his OLED well after returning the LCD.

This thread initially started out as concerns about burn in by using OLED as a monitor then spiraled off into the typical FALD vs OLED battle.
 
This thread initially started out as concerns about burn in by using OLED as a monitor then spiraled off into the typical FALD vs OLED battle.
It's bound to be this way.

This thread title is about "Why OLED for PC use?" The burn-in explanation is simple. OLED doesn't hold much brightness or it will burn. So you have to see average 100nits limited brightness in limited range with worse accuracy.

I didn't even need to start it myself. Somebody else already did a favor busting out hot take like "An OLED monitor with burn-in on the panel would *still* be a better looking monitor than any IPS LCD". Then OLED can get wrecked with limited range plus unsolvable flickering compared to FALD DC dimming HDR which is much better for the current and future content.
 
Yes, you are advertising OLED while undermining FALD. Whether it is intentional or not you are doing the exactly the same thing.

I'm not undermining FALD. I like FALD. But for my uses, after trying two IPS panels (one well-reviewed FALD), OLED is better. They both have their pros/cons. As far as "advertising" OLED, I've barely mentioned the brand/model in this thread. I'm not sure what I'd gain by it even if it was. I'm just giving my honest opinion.


Don't forget you are using a dim 27GR95QE, which is dimmer than the already dim AW3423dW, to see sRGB 80nits in SDR or a 360nits sun at most in HDR mode. These images are rather pathetic. You are seeing tone-mapped SDR on OLED while the FALD can make HDR you won't even able to see.

Dimmer in some tests, brighter in others. The images are quite nice and bright enough for me.

At least that's a HDR400 not like 27GR95QE without certification. That monitor is brighter than 27GR95QE at larger window size. It's at least accurate for HDR400 unlike yours which is not even qualified. But HDR400 true black or not is not true HDR1000 anyway. You cannot use a HDR400 monitor to see true HDR1000 accurately. Your 27GR95QE is a dim SDR monitor where the major complains are all about low brightness. It's not a HDR monitor when FALD SDR can beat it easily. It's not a HDR monitor without the lowest HDR certification. It's that simple.

I don't care much about certification. It'd be nice, but it's not a dealbreaker. The LG can go brighter for some things, dimmer than others. I don't think you can give any claims on accuracy, and you're making assumptions about the lack of certification I don't think either of us honestly know.

Also, it's an HDR monitor. This is not debatable. Any objective data backs that up; specific certifications are irrelevant.


OLED is worse for PC use due to limited range. It's not fit for future content or even the current content. It has flickers. You cannot see much at 160nits APL. Your monitor is just that dim without HDR certification. These are not opinions.

It's not worse for PC use; it depends on the user's criteria. It may be worse for your uses based on your opinion. It's better than FALD for mine based on my opinion.
 
It doesn't make sense but it's still funny to read every now and then lol. He's not going to change anyone's mind much like nobody is going to change his mind. "Truths" or whatever that he keeps going on about doesn't mean jack shit, people are going to use whatever they like, period. There are still people who even today rather use a CRT and nothing else despite CRT's being even dimmer than OLEDs and having zero HDR capability, and there is nothing wrong with that if that is what someone prefers to use.

For vintage gaming, CRT's probably really are king. I wish I still had one tbh. Nothing like hooking a NES or Super NES up to one of those =oP
 
He definitely doesn't have buyers remorse because he owns all the displays in question and has shown pictures of them all together.

I don't like the way he's going about it as if preference has no role but this forum is heavily OLED biased so he's fighting an uphill battle.

Until all of you have both a PG32UQX and C2/AW3423DW side by side to compare, you'll obviously defend your own purchase. I think there are 2 of us in this thread who have compared both but I'm not sure if the other person did side by side or got his OLED well after returning the LCD.

To be fair, I didn't compare FALD and OLED side by side at the same time (in my case the ProArt UCG and LG's new 27"), but brightness alone isn't one of my bigger criteria as long as it gets bright enough (which the OLED does for my light-controlled room, especially for SDR, but with enough pop in HDR to make me happy). Of course if you put HDR1000 or greater on FALD next to an OLED, the FALD display is going to look quite a bit brighter. That's completely fair and valid and I haven't really seen anybody argue otherwise. I do have a FALD TV in an adjacent room and can compare that way, though it's not a perfect comparison. Both technologies are great at what they do best, and both have compromises at what they do worst. I really like FALD for my TV, and I really thought I would for the monitor, but I do too much where the blooming with Local Dimming on is very distracting to me. (I also had major quality control issues with that specific ProArt). Plus I like the viewing angles and blacks of the OLED. But it's purely down to preference and what I do. If instead of 85% of my time in SDR I did 85% in high-nit HDR, I'm sure FALD would be my preferred choice. And if things like microLED become mainstream, it might be my choice in the future for monitor too - we'll see.

I think the value of comparing monitors side by side is also somewhat limited. There are variables like how they are set to consider (and if filmed, how the camera is picking it up). It's like going to an electronics store and deciding on a TV purely on how they look in the showroom on one specific image/scene. How are they set? Could one be set to Vivid and one to Cinema? What's the content? Is it SDR or HDR? A dark scene or a really bright scene? And it's very possible different scenes look better on each respectively.

My main point of contention isn't even FALD vs. OLED - it's the inability for the person in question to accept that some people really do prefer OLED for their specific use case after trying both, to the point of accusing multiple people of lying about ever trying FALD if they happened to have a better experience with OLED. It just puzzles me.
 
I'm not undermining FALD. I like FALD. But for my uses, after trying two IPS panels (one well-reviewed FALD), OLED is better. They both have their pros/cons. As far as "advertising" OLED, I've barely mentioned the brand/model in this thread. I'm not sure what I'd gain by it even if it was. I'm just giving my honest opinion.
You are doing the exactly the same job. I've seen many cases like you. It doesn't matter if you are advising OLED or you are unable to understand HDR or you are busting out philosophy to talk about preference. I don't care what you prefer. I'm stating facts that OLED is far less accurate in HDR to have worse images.

Dimmer in some tests, brighter in others. The images are quite nice and bright enough for me.
It's as dim as a 300nits sun.

I don't care much about certification. It'd be nice, but it's not a dealbreaker. The LG can go brighter for some things, dimmer than others. I don't think you can give any claims on accuracy, and you're making assumptions about the lack of certification I don't think either of us honestly know.

Also, it's an HDR monitor. This is not debatable. Any objective data backs that up; specific certifications are irrelevant.
27GR95QE is not a HDR monitor without a VESA certification. It doesn't have any HDR certification in fact. This is objective. Your dynamic range falls 3 times lower with a 300nits sun compared to the required 1000nits. The monitor is just that pathetically dim. Even AW3423DW with a HDR400 is still a borderline HDR that's not capable of displaying true HDR1000.

It's not worse for PC use; it depends on the user's criteria. It may be worse for your uses based on your opinion. It's better than FALD for mine based on my opinion.
It's worse for PC uses. It has 160nits max APL. The monitor doesn't even sell.
 
Last edited:
27GR95QE is not a HDR monitor without a VESA certification. It doesn't have any HDR certification in fact. This is objective. Your dynamic range falls 3 times lower with a 300nits sun compared to the required 1000nits. The monitor is just that pathetically dim. Even AW3423DW with a HDR400 is still a borderline HDR that's not capable of displaying true HDR1000.


It's worse for PC uses. It has 160nits max APL. The monitor doesn't even sell.

VESA certification doesn't define whether something is or isn't HDR alone, though it is nice to contextualize it. This was you trying to find a "gotcha" to move the goalpost. The monitor has HDR capabilities, multiple HDR-specific modes, and every reviewer reviews HDR performance. It's an HDR monitor, certification or not. This is akin to saying fruit isn't fruit since it's not certified organic.

It's not worse for PC uses; it all depends what you do. Especially for SDR, it's better in several areas (perfect blacks, viewing angles, and motion to name a few). FALD is better for high-nit HDR if that's most of what you do. No argument there, but trying to pretend OLED isn't better in some areas is just incorrect (just as pretending FALD isn't better in some areas would be).

What are you talking about "doesn't sell"? After launch, it was sold out until the other day and just came back into stock (with a "Best Seller" tag on LG's site), though I haven't seen any specific sales figures. This is a bold claim unless you have sources. But also, how it sells doesn't really affect me; I like the monitor either way. Though sales will mean continued panels in this size from LG and others like ASUS, and I'm looking forward to seeing what innovations come down the line. (Same with looking forward to seeing what future FALD panels from ASUS and others look like). I'm fairly technologically agnostic; I just want what works best for my uses.
 
Last edited:
VESA certification doesn't define whether something is or isn't HDR alone, though it is nice to contextualize it. This was you trying to find a "gotcha" to move the goalpost. The monitor has HDR capabilities, multiple HDR-specific modes, and every reviewer reviews HDR performance. It's an HDR monitor, certification or not. This is akin to saying fruit isn't fruit since it's not certified organic.
You are refusing the VESA certification and call whatever image as HDR as long as it has HDR mode. VESA doesn't give HDR400 to 27GR95QE. You cannot deny that. It simply fails to deliver the required 400nits dynamic range when a 1000nits sun get hit down to 300nits. The "HDR" on 27GR95QE can be easily beaten by SDR on FALD. You call that HDR in your preference. It has even worse range compared to SDR.

t's not worse for PC uses; it all depends what you do. Especially for SDR, it's better in several areas (perfect blacks, viewing angles, and motion to name a few). FALD is better for high-nit HDR if that's most of what you do. No argument there, but trying to pretend OLED isn't better in some areas is just incorrect (just as pretending FALD isn't better in some areas would be).
You see sRGB 80nits most of time. You don't see high range images. You don't like blooming. Then you use OLED. It's your preference. It pretty much proves OLED is a low range SDR monitor while FALD has much more range on both contrast and color to deliver better images overall.

What are you talking about "doesn't sell"? After launch, it was sold out until the other day and just came back into stock, though I haven't seen any specific sales figures. This is a bold claim unless you have sources. But also, how it sells doesn't really affect me; I like the monitor either way. Though sales will mean continued panels in this size from LG and others like ASUS, and I'm looking forward to seeing what innovations come down the line. (Same with looking forward to seeing what future FALD panels from ASUS and others look like). I'm fairly technologically agnostic; I just want what works best for my uses.
You should've checked the stores on East Asian market where it has the most volumes of sales.

The number is pathetic on OLED. A few clicks of search then you can get a good idea how many OLED sells. You can like OLED whatever you want. But it's not the future. Display market is a grinding business. OLED doesn't have enough dynamic range to show better images.

InnoCN 27M2U
1677565423379.png

LG 27GR95QE
1677565181004.png
 
Last edited:
You are refusing the VESA certification and call whatever image as HDR as long as it has HDR mode. VESA doesn't give HDR400 to 27GR95QE. You cannot deny that. It simply fails to deliver the required 400nits dynamic range when a 1000nits sun get hit down to 300nits. The "HDR" on 27GR95QE can be easily beaten by SDR on FALD. You call that HDR in your preference. It has even worse range compared to SDR.

You changed the debate. I never said it has VESA certification. Not once. As for the reason, you're assuming. Nonetheless, it's an HDR monitor. Certification alone doesn't determine that.


You see sRGB 80nits most of time. You don't see high range images. You don't like blooming. Then you use OLED. It's your preference. It pretty much proves OLED is a low range SDR monitor while FALD has much more range on both contrast and color to deliver better images overall.

I see sRGB most of the time, though not at 80 nits as you (for some reason that eludes me) like to claim. I do sometimes see higher range images, just not all that often - mostly just in HDR supported games. No, I don't like blooming, especially when using Windows. Yes, OLED excels at SDR (the only exception being text fringing, which doesn't happen to bother me) while still offering decent HDR. FALD excels at HDR while still offering decent SDR, but with compromises like blooming and poor viewing angles, which are more dealbreakers for me.

You should've checked the stores on East Asian market where it has the most volumes of sales.

The number is pathetic on OLED. A few clicks of search then you can get a good idea how many OLED sells. You can like OLED whatever you want. But it's not the future. Display market is a grinding business. OLED doesn't have enough dynamic range to show better images.

We don't know how many of each have been produced, how much the out of stock status affected sales, and they're also at different price points, with the InnoCN cheaper as well as being out several months longer. Also, I know of some people who are waiting to see what ASUS or other companies do with this same LG panel before purchasing. Not exactly the most equivalent comparison. It'll likely take a while longer to determine how good sales actually are.
 
You changed the debate. I never said it has VESA certification. Not once. As for the reason, you're assuming. Nonetheless, it's an HDR monitor. Certification alone doesn't determine that.
I never changed the debate. It's not a HDR monitor. It never had any HDR certification. You think it can get a VDE HDR800 certification like Samsung?
I see sRGB most of the time, though not at 80 nits as you (for some reason that eludes me) like to claim. I do sometimes see higher range images, just not all that often - mostly just in HDR supported games. No, I don't like blooming, especially when using Windows. Yes, OLED excels at SDR (the only exception being text fringing, which doesn't happen to bother me) while still offering decent HDR. FALD excels at HDR while still offering decent SDR, but with compromises like blooming and poor viewing angles, which are more dealbreakers for me.
Give a little more then you see sRGB 140nits brightness. There isn't much difference. People are complaining they cannot see with brightness this dim. And you see tone-mapped images that's well inside SDR range with 8bit color.
1677567723555.png




We don't know how many of each have been produced, how much the out of stock status affected sales, and they're also at different price points, with the InnoCN cheaper as well as being out several months longer. Also, I know of some people who are waiting to see what ASUS or other companies do with this same LG panel before purchasing. Not exactly the most equivalent comparison. It'll likely take a while longer to determine how good sales actually are.
The sheer numbers in sales just shows the differences. A monitor without stock then it doesn't sell either. They are in stock anyway.
 
Good luck ranking R6 or any typical FPS. You will get destroyed with brightness this low.
With nyctalopia for sure!

BTW. Isn't it enough to say OLEDs are less bright and for you its too big of a flaw.
Trying to convince people OLEDs are too dark for them makes you look like an idiot
 
With nyctalopia for sure!

BTW. Isn't it enough to say OLEDs are less bright and for you its too big of a flaw.
Trying to convince people OLEDs are too dark for them makes you look like an idiot
I'm saying you will get destroyed in R6 with brightness that low. I don't care what you prefer. It's you cannot see with 140nits.
 
I never changed the debate. It's not a HDR monitor. It never had any HDR certification. You think it can get a VDE HDR800 certification like Samsung?

Give a little more then you see sRGB 140nits brightness. There isn't much difference. People are complaining they cannot see with brightness this dim. And you see tone-mapped images that's well inside SDR range with 8bit color.
View attachment 552675




The sheer numbers in sales just shows the differences. A monitor without stock then it doesn't sell either. They are in stock anyway.


lol. You absolutely changed the debate. You said it wasn't an HDR monitor initially, and when challenged with...you know...all the reviews reviewing the HDR, specs listing HDR, all of a sudden demanded it be certified to count. That's not how it works. It's an HDR monitor. Fruit is still fruit if it's not certified organic. I never argued it met any certifications/standards. I simply said it's an HDR monitor. Just because that sun doesn't meet certain nits doesn't mean smaller highlights don't go higher. It's established it can hit far higher than that.

sRGB is made for ~100 nits in a pitch black room (some, like me prefer brighter at about 160 to meet room conditions and personal preference) - that you can't see at that brightness is nonsense. And it's more accurate closer to reference brightnesses.

I'm not sure if it extends to the PS5 (I think so, but not 100%), but Vincent over at HDTVTest said XBOX calibration for HDR on this panel is not recommended since HGiG is not supported (unfortunately - I guess that's common with most monitors vs TVs, so I don't know how you'd do it well with a console). I'm not sure if in that video any calibration issues are coming into play since it is a console. All I can tell you is my experience in HDR games on PC is that it's not dim and the colors look great. *shrugs* That said, I'm also in a very dark room and testing a game with good HDR customization settings for my display. I'm not saying this person shouldn't be unhappy with the monitor - sounds like he might be in a brighter room, and if he thinks it's too dim for his uses, that's valid and he should probably look for something else he likes better. It might not be a great match for consoles either. But for me on PC, the brightness is fine.

They are in stock again, but weren't just a few days ago. It's a brand new monitor. It'll take time to see actual sales figures. Either way, there's a lot of excitement about a 27" OLED option and more variants to come from companies like ASUS. That wouldn't happen if they didn't think it'd sell. My only real concern with this monitor is burn in, and it'll take time to see if that's an issue.
 
Yeah I'm sick of his spiel too.

Now I find myself skipping his posts and reading the retorts lol.
Well done for dealing with what appears to be a hardcore shill, perhaps with shares in companies that make LCDs? ... thats about the same quality of backchat he's been giving you guys, only with more chance its true :D
 
Last edited:
SDR was typically 80 - 120nit, this "new SDR" definition some use is a little silly imo, and on some screens, their new HDR-like SDR levels even trigger ABL on their screens as they are way above 180 - 200nit SDR, sometimes 400nit. Really, those SDR levels when reviewed are said to be useful in very bright room viewing environments, in order to compensate since our eyes view things relatively. That is, in a bright room they "need" the SDR to go 2x to 4x as bright because - our eyes see everything relatively - and the room is at least 2x as bright in that case, likely more, so you are then back to relatively the same SDR levels in effect to your eyes and brain.
Using those same blasted bright room compensatory SDR levels as a foundation for dim to dark viewing conditions is illogical and malforms professional DV curve breakdowns but knock yourself out if you like torch mode. HDR is best viewed in the dim to dark viewing environments it was designed for in the first place so the SDR, and the lower HDR range should be a their non-compensatory for bright room levels like the Dolby Vision professional mastering's 50 % of the typical screen composition at zero to 100nit. Dim to dark viewing environments also prevents the matte type AG surface treatment from "activating" so makes it's tradeoffs less of an issue. Modern oled can show content at 600 to 1000nit in another 35% of the screen which is often during dynamic scenes camera/FoV movement. It's only certain scenes that trigger the limitations of OLED more at 50% and 100% windows or on more static scenes ABL wise (though some 2000+ nit FALD LCDs also have ABL). The oled tone maps the top 25%/25% ranges (100 to 1000nit mids, 1000nit to 10,000nit highlights and light sources) down to fit whichever screen's limitations. E.g. about half of the screen 400 to 500 nit then compresses the remaining 400 to 500nit depending on the display. Are those limitations? Absolutley. They are tradeoffs for sure but FALD has it's own.

Overall it's an appreciable HDR experience having 400, 600, 1000nit areas in dynamic material side by side with 50% of the screen at 0 to 100nit down to the razor's edge pixel by pixel by pixel in tons of mixed shaded area brighter scenes, darker areas even in textures providing detail to the pixel. The dark end is part of the HDR range. FALD's zones are a spotted leopard so the zones pale or brighten on the contrasted areas as the dynamic scene puddle jumps the 45x25 zones constantly. It tries to make the edges a scar of multiple zones so that it's not as abrupt, like a jumbo cell toning anti aliasing but they are 7000pixels (15thousand at 8k) in each zone which is large, so larger areas than even the cell the contrasted edge lands on or is crossing multiple of can be affected in order to smooth them against each other. So there can be less bloom outright but while contrast is lost in those areas to 3800:1 to 4500:1 and the accompanying lifted black depths and muting of the brighter zones. That contrast where the zones are being crossed constantly in dynamic content is terrible compared to oleds pixel by pixel - and the blooming still happens on smaller or sharper edges and small sources. That or the opposite, the darker area will dim the bright one in a reverse bloom, losing color detail/level. It's a clever system as a stop-gap solution until everything ends up being per pixel emissive someday, but it's lighting resolution is way too low.
 
Last edited:
He definitely doesn't have buyers remorse because he owns all the displays in question and has shown pictures of them all together.

I don't like the way he's going about it as if preference has no role but this forum is heavily OLED biased so he's fighting an uphill battle.

Until all of you have both a PG32UQX and C2/AW3423DW side by side to compare, you'll obviously defend your own purchase. I think there are 2 of us in this thread who have compared both but I'm not sure if the other person did side by side or got his OLED well after returning the LCD.

Obviously the FALD is much brighter but it loses contrast across it's jumbo sized 7000 pixel zones to 3800:1 to 4500:1 which is very poor and it can still get blooming or reverse bloom dimming of the bright edges in media and small bright objects/text/cursor etc. It's zones are big. Per pixel emissive is a better way to display things.

If you have been reading my replies to you, I've been telling you that our eyes view everything releatively. So you won't get an accurate measure of the screens in a side by side since you are biasing your eyes by them adjusting to the brighter one, and you won't get an accurate photo of them side by side in the same photo frame either since cameras also capture things relatively.

Similarly, the push for SDR 400nit is in order to combat bright room viewing conditions. When you boost it that high you are just back to how 100nit looks to your eyes in dim to dark viewing environment. It's how our eyes work.
 
FALD active 3800:1 on high brightness checkerboard to 4800:1 on low brightness checkerboard... up to 5000:1 in real scene. (The native contrast is 1300:1 with FALD off, which is recommended for accuracy in SDR/apps usage as FALD is not accurate). Still even 5000:1 very poor compared to oled per pixel or any per pixel emissive tech in the future. My 4k edge lit tv does full field 6100:1 contrast (not checkerboard but still, poor compared to OLED's "infinite:1 contrast black depth side by side razor's edge to the pixel by pixel). Fald's contrast and blacks can only be achieved at distant areas not at contrasted ones. Yet during a dynamic scene in media and in FoV movement in games you are constantly puddle jumping across zones.

hardware unboxed, Asus proart ucx "Brightness, Contrast, Uniformity"



814233_endless-toy-train-robot-half-loop.gif


When a FALD screen gets a challenging mix of HDR brights and HDR darks in a scene, which happens throughout in a lot of scenes - a "high" comparatively for now 1400nit+ like the ucx only gets around 5000:1 contrast in those "difficult" mixtures/mappings of areas in scenes. It's native, non-fald contrast is under 1300:1. In a low brightness checkerboard test with FALD active it gets only 4500:1 contrast ratio on those patterns and on a high brightness checkerboard it gets merely a 3800:1 contrast ratio. That's extremely poor, especially compared to an OLED's "infinite" or at least "ultra" depth per pixel next to any color levels it provides as it applies to mixed parts of scenes. FALD ucx also blooms when bright edges are defined vs. dark backgrounds (e.g. bright steam interface thumbnails, plex/emby covers on dark background, crosshairs, even the mouse cursor can bloom slightly.) are straddling across anywhere in the 7000 pixel (4k) zone (or what would be 15,000 pixel zone on 8k screens). Let alone those kinds of edges in HDR 400 - 1400 nit ranges light to dark. While you aren't always watching an actual checkerboard - that really happens any time any area of bright + dark is in the middle of cells. . . 7k to 15k pixels is a big field, and HDR is a big range to attempt to smooth the larger brightness vs darkness range across. Tons of scenes have doorways, stairwells, darker areas cast by hair, hats, tables, trees/foliage, geology and architectures, clothing/robes/dresses, curtains casting darker areas in otherwise bright areas. Also modest brightness or dim room areas with bright window(s), bright artificial lights in dark areas, bright (e.g. space) ships and lasers, dragon fire and spells/magic, lightsabers, starfields in space and dark areas, the light reflected off of people's eyeballs and metals in darker scenes, isolated flame, etc. Even light cast across detailed textured objects with the dark pixels creating detail in an otherwise bright object. And these aren't slides, they are dynamic scenes in media and games so while scenes can pause at times, those zones are otherwise being panned across all of the time as camera cinematography changes the scene camera angels/shots, or pans . . or game virtual cinematography does the same or the player mouse looks, movement-keys, or controller pans. So you are landing on the fence zone-wise randomly in the heat map of more static scenes and you are jumping the fences constantly in dynamic scene's flow and in gameplay.

https://tftcentral.co.uk/reviews/asus_rog_swift_pg32uqx
Back to 1300:1 contrast in SDR?. Yuck.
"Note that you don’t really want to have the variable backlight operating for general desktop use, that’s better saved for HDR games and videos. For SDR general and office use just turn variable backlight off in the OSD menu we would recommend. If you use the FALD variable backlight for colour critical work then it will lead to additional inaccuracies than cannot be avoided with LCD local dimming backlights, which we explained in more detail in our LG 32EP950 OLED review here. You don’t need the local dimming for SDR content anyway."

The cells aren't small enough. The firmware chooses to lighten or darken the entire cell of 7000 pixels all the same amount, toning it's lighting. It can't "split hairs", not even close. Actually, the opposite is true. It can end up toning more cells, the surrounding cells of a contrasted area in a type of jumbo lighting sub-sampling/anti-aliasing type effect so even more cells are affected in an attempt to make the transitions less abrupt. That's a quiltwork of sideways icecube trays, 7k to 15k pixels each, so the lighting is not uniform and you get lighter or darker patches or very, very low rez lighting array brightness toning "gradients" spanning contrasted edges, with the lighting levels smoothed across multiple zones in a width of zones along borders, almost like how text is subsample smoothed except very jumbo sized 7k pixels each cell for each analogous "subpixel" smoothed. It can bloom or sort of reverse bloom and darken cells that shouldn't be but otherwise that smoothing and toning method, if not overtly exhibiting bloom/dim "halos", it can provide a lighting smoothing effect spreading it out across multiple zones when it's able to depending on the scene elements so that in those areas it's not as abrupt. It's a clever system but it's toning any contrasted area's cells against each other all of the time like AA or text-ss in a jumbo grid of 7k pixels per cell whether outright bloom/dim haloing or "anti aliasing" the huge lighting zones vs each other. So you can say "it's not blooming much" in this content, but it's lifting the black depths on zones, and also lowering the brightness/color volume of others dynamically in a wider backlight array scar where any high contrasted areas meet. It has to. It's only a 45x25 lighting resolution at best.

OLED has it's own cons of course and they are as considerable (among others, ABL though the 2000nit+ samsung qd LED LCD's also have aggressive ABL). I don't think any level headed person is disputing that both technologies have major tradeoffs. Both technologies have huge tradeoffs but I prefer per pixel emissive with the current tech in non-modified HDR curves rather than self-distorted curves that would artificially trigger ABL more often. Currently you can only get per pixel emissive in enthusiast price ranges in an OLED. Per pixel emissive is a better way to display things, period. MicroLED will be per pixel emissive ultimately. FALD is a hack we have for now because we can't do better in LCDs. It's clever and makes the most of what it can do but it will never compare to per pixel emissive display method as a technical way to do things, OLED or not.

If you like the overall pros vs. the cons of the large zones rather than the pros and cons of per pixel emissive that's fine but that's not how everyone values things. OLED is inky black and color lighting down to the razor's edge pixel level splitting hairs side by side by side on over 8 million pixels.. throughout the entire screen. No matter what there are 3840x2160 pixels each with their own level independent of all of the other pixels. ABL kicks them down in some dynamic scenes momentarily which is occasional in non-deformed HDR curves, but 2000+ nit FALD LCDs aren't immune to that either so far. The higher density FALDs are a lighting resolution of 45x25 cells at best, probably less grid wise if you don't count the edge lights. The fact that there are really no glossy options on FALD LCDs just makes it worse as matte type ag hit by ambient lighting raises black levels on any screen type so the end result is even higher than the FALD array does alone and can compromise small details a bit with it's sheen/frost-haze when "activated" by light hitting it and reflecting it back. A glossy OLED is a gorgeous picture. Idk if I could ever switch to haze coated FALD's patchwork of very low lighting resolution in halos and low density lighting compensations as zone gradients/AA personally. I know I'd regret it. I will probably will be on oled for media and gaming until OLED is replaced by a per pixel emissive tech like microLED, when it reaches enthusiast consumer prices. Also keeping an eye on microOLED goggle-like form factor VR generations with pancake lenses, varifocal lenses, HDR, and slightly better PPD though VR's ppd is going to be very poor to my tastes overall for many years yet I think. VR is going to microOLED in the next gens and later on it will probably go to microLED someyear. So for VR it's looking like it is all per pixel emissive going forward. Noone will ever go back to FALD if they can help it. It's a temporary compensation masking method like text sub-sampling for low pixel density, but it's much worse because it's so large. Once the lighting resolution or the display resolution ~ PPD is high enough you won't need any of those hacks anymore. 8k is getting close on the text-ss/AA front. Per pixel lighting resolution is the goal for display tech overall.
 
Last edited:
Just to be absolutely clear: Do you understand ANSI contrast measurement has nothing to do with measuring contrast between individual pixels?
 
It's jumping those zones constantly in dynamic material so is non-uniform. Those tests show discrepancy. It's bad enough that its recommended that you turn the FALD off for 2d SDR desktop work.

The first graph I linked was his test of an actual HDR contrasted scene, 10% window and the 2nd one was combination of two HDR checkerboard tests. The bright checkerboard test at ~ 4490 tracked similar to the real HDR scene frame 10% window one (4990).

The lighting resolution of fald is large 7000 pixel blocks each.. 15, 000 per block at 8k.. 45 x 25 lighting rez. Worse because it offsets the surrounding blocks in attempt to prevent blooming.
 
Last edited:
"The easiest way to explain why ANSI contrast is the better way to measure how good the contrast is for a projector, ANSI contrast refers to the difference between black and white when they coexist in the image. In other words very different from setting up a black image, measure, setting up a white image, measure and divide the result on white with the result on black. With ANSI, you measure black and white in the same image together."

"ANSI contrast measures contrast in a different way, it uses a checkerboard pattern of eight white and eight black squares and measures the ratio between the average brightness of the white squares and the average brightness of the black squares. The underlying principle behind ANSI contrast is that audiences almost never view images that are solid black or solid white (as with the images used to measure FOFO contrast), instead they view images that have countless gradations of brightness across the frame. In such circumstances the light from brighter sections of the image undoubtedly mixes and blends with the dark sections of the image. ANSI contrast therefore tries to express this reality with its simultaneous use of white and black boxes".

HDR scenes are full of contrasted areas as I said in my larger (repeated) reply. Yet per pixel emissive doesn't care if it's a HDR scene in a brighter area with shadowed area curtains/dresses/robes, people's hair/hat/mask, stairwell, under table, detail/relief in rockwork,machinery, textures, whatever.. . . - - - or a checkerboard - - because it can emit each pixel individually.

While you aren't always watching an actual checkerboard - that really happens any time any area of bright + dark is in the middle of cells. . . 7k to 15k pixels is a big field, and HDR is a big range to attempt to smooth the larger brightness vs darkness range across. Tons of scenes have doorways, stairwells, darker areas cast by hair, hats, tables, trees/foliage, geology and architectures, clothing/robes/dresses, curtains casting darker areas in otherwise bright areas. Also modest brightness or dim room areas with bright window(s), bright artificial lights in dark areas, bright (e.g. space) ships and lasers, dragon fire and spells/magic, lightsabers, starfields in space and dark areas, the light reflected off of people's eyeballs and metals in darker scenes, isolated flame, etc. Even light cast across detailed textured objects with the dark pixels creating detail in an otherwise bright object. And these aren't slides, they are dynamic scenes in media and games so while scenes can pause at times, those zones are otherwise being panned across all of the time as camera cinematography changes the scene camera angels/shots, or pans . . or game virtual cinematography does the same or the player mouse looks, movement-keys, or controller pans. So you are landing on the fence zone-wise randomly in the heat map of more static scenes and you are jumping the fences constantly in dynamic scene's flow and in gameplay.
 
Last edited:
And this is why I'd advise everyone else in this thread to give up with this dumb cyclic argument you keep pushing and block you.
Funny you can block me all you want. In the end it's you seeing much worse images on OLED without realizing what HDR really is.
 
lol. You absolutely changed the debate. You said it wasn't an HDR monitor initially, and when challenged with...you know...all the reviews reviewing the HDR, specs listing HDR, all of a sudden demanded it be certified to count. That's not how it works. It's an HDR monitor. Fruit is still fruit if it's not certified organic. I never argued it met any certifications/standards. I simply said it's an HDR monitor. Just because that sun doesn't meet certain nits doesn't mean smaller highlights don't go higher. It's established it can hit far higher than that.
I've always said it's not a HDR monitor. You get a 360nits sun that drops below the lowest requirement of HDR400. You can call it HDR300 and HDR100 that looks even worse than SDR

sRGB is made for ~100 nits in a pitch black room (some, like me prefer brighter at about 160 to meet room conditions and personal preference) - that you can't see at that brightness is nonsense. And it's more accurate closer to reference brightnesses.
sRGB is made for 80nits that looks the same on office monitors. You must like buying a monitor to see so limited range all the time.

I'm not sure if it extends to the PS5 (I think so, but not 100%), but Vincent over at HDTVTest said XBOX calibration for HDR on this panel is not recommended since HGiG is not supported (unfortunately - I guess that's common with most monitors vs TVs, so I don't know how you'd do it well with a console). I'm not sure if in that video any calibration issues are coming into play since it is a console. All I can tell you is my experience in HDR games on PC is that it's not dim and the colors look great. *shrugs* That said, I'm also in a very dark room and testing a game with good HDR customization settings for my display. I'm not saying this person shouldn't be unhappy with the monitor - sounds like he might be in a brighter room, and if he thinks it's too dim for his uses, that's valid and he should probably look for something else he likes better. It might not be a great match for consoles either. But for me on PC, the brightness is fine.
Speaking like this only means you don't understand how calibration works on an actual HDR monitor. You cannot calibrate HDR on this monitor. You can only shrink the dynamic range to prevent clipping such as using Windows calibration tool. That's not even calibration.
You should realize how ridiculous it is on HDR content the 1000nits brightness is tone-mapped down to the point dimmer than SDR. That's why people keeping say SDR brighter than HDR even on this monitor.

They are in stock again, but weren't just a few days ago. It's a brand new monitor. It'll take time to see actual sales figures. Either way, there's a lot of excitement about a 27" OLED option and more variants to come from companies like ASUS. That wouldn't happen if they didn't think it'd sell. My only real concern with this monitor is burn in, and it'll take time to see if that's an issue.
All of them are in stock in East Asian market all the time. OLED simply don't sell. ASUS is using the same panel. It won't be a HDR monitor either.
 
FALD active 3800:1 on high brightness checkerboard to 4800:1 on low brightness checkerboard... up to 5000:1 in real scene. (The native contrast is 1300:1 with FALD off, which is recommended for accuracy in SDR/apps usage as FALD is not accurate). Still even 5000:1 very poor compared to oled per pixel or any per pixel emissive tech in the future. My 4k edge lit tv does full field 6100:1 contrast (not checkerboard but still, poor compared to OLED's "infinite:1 contrast black depth side by side razor's edge to the pixel by pixel). Fald's contrast and blacks can only be achieved at distant areas not at contrasted ones. Yet during a dynamic scene in media and in FoV movement in games you are constantly puddle jumping across zones.





814233_endless-toy-train-robot-half-loop.gif


When a FALD screen gets a challenging mix of HDR brights and HDR darks in a scene, which happens throughout in a lot of scenes - a "high" comparatively for now 1400nit+ like the ucx only gets around 5000:1 contrast in those "difficult" mixtures/mappings of areas in scenes. It's native, non-fald contrast is under 1300:1. In a low brightness checkerboard test with FALD active it gets only 4500:1 contrast ratio on those patterns and on a high brightness checkerboard it gets merely a 3800:1 contrast ratio. That's extremely poor, especially compared to an OLED's "infinite" or at least "ultra" depth per pixel next to any color levels it provides as it applies to mixed parts of scenes. FALD ucx also blooms when bright edges are defined vs. dark backgrounds (e.g. bright steam interface thumbnails, plex/emby covers on dark background, crosshairs, even the mouse cursor can bloom slightly.) are straddling across anywhere in the 7000 pixel (4k) zone (or what would be 15,000 pixel zone on 8k screens). Let alone those kinds of edges in HDR 400 - 1400 nit ranges light to dark. While you aren't always watching an actual checkerboard - that really happens any time any area of bright + dark is in the middle of cells. . . 7k to 15k pixels is a big field, and HDR is a big range to attempt to smooth the larger brightness vs darkness range across. Tons of scenes have doorways, stairwells, darker areas cast by hair, hats, tables, trees/foliage, geology and architectures, clothing/robes/dresses, curtains casting darker areas in otherwise bright areas. Also modest brightness or dim room areas with bright window(s), bright artificial lights in dark areas, bright (e.g. space) ships and lasers, dragon fire and spells/magic, lightsabers, starfields in space and dark areas, the light reflected off of people's eyeballs and metals in darker scenes, isolated flame, etc. Even light cast across detailed textured objects with the dark pixels creating detail in an otherwise bright object. And these aren't slides, they are dynamic scenes in media and games so while scenes can pause at times, those zones are otherwise being panned across all of the time as camera cinematography changes the scene camera angels/shots, or pans . . or game virtual cinematography does the same or the player mouse looks, movement-keys, or controller pans. So you are landing on the fence zone-wise randomly in the heat map of more static scenes and you are jumping the fences constantly in dynamic scene's flow and in gameplay.

https://tftcentral.co.uk/reviews/asus_rog_swift_pg32uqx
Back to 1300:1 contrast in SDR?. Yuck.


The cells aren't small enough. The firmware chooses to lighten or darken the entire cell of 7000 pixels all the same amount, toning it's lighting. It can't "split hairs", not even close. Actually, the opposite is true. It can end up toning more cells, the surrounding cells of a contrasted area in a type of jumbo lighting sub-sampling/anti-aliasing type effect so even more cells are affected in an attempt to make the transitions less abrupt. That's a quiltwork of sideways icecube trays, 7k to 15k pixels each, so the lighting is not uniform and you get lighter or darker patches or very, very low rez lighting array brightness toning "gradients" spanning contrasted edges, with the lighting levels smoothed across multiple zones in a width of zones along borders, almost like how text is subsample smoothed except very jumbo sized 7k pixels each cell for each analogous "subpixel" smoothed. It can bloom or sort of reverse bloom and darken cells that shouldn't be but otherwise that smoothing and toning method, if not overtly exhibiting bloom/dim "halos", it can provide a lighting smoothing effect spreading it out across multiple zones when it's able to depending on the scene elements so that in those areas it's not as abrupt. It's a clever system but it's toning any contrasted area's cells against each other all of the time like AA or text-ss in a jumbo grid of 7k pixels per cell whether outright bloom/dim haloing or "anti aliasing" the huge lighting zones vs each other. So you can say "it's not blooming much" in this content, but it's lifting the black depths on zones, and also lowering the brightness/color volume of others dynamically in a wider backlight array scar where any high contrasted areas meet. It has to. It's only a 45x25 lighting resolution at best.

OLED has it's own cons of course and they are as considerable (among others, ABL though the 2000nit+ samsung qd LED LCD's also have aggressive ABL). I don't think any level headed person is disputing that both technologies have major tradeoffs. Both technologies have huge tradeoffs but I prefer per pixel emissive with the current tech in non-modified HDR curves rather than self-distorted curves that would artificially trigger ABL more often. Currently you can only get per pixel emissive in enthusiast price ranges in an OLED. Per pixel emissive is a better way to display things, period. MicroLED will be per pixel emissive ultimately. FALD is a hack we have for now because we can't do better in LCDs. It's clever and makes the most of what it can do but it will never compare to per pixel emissive display method as a technical way to do things, OLED or not.

If you like the overall pros vs. the cons of the large zones rather than the pros and cons of per pixel emissive that's fine but that's not how everyone values things. OLED is inky black and color lighting down to the razor's edge pixel level splitting hairs side by side by side on over 8 million pixels.. throughout the entire screen. No matter what there are 3840x2160 pixels each with their own level independent of all of the other pixels. ABL kicks them down in some dynamic scenes momentarily which is occasional in non-deformed HDR curves, but 2000+ nit FALD LCDs aren't immune to that either so far. The higher density FALDs are a lighting resolution of 45x25 cells at best, probably less grid wise if you don't count the edge lights. The fact that there are really no glossy options on FALD LCDs just makes it worse as matte type ag hit by ambient lighting raises black levels on any screen type so the end result is even higher than the FALD array does alone and can compromise small details a bit with it's sheen/frost-haze when "activated" by light hitting it and reflecting it back. A glossy OLED is a gorgeous picture. Idk if I could ever switch to haze coated FALD's patchwork of very low lighting resolution in halos and low density lighting compensations as zone gradients/AA personally. I know I'd regret it. I will probably will be on oled for media and gaming until OLED is replaced by a per pixel emissive tech like microLED, when it reaches enthusiast consumer prices. Also keeping an eye on microOLED goggle-like form factor VR generations with pancake lenses, varifocal lenses, HDR, and slightly better PPD though VR's ppd is going to be very poor to my tastes overall for many years yet I think. VR is going to microOLED in the next gens and later on it will probably go to microLED someyear. So for VR it's looking like it is all per pixel emissive going forward. Noone will ever go back to FALD if they can help it. It's a temporary compensation masking method like text sub-sampling for low pixel density, but it's much worse because it's so large. Once the lighting resolution or the display resolution ~ PPD is high enough you won't need any of those hacks anymore. 8k is getting close on the text-ss/AA front. Per pixel lighting resolution is the goal for display tech overall.

Glossy is even more of a problem than anti-reflective coating if there's light in the room. My room isn't dark most of the time when I use my monitors, so anti-reflective coating works much better than glossy.
 
The native contrast is 1300:1 with FALD off, which is recommended for accuracy in SDR/apps usage as FALD is not accurate.
You should've realized FALD is made to boost more accurate contrast even in SDR.

There is not just SDR sRGB 80nits. There is Adobe SDR at higher range where OLED cannot even reach. In SDR where it needs 400nits against 0.1nits then OLED will lose more contrast. When the dynamic range gets higher the overall accuracy of FALD will easily surpass OLED. Remember OLED cannot even be used to make HDR. The dynamic range shrinks 3x-5x compared to FALD.
 
I don't know using a OLED for a week now looks pretty epic to me in everyway. I had some teething problems for the first few days but I got used it it over 9 days now.
I used to get pretty bad eyestrain using larger 27"+ LCD monitors I owned two IPS monitor I have no idea why IPS became more popular than VA it just happened that way.
On IPS monitors I had crushing blacks I couldn't see a damn thing in some games so I opted for a VA panel like for 2-3 years basically perfect except for the size which I was kinda forced to use because of the harsh led lights on 27" screens. If the materials are out there to mass produce OLED screens they are here to stay thank god I don't have to stare at LED lights while I game now. Still use a 21.5" as my primary monitor lcd just because it's small. The problem I had with IPS in order to get rid off all the blacks on the screen I had to turn up the monitor super bright that is why I thought VA panels were better.
 
I don't know using a OLED for a week now looks pretty epic to me in everyway. I had some teething problems for the first few days but I got used it it over 9 days now.
I used to get pretty bad eyestrain using larger 27"+ LCD monitors I owned two IPS monitor I have no idea why IPS became more popular than VA it just happened that way.
On IPS monitors I had crushing blacks I couldn't see a damn thing in some games so I opted for a VA panel like for 2-3 years basically perfect except for the size which I was kinda forced to use because of the harsh led lights on 27" screens. If the materials are out there to mass produce OLED screens they are here to stay thank god I don't have to stare at LED lights while I game now. Still use a 21.5" as my primary monitor lcd just because it's small. The problem I had with IPS in order to get rid off all the blacks on the screen I had to turn up the monitor super bright that is why I thought VA panels were better.
It is only "epic" in SDR sRGB where the brightness is low. If you've bought a little brighter AW3423DW you will get eyestrain even more as OLED is always flickering. So you have to turn down the brightness even lower to see limited range. Only good DC dimming monitors don't have flickers so you can see higher range without eyestrain.

You talk about the edge-lit vs OLED. Edge-lit backlight won't give deep black anyway. It will output more light.
 
Last edited:
Glossy is even more of a problem than anti-reflective coating if there's light in the room. My room isn't dark most of the time when I use my monitors, so anti-reflective coating works much better than glossy.

You could benefit from 400nit SDR as a compensatory measure because your room is bright, because then it would look to your eyes as if it was 80 - 100 nit again due to the way our eyes work in a relative way.

However AG is going to be polluted by lighting regardless, just in blobs and ghost forms instead. It'll also raise the black levels and can compromise small details in it's frost/haze when it's "activated" by ambient lighting splashing off of it. It's not the proper environment for viewing HDR media and gaming either so those levels will be compromised too just like the SDR example.



--------------------------------

This has been argued in hardforum threads many times. Here is how I see it.

====================

Think of it like a light haze on clear "dry" ice vs. ultra clear wet ice.

Direct light sources hitting a screen are going to pollute the screen surface regardless. Some (many?) people are using their screens in poor setups. It's just like audio or photography - you should set up your environment to suit your hardware/screen not the other way around imo.

Like I said, improper lighting conditions and room layouts that allow direct light sources to hit a screen surface are going to pollute the screen regardless, as shown in the images below.
796266_S9gqCVy.png



(image credit: vega)
796267_Lj3vf4O.png



(image credit:vega)
full


full


matte-vs-glossy-screens.jpg

Since traditionally desks have been laid out up against the wall like a bookshelf, or upright piano with sheet music against a wall - most setups act like a catcher's mitt for direct light source pollution from behind and overhead. Professional/reference monitors often come with a hood that covers the top and some of the sides, like some production cameras have. Light pollution (as well as allowing lighting conditions to change throughout the day) will alter/pollute how even a calibrated screen's values are seen and perceived.

The direct light source vectors hitting the matte or ag screens will blow out contrast and pale saturation, washing out areas of the screen they hit and are diffused onto. Allowing lighting conditions to change will also alter the way our eyes/brain perceives the screen's contrast and saturation so even their "calibrated" values will be lost to your eyes and brain. E.g. the screen will look more pale, weakly contrasted and undersaturated the brighter the room gets, and vice versa. Some keep several sets of settings so that they can switch between them for different times of the day or different room lighting conditions. So you are going to get compromised results if you don't design your viewing environment more optimally no matter what screen coating you have.

. . . . . . . . . .

From TFTcentral review of the PG42UQ:

The PG42UQ features a more traditional monitor-like matte anti-glare coating, as opposed to a glossy panel coating like you’d find on TV’s including the LG C2. This does a very good job of reducing reflections and handling external light sources like windows and lamps and we noticed much better reflection handling (no surprise) than the LG C2. However this does mean that in some conditions the blacks do not look as deep or inky visually to the user. With this being an OLED panel, famous for its true blacks and amazing contrast ratio this could be considered a problem – are you “wasting” that by having an AG coating that reduces your perceived contrast?
.
In certain conditions blacks look a little more dark grey as the anti-reflective coating reflects some of the surrounding light back at you and it “dulls” the contrast a bit. The anti-glare coating means the image is not as clear and clean as a fully glossy coating. You don’t get this same effect if the coating is fully glossy as there’s no AG layer, but what you do get instead is more reflections. Don’t forget this same thing applies to all AG coated desktop monitors, you have the same impact on perceived black depth and contrast on IPS, TN Film and VA panels depending on your lighting conditions if there’s an AG coating used. You’d still get better relative blacks and contrast on the OLED (not to mention other benefits) compared with LCD technologies. They are all impacted in the same way by their coatings.

While they are concentrating on how it affects the blacks which is bad enough, it can also degrade the color saturation as it appears to your eyes/brain as it creates a haze.

===============================================

https://arstechnica.com/gadgets/202...-more-reflections/?comments=1&comments-page=1

Or course you are seeing this picture below on whatever screen surface you are using at the moment so it's more of a simulation. ;)

798430_matte-glossy.jpg




. . . . . .

https://euro.dough.tech/blogs/news/matte-vs-glossy-gaming-monitors-technology-explained


7fc6daaf63fa0869961e502ac6cb4fb2e99b179e_480x480.jpg




86e018a65e7c6b031682e3cf01edec16b7b177d6_480x480.jpg




075c78e242e254fd3f497774caaceadd54b947ba_480x480.jpg



subpixel photo the euro.dough.tech site referenced from TFTcentral:


d3047e25cf1e8c6b2679bd7aeaf8a0b7612859e4_480x480.png


bee01dbf0c7a3ed7446fb60e138e8d0a6196b2d6_480x480.jpg
 
It is only "epic" in SDR sRGB where the brightness is low. If you've bought a little brighter AW3423DW you will get eyestrain even more as OLED is always flickering. So you have to turn down the brightness even lower to see limited range. Only good DC dimming monitors don't have flickers so you can see higher range without eyestrain.

You talk about the edge-lit vs OLED. Edge-lit backlight won't give deep black anyway. It will output more light.

The Alienware was interesting but I knew I wouldn't like it due to size since I wear glasses. I know the LG uses MLA Micro Lens Array which is brand spanking new OLED technology.
 
You could benefit from 400nit SDR as a compensatory measure because your room is bright, because then it would look to your eyes as if it was 80 - 100 nit again due to the way our eyes work in a relative way.
Completely false. You see a 400nits Adobe SDR in a dim room with FALD local dimming then it's very close to HDR400 with boosted contrast. It looks nothing like 80nits. It's the FALD monitors can boost contrast at higher range, not the OLED.
 



Is this what you guys are talking about? I noticed this in Hogwarts Legacy but only in the loading screen.
 
The Alienware was interesting but I knew I wouldn't like it due to size since I wear glasses. I know the LG uses MLA Micro Lens Array which is brand spanking new OLED technology.
MLA is just a layer of subpixel magnifiers to focus more light. It doesn't boost that much brightness on 27GR95QE because the initial output brightness is too low.

That's why HDR on this monitor can be dimmer than SDR. It won't look much better than SDR anyway.



Is this what you guys are talking about? I noticed this in Hogwarts Legacy but only in the loading screen.

OLED flickers is different than VRR flickers. It's invisible. It happens all the time with or without VRR.

There is a brief black term at the start of each frame. It doesn't even matter if OLED uses DC dimming as it not DC enough to control the black term. It always flickers.
Once the brightness is higher you will get eye strain easily. It has never been fixed.
OLED flickering.png


OLED-Flickering_1.gif


So your solution is just turn down the brightness to see limited range on OLED. You can get 100nits vs 0.01nits or 100nits vs 0.001nits to see better blacks under SDR. But you cannot see higher range where 400nits vs 0.01nits or 1000nits vs 0.01nits.
 
Back
Top