Why OLED for PC use?

The direct light source vectors hitting the matte or ag screens will blow out contrast and pale saturation, washing out areas of the screen they hit and are diffused onto. Allowing lighting conditions to change will also alter the way our eyes/brain perceives the screen's contrast and saturation so even their "calibrated" values will be lost to your eyes and brain. E.g. the screen will look more pale, weakly contrasted and undersaturated the brighter the room gets, and vice versa. Some keep several sets of settings so that they can switch between them for different times of the day or different room lighting conditions. So you are going to get compromised results if you don't design your viewing environment more optimally no matter what screen coating you have.

Having direct light hit your monitor is obviously bad. I don't have that - there's no light source in front of my monitors. Still, with a glossy screen, the reflections from the room have a much worse effect on the image than with an AG coated screen. The glossy screen acts like a mirror. AG coating is used on monitors for very good reasons.
 
Is there a video for that animated gif comparing the Asus vs the LG OLED?



Funny the same video link has been posted multiple times. Guys really need to accumulate information. You need a high-speed camera to do it or you can just use your phone with fast shutter speed to record the flickers.



The similar thing goes to AW3423DW. It wasn't even bright. It's pathetically dim yet it gives eye strain at only 250nits.
 
Having direct light hit your monitor is obviously bad. I don't have that - there's no light source in front of my monitors. Still, with a glossy screen, the reflections from the room have a much worse effect on the image than with an AG coated screen. The glossy screen acts like a mirror. AG coating is used on monitors for very good reasons.


Everything is tradeoffs so if that's what you like best it's what you like best. However, matte type ag with ambient lighting rather than dim to dark viewing conditions will activate the AG layer in effect, lifting the blacks, even without direct light sources hitting it. HDR is best viewed in dim to dark viewing conditions also, again because our eyes view everything relatively so having light is going to change the way our eyes perceive brightness, contrast, saturation. Allowing your room lighting to change throughout the day will also throw off how your eyes perceive everything. Direct sources will wash out areas of the screen more in blobs/bars etc though so at least you are avoiding that.

A matte type AG in ambient lighting will never look as wet and clear as a glossy screen or as dark. No matter what screen type, the matte type AG surface treatment has this effect. The worse the contrasted zones to black depth or the native display type (VA, ips, TN) the higher the end result will be though since it's starting from a worse point but even oled's blacks are lifted more to greys on matte screens unless they are being viewed in dim to dark ambient environments. Unfortunately most if not all of the OLED gaming monitors (as opposed to gaming TVs) are matte type AG afaik, so in effect that have worse black depths to your eyes and brain than the glossy ones if there is ambient lighting rather than dim to dark movie type viewing conditions. Also all of the FALDs have matte type AG screen surface treatment, which is a nice way to say that their outer layer has been scratched/abraded. Which is why I consider it yet another tradeoff considering what we have available currently though it really shouldn't be if there were more options. That is why I brought it up in this thread but it is an age old tradeoff and has many threads about it. It's still a thing even on oled black depths and clarity onm oleds that have matte type AG though so it's still a problem/tradeoff.

Most of the reference monitors come with a monitor hood you can install because they know light pollutes the screen space as well as how your eyes perceive things.

Even the ucx and the ucg come with one in the box:

1592406056_IMG_1375653.jpg





From TFTcentral review of the PG42UQ, but it applies to all screens with AG:

The PG42UQ features a more traditional monitor-like matte anti-glare coating, as opposed to a glossy panel coating like you’d find on TV’s including the LG C2. This does a very good job of reducing reflections and handling external light sources like windows and lamps and we noticed much better reflection handling (no surprise) than the LG C2. However this does mean that in some conditions the blacks do not look as deep or inky visually to the user. With this being an OLED panel, famous for its true blacks and amazing contrast ratio this could be considered a problem – are you “wasting” that by having an AG coating that reduces your perceived contrast?
.
In certain conditions blacks look a little more dark grey as the anti-reflective coating reflects some of the surrounding light back at you and it “dulls” the contrast a bit. The anti-glare coating means the image is not as clear and clean as a fully glossy coating. You don’t get this same effect if the coating is fully glossy as there’s no AG layer, but what you do get instead is more reflections. Don’t forget this same thing applies to all AG coated desktop monitors, you have the same impact on perceived black depth and contrast on IPS, TN Film and VA panels depending on your lighting conditions if there’s an AG coating used. You’d still get better relative blacks and contrast on the OLED (not to mention other benefits) compared with LCD technologies. They are all impacted in the same way by their coatings.

While they are concentrating on how it affects the blacks which is bad enough, it can also degrade the color saturation as it appears to your eyes/brain as it creates a haze


. .

I just came back from a greek sit down/take out restaurant that had a salad bar like long counter thing where you choose your toppings as you go down the row. They were all covered on top and on the side with clear glass. Where the recessed lights were overhead was a bad reflection but other than that the various colors of very colorful foodstuffs behind the glass looked great and I wasn't noticing reflections unless I made a point to look for them and focus on them conciously. If they had an AG coating on there it would not have looked as clear. Same with car windshields, or my tablet, or my phone, or any of my glass jars full of spices. I also used to be into saltwater reef aquariums which is an evern better analogy as they have bright lighting inside and are full of colorful things on display. The best kind of aquarium you could get was made with "starphire" glass because they have low iron in them so looked clearer/higher clarity. I wouldn't want an AG scratch/abrasion treatment on my colorful reef aquarium anymore than I would want that on my colorful screen - but the matte type AG screens at least won't be activated as obnoxiously in dim to dark viewing environments and HDR is made for that environment in the first place.
 
Glossy is even more of a problem than anti-reflective coating if there's light in the room. My room isn't dark most of the time when I use my monitors, so anti-reflective coating works much better than glossy.

Don't let that elvn guy fool you with his useless quotes combined with tricks and hallucinations.

The manufacturers choose matt because not everybody has dim room like us. When the ambient light rises a little bit like 25 lux the glossy panel already loses way more contrast. When it's 125 lux the glossy has only 660:1 contrast while the matt is well at 9000:1. If they use glossy the 27GR95QE will look even worse with just a little ambient light.

810581_1676663403067.png


When you sit in a dim room enough to sleep there isn't much difference between matt and glossy due to 0 reflection. But there is tones of difference between 400nits vs 80nits. Elvn has been always recommended you sit in a dim room to imagine 80nits as 400nits while we've been always use HDR monitors in a dim room to see 400nits as the actual 400nits or see 1000nits as 1000nits instead of seeing 80nits to imagining higher brightness lol.
 
Our eyes+brain work relative to ambient lighting/lighting environment so 300 - 400 nit in a bright viewing environment goes back to looking more like 100 nit sdr would a dim to dark viewing environment. That's why reviews give high marks to SDR that does 300 - 400 nit, it's because it can compensate for the effect on your eyes of viewing TVs in a bright viewing environment, not because it torches the SDR/low ranges for use in a dim to dark viewing environment.


Its the same reason a phone or tablet blasted at 100% brightness at night in the dark or a dark bedroom is harshly bright but in bright daylight conditions it can look pale by comparison. Or doing the same with a flashlight, etc. Our eyes work in a relative fashion and adapt to lighting conditions. Another reason why HDR, as you know, is made to be viewed in dim to dark viewing conditions in the first place. The environment which also helps vs AG coating tradeoff.

Even if you calibrate your screen or have a screen with good factory calibration out of the box, those values will all swing to your eyes and brain when you let the room/environment lighting conditions change.

The professional dolby vision mastering personnel on dolby's site map HDR 10,000 average scenes as 50% at ~ 100nit , 25% at 100 to 1000nit, and 25% at 1000nit to 10,000nit. Toned down/compressed into whatever screen you are using in the screen's top end.. They aren't remapping their zero to 100nit scene content or even to 200nit of the average and deep ends up to 400 nit because a TV can do 400nit sdr in order to combat bright viewing conditions, using that torch in dim to dark conditions instead. I don't care what you are doing with your own blind reshade filters, knock yourself out but don't claim it's gospel for dim room viewing conditions.

Showing that FALD goes brighter for HDR's high end, noone is disputing that, it's a big tradeoff vs per pixel emissive though. repeat ad nauseam.

Glossy will always be clearer than scratching and abrading your outer screen surface , AG just won't get the sheen from reflected light from the outside of the screen if viewing it in dim to dark, where HDR should be viewed anyway in the first place. Ironically people typically get AG to use in poor lighting conditions because they are unwilling or unable to use them in an optimal environment in relation to the screen - so they will get the tradeoffs. The raised blacks thing wouldn't matter as much to me since I view HDR in dim to dark conditions. My laptop is AG which was a tradeoff for the other specs but I wish it was glossy like my tablet I use next to it. Even in dim conditions the layer is frosty by comparison. Glossy is better than scratching your screen surface, even with tiny scratches.
 
Last edited:
I dont think he's able to understand, pages of explanation made no difference.
Funny several quotes cut from somewhere else can trick you to see 80nits as bright as 400nits in the same viewing environment.
 
Our eyes+brain work relative to ambient lighting/lighting environment so 300 - 400 nit in a bright viewing environment goes back to looking more like 100 nit sdr would a dim to dark viewing environment. That's why reviews give high marks to SDR that does 300 - 400 nit, it's because it can compensate for the effect on your eyes of viewing TVs in a bright viewing environment, not because it torches the SDR/low ranges for use in a dim to dark viewing environment.


Its the same reason a phone or tablet blasted at 100% brightness at night in the dark or a dark bedroom is harshly bright but in bright daylight conditions it can look pale by comparison. Or doing the same with a flashlight, etc. Our eyes work in a relative fashion and adapt to lighting conditions. Another reason why HDR, as you know, is made to be viewed in dim to dark viewing conditions in the first place. The environment which also helps vs AG coating tradeoff.

Even if you calibrate your screen or have a screen with good factory calibration out of the box, those values will all swing to your eyes and brain when you let the room/environment lighting conditions change.

The professional dolby vision mastering personnel on dolby's site map HDR 10,000 average scenes as 50% at ~ 100nit , 25% at 100 to 1000nit, and 25% at 1000nit to 10,000nit. Toned down/compressed into whatever screen you are using in the screen's top end.. They aren't remapping their zero to 100nit scene content or even to 200nit of the average and deep ends up to 400 nit because a TV can do 400nit sdr in order to combat bright viewing conditions, using that torch in dim to dark conditions instead. I don't care what you are doing with your own blind reshade filters, knock yourself out but don't claim it's gospel for dim room viewing conditions.

Showing that FALD goes brighter for HDR's high end, noone is disputing that, it's a big tradeoff vs per pixel emissive though. repeat ad nauseam.

Glossy will always be clearer than scratching and abrading your outer screen surface , AG just won't get the sheen from reflected light from the outside of the screen if viewing it in dim to dark, where HDR should be viewed anyway in the first place. Ironically people typically get AG to use in poor lighting conditions because they are unwilling or unable to use them in an optimal environment in relation to the screen - so they will get the tradeoffs. The raised blacks thing wouldn't matter as much to me since I view HDR in dim to dark conditions. My laptop is AG which was a tradeoff for the other specs but I wish it was glossy like my tablet I use next to it. Even in dim conditions the layer is frosty by comparison. Glossy is better than scratching your screen surface, even with tiny scratches.

Your imagination is off the cliff. AG coating is made to sustain much more contrast than glossy when the ambient light is a little higher.

And 1000nits is not harsh to eyes when a monitor is flicker-free even in a pitch black room. Or is it because your eyes and brains are too old to stand contrast?

People have been always seeing 1000+nits flicker-free HDR highlight in a pitch black room without any problem. How funny it is that you are trying to say 80nits can be like 400nits. It's more like OLED flickering keeps you from seeing any higher "harsh" 1000nits lol.
 
Yeah I gotta be done with the repetitive back and forth; sorry for any contribution to derailing this thread at all as I really wanted to argue my point, but I feel like I've done that well, and all I'm getting back is recycled, disproven, improper use case, or misinformed stuff back. If I have anything new to add, I may, but at this point, it's the same false arguments, misrepresentations, and unwillingness to see all current tech has compromises and it's up to the user to choose what suits them best according to preference as far as the very real tradeoffs all techs have. I'm just going to enjoy having a great 27" HDR OLED that's working great for my situation. Maybe I'll give FALD monitors another try some day as the technology improves, but it's not there yet for my uses, so this is a better fit at the moment. I'll also be interested to see how it fairs as far as burn-in given my original OLED experience years ago. I wonder if either technology will be the future, or if there will be some sort of hybrid or something new altogether once we finally get something that does everything well.

By the way, I've found all the technological discussions (the ones with substance) really interesting in this thread. =)
 
Yeah I gotta be done with the repetitive back and forth; sorry for any contribution to derailing this thread at all as I really wanted to argue my point, but I feel like I've done that well, and all I'm getting back is recycled, disproven, improper use case, or misinformed stuff back. If I have anything new to add, I may, but at this point, it's the same false arguments, misrepresentations, and unwillingness to see all current tech has compromises and it's up to the user to choose what suits them best according to preference as far as the very real tradeoffs all techs have. I'm just going to enjoy having a great 27" HDR OLED that's working great for my situation. Maybe I'll give FALD monitors another try some day as the technology improves, but it's not there yet for my uses, so this is a better fit at the moment. I'll also be interested to see how it fairs as far as burn-in given my original OLED experience years ago. I wonder if either technology will be the future, or if there will be some sort of hybrid or something new altogether once we finally get something that does everything well.

By the way, I've found all the technological discussions (the ones with substance) really interesting in this thread. =)

You better be an OLED representative to keep advertising it while secretly undermining FALD. Just because you talk as diplomatic as chatGPT doesn't mean you are not doing the exactly the same thing. Good luck to just see a tone-mapped 300nits sun.
 
In the end it's you seeing much worse images on OLED without realizing what HDR really is.

I don't have an OLED. I have a 50" mini LED TV as a monitor that can hit 2000 nits in HDR. But that's completely irrelevant when I'm working and I still envy the image quality of OLED for everyday desktop stuff.

This comment shows how you're just trolling now. You're not paying attention to what anyone else is saying, you're just repeating the same ridiculous argument over and over.

You're either so full of hate for OLED for no reason or you're just having a laugh.
 
I don't think he's trolling. I can see his points. He's not alone in thinking oleds are too dim with low nits and ABL etc. The low brightness is unacceptable for me also. In fact I completely agree with Kramnelis lol. I'd rather take a top end mini led over any OLED. If OLED was up to the levels we needed it to be I would have one also but it's just not there yet. Plus the burn in as a PC monitor makes it automatically disqualified. Lol.
 
FALD is definitely much brighter and much longer sustained for sure in HDR, and OLEDs levels are especially a trade-off for people using their screen in sub-optimal for HDR material (on any screen) brighter room conditions (though those conditions also "activate" the AG surface tradeoffs). No-one argued that. There are major tradeoffs on both sides so pick the one you prefer. A 45x25 lighting resolution isn't there yet for those of us in the per pixel emissive camp. That could change down the road if they go magnitudes smaller and get a much higher backlighting resolution but that is a long ways off. VR will be all pixel emissive in the headsets/pancake lens goggle-like designs in the next iterations and all the way forward from there, using microOLED (which reportedly, can supposedly go very bright, plus it's right against your eyeballs). Later probably everything will go microLED. Per pixel emissive is a better way to do things. FALD is a stop gap solution of puddle jumping light/dark zones in the meantime. It's clever but it has limitations due to the very low lighting resolution. Enjoy whichever one suits your tastes better.

However - you know from my and other's replies that gaming OLEDs reserve the top ~ 25% of the brightness/energizing capability for a wear evening routine. You aren't burning in anything immediately in normal use. It will take years to burn down through that buffer. Many people in the oled threads are on 4yrs +. Your burn in tradeoff only applies to not desiring to use avoidance measures if you wanted to extend that lifespan longer. The simple things like no desktop icons, dark modes, hide the taskbar, logo dimming feature, pixel shift, and I even use the turn off the screen emitters trick when I go afk (though not everyone does even that). If you don't go out of your way to abuse an OLED it will last 4+ years due to the wear evening buffer. It's not a like a phone or tablet getting burn in immediately from things. It's a tradeoff to take some simple measures like that but *getting* burn in is not a real tradeoff in modern oleds in I'd guess a 4 - 5yr lifespan of heavy use unless you are purposely being a fool with one. So that is largely FUD, especially for HDR media and gaming digestion though people have been using oleds as desktop screens for over 4 years now w/o burn in, even disabling asbl. Still, personally I've used multiple screen for years so I keep my oled primarily as a media+gaming "stage" screen.
 
Last edited:
I don't think he's trolling. I can see his points. He's not alone in thinking oleds are too dim with low nits and ABL etc. The low brightness is unacceptable for me also. In fact I completely agree with Kramnelis lol. I'd rather take a top end mini led over any OLED. If OLED was up to the levels we needed it to be I would have one also but it's just not there yet. Plus the burn in as a PC monitor makes it automatically disqualified. Lol.

The point is he's not reading what other people are saying. He's said several times that the preferences of others are irrelevant, he's going to scream at people for liking OLED anyway. That's not how you promote the benefits (as you see them) of your preferred technology.
 
I don't think he's trolling. <... > . In fact I completely agree with Kramnelis lol.

I don't have a problem with you aggreeing with valuing the trade-offs of low lighting rez, high brightness/color volume, matte abraded FALD vs per pixel emissive's tradeoffs. That comes down to what you value more. However . .

A major problem in the thread is that for example . . Dolby Mastering engineers on the dolby site master HDR 10,000 typical scene makeup as 50% of the screen at 100nit, 25% at 100 nit to 1000nit, and 25% at 1000nit to 10,000nit for dim to dark room HDR viewing. People's screens will map the bottom portion of the screen's capability and then tone map the top potion of the screen's capability, compressing the top end down into the top portion of their screen. When they get brighter screens, less of that top portion is compressed down. However, it's not lifting the bottom end (that was painstakingly, professionally mapped logically to what it is representing scene by scene) just because the screen has a brighter top end capability. Kram is instead, in dim to dark viewing conditions, perverting the professional curves with a blind filter and torching 100nit - 200nit up to 400 - 600 nit. That's not how it works, it's his personal agenda. It's fine if he likes it that way but that's perverting/distorting it and it doesn't really apply.

Similarly, bright ~ 400nit SDR capability on tvs is given high marks in reviews sites because it can combat bright room conditions where otherwise regular sdr levels would look much dimmer than they do in dim to dark viewing conditions. This is because our eyes perceive things relatively. They aren't marketing it as viewing 400nit sdr in dark room conditions, they are marketing it as using 400nit to make bright room condition's SDR look as bright relative to the bright room as the regular SDR levels look in dim to dark viewing conditions. Yet again, this can be used by kram as an excuse to torch sdr levels in his personal "reshade filter" agenda.

So while there are very valid large tradeoffs and limitations of both technologies currently, and many people brought up data on them incl Kram, and you can choose which tradeoffs are more meaningful to you from all of that - - the above is not really applicable so I don't think completely agreeing with his distorted viewing choices is a valid position to take. That's part of his "trolling", that and taking things said wrong (there may be a language barrier there somewhat but it does not account for all of it by a long shot), not understanding or glancing away as if something else was said rather than what was meant, saying oled isn't HDR capable at all, completely refuting reputable review site's finding and positions of trade-offs of per pixel emissive vs current FALD and even matteAG (and they are just that, trade-offs so pick your poison), etc. That and also generally being a toxic online persona and resorting to ad hominem/frequent personal attacks and insults too, lashing out.

So saying you completely agree with his standpoint rather than just his more salient information, and that you don't think he's trolling at all, is a bad position to take imo.
 
Last edited:
I don't have an OLED. I have a 50" mini LED TV as a monitor that can hit 2000 nits in HDR. But that's completely irrelevant when I'm working and I still envy the image quality of OLED for everyday desktop stuff.

This comment shows how you're just trolling now. You're not paying attention to what anyone else is saying, you're just repeating the same ridiculous argument over and over.

You're either so full of hate for OLED for no reason or you're just having a laugh.
Funny this is never about what you use or what you like.

My point is always that OLED is worse due to low brightness and flickering. It doesn't have better images.

It's fine if you want to see sRGB or low range images but these are worse images compared to HDR.

All I see is how guys like Elvn busted out his quotes he doesn't understand to defend limited range is better. It can only be worse because eyes can see a lot more. It's very similar when you defend 60Hz better than 144Hz.
 
FALD is definitely much brighter and much longer sustained for sure in HDR, and OLEDs levels are especially a trade-off for people using their screen in sub-optimal for HDR material (on any screen) brighter room conditions (though those conditions also "activate" the AG surface tradeoffs). No-one argued that. There are major tradeoffs on both sides so pick the one you prefer. A 45x25 lighting resolution isn't there yet for those of us in the per pixel emissive camp. That could change down the road if they go magnitudes smaller and get a much higher backlighting resolution but that is a long ways off. VR will be all pixel emissive in the headsets/pancake lens goggle-like designs in the next iterations and all the way forward from there, using microOLED (which reportedly, can supposedly go very bright, plus it's right against your eyeballs). Later probably everything will go microLED. Per pixel emissive is a better way to do things. FALD is a stop gap solution of puddle jumping light/dark zones in the meantime. It's clever but it has limitations due to the very low lighting resolution. Enjoy whichever one suits your tastes better.

On paper yes it would seem like we are a long ways off from having satisfactory FALD numbers. 1152 dimming zones does seem pretty paltry when compared to the 10,000 - 1M dimming zones that Innolux was supposedly working on since 2019. But you really just need to have some real world usage and not just compare the paper specs to see which one provides the better viewing experience. I've been quite happy with my InnoCN 32M2V over the past few days, they tuned the FALD algorithm on it to minimize blooming as much as possible, this thing seriously has almost non existent bloom, although that does come at the cost of highlights being quite dimmer compared to other FALD displays at around 600 nits vs 1000 nits on my X27. However, 600 nit highlights is still pretty close to OLED anyways and the fact that it comes with no blooming means I'm not really losing anything vs OLED in the tiny window sizes, and I actually prefer this tradeoff of dimmer highlights with zero bloom vs brighter highlights that come with a ton of bloom on the X27. The biggest thing is that I gain though is much much more impact in the larger window sizes where the 32M2V can do 1000+ nits up to a 50% window and 700 nits fullfield. I don't do side by side comparisons because as you said our eyes will adjust to the brighter screen and make the dimmer one look more washed out than it actually is, but I have gone back and fourth doing individual A/B comparisons with my CX and I'm telling you the per pixel dimming of the CX isn't really as huge of an advantage as it seems on paper anymore. The InnoCN is still hitting the same highlight brightness as the CX and doing so with no bloom except for in the absolute most challenging HDR scenes like starfields, and it completely thrashes the CX in higher APL scenes. I do think an LG G3 or Samsung S95C will fair much better as those can do 1500 nits on a 10% window and have greater brightness throughout the rest of the window sizes, but those start at 55 inches and I'm not willing to go back to that for desktop use. When compared to a 42-48 inch CX-C2 though, I would say a good MiniLED will provide a better HDR experience despite the FALD not sounding impressive enough on paper compared to "per pixel dimming".
 
The point is he's not reading what other people are saying. He's said several times that the preferences of others are irrelevant, he's going to scream at people for liking OLED anyway. That's not how you promote the benefits (as you see them) of your preferred technology.
Funny it's more like you don't understand what other people are saying.

I understand exactly what their intentions are when they try to say something like:

1. 400nits equals 80nits.
2. Contents are meant to be made in the low range.
3. 300nits tone-mapped sun are good HDR lol.

I can easily counter these points.
1. Unless you pull out your monitor to bath in the sun light 400nits won't be anyway close to 80nits.
2. Contents has a lot higher range. It's a lot higher than whatever elvn quotes.
3. 300nits tone-mapped sun looks much worse compared to SDR. That 27GR95QE has dimmer HDR compared to SDR no matter how ridiculous it is. This why people would rather just see SDR on that monitor lol.

I understand they let their imaginations fly sky high. They just try to defend limited OLED with much worse images. You don't have a chance when you say something like these to defend worse images.
 
A major problem in the thread is that for example . . Dolby Mastering engineers on the dolby site master HDR 10,000 typical scene makeup as 50% of the screen at 100nit, 25% at 100 nit to 1000nit, and 25% at 1000nit to 10,000nit for dim to dark room HDR viewing. People's screens will map the bottom portion of the screen's capability and then tone map the top potion of the screen's capability, compressing the top end down into the top portion of their screen. When they get brighter screens, less of that top portion is compressed down. However, it's not lifting the bottom end (that was painstakingly, professionally mapped logically to what it is representing scene by scene) just because the screen has a brighter top end capability. Kram is instead, in dim to dark viewing conditions, perverting the professional curves with a blind filter and torching 100nit - 200nit up to 400 - 600 nit. That's not how it works, it's his personal agenda. It's fine if he likes it that way but that's perverting/distorting it and it doesn't really apply.
It doesn't even apply to the current content. Your numbers doesn't even stand a chance. 50% windows at only 100nits lol. It's more like 50% from 100nits to 1,000nits. Go enjoy your dim ABL images.


Similarly, bright ~ 400nit SDR capability on tvs is given high marks in reviews sites because it can combat bright room conditions where otherwise regular sdr levels would look much dimmer than they do in dim to dark viewing conditions. This is because our eyes perceive things relatively. They aren't marketing it as viewing 400nit sdr in dark room conditions, they are marketing it as using 400nit to make bright room condition's SDR look as bright relative to the bright room as the regular SDR levels look in dim to dark viewing conditions. Yet again, this can be used by kram as an excuse to torch sdr levels in his personal "reshade filter" agenda.
Don't forget it's the OLED HDR gets so dim to the point that 400nits SDR can look better.

So while there are very valid large tradeoffs and limitations of both technologies currently, and many people brought up data on them incl Kram, and you can choose which tradeoffs are more meaningful to you from all of that - - the above is not really applicable so I don't think completely agreeing with his distorted viewing choices is a valid position to take. That's part of his "trolling", that and taking things said wrong (there may be a language barrier there somewhat but it does not account for all of it by a long shot), not understanding or glancing away as if something else was said rather than what was meant, saying oled isn't HDR capable at all, completely refuting reputable review site's finding and positions of trade-offs of per pixel emissive vs current FALD and even matteAG (and they are just that, trade-offs so pick your poison), etc. That and also generally being a toxic online persona and resorting to ad hominem/frequent personal attacks and insults too, lashing out.
Funny it's you banging on a CX as good as 100nits APL without a chance to see anything better but with all that imagination off the cliff. This is why you pull out all these useless quotes you don't even understand.

I don't think he's trolling. I can see his points. He's not alone in thinking oleds are too dim with low nits and ABL etc. The low brightness is unacceptable for me also. In fact I completely agree with Kramnelis lol. I'd rather take a top end mini led over any OLED. If OLED was up to the levels we needed it to be I would have one also but it's just not there yet. Plus the burn in as a PC monitor makes it automatically disqualified. Lol.
They talk like they've never seen better.
 
I understand they let their imaginations fly sky high. They just try to defend limited OLED with much worse images. You don't have a chance when you say something like these to defend worse images.
General conclusion of people whose opinion actually matter is that OLEDs are better more promising technology.
It probably has to do with fact that most people do not actually want super bright displays anyways and do not equate high brightness with quality just to pretend to win argument on internet when everyone see they waste their time arguing with internet troll.
 
General conclusion of people whose opinion actually matter is that OLEDs are better more promising technology.
It probably has to do with fact that most people do not actually want super bright displays anyways and do not equate high brightness with quality just to pretend to win argument on internet when everyone see they waste their time arguing with internet troll.
Of course many people have good opinions such as how good subprime mortgages were.

What is better promising technology? You call OLED better when it is only able to display 100nits APL plus a few dots? That looks like crap compared to the actual HDR1000 with 10bit color.

The points of failure of OLED are low brightness and flickering. They cannot be fixed. OLED won't be a future.
 
holy fuck man drop it. all monitors have compromises, use what you like but stop with the evangelism.
 
holy fuck man drop it. all monitors have compromises, use what you like but stop with the evangelism.
All monitors have compromises. But there is a worse compromise and this's OLED.

Funny it's never about what you like. It's all about OLED being a lot worse in PC use. You can never see better on OLED with so limited range.

OLED is much less accurate in HDR when the brightness drops off the chart. When it gets only a little bit brighter then you get eyestrain due to unsolvable flickers.
 
Right, there is no single monitor currently that's great for everything. It's best, although expensive, to have different setups for different things. For example, a setup with the 40WP95C for work/browsing, a setup with a FALD monitor (e.g. PG32UQX) for some games and videos, and an OLED (e.g. AW3423DW or 45GR95QE) for other games and videos. I think with 3 setups you can have it covered if you're ok with the space, hassle, and expense.
 
All monitors have compromises. But there is a worse compromise and this's OLED.

Funny it's never about what you like. It's all about OLED being a lot worse in PC use. You can never see better on OLED with so limited range.

OLED is much less accurate in HDR when the brightness drops off the chart. When it gets only a little bit brighter then you get eyestrain due to unsolvable flickers.
see, you cant even stop yourself from rattling off bullshit that has nothing to do with what i said. just stop.
 
Right, there is no single monitor currently that's great for everything. It's best, although expensive, to have different setups for different things. For example, a setup with the 40WP95C for work/browsing, a setup with a FALD monitor (e.g. PG32UQX) for some games and videos, and an OLED (e.g. AW3423DW or 45GR95QE) for other games and videos. I think with 3 setups you can have it covered if you're ok with the space, hassle, and expense.
I've always said there is no perfect monitor. People have been using multiple monitors for different purposes unlike some guys only use one display to do it all.

If they have enough monitors they will find out 95% of the time they will use FALD to see better images. Only 5% of the time it is necessary to use a faster monitor such as TN or fast IPS for ranking. OLED can only join into that 5%.
 
see, you cant even stop yourself from rattling off bullshit that has nothing to do with what i said. just stop.
You think you can stop me with what? With the bullshit that OLED looks better for PC use? I know how you bang on your OLED to see that pathetic limited range as good as SDR.
 
I've always said there is no perfect monitor. People have been using multiple monitors for different purposes unlike some guys only use one display to do it all.

If they have enough monitors they will find out 95% of the time they will use FALD to see better images. Only 5% of the time it is necessary to use a faster monitor such as TN or fast IPS for ranking. OLED can only join into that 5%.
It could be a lot more than 5% depending on the type of gaming you do. Also, OLED does look better in some ways than FALD LCD, at least for some content. If I had to pick just one, I would pick FALD LCD, but it's certainly nice to have both, and use them for different content.
 
It could be a lot more than 5% depending on the type of gaming you do. Also, OLED does look better in some ways than FALD LCD, at least for some content. If I had to pick just one, I would pick FALD LCD, but it's certainly nice to have both, and use them for different content.
You've missed the spot that OLED is flickering. I've tried to play eSport games on it but either it's too dim to give an overall advantage or a little higher 250nits can give eyestrain so I have to stop using it.

I have both OLED and FALD to compare. OLED it is lifeless in SDR sRGB compared to wide gamut FALD SDR not even mentioning triple A games looks tons of better on FALD HDR1000.

They can only debate how accuracy matters in worse sRGB while not caring the real important accuracy in HDR. It's like Elden Ring locked at intended 60fps so that 144Hz doesn't matter. You can even find there is an article saying Elden Ring looks dull. It only exposes they use OLED. Elden Ring doesn't look dull at all if you have 2000nits highlight with wide gamut HDR
 
You've missed the spot that OLED is flickering. I've tried to play eSport games on it but either it's too dim to give an overall advantage or a little higher 250nits can give eyestrain so I have to stop using it.

I have both OLED and FALD to compare. OLED it is lifeless in SDR sRGB compared to wide gamut FALD SDR not even mentioning triple A games looks tons of better on FALD HDR1000.

They can only debate how accuracy matters in worse sRGB while not caring the real important accuracy in HDR. It's like Elden Ring locked at intended 60fps so that 144Hz doesn't matter. You can even find there is an article saying Elden Ring looks dull. It only exposes they use OLED. Elden Ring doesn't look dull at all if you have 2000nits highlight with wide gamut HDR

Why would you need wide gamut FALD to show SDR sRGB? I don't understand your point there. OLED shows SDR sRGB very well. I agree that the best FALD displays show HDR better than OLED does overall because of the brightness capabilities, but it looks pretty good on OLED as well (in some ways better, worse in others). I can't comment on the flickering and eyestrain - I haven't experienced it, but admittedly I mostly use my LCDs.
 
You think you can stop me with what? With the bullshit that OLED looks better for PC use? I know how you bang on your OLED to see that pathetic limited range as good as SDR.
clearly you cant read either, 'cause youre still spouting bullshit that has nothing to do with what i said.



 
clearly you cant read either, 'cause youre still spouting bullshit that has nothing to do with what i said.
Clearly you have never seen better. It's your business to get stuck in the limited range to see worse images anyway.
 
Why would you need wide gamut FALD to show SDR sRGB? I don't understand your point there. OLED shows SDR sRGB very well. I agree that the best FALD displays show HDR better than OLED does overall because of the brightness capabilities, but it looks pretty good on OLED as well (in some ways better, worse in others). I can't comment on the flickering and eyestrain - I haven't experienced it, but admittedly I mostly use my LCDs.
The problem is OLED can only show SDR sRGB or DCI-P3.

SDR isn't just about sRGB. There is wider colorspace such as Adobe RGB which OLED cannot have. Once you see Adobe color at a higher range in SDR you just don't want to go back to see lifeless sRGB. Adobe color also looks better than DCI-P3.

OLED can only have DCI-P3 at most. There are already people who find out 27GR95QE "HDR Effect" with DCI-P3 SDR looks better than whatever the more dim inaccurate HDR it has.

On PA32UCG, there is sRGB Mode with fixed 80nits. It's very accurate for sRGB document editing. But I won't go back to this mode unless I'm doing sRGB work. You won't be on that mode either. It's a waste to daily drive sRGB 80nits because Adobe RGB Mode has more color at much higher brightness to deliver better images. And most of the time you are either on HDR or on Standard Mode with higher brightness. It's not as accurate as the limited sRGB 80nits that looks the same on an office monitor. But it looks better this way.
 
Last edited:
The problem is OLED can only show SDR sRGB or DCI-P3.

SDR isn't just about sRGB. There is wider colorspace such as Adobe RGB which OLED cannot have. Once you see Adobe color at a higher range in SDR you just don't want to go back to see lifeless sRGB. Adobe color also looks better than DCI-P3.

OLED can only have DCI-P3 at most. There are already people who find out 27GR95QE "HDR Effect" with DCI-P3 SDR looks better than whatever the dimly inaccurate HDR it has.

On PA32UCG, there is sRGB Mode with fixed 80nits. It's very accurate for sRGB document editing. But I won't go back to this mode unless I'm doing sRGB work. You won't be on that mode either. It's a waste to daily drive sRGB 80nits because Adobe RGB Mode has more color at much higher brightness to deliver better images. And most of the time you are either on HDR or on SDR with higher brightness which looks better yet not as accurate as the limited sRGB 80nits that looks the same on an office monitor.

This doesn't make sense to me. You don't want to use the wrong gamut for the content. If you're looking at sRGB content, the monitor should be in the sRGB color space, not some wide gamut color space. You can get incorrect colors, which is noticeably bad, if the monitor is set to the wrong color space. Much of the web content, for example, is sRGB, and won't necessarily show correctly if your monitor is in wide gamut. It's very obvious on red colors, for example. The wide gamut color spaces aren't really all that relevant in SDR unless you're doing some "professional"-type work with wide gamut content.
 
This doesn't make sense to me. You don't want to use the wrong gamut for the content. If you're looking at sRGB content, the monitor should be in the sRGB color space, not some wide gamut color space. You can get incorrect colors, which is noticeably bad, if the monitor is set to the wrong color space. Much of the web content, for example, is sRGB, and won't necessarily show correctly if your monitor is in wide gamut. It's very obvious on red colors, for example. The wide gamut color spaces aren't really all that relevant in SDR unless you're doing some "professional"-type work with wide gamut content.
No. It will only show more color at a higher range but not less color so that it can look close to HDR400.

sRGB is most accurate at 80nits. But you won't use the monitor at 80nits just to match the accuracy for the lowest standard. sRGB doesn't have a better image. It has office images for documents.

The goal is always to see more than a limited range. That's why HDR exists. Games always look better than sRGB with Adobe color space at higher brightness.
 
This doesn't make sense to me. You don't want to use the wrong gamut for the content. If you're looking at sRGB content, the monitor should be in the sRGB color space, not some wide gamut color space. You can get incorrect colors, which is noticeably bad, if the monitor is set to the wrong color space. Much of the web content, for example, is sRGB, and won't necessarily show correctly if your monitor is in wide gamut. It's very obvious on red colors, for example. The wide gamut color spaces aren't really all that relevant in SDR unless you're doing some "professional"-type work with wide gamut content.

Why would you need wide gamut FALD to show SDR sRGB? I don't understand your point there. OLED shows SDR sRGB very well. I agree that the best FALD displays show HDR better than OLED does overall because of the brightness capabilities, but it looks pretty good on OLED as well (in some ways better, worse in others). I can't comment on the flickering and eyestrain - I haven't experienced it, but admittedly I mostly use my LCDs.

This x1000. 80%+ of what I do is in sRGB, so sRGB reproduction is important to me. I do a lot of web browsing, some office work, and I also enjoy various artwork, photos, etc., and most of that is still in the sRGB standard. I also do plenty of work in interfaces on the web (Discord for example). So good sRGB image reproduction is my single biggest criteria for a monitor. It's why OLED has such an advantage for me. The perfect blacks and viewing angles make a big difference in how good sRGB images look, and I found the blooming on FALD quite distracting for sRGB uses (or I could turn local dimming off, which was a hassle and resulted in poorer contrast, of course, but that wasn't ideal). Things like Discord chats would often be a mess with local dimming on. I can say without a second thought this OLED is the best monitor I've ever owned as far as showing sRGB content. My previous (non-FALD) IPS couldn't hold a candle to the contrast this can do. Newer IPS panels get close, but with local dimming off, still can't really compete. I can't speak on others getting eyestrain with OLED - I can just say I haven't experienced any, personally, despite quite a bit of use, and I actually find this monitor among the easiest on my eyes I've used. But, some people may be sensitive to different things than I am.

It makes zero sense for me to either expand the range to a wider space or convert sRGB content to HDR. While it's a nice ability, as you point out, just expanding the range is wildly inaccurate. It looks nice initially 'til you realize nothing appears as it actually should with proper colors. I could use Auto HDR to retain most of the accuracy, but my experiments with that (on this monitor as well as the FALD ProArt I'd tried before) made me feel that sRGB still looks best as SDR. It's close in a lot of content, but I don't feel like any perceived advantages of the algorithm up are worth the downsides. (For example, I tried it with an SDR game and felt like it slightly harmed the contrast, if anything, and the menus were too bright - it just looked more natural in SDR.)

No contest that HDR, particularly high-nit HDR, looks best on FALD. But I also find it looks "pretty good" on OLED, which is all I need for the games I play in HDR. Most offer good in-game settings to compensate for the display because they know the game will be played on a variety of monitor technologies. And if there's a bad HDR implementation, I can always play the SDR version. PC HDR is still a bit inconsistent, but it's getting better. Quite a few new games still launch with SDR only, though - the one I'm playing now is SDR. And I don't really watch movies on my PC, so for shows/movies in HDR/Dolby Vision, those are generally watched on my FALD TV anyways, so I get the brightness advantages with that content.

As far as having multiple monitors for the best of both worlds, that'd be nice and I think some of the setups you folks have for this are amazing, but it really isn't practical for my space (plus logistics wise I like having a monitor centered in front of me), so I decided on one with the focus on choosing what does best in what I do the most while still having respectable capabilities in other areas. For me, after trying FALD but not being totally happy, that's ended up being OLED, but I can certainly understand why someone else with different math and priorities would choose FALD.
 
This x1000. 80%+ of what I do is in sRGB, so sRGB reproduction is important to me. I do a lot of web browsing, some office work, and I also enjoy various artwork, photos, etc., and most of that is still in the sRGB standard. I also do plenty of work in interfaces on the web (Discord for example). So good sRGB image reproduction is my single biggest criteria for a monitor. It's why OLED has such an advantage for me. The perfect blacks and viewing angles make a big difference in how good sRGB images look, and I found the blooming on FALD quite distracting for sRGB uses (or I could turn local dimming off, which was a hassle and resulted in poorer contrast, of course, but that wasn't ideal). Things like Discord chats would often be a mess with local dimming on. I can say without a second thought this OLED is the best monitor I've ever owned as far as showing sRGB content. My previous (non-FALD) IPS couldn't hold a candle to the contrast this can do. Newer IPS panels get close, but with local dimming off, still can't really compete. I can't speak on others getting eyestrain with OLED - I can just say I haven't experienced any, personally, despite quite a bit of use, and I actually find this monitor among the easiest on my eyes I've used. But, some people may be sensitive to different things than I am.

It makes zero sense for me to either expand the range to a wider space or convert sRGB content to HDR. While it's a nice ability, as you point out, just expanding the range is wildly inaccurate. It looks nice initially 'til you realize nothing appears as it actually should with proper colors. I could use Auto HDR to retain most of the accuracy, but my experiments with that (on this monitor as well as the FALD ProArt I'd tried before) made me feel that sRGB still looks best as SDR. It's close in a lot of content, but I don't feel like any perceived advantages of the algorithm up are worth the downsides. (For example, I tried it with an SDR game and felt like it slightly harmed the contrast, if anything, and the menus were too bright - it just looked more natural in SDR.)

No contest that HDR, particularly high-nit HDR, looks best on FALD. But I also find it looks "pretty good" on OLED, which is all I need for the games I play in HDR. Most offer good in-game settings to compensate for the display because they know the game will be played on a variety of monitor technologies. And if there's a bad HDR implementation, I can always play the SDR version. PC HDR is still a bit inconsistent, but it's getting better. Quite a few new games still launch with SDR only, though - the one I'm playing now is SDR. And I don't really watch movies on my PC, so for shows/movies in HDR/Dolby Vision, those are generally watched on my FALD TV anyways, so I get the brightness advantages with that content.

As far as having multiple monitors for the best of both worlds, that'd be nice and I think some of the setups you folks have for this are amazing, but it really isn't practical for my space (plus logistics wise I like having a monitor centered in front of me), so I decided on one with the focus on choosing what does best in what I do the most while still having respectable capabilities in other areas. For me, after trying FALD but not being totally happy, that's ended up being OLED, but I can certainly understand why someone else with different math and priorities would choose FALD.
sRGB is too old. Funny you are not even seeing the most accurate sRGB. It needs 80nits to be most accurate. You just use sRGB as an excuse to cover the inability of OLED.

You care so much accuracy for sRGB to look at images as good as office documents while ignoring the accuracy of HDR.

27GR95QE OLED HDR looks like a dim crap that it needs rescue from its own SDR with "HDR Effect" mode. Your OLED HDR is dimmer than SDR. It's also the ironic reason why wide gamut SDR is useful.
 
Not gonna do the back and forth, so anything I post as an argument from now on will be posted once, not get at all personal, and if not responded to with an actual, factual argument (which isn't really happening from certain parties) won't need to be commented on again. That said, a couple things I've only partially commented on or haven't yet, so I thought I'd chime in:

- sRGB is still *the* standard on PC (as well as phones, etc.) for most everyday applications, including things like photos, where it's still the default. While many modern PCs/monitors support wide gamuts on some level, the entirety of the Internet is mostly sRGB based. That include everything from commercial sites to photo and artistic websites/collections to many games; it's a LOT more than just office work, as some have claimed. Anything with a wider gamut is still the exception, not the rule.

- HDR is the future, but it's still fairly inconsistent in its implementation and can be very hit or miss, not to mention the various formats and standards. In short, HDR looks great when done properly, but there are a lot of caveats and poor implementations of it as well. In addition, there are adoptions problem, particularly among older people who don't understand it. I sometimes help older people with AV setup and most of them have no idea HDR even exists, let alone how to make sure their TV is set up properly or shows it in the right preset, etc. Younger people tend to have fewer problems, but range switching still has a ways to go before it's truly simple. The system in Windows 11 is a lot better than 10's but still somewhat cumbersome. And the fact that there are all these competing technologies means that HDR setup of some kind to tone map is necessary on most displays; displays that can do high-nit HDR without tone-mapping are also the exception at the moment.

- "HDR Effect" modes are included as pseudo-HDR modes on many different TV's and monitors; it has little to do with native device HDR support, which many OLEDS including mine do have; it's for people who like the look and want to do the equivalent of Windows Auto HDR on the display level instead of in software. Personally, I'd never use it as I want to see sRGB in the range as close to the source as possible and find it visually preferable, just as I don't use Auto HDR on Windows 11, but it's nice for the people who do like it.

- SDR is plenty bright on my OLED. (After calibration, I'm at 84 brightness as I didn't need the full 100 to reach my target of 160 nits in a 10% window). HDR is bright enough for me in real-world use and provides a nice level of pop and 10-bit color for bright details. In games, once set up, it looks quite good and I have no real complaints. It'll be nice to eventually have more brightness headroom in a future monitor as HDR evolves, but it's enough for me right now to be able to enjoy it.

[EDIT 02/23/23]: I've seen the criticism that sRGB in SDR should be calibrated to 80 nits and that my choice of calibrating to 160 is inaccurate, so I thought I'd provide some clarification here instead of respond in a cyclic loop since that will just remain unproductive. I'm not sure where the oft-cited 80 nits comes from, but 100 nits is often considered the standard most places I've seen, and some professional calibrators even use 120, depending on the room. I can't remember where I settled on 160 nits when I calibrated my TV a few years ago, but it was after some research - I believe it was a source describing target brightnesses for different lighting conditions, and I chose the highest level that still provided me good accuracy; I wish I remembered the source. It's also worth noting that Adobe RGB uses 160 nits as its white point luminance, which is mostly academic since I use sRGB, but still interesting. My room has fairly good, but not perfect, light control most of the time, as well as a backlight on my monitor, and lighting varies somewhat throughout the day. On top of that, I just *like* things a touch brighter (which I mentioned from the outset of participation in this thread - it's not a secret). I also run calibration to the specific nit level I want ie. 160, which should hone in accuracy a bit further since it bases calibration on that nit level. I've said I value accuracy, but I also value personal preference. The accuracy variance by going slightly brighter would be extremely small and is acceptable to me for more enjoyment (just as to some people Auto HDR is an acceptable difference to get HDR elements in their SDR, while it's a bridge too far for me). Personal preference is still important. I'm fine with a slightly brighter than reference picture with a fairly high degree of accuracy, especially with ABL full-screen. (It's still going to be a lot more accurate in sRGB than FALD with local dimming on and the blooming that I found distracting).
 
Last edited:
Back
Top