Comixbooks
Fully [H]
- Joined
- Jun 7, 2008
- Messages
- 22,066
Is there a video for that animated gif comparing the Asus vs the LG OLED?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
The direct light source vectors hitting the matte or ag screens will blow out contrast and pale saturation, washing out areas of the screen they hit and are diffused onto. Allowing lighting conditions to change will also alter the way our eyes/brain perceives the screen's contrast and saturation so even their "calibrated" values will be lost to your eyes and brain. E.g. the screen will look more pale, weakly contrasted and undersaturated the brighter the room gets, and vice versa. Some keep several sets of settings so that they can switch between them for different times of the day or different room lighting conditions. So you are going to get compromised results if you don't design your viewing environment more optimally no matter what screen coating you have.
Is there a video for that animated gif comparing the Asus vs the LG OLED?
Having direct light hit your monitor is obviously bad. I don't have that - there's no light source in front of my monitors. Still, with a glossy screen, the reflections from the room have a much worse effect on the image than with an AG coated screen. The glossy screen acts like a mirror. AG coating is used on monitors for very good reasons.
.The PG42UQ features a more traditional monitor-like matte anti-glare coating, as opposed to a glossy panel coating like you’d find on TV’s including the LG C2. This does a very good job of reducing reflections and handling external light sources like windows and lamps and we noticed much better reflection handling (no surprise) than the LG C2. However this does mean that in some conditions the blacks do not look as deep or inky visually to the user. With this being an OLED panel, famous for its true blacks and amazing contrast ratio this could be considered a problem – are you “wasting” that by having an AG coating that reduces your perceived contrast?
In certain conditions blacks look a little more dark grey as the anti-reflective coating reflects some of the surrounding light back at you and it “dulls” the contrast a bit. The anti-glare coating means the image is not as clear and clean as a fully glossy coating. You don’t get this same effect if the coating is fully glossy as there’s no AG layer, but what you do get instead is more reflections. Don’t forget this same thing applies to all AG coated desktop monitors, you have the same impact on perceived black depth and contrast on IPS, TN Film and VA panels depending on your lighting conditions if there’s an AG coating used. You’d still get better relative blacks and contrast on the OLED (not to mention other benefits) compared with LCD technologies. They are all impacted in the same way by their coatings.
Glossy is even more of a problem than anti-reflective coating if there's light in the room. My room isn't dark most of the time when I use my monitors, so anti-reflective coating works much better than glossy.
Funny several quotes cut from somewhere else can trick you to see 80nits as bright as 400nits in the same viewing environment.I dont think he's able to understand, pages of explanation made no difference.
Our eyes+brain work relative to ambient lighting/lighting environment so 300 - 400 nit in a bright viewing environment goes back to looking more like 100 nit sdr would a dim to dark viewing environment. That's why reviews give high marks to SDR that does 300 - 400 nit, it's because it can compensate for the effect on your eyes of viewing TVs in a bright viewing environment, not because it torches the SDR/low ranges for use in a dim to dark viewing environment.
Its the same reason a phone or tablet blasted at 100% brightness at night in the dark or a dark bedroom is harshly bright but in bright daylight conditions it can look pale by comparison. Or doing the same with a flashlight, etc. Our eyes work in a relative fashion and adapt to lighting conditions. Another reason why HDR, as you know, is made to be viewed in dim to dark viewing conditions in the first place. The environment which also helps vs AG coating tradeoff.
Even if you calibrate your screen or have a screen with good factory calibration out of the box, those values will all swing to your eyes and brain when you let the room/environment lighting conditions change.
The professional dolby vision mastering personnel on dolby's site map HDR 10,000 average scenes as 50% at ~ 100nit , 25% at 100 to 1000nit, and 25% at 1000nit to 10,000nit. Toned down/compressed into whatever screen you are using in the screen's top end.. They aren't remapping their zero to 100nit scene content or even to 200nit of the average and deep ends up to 400 nit because a TV can do 400nit sdr in order to combat bright viewing conditions, using that torch in dim to dark conditions instead. I don't care what you are doing with your own blind reshade filters, knock yourself out but don't claim it's gospel for dim room viewing conditions.
Showing that FALD goes brighter for HDR's high end, noone is disputing that, it's a big tradeoff vs per pixel emissive though. repeat ad nauseam.
Glossy will always be clearer than scratching and abrading your outer screen surface , AG just won't get the sheen from reflected light from the outside of the screen if viewing it in dim to dark, where HDR should be viewed anyway in the first place. Ironically people typically get AG to use in poor lighting conditions because they are unwilling or unable to use them in an optimal environment in relation to the screen - so they will get the tradeoffs. The raised blacks thing wouldn't matter as much to me since I view HDR in dim to dark conditions. My laptop is AG which was a tradeoff for the other specs but I wish it was glossy like my tablet I use next to it. Even in dim conditions the layer is frosty by comparison. Glossy is better than scratching your screen surface, even with tiny scratches.
Yeah I gotta be done with the repetitive back and forth; sorry for any contribution to derailing this thread at all as I really wanted to argue my point, but I feel like I've done that well, and all I'm getting back is recycled, disproven, improper use case, or misinformed stuff back. If I have anything new to add, I may, but at this point, it's the same false arguments, misrepresentations, and unwillingness to see all current tech has compromises and it's up to the user to choose what suits them best according to preference as far as the very real tradeoffs all techs have. I'm just going to enjoy having a great 27" HDR OLED that's working great for my situation. Maybe I'll give FALD monitors another try some day as the technology improves, but it's not there yet for my uses, so this is a better fit at the moment. I'll also be interested to see how it fairs as far as burn-in given my original OLED experience years ago. I wonder if either technology will be the future, or if there will be some sort of hybrid or something new altogether once we finally get something that does everything well.
By the way, I've found all the technological discussions (the ones with substance) really interesting in this thread. =)
In the end it's you seeing much worse images on OLED without realizing what HDR really is.
I don't think he's trolling. I can see his points. He's not alone in thinking oleds are too dim with low nits and ABL etc. The low brightness is unacceptable for me also. In fact I completely agree with Kramnelis lol. I'd rather take a top end mini led over any OLED. If OLED was up to the levels we needed it to be I would have one also but it's just not there yet. Plus the burn in as a PC monitor makes it automatically disqualified. Lol.
I don't think he's trolling. <... > . In fact I completely agree with Kramnelis lol.
Funny this is never about what you use or what you like.I don't have an OLED. I have a 50" mini LED TV as a monitor that can hit 2000 nits in HDR. But that's completely irrelevant when I'm working and I still envy the image quality of OLED for everyday desktop stuff.
This comment shows how you're just trolling now. You're not paying attention to what anyone else is saying, you're just repeating the same ridiculous argument over and over.
You're either so full of hate for OLED for no reason or you're just having a laugh.
FALD is definitely much brighter and much longer sustained for sure in HDR, and OLEDs levels are especially a trade-off for people using their screen in sub-optimal for HDR material (on any screen) brighter room conditions (though those conditions also "activate" the AG surface tradeoffs). No-one argued that. There are major tradeoffs on both sides so pick the one you prefer. A 45x25 lighting resolution isn't there yet for those of us in the per pixel emissive camp. That could change down the road if they go magnitudes smaller and get a much higher backlighting resolution but that is a long ways off. VR will be all pixel emissive in the headsets/pancake lens goggle-like designs in the next iterations and all the way forward from there, using microOLED (which reportedly, can supposedly go very bright, plus it's right against your eyeballs). Later probably everything will go microLED. Per pixel emissive is a better way to do things. FALD is a stop gap solution of puddle jumping light/dark zones in the meantime. It's clever but it has limitations due to the very low lighting resolution. Enjoy whichever one suits your tastes better.
Funny it's more like you don't understand what other people are saying.The point is he's not reading what other people are saying. He's said several times that the preferences of others are irrelevant, he's going to scream at people for liking OLED anyway. That's not how you promote the benefits (as you see them) of your preferred technology.
It doesn't even apply to the current content. Your numbers doesn't even stand a chance. 50% windows at only 100nits lol. It's more like 50% from 100nits to 1,000nits. Go enjoy your dim ABL images.A major problem in the thread is that for example . . Dolby Mastering engineers on the dolby site master HDR 10,000 typical scene makeup as 50% of the screen at 100nit, 25% at 100 nit to 1000nit, and 25% at 1000nit to 10,000nit for dim to dark room HDR viewing. People's screens will map the bottom portion of the screen's capability and then tone map the top potion of the screen's capability, compressing the top end down into the top portion of their screen. When they get brighter screens, less of that top portion is compressed down. However, it's not lifting the bottom end (that was painstakingly, professionally mapped logically to what it is representing scene by scene) just because the screen has a brighter top end capability. Kram is instead, in dim to dark viewing conditions, perverting the professional curves with a blind filter and torching 100nit - 200nit up to 400 - 600 nit. That's not how it works, it's his personal agenda. It's fine if he likes it that way but that's perverting/distorting it and it doesn't really apply.
Don't forget it's the OLED HDR gets so dim to the point that 400nits SDR can look better.Similarly, bright ~ 400nit SDR capability on tvs is given high marks in reviews sites because it can combat bright room conditions where otherwise regular sdr levels would look much dimmer than they do in dim to dark viewing conditions. This is because our eyes perceive things relatively. They aren't marketing it as viewing 400nit sdr in dark room conditions, they are marketing it as using 400nit to make bright room condition's SDR look as bright relative to the bright room as the regular SDR levels look in dim to dark viewing conditions. Yet again, this can be used by kram as an excuse to torch sdr levels in his personal "reshade filter" agenda.
Funny it's you banging on a CX as good as 100nits APL without a chance to see anything better but with all that imagination off the cliff. This is why you pull out all these useless quotes you don't even understand.So while there are very valid large tradeoffs and limitations of both technologies currently, and many people brought up data on them incl Kram, and you can choose which tradeoffs are more meaningful to you from all of that - - the above is not really applicable so I don't think completely agreeing with his distorted viewing choices is a valid position to take. That's part of his "trolling", that and taking things said wrong (there may be a language barrier there somewhat but it does not account for all of it by a long shot), not understanding or glancing away as if something else was said rather than what was meant, saying oled isn't HDR capable at all, completely refuting reputable review site's finding and positions of trade-offs of per pixel emissive vs current FALD and even matteAG (and they are just that, trade-offs so pick your poison), etc. That and also generally being a toxic online persona and resorting to ad hominem/frequent personal attacks and insults too, lashing out.
They talk like they've never seen better.I don't think he's trolling. I can see his points. He's not alone in thinking oleds are too dim with low nits and ABL etc. The low brightness is unacceptable for me also. In fact I completely agree with Kramnelis lol. I'd rather take a top end mini led over any OLED. If OLED was up to the levels we needed it to be I would have one also but it's just not there yet. Plus the burn in as a PC monitor makes it automatically disqualified. Lol.
General conclusion of people whose opinion actually matter is that OLEDs are better more promising technology.I understand they let their imaginations fly sky high. They just try to defend limited OLED with much worse images. You don't have a chance when you say something like these to defend worse images.
Of course many people have good opinions such as how good subprime mortgages were.General conclusion of people whose opinion actually matter is that OLEDs are better more promising technology.
It probably has to do with fact that most people do not actually want super bright displays anyways and do not equate high brightness with quality just to pretend to win argument on internet when everyone see they waste their time arguing with internet troll.
All monitors have compromises. But there is a worse compromise and this's OLED.holy fuck man drop it. all monitors have compromises, use what you like but stop with the evangelism.
see, you cant even stop yourself from rattling off bullshit that has nothing to do with what i said. just stop.All monitors have compromises. But there is a worse compromise and this's OLED.
Funny it's never about what you like. It's all about OLED being a lot worse in PC use. You can never see better on OLED with so limited range.
OLED is much less accurate in HDR when the brightness drops off the chart. When it gets only a little bit brighter then you get eyestrain due to unsolvable flickers.
I've always said there is no perfect monitor. People have been using multiple monitors for different purposes unlike some guys only use one display to do it all.Right, there is no single monitor currently that's great for everything. It's best, although expensive, to have different setups for different things. For example, a setup with the 40WP95C for work/browsing, a setup with a FALD monitor (e.g. PG32UQX) for some games and videos, and an OLED (e.g. AW3423DW or 45GR95QE) for other games and videos. I think with 3 setups you can have it covered if you're ok with the space, hassle, and expense.
You think you can stop me with what? With the bullshit that OLED looks better for PC use? I know how you bang on your OLED to see that pathetic limited range as good as SDR.see, you cant even stop yourself from rattling off bullshit that has nothing to do with what i said. just stop.
It could be a lot more than 5% depending on the type of gaming you do. Also, OLED does look better in some ways than FALD LCD, at least for some content. If I had to pick just one, I would pick FALD LCD, but it's certainly nice to have both, and use them for different content.I've always said there is no perfect monitor. People have been using multiple monitors for different purposes unlike some guys only use one display to do it all.
If they have enough monitors they will find out 95% of the time they will use FALD to see better images. Only 5% of the time it is necessary to use a faster monitor such as TN or fast IPS for ranking. OLED can only join into that 5%.
You've missed the spot that OLED is flickering. I've tried to play eSport games on it but either it's too dim to give an overall advantage or a little higher 250nits can give eyestrain so I have to stop using it.It could be a lot more than 5% depending on the type of gaming you do. Also, OLED does look better in some ways than FALD LCD, at least for some content. If I had to pick just one, I would pick FALD LCD, but it's certainly nice to have both, and use them for different content.
You've missed the spot that OLED is flickering. I've tried to play eSport games on it but either it's too dim to give an overall advantage or a little higher 250nits can give eyestrain so I have to stop using it.
I have both OLED and FALD to compare. OLED it is lifeless in SDR sRGB compared to wide gamut FALD SDR not even mentioning triple A games looks tons of better on FALD HDR1000.
They can only debate how accuracy matters in worse sRGB while not caring the real important accuracy in HDR. It's like Elden Ring locked at intended 60fps so that 144Hz doesn't matter. You can even find there is an article saying Elden Ring looks dull. It only exposes they use OLED. Elden Ring doesn't look dull at all if you have 2000nits highlight with wide gamut HDR
clearly you cant read either, 'cause youre still spouting bullshit that has nothing to do with what i said.You think you can stop me with what? With the bullshit that OLED looks better for PC use? I know how you bang on your OLED to see that pathetic limited range as good as SDR.
Clearly you have never seen better. It's your business to get stuck in the limited range to see worse images anyway.clearly you cant read either, 'cause youre still spouting bullshit that has nothing to do with what i said.
The problem is OLED can only show SDR sRGB or DCI-P3.Why would you need wide gamut FALD to show SDR sRGB? I don't understand your point there. OLED shows SDR sRGB very well. I agree that the best FALD displays show HDR better than OLED does overall because of the brightness capabilities, but it looks pretty good on OLED as well (in some ways better, worse in others). I can't comment on the flickering and eyestrain - I haven't experienced it, but admittedly I mostly use my LCDs.
The problem is OLED can only show SDR sRGB or DCI-P3.
SDR isn't just about sRGB. There is wider colorspace such as Adobe RGB which OLED cannot have. Once you see Adobe color at a higher range in SDR you just don't want to go back to see lifeless sRGB. Adobe color also looks better than DCI-P3.
OLED can only have DCI-P3 at most. There are already people who find out 27GR95QE "HDR Effect" with DCI-P3 SDR looks better than whatever the dimly inaccurate HDR it has.
On PA32UCG, there is sRGB Mode with fixed 80nits. It's very accurate for sRGB document editing. But I won't go back to this mode unless I'm doing sRGB work. You won't be on that mode either. It's a waste to daily drive sRGB 80nits because Adobe RGB Mode has more color at much higher brightness to deliver better images. And most of the time you are either on HDR or on SDR with higher brightness which looks better yet not as accurate as the limited sRGB 80nits that looks the same on an office monitor.
No. It will only show more color at a higher range but not less color so that it can look close to HDR400.This doesn't make sense to me. You don't want to use the wrong gamut for the content. If you're looking at sRGB content, the monitor should be in the sRGB color space, not some wide gamut color space. You can get incorrect colors, which is noticeably bad, if the monitor is set to the wrong color space. Much of the web content, for example, is sRGB, and won't necessarily show correctly if your monitor is in wide gamut. It's very obvious on red colors, for example. The wide gamut color spaces aren't really all that relevant in SDR unless you're doing some "professional"-type work with wide gamut content.
This doesn't make sense to me. You don't want to use the wrong gamut for the content. If you're looking at sRGB content, the monitor should be in the sRGB color space, not some wide gamut color space. You can get incorrect colors, which is noticeably bad, if the monitor is set to the wrong color space. Much of the web content, for example, is sRGB, and won't necessarily show correctly if your monitor is in wide gamut. It's very obvious on red colors, for example. The wide gamut color spaces aren't really all that relevant in SDR unless you're doing some "professional"-type work with wide gamut content.
Why would you need wide gamut FALD to show SDR sRGB? I don't understand your point there. OLED shows SDR sRGB very well. I agree that the best FALD displays show HDR better than OLED does overall because of the brightness capabilities, but it looks pretty good on OLED as well (in some ways better, worse in others). I can't comment on the flickering and eyestrain - I haven't experienced it, but admittedly I mostly use my LCDs.
sRGB is too old. Funny you are not even seeing the most accurate sRGB. It needs 80nits to be most accurate. You just use sRGB as an excuse to cover the inability of OLED.This x1000. 80%+ of what I do is in sRGB, so sRGB reproduction is important to me. I do a lot of web browsing, some office work, and I also enjoy various artwork, photos, etc., and most of that is still in the sRGB standard. I also do plenty of work in interfaces on the web (Discord for example). So good sRGB image reproduction is my single biggest criteria for a monitor. It's why OLED has such an advantage for me. The perfect blacks and viewing angles make a big difference in how good sRGB images look, and I found the blooming on FALD quite distracting for sRGB uses (or I could turn local dimming off, which was a hassle and resulted in poorer contrast, of course, but that wasn't ideal). Things like Discord chats would often be a mess with local dimming on. I can say without a second thought this OLED is the best monitor I've ever owned as far as showing sRGB content. My previous (non-FALD) IPS couldn't hold a candle to the contrast this can do. Newer IPS panels get close, but with local dimming off, still can't really compete. I can't speak on others getting eyestrain with OLED - I can just say I haven't experienced any, personally, despite quite a bit of use, and I actually find this monitor among the easiest on my eyes I've used. But, some people may be sensitive to different things than I am.
It makes zero sense for me to either expand the range to a wider space or convert sRGB content to HDR. While it's a nice ability, as you point out, just expanding the range is wildly inaccurate. It looks nice initially 'til you realize nothing appears as it actually should with proper colors. I could use Auto HDR to retain most of the accuracy, but my experiments with that (on this monitor as well as the FALD ProArt I'd tried before) made me feel that sRGB still looks best as SDR. It's close in a lot of content, but I don't feel like any perceived advantages of the algorithm up are worth the downsides. (For example, I tried it with an SDR game and felt like it slightly harmed the contrast, if anything, and the menus were too bright - it just looked more natural in SDR.)
No contest that HDR, particularly high-nit HDR, looks best on FALD. But I also find it looks "pretty good" on OLED, which is all I need for the games I play in HDR. Most offer good in-game settings to compensate for the display because they know the game will be played on a variety of monitor technologies. And if there's a bad HDR implementation, I can always play the SDR version. PC HDR is still a bit inconsistent, but it's getting better. Quite a few new games still launch with SDR only, though - the one I'm playing now is SDR. And I don't really watch movies on my PC, so for shows/movies in HDR/Dolby Vision, those are generally watched on my FALD TV anyways, so I get the brightness advantages with that content.
As far as having multiple monitors for the best of both worlds, that'd be nice and I think some of the setups you folks have for this are amazing, but it really isn't practical for my space (plus logistics wise I like having a monitor centered in front of me), so I decided on one with the focus on choosing what does best in what I do the most while still having respectable capabilities in other areas. For me, after trying FALD but not being totally happy, that's ended up being OLED, but I can certainly understand why someone else with different math and priorities would choose FALD.