Why OLED for PC use?

Do you even read the posts you reply to?
Don't forget it's you envy OELD while banging a 55'' miniLED TV that's never as accurate as PC monitors.

I said you've never seen better. It takes years for you to accept you want to see worse.
 
Does that make it right?
Bro we're talking about displays, it's not that serious lol 😆
hq1kn.jpg
 
Don't forget it's you envy OELD while banging a 55'' miniLED TV that's never as accurate as PC monitors.

I said you've never seen better. It takes years for you to accept you want to see worse.

Accuracy isn't important for my work. I do structural engineering CAD drawings, most of the time colour isn't even involved. Other uses include watching TV (duh) and playing PS5. However I'm not saying that stretching sRGB to side gamut is a good thing or that attempting to fake HDR from SDR is the right way to go.

I envy the OLED image purely because it's self emissive, nothing more.

Now please tell me why you're so angry at me and my use case, even though I don't have what is apparently the spawn of the devil, an OLED.

Takes years to get sense from you, it seems.
 
Accuracy isn't important for my work. I do structural engineering CAD drawings, most of the time colour isn't even involved. Other uses include watching TV (duh) and playing PS5. However I'm not saying that stretching sRGB to side gamut is a good thing or that attempting to fake HDR from SDR is the right way to go.

I envy the OLED image purely because it's self emissive, nothing more.

Now please tell me why you're so angry at me and my use case, even though I don't have what is apparently the spawn of the devil, an OLED.

Takes years to get sense from you, it seems.
So you don't care about accurate images. And you don't see better images either on CAD lol.

Stretching sRGB with wide gamut is a good thing because it's better images at higher range. I already said people find out "HDR Effect" SDR it's better than whatever the dimmer HDR on 27GR95QE. That's how ironically useful wide gamut SDR is. That dim OLED proves the point by its own user.

Funny I'm never angry at anything. Why do you think I'm angry since I have been seeing better images several years ahead. All I need to do is speaking some truth to your face that you don't understand.
 
Don't forget it's you envy OELD while banging a 55'' miniLED TV that's never as accurate as PC monitors.

I said you've never seen better. It takes years for you to accept you want to see worse.
Hey Kramnelis I have a few questions for you since you know about mini led displays.

Which one are you using now? And are you satisfied with it? Or do you want something else?

What is the best mini led monitor right now?

What is the best mini led TV right now?

How many displays do you own in your home? What size are they? What rooms are they in and what content are they for?

I have a 50 QN90B because I love the big size . What are the pros and cons to my display in your opinion?
 
Hey Kramnelis I have a few questions for you since you know about mini led displays.

Which one are you using now? And are you satisfied with it? Or do you want something else?

What is the best mini led monitor right now?

What is the best mini led TV right now?

How many displays do you own in your home? What size are they? What rooms are they in and what content are they for?

I have a 50 QN90B because I love the big size . What are the pros and cons to my display in your opinion?
At the office, I use BenQ SW2700PT and PA32UCG. At home, I use PG32UQX, PG35VQ, PG27UQ 95% of the time for games and videos. On my right side, there are Zowie XL2546s, and PG279QM for 5% R6 ranking if needed. Then there is AW3423DW as a constant reminder of what flickering dim OLED looks like.

The problem with FALD LCD is that better HDR and more color have a slower image. So I cannot have both at once. I have the top dogs on each side instead. My monitor room is always dim enough to sleep.

I want these miniLED to have more zones, more brightness, faster G-sync, faster backlight. The best professional HDR monitor is still that 60Hz 576-zone Apple Pro Display XDR. The backlight on that monitor is faster than both PA32UCG and PA32UCR. This is why people won't switch to 1152-zone PA32UCG when they've used Apple Display for 3 years. You have to give Apple credit because Apple can design custom T-CON chips to analyze images and control backlight 10 times faster than the refresh rate. The panel manufacturers aren't chip makers. They can only wait for chips. Maybe one day Samsung can do it with its MagicColor chip.

The mini LED monitor is grinding on the price. The price is cut down. The corners are cut down as well. It doesn't have G-sync. It will have problems here and there but it looks at least like a true HDR1000.

32" 1152-zone miniLED 4K 144Hz opencells with 3rd party hybrid PWM backlight such as INNOCN 32M2V is already available. It has the best value right now. KTC M32P10 is available in other regions. CoolerMaster GP32U is made by KTC. It will be available soon.

32" 576-zone miniLED 4K 144Hz X32FP and PG32UQXR are rather messed up. One hasn't been released. Another one had firmware problems since the release a year ago. But they use DC dimming which is a better backlight compared to PWM.

27' 1152/576/384 miniLED 4K 144Hz are also available among many brands such as ViewSonic VX2781-4K-PRO, KTC M27P20, CoolerMaster GP27U. You just need to wait and see the review for more monitors coming up.

Don't know much about LG. LG probably has a 1560-zone PWM backlight 4K IPS panel with faster response time but less color. Samsung G7/G8 is VA so it has even less color.

The problem with TV is that it is not designed to be as accurate as monitors. If the current FALD display has a complete track of EOTF without doing anything else there will be tons of bloom on small highlights. So it either reduces the brightness of small highlights or lifts the screen a little bit or just uses a VA panel. QN90B EOTF is not tracked as accurately as Sony X95K. Samsung usually likes to make the highlight pop more. But when the content needs 2000nits then QN90B can be more accurate since X95K needs some time to go from 1400nits to 2000nits.
 
Last edited:
The thing is, most people in this thread aren't shoving things down anyone's throat. Speaking only for myself, most, if not all, of my posts state my opinions (as opinions), the objective reasons why I prefer things the way I do, but completely agree that people with different priorities might prefer a different display technology, and for good reason!

Meanwhile, if someone like myself (and another person earlier in the thread) tried a FALD display, didn't end up keeping it because it didn't end up fitting our needs all that well or had too many compromises/issues for our use cases, and returned it, but then found an OLED and liked it, we're accused of being liars who [paraphrasing] 'probably never even owned a FALD monitor.' I've also been told I don't know how to use a monitor, when I very much do. It's easy to say we're just talking s*** about monitors and not to take things too seriously, but that perspective seems to be coming one way, at least in this thread. Either you want to debate the actual issues/science, or you don't, and spouting nonsense repeatedly and making accusations is decidedly in the "don't" camp.

Then there's the phrase of "better images" to describe things like sRGB shoehorned into a wider color gamut, which is at best subjective, but many people, including most professionals, would call that objectively less accurate to the source material. It may look better to some people based on OPINION, and that's 100% fine if you prefer to view something like that, but it is not the game over objective clearly superior image that nobody can ever challenge it's being presented as. To a lot of us, even with the best monitor in the world, doing such still wouldn't be *desirable*. Preference comes into play here, and telling people they definitively don't see "better images" because they prefer to see them in their native colorspace is just weird. If you like wide color gamuts or faux HDR, you do you, but if I want to view sRGB as sRGB, I'm certainly not wrong in doing that, and there's objective science behind it as well.

Other comments like people preferring "HDR Effect" on the LG OLED seem to stem from *one* person saying they liked that mode better than native HDR, which could have a ton of reasons that were never gotten into. Most monitor have a ton of different modes some people like and some people don't. An "Auto HDR" mode on monitors and TVs isn't that uncommon, and many people like it. It's not for me, but it's a nice option for people to have who prefer that look.

I can only speak for myself, but I think it's great some people prefer FALD and some people prefer OLED depending on their uses. Heck, I was hell-bent on going FALD until I tried one and realized I just wasn't having a good experience for the majority of my uses. The OLED was the third monitor I tried, and instantly I liked it better than the other two and that's only grown. But I've also known people who had the opposite experience - they tried the OLED, had issues with brightness for their situation or what-have-you, and returned it for a miniLED option, and that makes sense for them. I love the OLED, but if I didn't, it would have gone back too. There's no right or wrong answer here for everyone. Each technology does a lot right/potentially best, and each has their flaws and drawbacks, some of which could be dealbreakers.

My annoyance only comes when people are dogmatic about things like viewing things "worse" if not converted to a higher range, which is especially silly when it goes against most industry standards. I certainly have my opinions on the best way to view certain content too, and often backed up by some objective arguments (and I do enjoy debating these things in general), but I'd never tell someone if they wanted to view something a certain way, or liked a certain technology because of its strengths in certain areas, they need to change their mind because they're wrong and viewing things "worse" because they don't have "better eyes" like mine. That's silly, counterproductive, and it dismisses subjective preference.

Oh, as to FALD mini-LED TV's, I don't think you can go wrong with Sony's personally. I have a Sony Z9F from a few years back and have been very pleased for both gaming and movies/TV. A friend who also has a Sony loves his too. Their local dimming algorithm is among the best out there. There's still some occasional blooming or dimming of small bright objects like stars, but I don't think you can completely get away from that on any FALD - I don't find it distracting in most uses, though. I've never owned a Samsung TV, so I can't really compare, but if you're checking out new TV's, I really like HDTVTest and their reviews, which can be found on YouTube.
 
The thing is, most people in this thread aren't shoving things down anyone's throat. Speaking only for myself, most, if not all, of my posts state my opinions (as opinions), the objective reasons why I prefer things the way I do, but completely agree that people with different priorities might prefer a different display technology, and for good reason!

Meanwhile, if someone like myself (and another person earlier in the thread) tried a FALD display, didn't end up keeping it because it didn't end up fitting our needs all that well or had too many compromises/issues for our use cases, and returned it, but then found an OLED and liked it, we're accused of being liars who [paraphrasing] 'probably never even owned a FALD monitor.' I've also been told I don't know how to use a monitor, when I very much do. It's easy to say we're just talking s*** about monitors and not to take things too seriously, but that perspective seems to be coming one way, at least in this thread. Either you want to debate the actual issues/science, or you don't, and spouting nonsense repeatedly and making accusations is decidedly in the "don't" camp.

Then there's the phrase of "better images" to describe things like sRGB shoehorned into a wider color gamut, which is at best subjective, but many people, including most professionals, would call that objectively less accurate to the source material. It may look better to some people based on OPINION, and that's 100% fine if you prefer to view something like that, but it is not the game over objective clearly superior image that nobody can ever challenge it's being presented as. To a lot of us, even with the best monitor in the world, doing such still wouldn't be *desirable*. Preference comes into play here, and telling people they definitively don't see "better images" because they prefer to see them in their native colorspace is just weird. If you like wide color gamuts or faux HDR, you do you, but if I want to view sRGB as sRGB, I'm certainly not wrong in doing that, and there's objective science behind it as well.

Other comments like people preferring "HDR Effect" on the LG OLED seem to stem from *one* person saying they liked that mode better than native HDR, which could have a ton of reasons that were never gotten into. Most monitor have a ton of different modes some people like and some people don't. An "Auto HDR" mode on monitors and TVs isn't that uncommon, and many people like it. It's not for me, but it's a nice option for people to have who prefer that look.

I can only speak for myself, but I think it's great some people prefer FALD and some people prefer OLED depending on their uses. Heck, I was hell-bent on going FALD until I tried one and realized I just wasn't having a good experience for the majority of my uses. The OLED was the third monitor I tried, and instantly I liked it better than the other two and that's only grown. But I've also known people who had the opposite experience - they tried the OLED, had issues with brightness for their situation or what-have-you, and returned it for a miniLED option, and that makes sense for them. I love the OLED, but if I didn't, it would have gone back too. There's no right or wrong answer here for everyone. Each technology does a lot right/potentially best, and each has their flaws and drawbacks, some of which could be dealbreakers.

My annoyance only comes when people are dogmatic about things like viewing things "worse" if not converted to a higher range, which is especially silly when it goes against most industry standards. I certainly have my opinions on the best way to view certain content too, and often backed up by some objective arguments (and I do enjoy debating these things in general), but I'd never tell someone if they wanted to view something a certain way, or liked a certain technology because of its strengths in certain areas, they need to change their mind because they're wrong and viewing things "worse" because they don't have "better eyes" like mine. That's silly, counterproductive, and it dismisses subjective preference.

Oh, as to FALD mini-LED TV's, I don't think you can go wrong with Sony's personally. I have a Sony Z9F from a few years back and have been very pleased for both gaming and movies/TV. A friend who also has a Sony loves his too. Their local dimming algorithm is among the best out there. There's still some occasional blooming or dimming of small bright objects like stars, but I don't think you can completely get away from that on any FALD - I don't find it distracting in most uses, though. I've never owned a Samsung TV, so I can't really compare, but if you're checking out new TV's, I really like HDTVTest and their reviews, which can be found on YouTube.
Are you even seeing sRGB accurately on your OLED?

You have 160nits instead of 80nits lol. sRGB is calculated at 80nits as the reference luminance.

It's more like you are trying to spouting nonsense you don't realize with whatever excuse backed up with display industrial standards you don't understand to just cover the inability of OLED where the HDR is dimmer than SDR.

If you had known just a little more sRGB industrial standards you would've used the monitor with 80nits for the most accuracy.

But obviously your eyes disagree to see such accurate but limited range.
 
Are you even seeing sRGB accurately on your OLED?

You have 160nits instead of 80nits lol. sRGB is calculated at 80nits as the reference luminance.

It's more like you are trying to spouting nonsense you don't realize with whatever excuse backed up with display industrial standards you don't understand to just cover the inability of OLED where the HDR is dimmer than SDR.

If you had known just a little more sRGB industrial standards you would've used the monitor with 80nits for the most accuracy.

But obviously your eyes disagree to see such accurate but limited range.

Since some of this hasn't been covered fully before, I'll respond.

Your narrow focus on 80 nits is a straw-man argument. You're trying to say because I value accuracy, then ANY deviation from reference is unacceptable, and that's never been what I said, even from the start. One of my first posts here talks about how I prefer slightly brighter than reference and calibrate to that. So I calibrated the display to 160 nits, just as I did on my TV. Reference for calibrators is generally 100 nits, and many calibrators will calibrate to a bit more depending on the room setup. 160 is for ambient lighting variances and personal preference (it also tends to be a common standard for Adobe RGB, which you talk about a lot, not that it matters for sRGB). If someone is working as a colorist in a fully light-controlled setup, 80 (or possibly 100) nits might be preferable, but for real-world use in a non-pitch-black room, 160 maintains the vast, vast majority of the accuracy while still giving me a pleasant brightness level for all possible room conditions. While I have relatively good light control in my room, it does vary some in the daytime, and I also use a backlight even in a dark room. Especially when calibrated to that threshold (I did sRGB, 160 nits, 2.2 gamma), it turned out quite accurate in the report (both my TV and now the monitor) and things like skin tones look natural and accurate, etc.

Pretending a slightly brighter calibration for sRGB is in any way equivalent to shoehorning sRGB into a wider gamut is pure misinformation and misrepresentation of the facts. Both positions are personal preference, but the former doesn't change accuracy in a significant way, whereas the latter does, more or less depending on how it's done. Either you're shifting colors completely by simply displaying the sRGB in the incorrect color space, which will fully make things look inaccurate and oversaturated, or you're using "Auto HDR" to convert sRGB into HDR with an algorithm for a sort of pseudo-HDR, which can have a nice effect and look quite good, but can also have compromises depending on the image and how good the algorithm is. In my case, I thought contrast in a game wasn't as good and I didn't like the overly bright menus. Like I said, I have no problem with Auto HDR for those who prefer it - it's there for a reason. But I've seen negative effects I don't like personally, so I prefer straight sRGB as I think that looks best.

My whole point is that sRGB, where I spend the majority of my time, has several advantages on an OLED, whereas FALD has significant advantages in high-nit HDR. I'm still pleased with the tone-mapping of HDR into the capabilities of my OLED display, and I personally view it as less of a compromise than the blooming and viewing angle issues I had with bigger FALD displays. It's understandable that people who have use cases more in high-nit HDR prefer FALD, though. Both can be excellent displays in their own right depending on use. That's why I just don't get your dogmatic [paraphrasing] 'FALD has better images' position. In HDR, you're right, FALD has the advantage. In sRGB/SDR, OLED has the advantage. Where you spend the most of your time very well may be the deciding factor for most people. The only area we really disagree on is tone mapping, and I think HDR still looks good (albeit not as good as FALD) on the OLED; you don't, and that's fine; I never argued you have to agree; I only argued that the monitor is capable of HDR, which it is. There are really no facts here up for debate - only opinions, which we simply disagree on.
 
Last edited:
Since some of this hasn't been covered fully before, I'll respond.

Your narrow focus on 80 nits is a straw-man argument. You're trying to say because I value accuracy, then ANY deviation from reference is unacceptable, and that's never been what I said, even from the start. One of my first posts here talks about how I prefer slightly brighter than reference and calibrate to that. So I calibrated the display to 160 nits, just as I did on my TV. Reference for calibrators is generally 100 nits, and many calibrators will calibrate to a bit more depending on the room setup. 160 is for ambient lighting variances and personal preference (it also tends to be a common standard for Adobe RGB, which you talk about a lot, not that it matters for sRGB). If someone is working as a colorist in a fully light-controlled setup, 80 (or possibly 100) nits might be preferable, but for real-world use in a non-pitch-black room, 160 maintains the vast, vast majority of the accuracy while still giving me a pleasant brightness level for all possible room conditions. While I have relatively good light control in my room, it does vary some in the daytime, and I also use a backlight even in a dark room. Especially when calibrated to that threshold (I did sRGB, 160 nits, 2.2 gamma), it turned out quite accurate in the report (both my TV and now the monitor) and things like skin tones look natural and accurate, etc.

Pretending a slightly brighter calibration for sRGB is in any way equivalent to shoehorning sRGB into a wider gamut is pure misinformation and misrepresentation of the facts. Both positions are personal preference, but the former doesn't change accuracy in a significant way, whereas the latter does, more or less depending on how it's done. Either you're shifting colors completely by simply displaying the sRGB in the incorrect color space, which will fully make things look inaccurate and oversaturated, or you're using "Auto HDR" to convert sRGB into HDR with an algorithm for a sort of pseudo-HDR, which can have a nice effect and look quite good, but can also have compromises depending on the image and how good the algorithm is. In my case, I thought contrast in a game wasn't as good and I didn't like the overly bright menus. Like I said, I have no problem with Auto HDR for those who prefer it - it's there for a reason. But I've seen negative effects I don't like personally, so I prefer straight sRGB as I think that looks best.

My whole point is that sRGB, where I spend the majority of my time, has several advantages on an OLED, whereas FALD has the advantage in high-nit HDR. I'm still pleased with the tone-mapping of HDR into the capabilities of my OLED display, and I personally view it as less of a compromise than the blooming and viewing angle issues I had with bigger FALD displays. It's understandable that people who have use cases more in high-nit HDR prefer FALD, though. That's why I just don't get your dogmatic [paraphrasing] 'FALD has better images' position. In HDR, you're right, FALD has the advantage. In sRGB/SDR, OLED has the advantage. Where you spend the most of your time very well may be the deciding factor for most people. The only area we really disagree on is tone mapping, and I think HDR still looks good (albeit not as good as FALD) on the OLED; you don't, and that's fine; I never argued you have to agree; I only argued that the monitor is capable of HDR, which it is. There are really no facts here up for debate - only opinions, which we simply disagree on.
You are the one who value so much accuracy at sRGB while ignoring the accuracy at HDR. Why don't you value the accuracy even a little bit more? You are not even seeing the most accurate sRGB 80nits. It's obvious your preference, your eyes want to see higher range.

I already told you there are better SDR images other than just sRGB. Eyes like high range images because eyes can see a lot more. This is why HDR exists.

It's the similar reason why eyes like high refresh rate. These are the simple truth you denied in the first place. I have seen many times people will eventually shoot their own foot when denying the truth. All I see is some guy don't want to admit how dim his OLED is with all these excuses and preferences.
 
You are the one who value so much accuracy at sRGB while ignoring the accuracy at HDR. Why don't you value the accuracy even a little bit more? You are not even seeing the most accurate sRGB 80nits. It's obvious your preference, your eyes want to see higher range.

I already told you there are better SDR images other than just sRGB. Eyes like high range images because eyes can see a lot more. This is why HDR exists.

It's the similar reason why eyes like high refresh rate. These are the simple truth you denied in the first place. I have seen many times people will eventually shoot their own foot when denying the truth. All I see is some guy don't want to admit how dim his OLED is with all these excuses and preferences.

80 nits sRGB is for a very dark/pitch black room. It doesn't take into account real-world lighting conditions. So I go for accuracy as much as possible to my chosen brightness level, which calibration to 160 nits accomplishes, and that still gets me very close to dE. I never said there isn't an "accurate enough". It's also why professional calibrators calibrate to different brightness levels depending on the situation. If I wanted total accuracy, I could also build a totally dark theater room and go for a reference-level monitor, but there are diminishing returns. sRGB at 160 nits is accurate enough to me that colors and skin tones still look completely natural, and the slightly brighter than reference picture is pleasing to me. Again, it's a strawman to say just because someone values accuracy that they HAVE TO BE 100% ACCURATE IN ALL THINGS OR ELSE THEY ARE LYING!

Almost all desktop-oriented content is in sRGB (programs, photos, SDR games). They can't be made "better" via magic, and trying to show sRGB content in a wide gamut distorts colors. Nobody's arguing native HDR isn't better. It is, clearly. But it's still not the majority of content, nor is it particularly standardized at the moment. I'm playing a brand new game right now that doesn't support HDR. A bunch of games in the past have come out with broken HDR until later on. And I already have a high-nit TV for HDR/Dobly Vision television and movie content. (Not that HDR YouTube videos don't still look great on this display, even if they get tone-mapped.) All the things you claim people are denying, they're not. I'm well aware OLED has its limitations, but they're ones more compatible with my use cases. To put another way, why in the world would I prioritize high-nit content, when is less than 20% of what I do, and compromise sRGB content, which is more than 80% of what I do? It would make no sense. That's why OLED is the better choice for me.
 
FALD is expensive tech which gets much more expensive as number of zones increase.
OLEDs are comparatively cheap to make and I suspect their price to decrease and popularity increase.
FALD I suspect will get even more expensive as number of zones increase...

Anyways for desktop usage OLEDs are not the greatest for multitude of reasons:
- burn-in and need to use burn-in prevention measures
- insufficient bitdepth precision at lower brightness levels
- WOLEDs have near black chrominance overshoot issues especially visible when used at lower brightness levels
- subpixel structure is often not RGB

For me they wont replace LCD until these issues are resolved.
Also for me brightness is not an issue. HDR could be better but the same thing can be said for HDR implementation in Windows and in games. For now for me HDR seems like more bother than its really worth. Also I see HDR more usable in bigger displays than typical monitor sizes.
 
FALD is expensive tech which gets much more expensive as number of zones increase.
OLEDs are comparatively cheap to make and I suspect their price to decrease and popularity increase.
FALD I suspect will get even more expensive as number of zones increase...

It's definitely going to be interesting to see how both technologies evolve.


Anyways for desktop usage OLEDs are not the greatest for multitude of reasons:
- burn-in and need to use burn-in prevention measures
- insufficient bitdepth precision at lower brightness levels
- WOLEDs have near black chrominance overshoot issues especially visible when used at lower brightness levels
- subpixel structure is often not RGB

These are some really good points. None of them were dealbreakers for me personally but I can certainly see how they could be:

- I'm concerned about burn-in, but I've read from multiple users OLED has come a long ways in this regard. This is a concern, but it's been so long since I've tried an OLED since that first TV burn-in experience, I wanted to give it another shot. I am taking some burn-in prevention measures, but none I find too inconvenient. It's no big deal to have the screen go off after 5 minutes or hide the taskbar/go full screen at times. I changed the desktop background anyways. Still getting used to not showing all desktop icons, but it was a small adjustment.

- I'm not extremely well-versed in bit depth precision at lower brightness levels; that said, I haven't noticed anything specifically on dimmer images.

- I haven't really noticed any near black chrominance overshoot issues on my LG, and Vincent's recent review mentioned it being kept to a minimum, so I'm not sure how much of a problem this is on newer panels.

- A lot of people have issues with text readability on panels like mine. I can notice some fringing if I look closely, but for whatever reason, it doesn't tend to bother me. If I was sensitive to it, though, I could see how it'd be a dealbreaker. It's sort of funny how different people are sensitive to different things and other things don't bother them at all. Many people don't mind the blooming in sRGB on FALD displays, but it was a dealbreaker for me.


For me they wont replace LCD until these issues are resolved.
Also for me brightness is not an issue. HDR could be better but the same thing can be said for HDR implementation in Windows and in games. For now for me HDR seems like more bother than its really worth. Also I see HDR more usable in bigger displays than typical monitor sizes.

I'd definitely like to see these resolved and a more standard RGB structure in the future. That, additional brightness, and even better burn-in resistance would make OLED even more appealing, but I know some of that is contradictory with current tech.
 
So you don't care about accurate images. And you don't see better images either on CAD lol.

Accurate images aren't important for the work I do. I never said I don't care about it elsewhere. Tell me how colour accuracy is important for mostly monochrome technical drawings. Drawings that are annotated on white sheet backgrounds where excessive brightness would do my eyes in. Why on earth would I want to work in high brightness HDR for this, all day every day? That's insanity, and is in no way a "better image".

Stretching sRGB with wide gamut is a good thing because it's better images at higher range. I already said people find out "HDR Effect" SDR it's better than whatever the dimmer HDR on 27GR95QE. That's how ironically useful wide gamut SDR is. That dim OLED proves the point by its own user.

So you couldn't care less about accuracy either then. "Better" in this case is entirely subjective but it's an undeniably fake image. I'm under no illusion that this TV displays "better" images than a properly calibrated monitor.

Funny I'm never angry at anything. Why do you think I'm angry since I have been seeing better images several years ahead. All I need to do is speaking some truth to your face that you don't understand.

There's that "Funny..." response again. And that final sentence definitely has some anger to it. Or impenetrable ego. Or a bit of both.
 
- I'm concerned about burn-in, but I've read from multiple users OLED has come a long ways in this regard. This is a concern, but it's been so long since I've tried an OLED since that first TV burn-in experience, I wanted to give it another shot. I am taking some burn-in prevention measures, but none I find too inconvenient. It's no big deal to have the screen go off after 5 minutes or hide the taskbar/go full screen at times. I changed the desktop background anyways. Still getting used to not showing all desktop icons, but it was a small adjustment.
Rtings started releasing some data from their now 2 months in the running burn-in test consisting of not only OLEDs from LG/Samsung but also mini-LED TVs. Interestingly LG OLEDs seem to do better in this whereas the Samsung S95B QD-OLED seems to be already exhibiting slight signs of burn-in. You can see the test setup here and clicking on the TV model names takes you to the review section with current result pics and graphs. That doesn't mean that in typical use you will see these problems.

For personal experience, I never saw burn-in on my LG C9 65" I had and on the LG CX 48" that has seen 2 years of desktop use, there is still no sign of burn-in after over 2.5 years.

- A lot of people have issues with text readability on panels like mine. I can notice some fringing if I look closely, but for whatever reason, it doesn't tend to bother me. If I was sensitive to it, though, I could see how it'd be a dealbreaker. It's sort of funny how different people are sensitive to different things and other things don't bother them at all. Many people don't mind the blooming in sRGB on FALD displays, but it was a dealbreaker for me.
With the large LG OLEDs the required longer viewing distance and 125% DPI scaling definitely helps a lot. That's why I don't want any of the 1440p OLEDs.
I'd definitely like to see these resolved and a more standard RGB structure in the future. That, additional brightness, and even better burn-in resistance would make OLED even more appealing, but I know some of that is contradictory with current tech.
Yeah, I'm also hoping Windows would start supporting any pixel structure for Cleartype so the issue can be at least solved for most applications. I hope e.g JOLED panels which do have a RGB structure would start to come as 4K 120+ Hz models and support higher brightness.

It feels like we are in a bit of a standstill at this point, with none of the display technologies getting significantly better and micro-LED still seems like it's at least 5 years away from being a thing in consumer grade TVs let alone monitors.

I'm interested to see how the Samsung 57" Neo G9 8K x 2K superultrawide mini-LED I'm planning to buy fares in HDR. The previous 5120x1440 Neo G9 has significantly higher peak/sustained brightness than the 4K Neo G8 when talking about larger window sizes. Either the 4K res limits the light output enough or Samsung tried to avoid blooming much more on the Neo G8. The new Neo G9 will probably cost reasonably close to the PG32UQX but I'm far more willing to pay that price when it also comes with a ton of desktop space, ultrawide gaming and higher refresh rate and I'm totally fine with taking a hit on HDR performance since it's about 20% of my usage at best.
 
80 nits sRGB is for a very dark/pitch black room. It doesn't take into account real-world lighting conditions. So I go for accuracy as much as possible to my chosen brightness level, which calibration to 160 nits accomplishes, and that still gets me very close to dE. I never said there isn't an "accurate enough". It's also why professional calibrators calibrate to different brightness levels depending on the situation. If I wanted total accuracy, I could also build a totally dark theater room and go for a reference-level monitor, but there are diminishing returns. sRGB at 160 nits is accurate enough to me that colors and skin tones still look completely natural, and the slightly brighter than reference picture is pleasing to me. Again, it's a strawman to say just because someone values accuracy that they HAVE TO BE 100% ACCURATE IN ALL THINGS OR ELSE THEY ARE LYING!

Almost all desktop-oriented content is in sRGB (programs, photos, SDR games). They can't be made "better" via magic, and trying to show sRGB content in a wide gamut distorts colors. Nobody's arguing native HDR isn't better. It is, clearly. But it's still not the majority of content, nor is it particularly standardized at the moment. I'm playing a brand new game right now that doesn't support HDR. A bunch of games in the past have come out with broken HDR until later on. And I already have a high-nit TV for HDR/Dobly Vision television and movie content. (Not that HDR YouTube videos don't still look great on this display, even if they get tone-mapped.) All the things you claim people are denying, they're not. I'm well aware OLED has its limitations, but they're ones more compatible with my use cases. To put another way, why in the world would I prioritize high-nit content, when is less than 20% of what I do, and compromise sRGB content, which is more than 80% of what I do? It would make no sense. That's why OLED is the better choice for me.
Just turn off the light at night lol. The solution is that cheap. You can totally do sRGB 80nits. Why don't you do it if you care so much about sRGB accuracy?

It's obviously sRGB 80nits looks even more lifeless at accurate brightness. You just prove it that even you want to see lifted sRGB 160nits.

The monitor capability has gone far beyond just sRGB. All the reason why there are multiple SDR modes especially on a gaming monitors such as FPS mode or whatever mode to boost the color and contrast beyond sRGB gamut to provide a more suitable usage scenario. The mere accuracy at lifeless sRGB doesn't matter that much. sRGB range is pathetic low.

But there is a difference between you can choose to do it and you have no choice at all. It's on the OLED where you have no choice, no brightness, no wider gamut to do SDR 400nits or HDR1000. Again, all I see is just some guy doesn't accept how dim his OLED is.
 
Just turn off the light at night lol. The solution is that cheap. You can totally do sRGB 80nits. Why don't you do it if you care so much about sRGB accuracy?

It's obviously sRGB 80nits looks even more lifeless at accurate brightness. You just prove it that even you want to see lifted sRGB 160nits.

Backlights are better to prevent eyestrain, and I can't control daylight completely during the day. As usual, you're making a false argument; the accuracy lost at running sRGB at a slightly higher brightness, especially when you run the calibration TO that brightness, is minimal. The fact that professional calibrators even calibrate to different brightnesses according to the room/person's preferences attest to that. My calibration report results were great. That's what I care about. It's still highly accurate for sRGB, and I'm happy with that.

The monitor capability has gone far beyond just sRGB. All the reason why there are multiple SDR modes especially on a gaming monitors such as FPS mode or whatever mode to boost the color and contrast beyond sRGB gamut to provide a more suitable usage scenario. The mere accuracy at lifeless sRGB doesn't matter that much. sRGB range is pathetic low.

But there is a difference between you can choose to do it and you have no choice at all. It's on the OLED where you have no choice, no brightness, no wider gamut to do SDR 400nits or HDR1000. Again, all I see is just some guy doesn't accept how dim his OLED is.

It's not about monitor capability; it's about the majority of content (web, photos, etc.) being made with sRGB in mind. That's how I want to view it, period. I don't have any wish to boost the gamut, even if this was a reference-grade monitor. That's what you don't seem to understand. I know I COULD. I know it would look more colorful. It'll also throw all the colors off and not look correct/natural. Skin tones are a great example. Again, I want to see, within reason, the intention behind the image. If you want to expand the color gamut, more power to you. To me, that's the same thing as somebody buying a new TV and preferring Vivid (well, not exactly - even Vivid uses the native color space). It's not something I want to do personally, but if you do, go for it.

This OLED isn't particularly dim. I'm more than happy with the brightness. Hell, in SDR, I can go a good bit brighter than I'm currently set at. Also, it fully supports the wider gamut, so I'm not sure what you're on about. Again, the high-nit HDR applications are the things I do LEAST, so it doesn't bother me that much it doesn't get that bright. It's one of the brightest OLEDs for HDR yet and does a serviceable job with tone-mapping until something better comes along in all areas, and that's enough for me.
 
Accurate images aren't important for the work I do. I never said I don't care about it elsewhere. Tell me how colour accuracy is important for mostly monochrome technical drawings. Drawings that are annotated on white sheet backgrounds where excessive brightness would do my eyes in. Why on earth would I want to work in high brightness HDR for this, all day every day? That's insanity, and is in no way a "better image".
Your work is CAD. You think a floor plan can look as good as real life? This is why you think it doesn't matter on whatever colorspace you have. You don't even understand HDR. HDR can have various range. You can have any brightness in HDR.

So you couldn't care less about accuracy either then. "Better" in this case is entirely subjective but it's an undeniably fake image. I'm under no illusion that this TV displays "better" images than a properly calibrated monitor.
sRGB has the same lifeless images on every office monitor with the lowest standard. The standard is too old and too limited. That limited range won't give a better image. I don't play games on sRGB mode. I might watch some movies when it doesn't have HDR. But I will most likely grade the movie myself to see better range.

There's that "Funny..." response again. And that final sentence definitely has some anger to it. Or impenetrable ego. Or a bit of both.
Funny all I see is someone who cannot see better.
 
Your work is CAD. You think a floor plan can look as good as real life?

Proof you don't read what people write.

I work with monochrome structural drawings. I don't make pretty pictures attempting to model the real look of internal/external design. I'm not an architect.

You don't even understand HDR.

The way you presume people's knowledge or intention is utterly foul.

sRGB has the same lifeless images on every office monitor with the lowest standard. The standard is too old and too limited. That limited range won't give a better image.

Why does it matter if someone is using a display as nothing more than an office screen?

Funny all I see is someone who cannot see better.

Foul.
 
Backlights are better to prevent eyestrain, and I can't control daylight completely during the day. As usual, you're making a false argument; the accuracy lost at running sRGB at a slightly higher brightness, especially when you run the calibration TO that brightness, is minimal. The fact that professional calibrators even calibrate to different brightnesses according to the room/person's preferences attest to that. My calibration report results were great. That's what I care about. It's still highly accurate for sRGB, and I'm happy with that.
160nits sRGB is not as accurate as 80nits sRGB. And you can totally do it. Where is your industrial standard again? You talk with all the seriousness but ends up with a 160nits instead.

You just don't care that much about accuracy. You don't even care much about the accuracy on the real important HDR lol.

Again, all the excuses you use are only to deny how dim your OLED is.

It's not about monitor capability; it's about the majority of content (web, photos, etc.) being made with sRGB in mind. That's how I want to view it, period. I don't have any wish to boost the gamut, even if this was a reference-grade monitor. That's what you don't seem to understand. I know I COULD. I know it would look more colorful. It'll also throw all the colors off and not look correct/natural. Skin tones are a great example. Again, I want to see, within reason, the intention behind the image. If you want to expand the color gamut, more power to you. To me, that's the same thing as somebody buying a new TV and preferring Vivid. It's not something I want to do personally, but if you do, go for it.

This OLED isn't particularly dim. I'm more than happy with the brightness. Hell, in SDR, I can go a good bit brighter than I'm currently set at. Also, it fully supports the wider gamut, so I'm not sure what you're on about. Again, the high-nit HDR applications are the things I do LEAST, so it doesn't bother me that much it doesn't get that bright. It's one of the brightest OLEDs for HDR yet and does a serviceable job with tone-mapping until something better comes along in all areas, and that's enough for me.

It's all about the monitor capability to see better. It's about OLED being a lot worse. You cannot use OLED to see higher range because it is that dim. You don't have much options with OLED. So you can only see sRGB.
 
I work with monochrome structural drawings. I don't make pretty pictures attempting to model the real look of internal/external design. I'm not an architect.
These drawings must look as good as real life. You can bust out an old TV to draw it anyway.
 
I'm interested to see how the Samsung 57" Neo G9 8K x 2K superultrawide mini-LED I'm planning to buy fares in HDR. The previous 5120x1440 Neo G9 has significantly higher peak/sustained brightness than the 4K Neo G8 when talking about larger window sizes. Either the 4K res limits the light output enough or Samsung tried to avoid blooming much more on the Neo G8. The new Neo G9 will probably cost reasonably close to the PG32UQX but I'm far more willing to pay that price when it also comes with a ton of desktop space, ultrawide gaming and higher refresh rate and I'm totally fine with taking a hit on HDR performance since it's about 20% of my usage at best.
I'm interested in that monitor as well. It may be a while until it becomes available, and even then it's likely to suffer from some significant issues judging by recent Samsung monitors, so not sure if and when I will actually buy it.
 
Then prove me CAD looks as good as real life.

What is it about this sort of work that needs anything but the most basic display?

Screenshot_20230303_141327.png


The things you're stressing over are completely irrelevant. "Real life" doesn't come into it.

You've strawman'd so far away from OLED for PC use at this point I don't understand what your goal is.
 
What is it about this sort of work that needs anything but the most basic display?

View attachment 553436

The things you're stressing over are completely irrelevant. "Real life" doesn't come into it.

You've strawman'd so far away from OLED for PC use at this point I don't understand what your goal is.
Does CAD look like real life? Does CAD look vivid and colorful? Does CAD need HDR?

A 50-year-old TV can display it no problem.
 
160nits sRGB is not as accurate as 80nits sRGB. And you can totally do it. Where is your industrial standard again? You talk with all the seriousness but ends up with a 160nits instead.

You just don't care that much about accuracy. You don't even care much about the accuracy on the real important HDR lol.

Again, all the excuses you use are only to deny how dim your OLED is.

I have used 160 nits to calibrate to for literally years (both my TV and previous monitor). And never in the conversation have I cited 80 nits as desirable for me. This is a bizarre, silly attempt to invalidate liking accuracy if it doesn't follow the reference standard without ANY deviation. That reference standard (and even most calibrators start at 100 instead of 80) doesn't necessarily work well in real-world use/lighting conditions, which is why calibration software offers calibration to different cd/m2 levels in the first place. The whole point is to make it accurate to the brightness level you'd like.


It's all about the monitor capability to see better. It's about OLED being a lot worse. You cannot use OLED to see higher range because it is that dim. You don't have much options with OLED. So you can only see sRGB.

OLED isn't "a lot worse"; it's just different with different pros and cons, and is arguably better for sRGB. I spend most of my time viewing sRGB content, so of course I want the display technology that works better for it. Higher range is nice, but it's not the be-all and end-all, and the sacrifice in brightness for high-nit HDR is a worthwhile tradeoff for me, especially since tone mapping works well enough I still find HDR on these displays quite enjoyable.

We're at the point in the conversation where you're repeating without actually making any fact-based arguments and conveniently ignoring anything said or the points brought up you simply don't want to address, so I think I'll go back to not replying to your trolling unless some actual interesting points of debate come up.
 
I'm interested in that monitor as well. It may be a while until it becomes available, and even then it's likely to suffer from some significant issues judging by recent Samsung monitors, so not sure if and when I will actually buy it.
Yeah it remains to be seen. Personally I have a decent track record with Samsungs, had the CRG9 and currently use the 28" G70A. Both had their fair share of quirks but no real dealbreaker issues for me. So I am hoping that Samsung manages to improve upon the Neo G7/G8 for the new G9, especially since it's going to be a bit of a flagship model in their monitor range for sure.
 
Does CAD look like real life? Does CAD look vivid and colorful? Does CAD need HDR?

A 50-year-old TV can display it no problem.

Alright now I really don't understand what you're getting at. You're basically agreeing with what I'm trying to say.
 
I have used 160 nits to calibrate to for literally years (both my TV and previous monitor). And never in the conversation have I cited 80 nits as desirable for me. This is a bizarre, silly attempt to invalidate liking accuracy if it doesn't follow the reference standard without ANY deviation. That reference standard (and even most calibrators start at 100 instead of 80) doesn't necessarily work well in real-world use/lighting conditions, which is why calibration software offers calibration to different cd/m2 levels in the first place. The whole point is to make it accurate to the brightness level you'd like.
Don't forget It's you have all the industrial standard to back you up. Why don't you use the most accurate sRGB 80nits? Eyestrain from OLED at only 80nis in a pitch black room?

Even I use sRGB 80nits when I do sRGB work. But I won't deny how lifeless it looks compared to wide gamut SDR with tons more color and brightness. The accuracy on sRGB doesn't mean it can look better in such limited range lol. It's similar that there are old movies that are black and white with color added several years later. You want to see the original with accuracy then it doesn't have color. It doesn't mean it looks better.

OLED isn't "a lot worse"; it's just different with different pros and cons, and is arguably better for sRGB. I spend most of my time viewing sRGB content, so of course I want the display technology that works better for it. Higher range is nice, but it's not the be-all and end-all, and the sacrifice in brightness for high-nit HDR is a worthwhile tradeoff for me, especially since tone mapping works well enough I still find HDR on these displays quite enjoyable.

We're at the point in the conversation where you're repeating without actually making any fact-based arguments and conveniently ignoring anything said or the points brought up you simply don't want to address, so I think I'll go back to not replying to your trolling unless some actual interesting points of debate come up.
OLED is a lot worse. I know how dim OLED looks like. It's you deny how dim it is.

It can only show sRGB where no majority except a few of us are using it. Most users are seeing much higher range than sRGB on their displays to have a more vivid images because it looks better to the eyes. It looks close to realism. These are better images provided by display with more capability at higher range not limited by the lowest sRGB.
 
Alright now I really don't understand what you're getting at. You're basically agreeing with what I'm trying to say.
If you have a special need to see low brightness then you use OLED. Not everybody is doing CAD. OLED can burn faster than you think on CAD with large scale of white.
 
If you have a special need to see low brightness then you use OLED. Not everybody is doing CAD. OLED can burn faster than you think on CAD with large scale of white.
That's... basically precisely why I don't and won't use OLED. It doesn't match my use case.

Still doesn't make any of what you've been screaming about relevant, while accusing so many people of "not seeing" or having a lack of knowledge because they've chosen a display tech you don't like.
 
That's... basically precisely why I don't and won't use OLED. It doesn't match my use case.

Still doesn't make any of what you've been screaming about relevant, while accusing so many people of "not seeing" or having a lack of knowledge because they've chosen a display tech you don't like.
How many people exactly? Like 1 or 2 with special needs? Just check how the market goes. OLED will be lucky to get into 5% user base. It doesn't have better images.
 
How many people exactly? Like 1 or 2 with special needs? Just check how the market goes. OLED will be lucky to get into 5% user base. It doesn't have better images.
Why do you keep repeating the same point? OLED "not having better images" is your opinion. We already know what your opinion is, it feels like we've had it repeated more times than the earth has ever rotated.

OLED works for some and not others. FALD works for some and not others. Arrogantly pushing your opinion over and over isn't doing you any favours.
 
Don't forget It's you have all the industrial standard to back you up. Why don't you use the most accurate sRGB 80nits? Eyestrain from OLED at only 80nis in a pitch black room?

A few new points, so why not...
This doesn't describe the room I'm using it in, which I've said multiple times. It has decent light control and I mostly use it at night, but not exclusively. I also generally have some back lighting on. So it needs some extra brightness - the amount is fairly arbitrary, and I eventually settled on 160 after some research. I never claimed to follow reference 100% - just that I like an accurate sRGB picture (which I've achieved by calibrating to 160 nits, and the test results prove that). This whole taking people's points of view and twisting them into an extreme, absolulist version so you can try to trap them really doesn't do your arguments any favors. (And no, no eye strain problems, but that’s not my room).

Even I use sRGB 80nits when I do sRGB work. But I won't deny how lifeless it looks compared to wide gamut SDR with tons more color and brightness. The accuracy on sRGB doesn't mean it can look better in such limited range lol. It's similar that there are old movies that are black and white with color added several years later. You want to see the original with accuracy then it doesn't have color. It doesn't mean it looks better.

Even in that example, that's not expanding the black and white into the color space artificially; it's a manual process, graded by people. But I also know people who would still prefer to watch the original black and white as it's an aesthetic they like and they'd rather see the original. A few years ago they were airing new episodes of The Walking Dead in color, and then re-airing them in black and white to give them a vintage feel. No idea how many people actually watched it (I didn't), but obviously some people still enjoy black and white. I also know someone who bought a black and white old tiny CRT just to watch content on because they enjoy old electronics and that aesthetic so much.

OLED is a lot worse. I know how dim OLED looks like. It's you deny how dim it is.

It can only show sRGB where no majority except a few of us are using it. Most users are seeing much higher range than sRGB on their displays to have a more vivid images because it looks better to the eyes. It looks close to realism. These are better images provided by display with more capability at higher range not limited by the lowest sRGB.

You can claim OLED is "worse" as much as you like, but it doesn't make it true. Both FALD and OLED have their strengths and weaknesses.

And sRGB is used in loads more content than HDR - just about the entire web alone, most photographs, etc... And as usual, you haven't addressed what happens to colors and skin tones when expanded to the wrong color space.
 
Last edited:
  • Like
Reactions: Senn
like this
A few new points, so why not...
This doesn't describe the room I'm using it in, which I've said multiple times. It has decent light control and I mostly use it at night, but not exclusively. I also generally have some back lighting on. So it needs some extra brightness - the amount is fairly arbitrary, and I eventually settled on 160 after some research. I never claimed to follow reference 100% - just that I like an accurate sRGB picture (which I've achieved by calibrating to 160 nits, and the test results prove that). This whole taking people's points of view and twisting them into an extreme, absolulist version so you can try to trap them really doesn't do your arguments any favors.
How is this extreme? This is the industrial standard you pull out by yourself. You can keep the the ambient light to have only 0.2lux reflection on the screen or just turn off the light to see sRGB 80nits easily.

Even in that example, that's not expanding the black and white into the color space artificially; it's a manual process, graded by people. But I also know people who would still prefer to watch the original black and white as it's an aesthetic they like and they'd rather see the original. A few years ago they were airing new episodes of The Walking Dead in color, and then re-airing them in black and white to give them a vintage feel. No idea how many people actually watched it (I didn't), but obviously some people still enjoy black and white. I also know someone who bought a black and white old tiny CRT just to watch content on because they enjoy old electronics and that aesthetic so much.
With all the various SDR mode available on the monitors, nobody is really using limited sRGB anymore. sRGB is dull and lifeless. I know how it looks like. You know how it looks like. It's you don't want to admit how dull it looks like.

You can claim OLED is "worse" as much as you like, but it doesn't make it true. Both FALD and OLED have their strengths and weaknesses.

And sRGB is used in loads more content than HDR - just about the entire web alone, most photographs, etc... And as usual, you haven't addressed what happens to colors and skin tones when expanded to the wrong color space.
OLED is worse. sRGB is worse than HDR. You might use sRGB to see some UIs or photographs while I just view HDR format photography instead.

sRGB is lifeless. The color is limited no matter how accurate it is because it's a compromised intention nowhere close to reality. With expanded colorspace, even it is not the compromised intention it still has more color not less to deliver better images.
 
Back
Top