Why OLED for PC use?

Nah It's fine. If you want anything faster go 240 or 360hz. If you like OLED go OLED. There is no motion blur or blooming that an average gamer would notice. Only pixel peepers maybe but in that regard a pixel peepers should find something wrong with any display tech right lol There are other options out there. For me it's beautiful 😍
Yeah, that's my point, for you it's beautiful. I happen to agree it's a fantastic panel but it's not perfect, and saying people won't notice things is underestimating people.

Others may not be able to look past <insert for here> when deciding what to buy and use.
 
Yeah, that's my point, for you it's beautiful. I happen to agree it's a fantastic panel but it's not perfect, and saying people won't notice things is underestimating people.

Others may not be able to look past <insert for here> when deciding what to buy and use.
Well my point is for most more like majority of consumers will love it. It's more like the nitpickers are mostly on this forum to have pissing contests and compare stat sheets. None of my friends are as nerdy as I am and even for me I love it so much so any average PC gamer is generally not going to pick it apart. Besides a majority of the reviews are overwhelmingly positive that's not to mention all the casuals that don't even bother to review it online. You can say some might pick something they dislike sure of course but that's few and far between most average consumers are not obsessive like forum members which is a small fraction of the general population. They take it home install it and love it that's it lol.
 
I posted that video mostly because of this false claim right here:

View attachment 554623

Nobody uses a 200 nits OLED to grade HDR1000 eh?
Everybody talks like they have a PA32DC now? These consumer OLED cannot even reach HDR400. What kind of HDR1000 content can you grade if your cannot even see HDR1000? Maybe you can grade a HDR200 or HDR300 or just sRGB on a C2.
 
What a completely random and pointless statement.

I work from home. I won't be buying an OLED for my PC, ever.

Yet I still think all your arguments against OLED fall flat.
Work from home in 2023?

I said I look forward for you company replacing office LCD for OLED.

You usage are within that 1% scenario.
 
Depends on what level you work at in the industry.

At the top end, OLED is mostly used to show clients.
The top of the line monitors used for grading are all from Flanders Scientific.
https://flandersscientific.com/XM312U/
This is what you'd find at places such as Dolby for mastering. Most can't afford to spend $20k on a monitor though.

If you have a smaller budget, then yeah, people have been using OLED TV's to grade on for a while. There are quite a few guides on how to get into the service menu and turn off all the "safety features". Which does mean that users have to be much more careful with static imagery, but when "used right", it's simply the output monitor with the graded image on it and all the tools are on a separate monitor. Meaning that static images are minimized. The C2 as an example more or less destroys all other desktop monitors for accuracy while also considering price when things like ABL are turned off.

I'm sure the G3 will be the go to as a client display... well until there is something better.
C2 loses way more brightness and color with only 70% Rec 2020. Consumer OLED doesn't have more accuracy on HDR. It maybe have accuracy on sRGB or on content below HDR400.
 
Everybody talks like they have a PA32DC now? These consumer OLED cannot even reach HDR400. What kind of HDR1000 content can you grade if your cannot even see HDR1000? Maybe you can grade a HDR200 or HDR300 or just sRGB on a C2.

Why don't you ask that to the people over at Hollywood who's using them lol.
 
I cannot wait to see what kind of HDR1000 can be graded from a C2.

Again, nobody uses a 200nits OLED to grade HDR1000.

I thought they were using the G3 and not the C2. But again, Hollywood are using those 200 nits OLED for mastering. Unless you meant Hollywood are nobodies then sure lol.
 
I thought they were using the G3 and not the C2. But again, Hollywood are using those 200 nits OLED for mastering. Unless you meant Hollywood are nobodies then sure lol.
Are you hallucinating that Hollywood uses a 200nits OLED to grade HDR1000?
 
Nope I already provided a video. I'm not even the one who's saying so but it's the DigitalTrends guy.
Again, does Hollywood use a 200nits C2 or C3 or whatever consumer OLED for grading HDR1000? Do you want to drag the standard this low?
 
I find Gamer mode varies a lot based on what you're playing on.

On my Sony Z9F (FALD) TV, Gamer mode is a must because input lag is much better.

On my LG OLED monitor, Gamer mode matters less in SDR. I am using Calibrated 1 for SDR/sRGB games; if I hadn't calibrated, I'd use the dedicated sRGB mode instead (which was pretty well calibrated out of the box). With VRR, I don't sense input lag to be much different than one of the Gamer modes, though technically Gamer modes are supposed to have a few extra framerate/input lag features (DAS, which is off in all modes except the Gamer ones). Practically, I can't tell a difference unless I turn off VRR, so I'll stick to the calibrated mode with VRR on.

I do use Gamer 1 for HDR games because there are fewer options (and no calibrated options) for HDR, but it looks quite good and pretty accurate in my experience (and of course has all the input lag benefits).
 
On my LG OLED monitor, Gamer mode matters less in SDR. I am using Calibrated 1 for SDR/sRGB games; if I hadn't calibrated, I'd use the dedicated sRGB mode instead (which was pretty well calibrated out of the box). With VRR, I don't sense input lag to be much different than one of the Gamer modes, though technically Gamer modes are supposed to have a few extra framerate/input lag features (DAS, which is off in all modes except the Gamer ones). Practically, I can't tell a difference unless I turn off VRR, so I'll stick to the calibrated mode with VRR on.

I do use Gamer 1 for HDR games because there are fewer options (and no calibrated options) for HDR, but it looks quite good and pretty accurate in my experience (and of course has all the input lag benefits).
This on LG 27GR95QE?
I noticed similar DAS indication in LG 48GQ900 but it didn't feel any different in desktop when its ON or OFF

It is however an illusion and there is one frame of lag and it can be easily tested!
Open UFO test https://www.testufo.com/ghosting and change modes between these which have DAS and those without. There will be visible jump - UFOs will always jump one frame forward when enabling mode with DAS and jump one frame backward when enabling mode without DAS.

BTW. LG 27GP950 doesn't have DAS OFF in any mode and there is no frame jumping when switching modes. I am thus not sure why OLED monitor cannot have DAS enabled in all modes becase there is hardly any technical reason one should be different to the other... but well oh well, it is what it is.

Personally I do not really care for perfect color accuracy in games
I only wanted to have VIVID mode with proper white-point and found it can be changed in service menu so I calibrated it to look like calibrated 6500K modes.
It might also be possible to use six color axis to calibrate gamut to be much closer to sRGB and still have DAS. Seems however like a lot of effort for feature I wouldn't really use liking wide gamut gaming more. Always used wide gamut on 27GP950 in games and even in some videos which had strange 'creative intent' with default dull colors. In some cases temperature needs to be adjusted too because some videos are too warm or too cold...

More pressing issue on these monitors is lack of option to adjust RGB in HDR. I get they try to squeeze more luminance by using colder colors but this should still be adjustable and user should be able to sacrifice luminance for correctness/tastes. Thankfully its not so cold I cannot get sed to it after a while... still I should not be forced to get used to colder colors. Shame on LG!!!!

HDR is in itself whole other bag of issues though and I found 'calibrating' it in eg. PS5 as instructed leads to broken black crushy picture. On dark pattern increasing brightness to maximum value yields best ressults. White patterns calibrated as per instructions give correct not overblown colors though.
 
This on LG 27GR95QE?
I noticed similar DAS indication in LG 48GQ900 but it didn't feel any different in desktop when its ON or OFF

Yup - that's the monitor.

It is however an illusion and there is one frame of lag and it can be easily tested!
Open UFO test https://www.testufo.com/ghosting and change modes between these which have DAS and those without. There will be visible jump - UFOs will always jump one frame forward when enabling mode with DAS and jump one frame backward when enabling mode without DAS.

Interesting!

BTW. LG 27GP950 doesn't have DAS OFF in any mode and there is no frame jumping when switching modes. I am thus not sure why OLED monitor cannot have DAS enabled in all modes becase there is hardly any technical reason one should be different to the other... but well oh well, it is what it is.

Yeah I'm not sure either. It does seem somewhat arbitrary.


Personally I do not really care for perfect color accuracy in games
I only wanted to have VIVID mode with proper white-point and found it can be changed in service menu so I calibrated it to look like calibrated 6500K modes.
It might also be possible to use six color axis to calibrate gamut to be much closer to sRGB and still have DAS. Seems however like a lot of effort for feature I wouldn't really use liking wide gamut gaming more. Always used wide gamut on 27GP950 in games and even in some videos which had strange 'creative intent' with default dull colors. In some cases temperature needs to be adjusted too because some videos are too warm or too cold...

Fair enough - if that's what you prefer, that's what you prefer. I agree it can look really nice and really pop - no disagreement there; the inaccuracy just bothers me too much, but to each their own!

More pressing issue on these monitors is lack of option to adjust RGB in HDR. I get they try to squeeze more luminance by using colder colors but this should still be adjustable and user should be able to sacrifice luminance for correctness/tastes. Thankfully its not so cold I cannot get sed to it after a while... still I should not be forced to get used to colder colors. Shame on LG!!!!

HDR is in itself whole other bag of issues though and I found 'calibrating' it in eg. PS5 as instructed leads to broken black crushy picture. On dark pattern increasing brightness to maximum value yields best ressults. White patterns calibrated as per instructions give correct not overblown colors though.

It is strange they allow calibration, but not the HDR modes. I like the Sony approach for my TV where when you calibrate SDR, it ports those settings over to the HDR presets, but I don't think that's the case here. I find Gamer 1 in HDR decent, but I would prefer calibrated if I could.

Re the PS5 issue, in his review of the monitor on HDTVTest, Vincent mentioned this and not to use the console HDR calibration as it will have incorrect results (though he was talking about XBOX IIRC). Apparently this has something to do with HGiG, which most TV's support, but not all monitors do. I'm a little lost as to why it's this way, though (either the monitor not supporting it or the consoles not able to calibrate to a monitor; seems kind of silly).
 
Work from home in 2023?

I said I look forward for you company replacing office LCD for OLED.

You usage are within that 1% scenario.
Self-employed, no company involved. I've worked from home for the last 18 years.

Keep up the pointless assumptions and attacks, just makes you look worse.
 
"What about PC users; should they avoid using QD-OLED displays? These results don't look good for computer users. A computer's user interface often has large white areas, even if you're using your computer's Dark Mode feature, and those areas are likely to cause burn-in."

Not just the Samsung. The Sony had miserable burn in also. The LG had a row of pixels fail. This was only 3 months. Give it more time they all will exhibit burn in and have picture quality issues. All oled technology is very delicate and doesn't have any reliability, you just can't trust them. Whatever happened to the good old days of hardware being built like a tank? All that is delicate as a flower. The results speak for themselves.
 
"What about PC users; should they avoid using QD-OLED displays? These results don't look good for computer users. A computer's user interface often has large white areas, even if you're using your computer's Dark Mode feature, and those areas are likely to cause burn-in."

Not just the Samsung. The Sony had miserable burn in also. The LG had a row of pixels fail. This was only 3 months. Give it more time they all will exhibit burn in and have picture quality issues. All oled technology is very delicate and doesn't have any reliability, you just can't trust them. Whatever happened to the good old days of hardware being built like a tank? All that is delicate as a flower. The results speak for themselves.
Now there are some extremely valid reasons to avoid these displays for PC use. I know the RTINGS tests are harsh (and that's the point) but my desktop use wouldn't be much less harsh and I don't know whether I'd trust it for console gaming (static HUDs and the like) either.

I'll continue giving OLED a miss.
 
What I noticed with my OLED owning it over two weeks the eyestrain isn't as bad but it's so bright compared to what I'm use to in Win 11 games are fine cause the 3D animations mask the overall brightness by filtering out the white. Windows the White area surrounding the icons are too bright I could turn it down some more for easier Desktop use but then games would be too Dark luckily I'm using a remote and can change that. It's a Gaming Monitor not really a Desktop monitor. Still using a 21.5 inch Asus from 2012 as my Main.

I suppose I could try Windows Nightlight in Desktop mode I never had luck with it before just causes more eyestrain after I'm done using. While I'm using it it's ok but I never had luck with it in AMOLED Phones.
 
What Windows 11 is need is a Grey theme sure the have the dark theme but don't care for it I think I might look to see if Stardocks has anything good.



These don't look too bad lol you should see Hardforum with one of these enabled.
 
Last edited:
"What about PC users; should they avoid using QD-OLED displays? These results don't look good for computer users. A computer's user interface often has large white areas, even if you're using your computer's Dark Mode feature, and those areas are likely to cause burn-in."

Not just the Samsung. The Sony had miserable burn in also. The LG had a row of pixels fail. This was only 3 months. Give it more time they all will exhibit burn in and have picture quality issues. All oled technology is very delicate and doesn't have any reliability, you just can't trust them. Whatever happened to the good old days of hardware being built like a tank? All that is delicate as a flower. The results speak for themselves.
You're taking from this only what you want to take from this. The failed pixels on the LG have nothing to do with burn-in, it's a random defect for whatever reason. While it seems like QD-OLED might burn a little too fast based on this test (definitely something to consider), the other OLED units are not exhibiting any burn in at all.

My LG CX has 12000h of displaying primarily games with a very static HUD (MMOs) + the desktop and it still looks clean as day one. I've had many LCDs exhibit issues in a shorter span of time.
 
Saw this in one of the youtube comments (yes... I know) of a video regarding the s95b burn in. Again, it's a youtube comment so it is probably pulled right from someones ass, but maybe it has some truth to it:
"Regarding the burn in on S95B and A95K, 1st gen QD-OLED uses Hydrogen (Protium) while LG’s EVO WOLEDs use Deuterium which is much more stable, the new 2nd Gen QD-OLED now also has Deuterium which along with the other improvements contributes to the 2x longer life vs the 1st gen."
 
  • Like
Reactions: elvn
like this
Found something interesting on Discord regarding future QD OLED for PPI and text clarity.
 

Attachments

  • Screenshot_20230310_131806_Discord.jpg
    Screenshot_20230310_131806_Discord.jpg
    252.2 KB · Views: 0
Self-employed, no company involved. I've worked from home for the last 18 years.

Keep up the pointless assumptions and attacks, just makes you look worse.
So that's even less than 1% usage of typical scenario.

It's never about me. It's about you buying an OLED for CAD.
 
If you're a hardcore gamer you should stay away from OLED. If you're very casual perhaps you can pull it off jumping through several hoops and walking a tight rope every time you use it. Or get a mini led and plow through all content like a champ. I play a variety of games and I have never noticed blooming. That argument is silly, at least on the qn90b. The only time you can notice it is it you have a all black background and the mouse pointer is white only then you'll see a halo around the mouse. Other than that I've/you'll never noticed it. You'll notice the oleds dim peak brightness and dim peak highlights more than you're likely to ever notice blooming on mini led. There are several videos covered by techwithkg and fomo that demonstrate side by side how the quantum mini leds are excellent in controlling blooming and rival the best oleds while having a more impactful overall image. I had both and blooming is a non issue in content where dim image quality is a deal breaking problem on oleds. Along with ABL topped off with guaranteed burn in for hardcore gamers with HUDS makes it a hard pass to be completely frank.
There is a difference between hardcore players and competitive players. People call themselves hardcore as long as they still use something unique or something old.

The competitive players will use a 360Hz fast IPS PG27AQN or 360Hz strobe TN XL2566K or a even faster 500Hz AW2524H. These players don't care about graphics or HDR. They've already seen through game to the structure and geometry. They won't use a a dim OLED until it provides enough competitive edge to win a game.
 
Re RTINGS burn-in tests, definitely interesting and potentially troubling. Still glad I'm giving the OLED a try as it's the perfect monitor for me at the moment and I love the performance so far. Whether burn-in will become an issue and when remains to be seen. If I can make it at least ~3 years before becoming noticeable, I'm happy. If it becomes noticeable before then, it may prevent me from choosing OLED for my next display.

I know this LG in particular is using a newer MLA panel, which theoretically should make it more resistant to burn-in, so I'm hopeful with that, reasonable care, as well as things like the 25% buffer, it will mitigate most of the issues, but the proof will be in the pudding.
If it fails early for whatever reason, hopefully the new FALD ASUS entry will be out and maybe some other options to consider, but I'd definitely miss the perfect blacks/perfect viewing angles on this. Burn-in is the one thing that would end up being a deal-breaker if it occurred too early. But I've seen enough reports of people using OLED monitors way more than I do to say for a fact it'll be an issue. It seems to be one of those things that can be highly variable.
 
So that's even less than 1% usage of typical scenario.

It's never about me. It's about you buying an OLED for CAD.
You still don't listen.

You're obsessed with this idea of me buying an OLED for a specific reason that I've told you countless times is the very reason I won't touch OLED.

Seriously, shut the hell up.
 
Disappointed with this year's W-OLED. Even with MLA + META enhancements, it still can't help boosting Coverages to be around the same level as FALD Mini-LED QD-IPS/VA or QD-OLED's nor removing the washed-out that came with White subpixels. Was expecting these things to be solved with those enhancements.
 
Personally I wouldn't use an oled for static desktop/apps. I keep mine as it's own media/gaming "stage" with side screens for static desktop/apps. Kind of like how everyone has their own workstation screens in star trek but they also have the big main screen to view events. I use the turn off the screen emitters trick when not viewing content on the oled. That turns the emitters off without dropping the screen out of the monitor array or affecting anything running on the screen, incl audio. I only raise the curtain so to speak when I'm actually playing a game or viewing media, and activate the tots (emitters) feature when going afk etc.

Plenty of games have HuDs with a lot of hours on them. The LG's use logo dimming which helps a bit. . . but I prob wouldn't play a static app-like game like magic the gathering card game 24/7 at high brightness as that is more like a static desktop app than a more dynamic game.

Still there are people in the oled threads who have used their oleds for static desktop apps mixed usage with a lot of gaming as an all-around display - even disabling asbl in the svc menu, etc. for 4 years and more currently w/o burn in. OLEDs reserve the top 25% of brightness/energize capability for wear evening routine so you shouldn't get burn in until after that runs out completely. That should last a long time if you aren't foolishly abusing the screen. If you are worried about it that much , you can get the more expensive G model LG OLEDs (55" minimum though i think) which come with 4 yr burn in warranty, or pony up for the pricey best buy warranty on any brand/model oled which also covers burn in for several years. e.g. 5 yr bb warranty on the 42" C2 is ~ $210 usd. It's typically more or less around 1/5th of the price of a tv.

Found something interesting on Discord regarding future QD OLED for PPI and text clarity.

So maybe more affordable 8k oleds in the future I hope. Though 8k is 4x the pixels it still might mean more affordable 55" 8k oleds potentially since a 55" is near size to four 27" 4k screens. Maybe even a 48" at the figures you linked from discord. Can hope.
 
Last edited:
You still don't listen.

You're obsessed with this idea of me buying an OLED for a specific reason that I've told you countless times is the very reason I won't touch OLED.

Seriously, shut the hell up.
No. It's you envying OLED without brightness but talking about burn-in. Low brightness and burn-in are exactly the same thing.
 
He's not wrong there really. The lower peaks, sustained, and abl is there to avoid burn in.

high brightness = outright blooming in places and overall luminance fluctuation +/- lifting/dimming of large light bucket areas of tetris brickwork .. . . . So those are the same thing too though.

Trade-offs.
 
He's not wrong there really. The lower peaks, sustained, and abl is there to avoid burn in.

high brightness = outright blooming in places and overall luminance fluctuation +/- lifting/dimming of large light bucket areas of tetris brickwork .. . . . So those are the same thing too though.

Trade-offs.
OLED is the biggest trade-off for PC use. You can only buy OLED for SDR yet still afraid of burn-in regardless.
 
Being graphics oriented in pc games I've played for decades with the Hud off when casually exploring. There may be at few here who do the same. And then turn it on for the occasional mission when necessary to get to a new map (with new graphics.) Currently on a C2 42. So static images are not a problem in games for my gameplay style. And desktop net browsing static images are not much of an issue with the tools to use in the LG options.

And logo dimming is definitely off. I noticed brighter more vivid images in Witcher 3 right after I turned it off. I keep it off in desktop use to avoid having to switch it on and off. But with Asbl on and other safeguards it won't be a problem until maybe 3 or 4 years of tv "wear." By that time we may have the new 2023 micro lens array super bright LG Oled 55"+ available in a 42-43" size. Which I would promptly buy to replace an aging C2 42. Or god knows what other tv tech will be out by then.

And speaking of switching logo dimming on and off in a C2, I wouldn't be surprised if there's a 3rd party program that automatically turns it on when entering the desktop and off when in a game. I haven't bothered to check if it's even possible btw. Maybe in ColorControl app, not sure.
 
Last edited:
It'd be nice if all screens had a number of customizable sized masking shapes via osd you could overlay and set the opacity of, even on lcds. There is some stuff and fields I'd even black out altogether in some media and streams because it's just distracting garbage or point light sources I don't need to be staring at.

I use turnoffthe lights add on and colorchanger add-ons on my Firefox browser on any screen but masking objects would be more useful overall. If they remembered content they were customized for or at least could be named sets it would be even more useful.
 
Saw this over at AVSForum:



Looks like a really good test and can't wait to see it being applied to future review sites. Window size alone definitely doesn't tell the whole story.
 
It'd be nice if all screens had a number of customizable sized masking shapes via osd you could overlay and set the opacity of, even on lcds. There is some stuff and fields I'd even black out altogether in some media and streams because it's just distracting garbage or point light sources I don't need to be staring at.

I use turnoffthe lights add on and colorchanger add-ons on my Firefox browser on any screen but masking objects would be more useful overall. If they remembered content they were customized for or at least could be named sets it would be even more useful.
Like this?

https://www.softpedia.com/get/Desktop-Enhancements/Other-Desktop-Enhancements/Zorro.shtml
 
  • Like
Reactions: elvn
like this
Back
Top