Why OLED for PC use?

Just wondering why you spend so much time and effort on your anti-OLED proselytism, that's all.

And if you like them as low brightness SDR monitors, which have their place, what's the problem? Why attack everyone in this thread and claim their individual preferences and opinions don't count?

I'm trying to figure out where you're coming from, if not simply from a place of sheer hate and trolling behaviour. I don't understand the goal.
you just dont understand ;)
 
  • Like
Reactions: Senn
like this
I can see the benefit of HDR in video and gaming content, but I usually keep it turned off on the desktop, because at least on my shitty current monitor, the ASUS XG438Q, turning HDR on overexposed the desktop to the point where it is practically unusably ugly and burns my eyes.

In Cyberpunk it looked pretty nice, but I kept the settings at about 600 max because any more than that started to become uncomfortable to my eyes.

That said, I will fully admit, I am not very well read on HDR, it's various technologies and optimal settings. It hadn't been a big deal to me yet. In most cases it winds up being more of an annoyance I have to deal with than something I actually want.
 
Just wondering why you spend so much time and effort on your anti-OLED proselytism, that's all.

And if you like them as low brightness SDR monitors, which have their place, what's the problem? Why attack everyone in this thread and claim their individual preferences and opinions don't count?

I'm trying to figure out where you're coming from, if not simply from a place of sheer hate and trolling behaviour. I don't understand the goal.
Because your preference doesn't matter when OLED loses more accuracy due to low brightness.
 
Because your preference doesn't matter when OLED loses more accuracy due to low brightness.
So... hatred then. Not just for OLED, but for everyone else in this thread.

I'm not sure you quite get how nasty "your preference doesn't matter" sounds.
 
So... hatred then. Not just for OLED, but for everyone else in this thread.

I'm not sure you quite get how nasty "your preference doesn't matter" sounds.
Again you preference doesn't matter. It won't make OLED magically hold any brightness to make better images.

It's never about me. It only exposes what you are.
 
Again you preference doesn't matter.
Pure nastiness.

It won't make OLED magically hold any brightness to make better images.
We just came to the conclusion that "dim SDR" has its rightful place. This stupid "better images" brightness obsession doesn't have a place here so stop bringing it up.

It's never about me. It only exposes what you are.
I don't even know what this means. Just sounds like nastiness for nastiness' sake.
 
Pure nastiness.


We just came to the conclusion that "dim SDR" has its rightful place. This stupid "better images" brightness obsession doesn't have a place here so stop bringing it up.


I don't even know what this means. Just sounds like nastiness for nastiness' sake.
Anything has its place such as setting an example what a failure looks like.
 
So your preference can make OLED bright enough to do even HDR400?
Again, we're talking about SDR. "Dim" SDR, to put it your way. For office use.

You yourself just admitted you like OLED for this purpose and that's what we're talking about.

At this point in time I don't care whether it can get bright enough for HDR. For office use that's completely irrelevant.

So tell me again why you are being nasty about this when all I was trying to do is ask if there's anything you do like about these displays.
 
Again, we're talking about SDR. "Dim" SDR, to put it your way. For office use.

You yourself just admitted you like OLED for this purpose and that's what we're talking about.

At this point in time I don't care whether it can get bright enough for HDR. For office use that's completely irrelevant.

So tell me again why you are being nasty about this when all I was trying to do is ask if there's anything you do like about these displays.
I said I like OLED being a dim SDR monitor for the long term, which sets the example of what a failure looks like.

Of course you don't care as you value your preference. All the thing you do just makes me laugh sooner.
 
I said I like OLED being a dim SDR monitor for the long term, which sets the example of what a failure looks like.
Why is that a failure if that's your intended goal for the display?

The only thing I'm taking from this is that your statement about liking OLED was completely sarcastic and just another excuse to hate on it any anyone that likes it.

Of course you don't care as you value your preference.
As do you. Your preference for FALD and the brightness they can achieve is completely valid, but still just a preference.

All the thing you do just makes me laugh sooner.
Glad you're so entertained. Weird as that is.
 
Why is that a failure if that's your intended goal for the display?

The only thing I'm taking from this is that your statement about liking OLED was completely sarcastic and just another excuse to hate on it any anyone that likes it.


As do you. Your preference for FALD and the brightness they can achieve is completely valid, but still just a preference.


Glad you're so entertained. Weird as that is.
Because OLED won't make an impact as it has worse images in the end.
 
Color is lit by brightness. Without brightness the color won't be accurate.

True, but more isn't better. What is really important is the balance in brightness between the base colors, not the absolute brightness.

Absolute brightness has some limited effect uses when you what really bright light shafts blind the viewer in an otherwise dark scene, and stuff like that, but in overall image quality and color accuracy it is the balance that is WAY more important.
 
Office use. "Dim". Long term. Your argument remains completely irrelevant for this use case.


In your opinion.
Even office needs 250nits-300nits. It's not your work from home usage. Without brightness OLED doesn't have better images. It's simple as that. It's not an opinion.
 
True, but more isn't better. What is really important is the balance in brightness between the base colors, not the absolute brightness.

Absolute brightness has some limited effect uses when you what really bright light shafts blind the viewer in an otherwise dark scene, and stuff like that, but in overall image quality and color accuracy it is the balance that is WAY more important.
You talk about grading? The balance is an artistic approach. The job of a display is to make sure it's accurate enough so that you can see the desired balance.
 
Even office needs 250nits-300nits.
Which OLED is capable of. Other compromises aside.

It's not your work from home usage.
My work from home usage is on a 50" FALD with plenty of brightness. That isn't going to change any time soon.

Without brightness OLED doesn't have better images.
81dda52a2daf55694bf60c5448c4d610.gif


It's simple as that. It's not an opinion.
You keep telling yourself that.
 
Which OLED is capable of. Other compromises aside.


My work from home usage is on a 50" FALD with plenty of brightness. That isn't going to change any time soon.


View attachment 556268


You keep telling yourself that.
It's more like you keep telling yourself that

OLED cannot even have 300nits at larger window size.

OLED can only have more brightness at 0.01%, 0.1%, 1% windows size and that is all bout how good it can be.
 
How much is the membership fee at The Church of Brightness?

I don't get the brightness stuff.

Now granted, I generally work (and game, and watch movies) in a dark room, but my Asus XG438Q apparently tops out at 474 nits in SDR mode, and I have it set ot 17% brightness. Much more than that I find uncomfortable to use, especially with white windows. They feel like they are going to burn my retinas. Assuming this is just a linear percentage, that means I am at about 80 nits for comfort in SDR on the desktop?

With HDR enabled I can get about 600 nits. That has been MORE than enough for me.

OLED cannot even have 300nits at larger window size.

I'm guessing window size in this context means something different than I am used to?

Because I use HDR only on full screen content. If something were windowed, I'd probably prefer to just keep it at SDR.

Doing some reading on Rtings, window size seems to refer to the proportion of the screen which is at max brightness, not necessarily like an operating system window. A bright moon on a dark sky would be the window I assume.


Rtings says the following about the G2 OLED:

Hallway Lights (~1950 cd/m²)
1,018 cd/m²
Yellow Skyscraper (~700 cd/m²)
606 cd/m²
Landscape Pool (~300 cd/m²)
251 cd/m²
Peak 2% Window
976 cd/m²
Peak 10% Window
950 cd/m²
Peak 25% Window
387 cd/m²
Peak 50% Window
269 cd/m²
Peak 100% Window
177 cd/m²
Sustained 2% Window
838 cd/m²
Sustained 10% Window
867 cd/m²
Sustained 25% Window
357 cd/m²
Sustained 50% Window
256 cd/m²
Sustained 100% Window
169 cd/m²
Automatic Brightness Limiting (ABL)
0.108

The LG G2 OLED has impressive peak brightness in HDR, and it's much brighter than the previous generation of OLED TVs, including the LG G1 OLED and the LG C1 OLED. Unfortunately, large bright scenes are still significantly dimmer than smaller highlights due to the TV's aggressive Automatic Brightness Limiter (ABL). It isn't very noticeable when watching regular content, but it's distracting when using the TV as a monitor. Most scenes are displayed at the correct brightness level, and it tracks the PQ EOTF well for the most part. Very dark scenes are crushed a bit. There's a smooth roll-off near the TV's peak brightness, so bright highlights in really bright scenes are preserved.

These measurements are in the 'Cinema' HDR Picture Mode with OLED Pixel Brightness, Contrast, and Peak Brightness all at their max settings, with Color Temperature at ' Warm 50' and all other image processing disabled. By setting Auto Dynamic Contrast to 'High' and enabling Dynamic Tone Mapping, you can get a slightly brighter image, as seen in this EOTF. Note that these settings increase the brightness of darker scenes, but the peak brightness of the display is the same.

Another thing I think is being neglected is that it is not absolute brightness that is important, but rather then dynamic range. The range between the darkest darks, and the brightest brights. OLED has some of the darkest darks of all technologies, which offsets some of the needs to have the brightest brights be as bright as they would on other technologies.


First I had to google this because I didn't know, but Nits and Cd/m^2 are the same unit.

It would seem to me the "large window size" brightness are just not a practical issue. The benefit you get from HDR is when there are differences between important dark and light parts of the scene. A full screen full brightness output is not necessarily particularly important to image quality. In my case where I seem to prefer ~80 nits for full screen brightness, the ~169 the LG G2 achieves at 100% window size seem more than twice what I need.

The parts where high light output is really important (a reflection of light, or the hallways lights) that blind the user, are small areas of the screen, and there it does amazingly at over a thousand nits, which should be enough to blind anyone.
 
I don't get the brightness stuff.
I do but only in certain circumstances and use cases.

An example being a session of GT7 not too long ago on my QN94A, which Samsung states as "Quantum HDR 2000" capable. I'm not sure if I've actually seen 2000 (or near) nit highlights but when the sun caught one of the cars in front of me and reflected the light straight at me, the impact was intense. And during night races, if a car shines its headlights into your rear view, kinda dazzling.

So it's a cool thing to have when the time is right.

Saying that we need it everywhere because "better images" makes zero sense and yet this is the argument continuously pushed. Where I just don't get the obsession is with exactly what you mentioned - why on earth would I want this dazzling level of brightness while working with mostly white windows? Comfort, while maintaining legibility, is the go-to for working.

Unless you're a game dev or video editor or such, then I get it.
 
And even then, dual panel IPS displays are superior because they don’t rely on zone dimming. Natively they’re as close as it gets to OLED’s infinite contrast without resorting to modulating the backlight
 
And even then, dual panel IPS displays are superior because they don’t rely on zone dimming. Natively they’re as close as it gets to OLED’s infinite contrast without resorting to modulating the backlight
Unfortunately dual panel have much lower efficiency due to light passing through 2 lossy LCD panels (and the extra power needed for the 2nd panel), resulting in higher power requirement for the same light output.
This makes the monitor produce a lot more heat which isnt good for the light source or the panels. Worse, dual panels trap more heat and are harder to cool with more problems when there is a gap between them.
This means brightness must be limited to keep power use down, reducing additional rapid ageing to some degree and prevent heat related early burn in on the LCD panels.

Its not a progressive technology that can keep up and in the current energy climate will be frowned upon.
 
diy perks on yt just built his own dual lcd and the leds to light it were 250w alone but it did look real nice.
1678838577644.png
 
Last edited:
  • Like
Reactions: Nenu
like this
Unfortunately dual panel have much lower efficiency due to light passing through 2 lossy LCD panels (and the extra power needed for the 2nd panel), resulting in higher power requirement for the same light output.
This makes the monitor produce a lot more heat which isnt good for the light source or the panels. Worse, dual panels trap more heat and are harder to cool with more problems when there is a gap between them.
This means brightness must be limited to keep power use down, reducing additional rapid ageing to some degree and prevent heat related early burn in on the LCD panels.

Its not a progressive technology that can keep up and in the current energy climate will be frowned upon.
Hisense made the U9DG TV and it didn't really perform much better than LG OLED TVs while having all the drawbacks of LCD. So dual layer LCD seems like a pretty dead tech at this point, especially with the tightening of EU power regulations that have made manufacturers put "on by default" eco modes in their displays that gimp their capabilities to save power.

OLED and mini-LED seem to be the technologies of choice until micro-LED can be made cheaper, which means breakthroughs in manufacturing tech need to happen first.
 
Hisense made the U9DG TV and it didn't really perform much better than LG OLED TVs while having all the drawbacks of LCD. So dual layer LCD seems like a pretty dead tech at this point, especially with the tightening of EU power regulations that have made manufacturers put "on by default" eco modes in their displays that gimp their capabilities to save power.

OLED and mini-LED seem to be the technologies of choice until micro-LED can be made cheaper, which means breakthroughs in manufacturing tech need to happen first.

WOLED looks dead in the water as well, MLA can only do so much and it doesn't even increase the color brightness to match QD-OLED. Lol people on discord were looking back at WOLED for the last 7-8 years since the C6 and found that overall picture quality has barely improved at all, basically just tiny tiny incremental improvements every year but you would be hard pressed to notice a major jump going from a 2016 C6 to a 2023 C3. LG really is Intel stagnant when it comes to their OLED tech.
 
WOLED looks dead in the water as well, MLA can only do so much and it doesn't even increase the color brightness to match QD-OLED. Lol people on discord were looking back at WOLED for the last 7-8 years since the C6 and found that overall picture quality has barely improved at all, basically just tiny tiny incremental improvements every year but you would be hard pressed to notice a major jump going from a 2016 C6 to a 2023 C3. LG really is Intel stagnant when it comes to their OLED tech.
LG seems to have prioritized burn-in mitigation over all else as that's probably where you see the most improvement. C9 and up seem to have been mostly the same, with only the 48 and 42" sizes being new things. Honestly LG has gotten a lot of things right. I don't mind using WebOS for streaming services, the settings menus are reasonably easy to work with and so on. Also no bullshit like only two 4K 120 Hz ports like on the new Mediatek chip.
 
LG seems to have prioritized burn-in mitigation over all else as that's probably where you see the most improvement. C9 and up seem to have been mostly the same, with only the 48 and 42" sizes being new things. Honestly LG has gotten a lot of things right. I don't mind using WebOS for streaming services, the settings menus are reasonably easy to work with and so on. Also no bullshit like only two 4K 120 Hz ports like on the new Mediatek chip.

Oh yeah for sure they've done pretty well up to this point. But that QD-OLED and MiniLED is starting to bring real competition so they gotta start moving the needle forward at some point for improving picture quality, although I guess we can say MLA is a start of that I guess.
 
I don't get the brightness stuff.

Now granted, I generally work (and game, and watch movies) in a dark room, but my Asus XG438Q apparently tops out at 474 nits in SDR mode, and I have it set ot 17% brightness. Much more than that I find uncomfortable to use, especially with white windows. They feel like they are going to burn my retinas. Assuming this is just a linear percentage, that means I am at about 80 nits for comfort in SDR on the desktop?

With HDR enabled I can get about 600 nits. That has been MORE than enough for me.



I'm guessing window size in this context means something different than I am used to?

Because I use HDR only on full screen content. If something were windowed, I'd probably prefer to just keep it at SDR.

Doing some reading on Rtings, window size seems to refer to the proportion of the screen which is at max brightness, not necessarily like an operating system window. A bright moon on a dark sky would be the window I assume.


Rtings says the following about the G2 OLED:

Hallway Lights (~1950 cd/m²)
1,018 cd/m²
Yellow Skyscraper (~700 cd/m²)
606 cd/m²
Landscape Pool (~300 cd/m²)
251 cd/m²
Peak 2% Window
976 cd/m²
Peak 10% Window
950 cd/m²
Peak 25% Window
387 cd/m²
Peak 50% Window
269 cd/m²
Peak 100% Window
177 cd/m²
Sustained 2% Window
838 cd/m²
Sustained 10% Window
867 cd/m²
Sustained 25% Window
357 cd/m²
Sustained 50% Window
256 cd/m²
Sustained 100% Window
169 cd/m²
Automatic Brightness Limiting (ABL)
0.108

The LG G2 OLED has impressive peak brightness in HDR, and it's much brighter than the previous generation of OLED TVs, including the LG G1 OLED and the LG C1 OLED. Unfortunately, large bright scenes are still significantly dimmer than smaller highlights due to the TV's aggressive Automatic Brightness Limiter (ABL). It isn't very noticeable when watching regular content, but it's distracting when using the TV as a monitor. Most scenes are displayed at the correct brightness level, and it tracks the PQ EOTF well for the most part. Very dark scenes are crushed a bit. There's a smooth roll-off near the TV's peak brightness, so bright highlights in really bright scenes are preserved.

These measurements are in the 'Cinema' HDR Picture Mode with OLED Pixel Brightness, Contrast, and Peak Brightness all at their max settings, with Color Temperature at ' Warm 50' and all other image processing disabled. By setting Auto Dynamic Contrast to 'High' and enabling Dynamic Tone Mapping, you can get a slightly brighter image, as seen in this EOTF. Note that these settings increase the brightness of darker scenes, but the peak brightness of the display is the same.

Another thing I think is being neglected is that it is not absolute brightness that is important, but rather then dynamic range. The range between the darkest darks, and the brightest brights. OLED has some of the darkest darks of all technologies, which offsets some of the needs to have the brightest brights be as bright as they would on other technologies.


First I had to google this because I didn't know, but Nits and Cd/m^2 are the same unit.

It would seem to me the "large window size" brightness are just not a practical issue. The benefit you get from HDR is when there are differences between important dark and light parts of the scene. A full screen full brightness output is not necessarily particularly important to image quality. In my case where I seem to prefer ~80 nits for full screen brightness, the ~169 the LG G2 achieves at 100% window size seem more than twice what I need.

The parts where high light output is really important (a reflection of light, or the hallways lights) that blind the user, are small areas of the screen, and there it does amazingly at over a thousand nits, which should be enough to blind anyone.
The HDR content can easily surpass 10% window peak brightness. OLED can only be brighter at window size below 1%, which just shows several dots in total.

At 1%-10% window OLED already looks dimmer compared to miniLED.

Beyond 10% window miniLED will crash OLED.
 
The HDR content can easily surpass 10% window peak brightness. OLED can only be brighter at window size below 1%, which just shows several dots in total.

At 1%-10% window OLED already looks dimmer compared to miniLED.

Beyond 10% window miniLED will crash OLED.
And still you act as if brightness is the only thing a display needs to be good.
 
Office use typically requires 250-300nits of brightness, unless you're in a house with low ambient light.

My environment has low to moderate brightness and 160 nits is generally plenty for me.


Games in SDR require a much higher brightness than 150nits, essentially requiring what "HDR" OLED offers. However, the HDR on OLED screens can be dimmer than SDR. Yet the flickering kick in a slightly higher APL. You cannot stare at an APL of 250 nits on an OLED screen in a dimly lit room for long. You will say it is too "bright".

This is false. 160 nits is what I play games at. I'm playing Atomic Heart (which doesn't have native HDR) and it looks beautiful at 160 nits in the sRGB colorspace. I play SDR games on my TV at ~160 nits also. I know some people prefer brighter, and that's fine and depends on preference and room conditions, but saying games "games in SDR require a much higher brightness than 150 nits" is not accurate.

As far as flickering, supposedly the new ASUS OLED (same panel as the LG) can do higher-nit SDR without a problem, but obviously I haven't tried it; it'll be interesting to see if there are any flickering/eyestrain reports.


Games in HDR require a much higher brightness than what OLED can offer. Just like Cyberpunk require a range of colors and brightness levels that often exceed what OLED can offer. They don't shine with OLED at all.


Even in low APL night scene the range of highlight already reaches 1500+nits. It's not the same image on OLED.

Cyberpunk looks gorgeous on my display in HDR. It also has many settings to match HDR to your display. I'm sure it can get brighter on a FALD display, but it's certainly no slouch on OLED, especially if you're not comparing side-by-side.
 
Back
Top