LG 32GS95UE – OLED 31.5″ with 4K @ 240Hz and 1080p @ 480Hz Support - The death knell of LCD panels

It's not doing 1000 for sure, but it might be able to get a lot closer to it like the PG42UQ and PG27AQDM.

View attachment 651041

View attachment 651042

I would be more than happy to have that kind of HDR performance vs what QD OLED offers.

View attachment 651043

View attachment 651044

Again color volume can be thrown out the window when this is all the panel is capable of.
I'm still wondering what the heck people think they need 1000 nits full screen for?

I have a 27 inch IPS which does 430 nits SDR and 520 HDR-------and I've never had it anywhere close to max. I always game with a (not very bright) overhead light on. And I never turn the brightness past "48" and its usually more like 43.
For HDR, I appreciate a good twinkle. But, I don't want anything so bright it causes me to squint and look away.

1000 nits for a 2% would maybe be alright for me. But I wouldn't want it that bright for anything larger.
 
I'm still wondering what the heck people think they need 1000 nits full screen for?

I have a 27 inch IPS which does 430 nits SDR and 520 HDR-------and I've never had it anywhere close to max. I always game with a (not very bright) overhead light on. And I never turn the brightness past "48" and its usually more like 43.
For HDR, I appreciate a good twinkle. But, I don't want anything so bright it causes me to squint and look away.

1000 nits for a 2% would maybe be alright for me. But I wouldn't want it that bright for anything larger.

I didn't say I wanted 1000 nits fullscreen dude...
 
I'm still wondering what the heck people think they need 1000 nits full screen for?

I have a 27 inch IPS which does 430 nits SDR and 520 HDR-------and I've never had it anywhere close to max. I always game with a (not very bright) overhead light on. And I never turn the brightness past "48" and its usually more like 43.
For HDR, I appreciate a good twinkle. But, I don't want anything so bright it causes me to squint and look away.

1000 nits for a 2% would maybe be alright for me. But I wouldn't want it that bright for anything larger.

The lower your peak nits is the lower your color volume.

As I understand it, any time you use dynamic tone mapping (or HLG tonemapping and diverge from the static scale, which can do with HLG since it is is relative brightness),
rather than PQ "absolute value" HDR (plus static tone mapping by your screen into it's range limitations)
- you are stealing brightness and thus color volume details-in-color from other ranges. So you'll probably end up clipping details (making it a white blob) at the top end of the scale, like losing the detail of the sun and clouds in a bright sky commonly to a featureless white blob, but it applies to any bright color detail pushed out of range of the display on the top end.

Any time you shift things upward dynamically you are usually pushing stuff at the top over the cliff (clipping to white). You might also raise the black floor, depending.

In PQ HDR with dynamic tone mapping of a display turned off (and/or activating HGiG mode), the static range is delivered to the display and then the display usually keeps the bottom "sdr" + maybe 50 - 60% of the screen brightness as 1:1 mapped, then it compresses the rest into the top as necessary. It varies in exactly how it's done by manufacturer, model and firmware versions as the case may be. For example you might get a HDR4000 metadata sent but you have to compress past 500 - 600nit into the last ~ 400 nit on a 1000nit screen.


. . .

In regard to your question, HDR 4000, HDR 10,000 material , (and even HDR 1,000 material on screens that can't show it fully) - on screens incapable of showing 10k, 4k, 1k adequately, the screens will map the HDR data to the bottom half or so of the screen's range 1:1, then compress the rest into the remainder of the screen's capability. When using screens with lower peak nits, you are compressing color detail more. It's not these numbers specifically, but to simplify, instead of 4 or 10 tones of a color gradient from a color value, you are down to 1. Screens are already compressing and losing detail on the top half of the screen's range, the number of colors are alread squashed to less. Cutting it lower loses even more color and detail-in-color.

That doesn't even take into consideration artifically lowering your screen's brightness values in your OSD for whatever reason, using dynamic value options in the OSD, or things like ABL (which both OLED and some bright FALD screens suffer), and strict % of screen brightness limits and duration threshold limits kicking in.

I watched a dolby vision mastering documentary video a few years ago where the dolby tech said (at least at the time) - a typical HDR 10,000 scene was mastered as something like 50% at "zero" to 100 nit, 25% at 100 to 1,000 nit, and 25% at 1000 to 10,000 nit.
Your tv's firmware would tone map that down within it's own ratios to fit the capacity of the screen though. When fed HDR 10,000 curve, the LG CX's curve was quoted to me as mapping accurate up to 400 nits, then squeezing 400 - 10,000 nits into the remaining 400 - 800 nits.
 
Last edited:
Ok, replace 'full screen' with 'real scene'?

The two are completely different my guy. Max fullscreen brightness is almost as useless as the 1% window brightness. Both the PG42UQ and PG27AQDM have the poorest fullscreen brightness at like barely over 100 nits, yet they deliver almost the best HDR experience among OLED monitors. If you are fine with highlights being no more than 450 nits then that's ok if it's what you prefer, but if say the source material is calling for 1000 nits then I would like to get as close to that as possible. I wouldn't even consider myself a brightness snob as I would be totally content with 1000 nits, others here want as much as possible and won't settle for 1000.
 
Rtings real scene measures a 10% highlight. This discussion is going into bizarro world. Someone is justifying the lower brightness capabilities because higher brightness hurts their eyes. Alright.
 
I'm still wondering what the heck people think they need 1000 nits full screen for?
It can be pretty impressive for special effects. In Jedi Survivor when you get flash-banged it'll do as bright a full screen white as it can and it is really impactful. That said, it isn't that big a deal.
1000 nits for a 2% would maybe be alright for me. But I wouldn't want it that bright for anything larger.
No it really wouldn't because you have to remember test patches are that bright with nothing else on the screen. If there's other content, what the monitor is capable of lowers. That's why there's such a focus on 10-25% brightness, not because scenes normally have that many highlights, many don't, but because the screen needs enough headroom to display all the other content as well. If you have a couple of highlights that hit 1000 nits that only take up 2% of the screen, well you still have to display other stuff. So the display needs to be capable of more than just 2% at full brightness and the rest at 0 or it'll have to lower the brightness.

That's why this is not nearly such a big deal for TVs. They often have a full screen brightness lower than the monitors, but they can deal with highlights much better without lowering the brightness. They'll be at or over 1000 nits at 10% window, which translates in to better results in real scenes.

Now, if you just flat out don't like bright, not even for small highlights, that's 100% fine, but most people do. This isn't just me speculating, Dolby tested it when they developed the PQ curve and the 10,000 nit standard and the reason it goes so high is people like those bright highlights.
 
It can be pretty impressive for special effects. In Jedi Survivor when you get flash-banged it'll do as bright a full screen white as it can and it is really impactful. That said, it isn't that big a deal.

No it really wouldn't because you have to remember test patches are that bright with nothing else on the screen. If there's other content, what the monitor is capable of lowers. That's why there's such a focus on 10-25% brightness, not because scenes normally have that many highlights, many don't, but because the screen needs enough headroom to display all the other content as well. If you have a couple of highlights that hit 1000 nits that only take up 2% of the screen, well you still have to display other stuff. So the display needs to be capable of more than just 2% at full brightness and the rest at 0 or it'll have to lower the brightness.

That's why this is not nearly such a big deal for TVs. They often have a full screen brightness lower than the monitors, but they can deal with highlights much better without lowering the brightness. They'll be at or over 1000 nits at 10% window, which translates in to better results in real scenes.

Now, if you just flat out don't like bright, not even for small highlights, that's 100% fine, but most people do. This isn't just me speculating, Dolby tested it when they developed the PQ curve and the 10,000 nit standard and the reason it goes so high is people like those bright highlights.

Exactly. This is why the test patch brightness almost never lines up with the real scene brightness. Here is RTings real scene video, they take measurements on the upper left light source:


View: https://www.youtube.com/watch?v=dc6zafyvE1M

There is absolutely no way in hell these QD OLED monitors will hit 1000 nits on the upper left light source in a scene like this, and even ~480 nits that they score on test patches in TB400 mode is a stretch because the actual real scene brightness ends up being more like 400.
 
As I understand it anyway, (I'm open to other interpretations) . . it's like this:

Whenever you are viewing HDR material mastered higher than your screen's capability , all of the colors on the top ~ half of your screen are probably being compressed into a lower number of values. It's called color volume.

Say you had some advanced screen that could view 1000nit accurately, at 25% and 50% of the field of the screen space, and with some decent duration. When you send HDR 4000 or HDR 10,000 material, all of the detail above say 500nit or so is probably being compressed into that last ~ 500 nit remaining on the screen, starting at the colors at 501 nit, 502 nit and onward. Compressed meaning, a larger number of colors is dropped to fit into a lower number of colors.

Likewise, if you send 1000nit to a screen that isn't capable of showing 1000nit fully, you are compressing the colors on the top half of the screen more there too.

You aren't just cutting out the peak color values when you have a lower nit / color volume screen, the result can be that you are squashing more of the colors into fewer color value "slots" on the entire second half of your screen's range. It's not just that you turned that spotlight or reflection down from 1000nit to 600nit. As I understand it, you'd have for some reason set your scale to top out at 600, (similar to if you had a 600nit screen), resulting in squashing all of the colors together into fewer color values all the way down the scale within the top 50% of your screen. It's more like a slinky or an accordion on the top half, when you reduce the top height, the result is that you squash the whole slinky portion of the screen into lower amount of colors and thus losing detail in colors.

I think the only other way to reduce the peaks without squashing the entire top ~"half" of the screen would be a hard clip cutoff, where everything above a certain value ends up "clipped", with all of those values turned into a white blob, which is what can happen when using dynamic tone mapping or relative-brightness HLG HDR format with the brightness turned up. That's not desirable by most people either.
 
Last edited:
Yes I understand tone mapping and how that comes into play with HDR, brightness, colors, etc.

I'm just saying, I don't think I could stand to look at 700+ nits of 'real scene' brightness, while sitting 3 feet away.

Those really high nits make more sense to me, in a large living room situation.

However, regardless, I like to have some ambient light or light behind the display, to balance things out for me.
 
Like others here, I thought I cancelled mine, but it arrived today as well. I'll be interested to hear what others think. Text clarity is still quite bad despite the claimed improvements, which likely means it will be going back because I'm doing more work than gaming these days. Full screen brightness is also not all that satisfactory in the day time. Its also really annoying you can't remap the button that swaps between 4k and 1080p to toggle between inputs instead as I am constantly toggling between my work and personal computer during the day.

It looks pretty good in games, and the motion is excellent, but again the lack of brightness really stands out. Media also looks excellent on it.
 
Yes I understand tone mapping and how that comes into play with HDR, brightness, colors, etc.

I'm just saying, I don't think I could stand to look at 700+ nits of 'real scene' brightness, while sitting 3 feet away.

Those really high nits make more sense to me, in a large living room situation.

However, regardless, I like to have some ambient light or light behind the display, to balance things out for me.
Do you squirm in pain anytime you're driving when the sun's out?
 
Do you squirm in pain anytime you're driving when the sun's out?

Our eyes view everything relatively, so you have to take that into account. A smartphone at max brightness will look much brighter in dim to dark conditions than when viewed out in daylight or bright sunlight. PQ hdr is mastered for dim to dark viewing conditions as a solid way to roll out nits from a foundation to begin with though. Reference environment. Similarly, calibration of screens is usually done right up against the screen in dark conditions without polluting the area scanned with ambient light.

That said, sunlight is way brighter than anything a screen can do, and a lot of people wear sunglasses driving in bright sunlight.


. . . .

https://www.theverge.com/2014/1/6/5276934/dolby-vision-the-future-of-tv-is-really-really-bright

"The problem is that the human eye is used to seeing a much wider range in real life. The sun at noon is about 1.6 billion nits, for example, while starlight comes in at a mere .0001 nits; the highlights of sun reflecting off a car can be hundreds of times brighter than the vehicle’s hood. The human eye can see it all, but when using contemporary technology that same range of brightness can’t be accurately reproduced. You can have rich details in the blacks or the highlights, but not both.

So Dolby basically threw current reference standards away. "Our scientists and engineers said, ‘Okay, what if we don’t have the shackles of technology that’s not going to be here [in the future],’" Griffis says, "and we could design for the real target — which is the human eye?" To start the process, the company took a theatrical digital cinema projector and focused the entire image down onto a 21-inch LCD panel, turning it into a jury-rigged display made up of 20,000 nit pixels. Subjects were shown a series of pictures with highlights like the sun, and then given the option to toggle between varying levels of brightness. Dolby found that users wanted those highlights to be many hundreds of times brighter than what normal TVs can offer: the more like real life, the better."
------------------------------------
 
So compared side-by-side with the AW32 QD-OLED, they both have some give and take as to be expected. The colors are slightly better on the QD, but the whites are way better as to be expected on the WRGB. With some tweaking in Vivid mode for use during gaming, I am able to get small HDR highlights and colors closer to the QD-OLED, but with higher average brightness throughout the whole image. Overall, the bright whites of the LG make the QD OLED whites look a bit warm and dingy, no matter what settings I tweak on it.

There is a tiny bit of sparkle from the AR film, but it's not a deal breaker and I think I prefer it over the semi-gloss of the AW32. It's definitely one of the better AR films out there. The LG also has smaller bezels and a I think a more professional looking stand versus the white Alienware. Going back to a flat monitor; I thought I would not like it. But after some use, I think I may prefer flat at the 32" size.

I can only hear the fan if my ear is up to the top of the monitor. These new panels on both sides of the fence have much better sub-pixel layout for text, and I would rate them about the same. It's good enough now that I don't think it's a huge issue unless you need the sharpest super fine text, which really should be reserved for 4K/27" anyway. Or even 5K/27" or 6K/32".

The motion clarity on the 480 Hz mode is epic, but for me, even in FPS's, I'm not sure the clarity loss is worth it. Maybe if you play very basic games like arena shooters. In my three main games: Halo Infinite, COD:MW3 and PUBG, I prefer 4K/240 Hz. 240 FPS on OLED is still pretty clear.

IMO this is the best overall single gaming display out there, especially if you are a competitive FPS player with that insane 480 Hz OLED motion clarity. The only other option I see on the horizon is a 27" 480 Hz 1440p, but for me and desktop and other games I'd have to get a second monitor. And I prefer only one monitor on the desk these days.

PS: Has anyone found out information about the differences in "Screen Move" settings 1 through 3?

So funny they even have the setting in their marketing material and nowhere is it stated what the different modes do, not even in the manual:

1714568176627.png



1714568601254.png



LOL the above. Hey set your screen move properly for your environment, but we won't tell you what they are!
 
Last edited:
The AR film on this display is the best I've ever seen.
 

Attachments

  • 20240501_104212.jpg
    20240501_104212.jpg
    461.9 KB · Views: 0
  • 20240501_104046.jpg
    20240501_104046.jpg
    421.6 KB · Views: 0
I was able to check out my friends whose arrived today and gotta say I much prefer the HDR performance of this over the 32" QD-OLED I had. Its probably because its not getting quite as bright and clips at only 600nits on the HDR Calibration app but still, it didn't look so dim in certain games and the ABL fluctuations were far less severe.

EDIT: I went into the 480hz mode thinking I'd barely be able to see a difference in smoothness but there is indeed a noticeable difference to my eyes. Against the Storm is one of the few games where 480FPS is possible on his setup but the gain in smoothness and motion clarity is definitely there. Also the monitor switches between HDR/SDR super fast.
 
Last edited:
The AR film on this display is the best I've ever seen.
Can you try making a custom resolution for 2560x1440@360hz, please? The monitor should have enough bandwidth to handle this.
Also, how does the monitor scale 1920x1080 and 1280x720 in 4k/240hz mode? Is it sharp like the 1080p/480hz mode?
 
I was able to check out my friends whose arrived today and gotta say I much prefer the HDR performance of this over the 32" QD-OLED I had. Its probably because its not getting quite as bright and clips at only 600nits on the HDR Calibration app but still, it didn't look so dim in certain games and the ABL fluctuations were far less severe.

EDIT: I went into the 480hz mode thinking I'd barely be able to see a difference in smoothness but there is indeed a noticeable difference to my eyes. Against the Storm is one of the few games where 480FPS is possible on his setup but the gain in smoothness and motion clarity is definitely there. Also the monitor switches between HDR/SDR super fast.
I saw a video on youtube yesterday where they played tons of HDR content between the gigabyte QDs and the LG next to each other and the LG looked better IMO in almost everything but small bright colored highlights.


View: https://youtu.be/z7n0VXU7i6g?feature=shared

Unclear what settings or modes they are using on the monitors.
 
The 1080p/480hz mode looks really bad to me. I think you have to be very desperate for the extra 240hz to justify losing the visual clarity of 4K.

Even in Doom Eternal where you can achieve 350-450FPS, the fact that I can count pixels completely offset any motion smoothness/clarity benefits.

I would only really use it in very specific games or if you can put the monitor 1 Shaquille O'Neal length away.

Also I think I prefer the text clarity of QD-OLED vs this WOLED. It looks soft to me vs a 4K IPS.
 
I was able to check out my friends whose arrived today and gotta say I much prefer the HDR performance of this over the 32" QD-OLED I had. Its probably because its not getting quite as bright and clips at only 600nits on the HDR Calibration app but still, it didn't look so dim in certain games and the ABL fluctuations were far less severe.
Unless that's a bug with the calibration app------these are supposed to get brighter than QD-OLED, for 1% windows with a test pattern. ~1300 nits.

I think there are a few different modes on the monitor, which affect HDR behavior? Maybe check out a couple of reviews to make sense of them, and get the most brightness from the panel.
 
Last edited:
Unless that's a bug with the calibration app------these are supposed to get brighter than QD-OLED, for 2% windows with a test pattern. ~1300 nits.

I think there area few different modes on the monitor, which affect HDR behavior? Maybe check out a couple of reviews to make sense of them, and get the most brightness from the panel.
Peak brightness is set to high in the OSD. VESA DisplayHDR app also reports max CLL as 603 and max peak luminance the same.

Either these are being shipped with incorrect HDR output or its intentional.

EDIT: We checked for a firmware update and it shows the latest.
 
Last edited:
Considering this was originally planned for a 2nd half of 2024 release and got rushed out the door to compete with the QD hype it's likely LG will go back and release a firmware update to fix some of the HDR issues.

Doubt it will be anything crazy to get excited about but the max 603 issue, EOTF curve as well as a slight brightness bump in the 5-10% range will likely see some improvements at some point....least you would hope.
 
Well that didn't last long. He's already putting back in the box to return lol.

I think its a decent monitor but some of WOLED's shortcomings are as present as ever like the super grainy greys and soft text quality. As a panel technology there is no doubt QD-OLED is superior right now its just too bad that their HDR implementation of 400/1000 modes is dumb.

Hopefully this thing will see an update to address the HDR clipping at 600nits. I think that might be what's responsible for me seeing so much less ABL compared to QD-OLED since its behaving like the QD-OLED HDR400 mode.

EDIT: Just want to reiterate how horrible the 1080/480hz mode looks. Its seriously like how I remember 640x480 back in the day.
 
Last edited:
Glad I held off on these monitors until I read the real skinny here at [H]. I'll reiterate again how awesome the QD-OLED TV's look in gaming at 144Hz, if you can make a big screen work.

Hopefully we will get better, brighter monitor options in 2025. Or 240Hz on the televisions. That would be effing epic. Right now I'm interested in hearing from QN900D owners, that one can do 4k/240. But it's mini-LED, so meh.
 
Glad I held off on these monitors until I read the real skinny here at [H]. I'll reiterate again how awesome the QD-OLED TV's look in gaming at 144Hz, if you can make a big screen work.

Hopefully we will get better, brighter monitor options in 2025. Or 240Hz on the televisions. That would be effing epic. Right now I'm interested in hearing from QN900D owners, that one can do 4k/240. But it's mini-LED, so meh.

On the monitor side I'm thinking we won't see much brighter options until PHOLED comes around, but latest news is that there are issues with stability and that will delay adoption: https://www.oled-info.com/elec-udcs...-still-unstable-may-delay-market-introduction

Given this delay I don't think we'll see significantly brighter OLED monitors in 2025 or even 2026 but probably closer to 2030. The first PHOLED products would be used in smartphones and TVs first before they trickle into monitors so we are going to be in for the long haul with the current crop of monitors.
 
On the monitor side I'm thinking we won't see much brighter options until PHOLED comes around, but latest news is that there are issues with stability and that will delay adoption: https://www.oled-info.com/elec-udcs...-still-unstable-may-delay-market-introduction

Given this delay I don't think we'll see significantly brighter OLED monitors in 2025 or even 2026 but probably closer to 2030. The first PHOLED products would be used in smartphones and TVs first before they trickle into monitors so we are going to be in for the long haul with the current crop of monitors.
I am confident we will see brighter OLED next year....because I am confident they are artificially holding back current panels, so that they can easily sell us a brighter version next year. Both Samsung and LG released brand new panels with new tech for added brightness, better algorithms for longevity and they......really aren't much brighter, overall. Look at Hardware Unboxed's HDR numbers for Alienware's 1st gen QD-OLED panel 34 inch, compared to their 3rd gen panel 32 inch. Its practically the same brightness. So a 2 generations newer panel has no meaningful increase to brightness? I don't buy that.

LG's 32 with MLA 2.0 and new pixel structure, even curiously follows the brightness curve of the QD-OLED's nearly exactly.

They are selling us on the newness of 32 inch size for both, 27 size for QD-OLED, and a bump to refresh rates.
LG even refreshed their 27 without adding a new gen panel-----yet its significantly brighter than their first one in SDR and HDR, while potentially using the same panel. The refresh could be MLA 1.0, whereas the first one may not be MLA at all. I'm not sure if that info is known.
 
Last edited:
I am confident we will see brighter OLED next year....because I am confident they are artificially holding back current panels, so that they can easily sell us a brighter version next year. Both Samsung and LG released brand new panels with new tech for added brightness, better algorithms for longevity and they......really aren't much brighter, overall. Look at Hardware Unboxed's HDR numbers for Alienware's 1st gen QD-OLED panel 34 inch, compared to their 3rd gen panel 32 inch. Its practically the same brightness. So a 2 generations newer panel has no meaningful increase to brightness? I don't buy that.

LG's 32 even curiously follows the brightness curve of the QD-OLED's nearly exactly.

They are selling us on the newness of 32 inch size for both, 27 size for QD-OLED, and a bump to refresh rates.
LG even refreshed their 27 without adding a new gen panel-----yet its significantly brighter than their first one, which may have the same panel. The refresh could be MLA 1.0, whereas the first one may not be MLA at all. I'm not sure if that info is known.

Yes and no. It has less aggressive ABL at the 50% and 100% window sizes but in real content they are practically identical. Real scene brightness of 424 nits vs 439 nits, hardly a difference.

1714604010375.png


The first gen does have MLA. HDTVTest has a video proving it with a macro shot.

As for QD OLEDs, they will not push the brightness higher because the 1st gen was pretty bad when it came to burn in so the 3rd gens addressed that. If they cranked the brightness up then it would go back to being bad at longevity like the 1st gen panels.
 
Yes and no. It has less aggressive ABL at the 50% and 100% window sizes but in real content they are practically identical. Real scene brightness of 424 nits vs 439 nits, hardly a difference.

View attachment 651286

The first gen does have MLA. HDTVTest has a video proving it with a macro shot.
Here's RTINGS numbers

100% and 50% window in HDR are ~100 nits better. Overall scene brightness is similar. And some of that is because they updated the firmware for the 27GR to improve HDR brightness.
SDR is where the big improvement is.

The 27GS also has a mode which disables ABL and keeps a consistent ~260nits, when brightness setting is maxed out.

27GS (new/refreshed 27)
1714604243798.png


27GR (LG's 27 from last year)
1714604372823.png
 
Liking this model so far, but noticing some dimming of the corners during gaming. I think it's some kind of UI dimming algorithm. Anyone know how to enter the Service Menu?
 
Here's RTINGS numbers

100% and 50% window in HDR are ~100 nits better. Overall scene brightness is similar. And some of that is because they updated the firmware for the 27GR to improve HDR brightness.
SDR is where the big improvement is.

27GS (new/refreshed 27)
View attachment 651287

27GR (LG's 27 from last year)
View attachment 651289

Honestly I could care less about SDR brightness. I never use SDR. HDR brightness is what matters to me so if they don't improve on that next year then I won't bother.
 
  • Like
Reactions: elvn
like this
I was able to check out my friends whose arrived today and gotta say I much prefer the HDR performance of this over the 32" QD-OLED I had. Its probably because its not getting quite as bright and clips at only 600nits on the HDR Calibration app but still, it didn't look so dim in certain games and the ABL fluctuations were far less severe.

EDIT: I went into the 480hz mode thinking I'd barely be able to see a difference in smoothness but there is indeed a noticeable difference to my eyes. Against the Storm is one of the few games where 480FPS is possible on his setup but the gain in smoothness and motion clarity is definitely there. Also the monitor switches between HDR/SDR super fast.

Having both QDOLED and the LG side by side, the full scene/real scene HDR is definitely brighter on the LG WRGB. In vivid mode, I'm talking like 50% brighter it's crazy.

For some reason my Win 11 shows:

1714604310153.png


9,300 nits haha. The HDR calibrator app doesn't clip until 2,700 nits on the slider is reached, which is obviously an error. All I know is games/videos look fantastic in HDR on this thing.


Can you try making a custom resolution for 2560x1440@360hz, please? The monitor should have enough bandwidth to handle this.
Also, how does the monitor scale 1920x1080 and 1280x720 in 4k/240hz mode? Is it sharp like the 1080p/480hz mode?

Only way to create a custom resolution is to turn DSC off, and then it limits it to 1440p@144 Hz. NVIDIA control panel kicks back any higher custom resolutions.

The 1080p/480hz mode looks really bad to me. I think you have to be very desperate for the extra 240hz to justify losing the visual clarity of 4K.

Even in Doom Eternal where you can achieve 350-450FPS, the fact that I can count pixels completely offset any motion smoothness/clarity benefits.

I would only really use it in very specific games or if you can put the monitor 1 Shaquille O'Neal length away.

Also I think I prefer the text clarity of QD-OLED vs this WOLED. It looks soft to me vs a 4K IPS.

It's 1080p at 32", so ya you have to push the monitor back. But that motion clarity is insane. But I too prefer regular 4K/240 Hz for most things. The text is a little soft, but not unbearable for me.

Peak brightness is set to high in the OSD. VESA DisplayHDR app also reports max CLL as 603 and max peak luminance the same.

Either these are being shipped with incorrect HDR output or its intentional.

EDIT: We checked for a firmware update and it shows the latest.

What nits did his clip in HDR calibrator app?
 
Honestly I could care less about SDR brightness. I never use SDR. HDR brightness is what matters to me so if they don't improve on that next year then I won't bother.
And ASUS already proved the older WOLED panel can do higher peak brightness.

I'm sure LG's new 27 inch 1440p 480hz monitor will have similar peak brightness to the 27 inch QD-OLED's.
 
Back
Top