Dell Alienware AW3423DW 34″ QD-OLED 175Hz (3440 x 1440)

Your obsession with this is weird, and has nothing to do with the flickering. I don't run my current monitor at 250 nits, I didn't run my flickery BenQ at 250 nits, I wouldn't be running this OLED at 250 nits if I got it. Not sure why you are acting like this is something people do, it has no relevance to if flicker is bothersome to a person or not..
People just talk without even understanding what they are saying. Your imagination is flying. You haven't seen enough.
I point out the monitor flickers. It's a fact the monitor flickers. It's not even an opinion. And you are mistaking facts with opinions.
There is one thing you cannot do no matter whatever your opinion is: stare at its 250nits for over 1 hour in a dark room without eye strain. The flickering will make eye strain happen. Other monitors won't have the issue.
And if you have a studio monitor like I do instead of Genelec audio solution, you will instantly realize how much worse this monitor looks like.
 
People just talk without even understanding what they are saying. Your imagination is flying. You haven't seen enough.
I point out the monitor flickers. It's a fact the monitor flickers. It's not even an opinion. And you are mistaking facts with opinions.
There is one thing you cannot do no matter whatever your opinion is: stare at its 250nits for over 1 hour in a dark room without eye strain. The flickering will make eye strain happen. Other monitors won't have the issue.
And if you have a studio monitor like I do instead of Genelec audio solution, you will instantly realize how much worse this monitor looks like.
Man, stop spamming this in every thread about oled. There is no real flickering of note but for the near black gamma thing known about for years. Stop being so angry and obsessive about this topic. Go back to discord if you weren't banned for your spam ;).
 
Last edited:
Man, stop spamming this in every thread about oled. There is no real flickering of note but for the near black gamma thing known about for years. Stop being so angry and obsessive about this topic. Go back to discord if you weren't banned for your spam ;).
Funny OLED shills threat to ban. Too bad you can not abuse your power even if you had any at all.
 
There is one thing you cannot do no matter whatever your opinion is: stare at its 250nits for over 1 hour in a dark room without eye strain
Literally anyone can do this. It's the easiest thing in the world. If you can't, I'm sorry you have an eyesight problem, maybe get that checked out. Also, not sure why you keep saying 250 nits. That's more than double the normal calibration standard of 80-120 nits and using a screen at 250 nits in a dark room just means you have no idea what you're doing.
 
Literally anyone can do this. It's the easiest thing in the world. If you can't, I'm sorry you have an eyesight problem, maybe get that checked out. Also, not sure why you keep saying 250 nits. That's more than double the normal calibration standard of 80-120 nits and using a screen at 250 nits in a dark room just means you have no idea what you're doing.
Funny who's trolling now? I tell you to stare at its flickering 250 nits in a dark room if your eyes can stand it.

There are brightness dip up to 50nits waiting for you even in SDR captured by test equipment.
52334482356_c8256a1399_o_d.png


At the start of every frame, there is a black term because of OLED emulated DC dimming, which is not true DC dimming at all. It's always flickering. And this is the worst unregulated flickering that introducing more eye strain.
52333560807_9d1bf379c1_o_d.png


And It is you who have no idea what 400nits wide color SDR looks like. It looks better than this monitor in HDR. Too bad this monitor cannot do that. Instead, it flickers. You should've realized how dull this monitor looks like in SDR. It cannot even do HDR properly with ABL and flickers.
 
Literally anyone can do this. It's the easiest thing in the world. If you can't, I'm sorry you have an eyesight problem, maybe get that checked out. Also, not sure why you keep saying 250 nits. That's more than double the normal calibration standard of 80-120 nits and using a screen at 250 nits in a dark room just means you have no idea what you're doing.
I'm starting to wonder if it is a language problem, and he simply doesn't understand what people are saying and/or is trying to communicate something but not able to state it in English. I really don't understand what his obsession with the "250 nits" thing is. Likewise, there's his statement that would imply Genelec's are not studio monitors (they are). Personally I'm going to stop engaging as I don't feel this is a difference of opinion situation, but literally a failure to communicate, like I am not sure he understands what I'm saying and I sure don't seem to be able to understand him.
 
I'm starting to wonder if it is a language problem, and he simply doesn't understand what people are saying and/or is trying to communicate something but not able to state it in English. I really don't understand what his obsession with the "250 nits" thing is. Likewise, there's his statement that would imply Genelec's are not studio monitors (they are). Personally I'm going to stop engaging as I don't feel this is a difference of opinion situation, but literally a failure to communicate, like I am not sure he understands what I'm saying and I sure don't seem to be able to understand him.
Funny this guy who claims of owning studio monitor knows nothing about monitor. You mistake facts as opinions.
The 250nits is the max SDR brightness of this flickering QD-OLED. If you have eyestrain with merely 250nits, anything above 250nits will give you even more eyestrain especially in HDR.
I doubt you have ever seen what HDR1000 looks like. This QD-OLED HDR performance is not even as good as SDR 400nits.
 
I agree that oled flickers and it is annoying, cusing eyestrain, especially in vrr. But sdr wide color gamut? Don't think it exists. These are mutually exclusive characteristics. It is wether sdr, or wide gamut. Hot, or cold...
 
I agree that oled flickers and it is annoying, cusing eyestrain, especially in vrr. But sdr wide color gamut? Don't think it exists. These are mutually exclusive characteristics. It is wether sdr, or wide gamut. Hot, or cold...
It exists. There are Adobe, DCI-P3 color space and above. If you use Rec 2020 color pace in SDR with 1,000 nits. SDR looks the same as HDR. There isn't any gap between SDR and HDR if the color gamut, bit depth, brightness reaches the same level.

The limit factor is always the displays not capable of displaying it. So the image has to be shrieked down from RAW to compromised sRGB SDR for the majority of monitors.

Whatever monitor you have now should've displayed much better images than sRGB from 25 years ago. OLED should've bridged the artificial gap between SDR and HDR. But flickers and ABL prevents OLED doing so.

And people are satisfied with SDR without realizing the image can look like HDR.
 
It exists. There are Adobe, DCI-P3 color space and above. If you use Rec 2020 color pace in SDR with 1,000 nits. SDR looks the same as HDR. There isn't any gap between SDR and HDR if the color gamut, bit depth, brightness reaches the same level.

The limit factor is always the displays not capable of displaying it. So the image has to be shrieked down from RAW to compromised sRGB SDR for the majority of monitors.

Whatever monitor you have now should've displayed much better images than sRGB from 25 years ago. OLED should've bridged the artificial gap between SDR and HDR. But flickers and ABL prevents OLED doing so.

And people are satisfied with SDR without realizing the image can look like HDR.
Yet most games, website content etc are still made with sRGB color space in mind if they do not support HDR. AdobeRGB etc are largely only used for photography and otherwise wide gamut from the monitor tends to just result in oversaturation, which to be fair a lot of people like visually.

I don't see why anyone would find it appealing to run their monitor in SDR content at very high brightness levels like you suggest unless the environment is bright enough to require this. In HDR it works because those very high brightness levels are largely used only in small things, just like in real life you aren't looking regularly at the sun or tend to avert your eyes when going from dimmer lit indoors to bright outdoors while being fine that the sun reflects very brightly off a car or you see car headlights in the dark.

Ideally all our content would work with HDR but that's not the reality today.
 
Right ^ who would want to look at full screen 1000 nits? The great thing about hdr is the contrast not the fact that it is bright. HDR is great because you can have a very dark area and then a very bright highlight which really makes it pop.

This discussion is dumb and has nothing to do with this monitor just some guy complaining about oled tech… can we talk about the monitor again?
 
Yet most games, website content etc are still made with sRGB color space in mind if they do not support HDR. AdobeRGB etc are largely only used for photography and otherwise wide gamut from the monitor tends to just result in oversaturation, which to be fair a lot of people like visually.

I don't see why anyone would find it appealing to run their monitor in SDR content at very high brightness levels like you suggest unless the environment is bright enough to require this. In HDR it works because those very high brightness levels are largely used only in small things, just like in real life you aren't looking regularly at the sun or tend to avert your eyes when going from dimmer lit indoors to bright outdoors while being fine that the sun reflects very brightly off a car or you see car headlights in the dark.

Ideally all our content would work with HDR but that's not the reality today.
I have talked about this. You either don't read or don't understand what it means.

People satisfied with sRGB 80nits don't realize the image can look much better if the monitor is more capable. The limit factor is always the display. So the content has to be downgraded to sRGB for you, for the majority office monitors out there.

The below HDR video shows HDR grading process by adding contrast and expanded color.


The 1st scene is sRGB look. The 3rd scene is the graded HDR.

If your monitor is capable of displaying wide color space and higher nits, it has auto HDR function. It can do auto grading.

While AW3423DW or such monitors stuck in the sRGB 80nits can only look like the 1st scene, that's the most they can display, other monitors like PA32UCG can make SDR look exactly like the 3rd scene with Rec 2020 in HDR preview mode.

It has little to do with the content. It is always about the capability of a monitor to display better images. So imagine your sRGB 80nits content you watch everyday with your office-like monitor looks completely different on another level somewhere else with other monitors.
 
I have talked about this. You either don't read or don't understand what it means.

People satisfied with sRGB 80nits don't realize the image can look much better if the monitor is more capable. The limit factor is always the display. So the content has to be downgraded to sRGB for you, for the majority office monitors out there.

The below HDR video shows HDR grading process by adding contrast and expanded color.


The 1st scene is sRGB look. The 3rd scene is the graded HDR.

If your monitor is capable of displaying wide color space and higher nits, it has auto HDR function. It can do auto grading.

While AW3423DW or such monitors stuck in the sRGB 80nits can only look like the 1st scene, that's the most they can display, other monitors like PA32UCG can make SDR look exactly like the 3rd scene with Rec 2020 in HDR preview mode.

It has little to do with the content. It is always about the capability of a monitor to display better images. So imagine your sRGB 80nits content you watch everyday with your office-like monitor looks completely different on another level somewhere else with other monitors.

We get it man. You hate OLEDs. Move on.
 
Last edited:
I don't know. He seems pretty knowledgeable on monitors. But yeah it's probably time to just leave it there and maybe start another thread about all this. It's pretty interesting.
 
You can sit for hours because it is in a bright room.
Even the max SDR brightness in QD-OLED is only 250nits, I doubt anyone can stare at its 250nits for over 1 hour in a dark room. And people complain SDR at 250nits is too bright. It's because of the flickering.
There are good flickering and bad flickering. My eyes don't get strained with Zowie DyAC strobe. The other monitors are 2-5 times brighter than QD-OLED. They don't even cause eye strain.
When an actual DC dimming monitor is put next to the QD-OLED, I don't want to look at QD-OLED because how uncomfortable the light comes from it even the light is dim.

Not sure what you mean by this. My room is never bright. It's dim at the most to completely dark most of the time.

I mainly use this monitor at 120 nits for SDR for several hours a day and never have had any eye strain in the months I have been using it. I wouldn't want more than this for brightness from any monitor in a dark room as it's just too bright, and that itself absolutely can cause eye strain and it does for me. An LCD monitor at 200 nits in a dark room hurts my eyes after a bit.

The image looks the best I have ever seen thanks to the infinite contrast ratios with prefect blacks and motion looks the clearest I have seen in gaming thanks to the 0.1ms response time.


I understand that some people get eye strain from the PWM behavior of these OLED displays, but from what I have seen it is only a small percentage of people. So yeah, if it affects you, then go for something else. But for most people this is such an excellent monitor.
 
Not sure what you mean by this. My room is never bright. It's dim at the most to completely dark most of the time.

I mainly use this monitor at 120 nits for SDR for several hours a day and never have had any eye strain in the months I have been using it. I wouldn't want more than this for brightness from any monitor in a dark room as it's just too bright, and that itself absolutely can cause eye strain and it does for me. An LCD monitor at 200 nits in a dark room hurts my eyes after a bit.

The image looks the best I have ever seen thanks to the infinite contrast ratios with prefect blacks and motion looks the clearest I have seen in gaming thanks to the 0.1ms response time.


I understand that some people get eye strain from the PWM behavior of these OLED displays, but from what I have seen it is only a small percentage of people. So yeah, if it affects you, then go for something else. But for most people this is such an excellent monitor.
If your room is dim, the screen will look the same as the bezel when it displays a black image. This is considered a dark room when no light source hits in front of the monitor.

My monitors display up to 1,800 nits. If 200 nits in a dark room hurts eyes, there will be no HDR monitor. You say your LCD hurts your eyes at 200nits, maybe the backlight of it is flickering too.

And the image looks like crap at 120nits sRGB. People should've seen better images with this monitor instead of saying 200nits brightness hurts eyes.
 
If your room is dim, the screen will look the same as the bezel when it displays a black image. This is considered a dark room when no light source hits in front of the monitor.

My monitors display up to 1,800 nits. If 200 nits in a dark room hurts eyes, there will be no HDR monitor. You say your LCD hurts your eyes at 200nits, maybe the backlight of it is flickering too.

And the image looks like crap at 120nits sRGB. People should've seen better images with this monitor instead of saying 200nits brightness hurts eyes.
He's talking about full screen brightness not HDR highlights.
 
If your room is dim, the screen will look the same as the bezel when it displays a black image. This is considered a dark room when no light source hits in front of the monitor.

My monitors display up to 1,800 nits. If 200 nits in a dark room hurts eyes, there will be no HDR monitor. You say your LCD hurts your eyes at 200nits, maybe the backlight of it is flickering too.

The LCD is not flickering. I can confirm that with a 960fps slo-mo camera. 200 nits full white field is just not a comfortable brightness.


SDR and HDR are completely different formats. SDR is designed for around 100 nits peak white. The brightness of the various elements on the screen are designed for this balance and a power-law gamma of around 2.2 or 2.4.

HDR on the other hand does not use power law gamma, and instead uses an EOTF (PQ) which is an absolute scale. Meaning that if you play a 1000 nit mastered content on a 200 nit HDR display and a 500 nit HDR display and a 1000 nit HDR display, any pixels in the content that are mastered at say 100 nits will always be displayed at 100 nits no matter how bright the peak white of the display can reach.

The vast majority of the pixels in HDR movies are still in that 0-100 nit range. Only smaller portions of the screen that are considered highlights, like lights, reflections, explosions, other special effects use higher nit values like 100-1000+ nits. These elements take up a smaller portion of the screen and are not necessarily displayed for long periods of time.

A practical example would be an outdoor scene with a large bright sky taking up about half the image. In SDR, the sky might be mastered at say 80-90% signal level and thus on a ~100 nit calibrated display would be about 80-90 nits. This is not too bright in a dark room for such a large portion of the screen, taking up a large portion of your FOV. Now if you calibrated that SDR monitor to 200 nits, that sky is now going to be 160-180 nits. On the first hand this is wrong and not how the content creator intended this to be seen, and it's also too bright for some people in a dark room when its taking up a large portion of their FOV. It's just not comfortable and makes me squint. Now maybe 160-180 nits is not too much for you, but if you calibrated your display to 400 nits SDR, now the sky is 320-360 nits and it's definitely getting too bright.

This same scene mastered in HDR will have the sky mastered at say 80-90 nits, and no matter how bright your HDR monitor can go, 1000, even 2000+ nits, the sky will always display at the 80-90 nits as it's intended to. This is what the absolute PQ EOTF ensures.

And the image looks like crap at 120nits sRGB. People should've seen better images with this monitor instead of saying 200nits brightness hurts eyes.

I disagree, it looks great in a dark environment. Just making everything brighter doesn't automatically make it look better. Not to me at least. It just causes my pupils to constrict more which ends up effectively dimming the whole image and just puts a greater strain on my eyes.
 
Last edited:
The LCD is not flickering. I can confirm that with a 960fps slo-mo camera. 200 nits full white field is just not a comfortable brightness.


SDR and HDR are completely different formats. SDR is designed for around 100 nits peak white. The brightness of the various elements on the screen are designed for this balance and a power-law gamma of around 2.2 or 2.4.

HDR on the other hand does not use power law gamma, and instead uses a PQ EOTF which is an absolute scale. Meaning that if you play a 1000 nit mastered content on a 200 nit HDR display and a 500 nit HDR display and a 1000 nit HDR display, any pixels in the content that are mastered at say 100 nits will always be displayed at 100 nits no matter how bright the peak white of the display can reach.

The vast majority of the pixels in HDR movies are still in that 0-100 nit range. Only smaller portions of the screen that are considered highlights, like lights, reflections, explosions, other special effects use higher nit values like 100-1000+ nits. These elements take up a smaller portion of the screen and are not necessarily displayed for long periods of time.

A practical example would be an outdoor scene with a large bright sky taking up about half the image. In SDR, the sky might be mastered at say 80-90% signal level and thus on a ~100 nit calibrated display would be about 80-90 nits. This is not too bright in a dark room for such a large portion of the screen, taking up a large portion of your FOV. Now if you calibrated that SDR monitor mode to 200 nits, that sky is now going to be 160-180 nits. On the first hand this is wrong and not how the content creator intended this to be seen, and it's also too bright for some people in a dark room when its taking up a large portion of their FOV. It's just not comfortable and makes me squint. Now maybe 160-180 nits is not too much for you, but if you calibrated your display to 400 nits SDR, not the sky is 320-360 nits and it's definitely getting too bright.

This same scene mastered in HDR will have the sky mastered at say 80-90 nits, and no matter how bright your HDR monitor can go, 1000, even 2000+ nits, the sky will always display at the 80-90 nits as it's intended to. This is what the absolute PQ EOTF ensures.



I disagree, it looks great in a dark environment. Just making everything brighter doesn't automatically make it look better. Not to me it doesn't. It just causes my pupils to constrict more which ends up effectively dimming the image and just puts a greater strain on my eyes.
Funny you cannot set brightness above 120nits while saying flickering is not an issue.

Even 200nits is dim, nobody set 200nits or any nits to watch fulfilled white. Set max brightness for the highlights.

When you use HDR, the APL, which is equal to fulfilled luminance output, goes beyond 200nits easily especially in recent movies and games. With aggressive ABL and flickering of the monitor, no wonder you stuck at 100 nits APL HDR content. The flickering issues only starts to show when OLED gets a little bit brighter with a QD layer. If your monitor is flickering and your eyes gets hurt in 200nits SDR. Your eyes will get hurt even more in HDR.

And you haven't graded HDR for once. Even though different color pace read the same data differently, you cannot separate HDR and SDR If their color and luminance level are graded the same. It's the monitor doesn't let you unlock 1,000 nits and use wider color space in SDR yet.

What the content creators intended is for the majority of low end monitors that are only capable of sRGB 80nits. It's a much compromised image. sRGB 80nits won't look good compared to Adobe 400nits or Rec2020 1,000nits if they can use these for the main distribution. If a monitor is not capable of wider color space, they won't even do a good job in HDR. So sRGB looks as good as on an office monitor. And you are satisfied with however dull the image looks like. Let me tell you the content creators are never stratified with dull sRGB, a better monitor with more colorspace and more luminance is doing your a favor to view what they actually intended.
 
Set max brightness for the highlights.

The whole issue is with SDR you can't... Power law gamma doesn't allow it. Highlights can be made bright, sure, but regular things will then also be way too bright too and you can't prevent that, because they are encoded at the same triplet values.

You need an absolute scale like PQ EOTF which is the whole point of HDR. With HDR you can set the highlight brightness and I do calibrate it there to 1000 nits. My monitor is calibrated and tracks EOTF from 0 to 1000 nits perfectly for HDR, so I am viewing it exactly how bright it's mastered and intended (at least up to DCI-P3 gamut volume, not the full BT.2020 of course).


Content creators have to master within the limitations of their format. You say they don't want to limit to rec709 and ~100 nits. Well it's too bad, they had to for the HDTV and SDR BluRay release. And if you go against the standards it was mastered to you are not at all re-creating what they saw on their mastering monitor and making things way too bright to be comfortable in a dark room and I know this because I have tried it so many times and it's uncomfortably bright. It has nothing to do with flicker, again because I have spent years watching content on a variety of LCD screens that do not have PWM or flicker as proven by my 960fps Sony RX camera.

Nothing you can say will change that fact.

If SDR on a 120 nit display looks dull, then i'd say that's your own problem, not mine. It looks good to me for SDR mastered content and more importantly it's comfortable to view for long periods in the dark how I normally view it.
 
Last edited:
if i have a image with 2different nit points i can't see it correctly in sdr

example
sky at 500nit and shadow with 50nit, if i set monitor to 500nit sdr to match sky but i'll miss the 50nit area, with hdr i can match both area with correct nits (for this hdr have locked brightness)
 
The whole issue is with SDR you can't... Power law gamma doesn't allow it. Highlights can be made bright, sure, but regular things will then also be way too bright too and you can't prevent that, because they are encoded at the same triplet values.

You need an absolute scale like PQ EOTF which is the whole point of HDR. With HDR you can set the highlight brightness and I do calibrate it there to 1000 nits. My monitor is calibrated and tracks EOTF from 0 to 1000 nits perfectly for HDR, so I am viewing it exactly how bright it's mastered and intended (at least up to DCI-P3 gamut volume, not the full BT.2020 of course).


Content creators have to master within the limitations of their format. You say they don't want to limit to rec709 and ~100 nits. Well it's too bad, they had to for the HDTV and SDR BluRay release. And if you go against the standards it was mastered to you are not at all re-creating what they saw on their mastering monitor and making things way too bright to be comfortable in a dark room and I know this because I have tried it so many times and it's uncomfortably bright. It has nothing to do with flicker, again because I have spent years watching content on a variety of LCD screens that do not have PWM or flicker as proven by my 960fps Sony RX camera.

Nothing you can say will change that fact.

If SDR on a 120 nit display looks dull, then i'd say that's your own problem, not mine. It looks good to me for SDR mastered content and more importantly it's comfortable to view for long periods in the dark how I normally view it.
You can do this with SDR. It's the monitor that limits you to do so. You spent years watching content without realizing it. And I don't think you know how to calculate PQ.

I have said the image needs to be expanded into a wider color space and a higher luminance range. So with the new transfer function based on the wider color space, the highlight can get higher while lowlight gets lower.

With an expanded wider color space, for example, Adobe further expanded by using YCbCr, 100nits sky will be 500nits and 10nits shadow will be 1nits or even lower.

With the capability of the monitor to deliver higher luminance and contrast, you can even get images looks like HDR 1000 with Rec 2020 or Rec 2100.

For example, this is a standard Rec. 709 100nits image.

52341398040_a870f7afc7_o_d.png


You put it into Rec 2100 colors pace, it can look the same as SDR with only 100nits output.

52341398980_cd19eed3ac_o_d.png



But if you increase the brightness with Rec 2100. See what it looks like.
52340010387_155ed0d8ef_o_d.png


Since your monitor can only display sRGB or rec 709 100nits in SDR, you need to see what these images looks like on my end with Windows Photo with HDR mode.
Sky low/original look

Sky High

It's obvious which one looks better.

There is a reason monitors with HDR capability like PA32UCG have an HDR preview mode in SDR to turn SDR images into HDR. PA32UCG can look exactly like the second image in SDR.
PG32UQX, PG27UQ, PG35VQ can look similar to HDR 400 with adobe color space while the new QD-OLED is just like the first 100nits, the same as an office monitor.

sRGB 100nits is not enough, that's 20 years ago standard for office monitors. Be better. This AW3423DW could be better. It flickers as well as aggressive ABL. Eyes cannot even stand 250nits flickering.
 
I have talked about this. You either don't read or don't understand what it means.
Thanks for your hard work on this matter kramnelis. I'm packing up my monitor right now and will be returning it. I will also be returning my oled tv and oled phone. I'll also be contacting my dealership to see if they can remove the oled display from my car. I'm so ashamed I used to be part of the oled master race. Thank you so much. Please keep posting over and over and over and over again to help others see the light.
 
Thanks for your hard work on this matter kramnelis. I'm packing up my monitor right now and will be returning it. I will also be returning my oled tv and oled phone. I'll also be contacting my dealership to see if they can remove the oled display from my car. I'm so ashamed I used to be part of the oled master race. Thank you so much. Please keep posting over and over and over and over again to help others see the light.
Yep, boxed up mine and ready to ship back. Don't know why I went with OLED. ;)
IMG_2810.JPEG
 
You can do this with SDR. It's the monitor that limits you to do so. You spent years watching content without realizing it. And I don't think you know how to calculate PQ.

I have said the image needs to be expanded into a wider color space and a higher luminance range. So with the new transfer function based on the wider color space, the highlight can get higher while lowlight gets lower.

With an expanded wider color space, for example, Adobe further expanded by using YCbCr, 100nits sky will be 500nits and 10nits shadow will be 1nits or even lower.

With the capability of the monitor to deliver higher luminance and contrast, you can even get images looks like HDR 1000 with Rec 2020 or Rec 2100.

For example, this is a standard Rec. 709 100nits image.

View attachment 507728

You put it into Rec 2100 colors pace, it can look the same as SDR with only 100nits output.

View attachment 507729


But if you increase the brightness with Rec 2100. See what it looks like.
View attachment 507730

Since your monitor can only display sRGB or rec 709 100nits in SDR, you need to see what these images looks like on my end with Windows Photo with HDR mode.
Sky low/original look

Sky High

It's obvious which one looks better.

There is a reason monitors with HDR capability like PA32UCG have an HDR preview mode in SDR to turn SDR images into HDR. PA32UCG can look exactly like the second image in SDR.
PG32UQX, PG27UQ, PG35VQ can look similar to HDR 400 with adobe color space while the new QD-OLED is just like the first 100nits, the same as an office monitor.

sRGB 100nits is not enough, that's 20 years ago standard for office monitors. Be better. This AW3423DW could be better. It flickers as well as aggressive ABL. Eyes cannot even stand 250nits flickering.
You may want to check with the author of this article here : https://hardocp.com/blog/gaming-amp-working-at-4k-going-big. , GL 😊
 
Interesting thread.

I am a small minority,. I can have ocular migraines due to lights flicker.
I noticed that these became much more frequent with oled, but never made the connection.
Could it be the emulated pwm flicker that is being talked about? I'm not sure, but it would make sense for me.
 
Not sure what you are trying to prove here. The low/original looks significantly better.
As I said, open these images in HDR with Windows Photo. Funny either you cannot use these images correctly or your "studio" monitor is not capable of it. But I'm afraid you are trying to lie to yourself after watching content for years.
 
Thanks for your hard work on this matter kramnelis. I'm packing up my monitor right now and will be returning it. I will also be returning my oled tv and oled phone. I'll also be contacting my dealership to see if they can remove the oled display from my car. I'm so ashamed I used to be part of the oled master race. Thank you so much. Please keep posting over and over and over and over again to help others see the light.
Yep, boxed up mine and ready to ship back. Don't know why I went with OLED. ;)
View attachment 508040
You may want to check with the author of this article here : https://hardocp.com/blog/gaming-amp-working-at-4k-going-big. , GL 😊
I like to see people downplay things they don't even see or understand. There should be a ceremony so they can reinforce each other.
 
Oh, there is. You're actively participating in it!

Why do you spam here so much with oled hate? Did you get banned from your precious monitor discords?
Funny people like you can devolve this place into a degenerated discord slum. You look exactly like one of them who circlejerk a lot.

It's the problem with OLED. I just point it out before the issue becomes bigger. And what's your defense?
 
As I said, open these images in HDR with Windows Photo. Funny either you cannot use these images correctly or your "studio" monitor is not capable of it. But I'm afraid you are trying to lie to yourself after watching content for years.

I don't have any reference monitor, but I tried it on my 42" LG C2 which I am using as a PC monitor. Manually activated HDR (I run SDR only when on the desktop), then opened the pictures with Windows Photo.
Sky_high easily looks much better - it has much higher range (difference between dark and bright parts of the picture) and color reproduction. That being said, I think sky_low could still look a lot better even in pure SDR compared to how it looks on that picture - it's really muted for some reason.

When I opened both while having Windows in SDR, then sky_low didn't look all that different (still kinda bad), but sky_high was blown out to kingdom come, totally ridiculous.

So, what's the conclusion from this? The C2 is a decent HDR screen since sky_high looks much better than sky_low?
 
I don't have any reference monitor, but I tried it on my 42" LG C2 which I am using as a PC monitor. Manually activated HDR (I run SDR only when on the desktop), then opened the pictures with Windows Photo.
Sky_high easily looks much better - it has much higher range (difference between dark and bright parts of the picture) and color reproduction. That being said, I think sky_low could still look a lot better even in pure SDR compared to how it looks on that picture - it's really muted for some reason.

When I opened both while having Windows in SDR, then sky_low didn't look all that different (still kinda bad), but sky_high was blown out to kingdom come, totally ridiculous.

So, what's the conclusion from this? The C2 is a decent HDR screen since sky_high looks much better than sky_low?
A certain user is a certain troll and needs to be put on ignore has been the conclusion for some time...
 
Who cares. Most of us game or watch movies on this monitor. Content such as this is either SDR rgb or HDR. Creating and converting an imagine like you describe has 0 value to me in what I use my computer for.
 
Interesting thread.

I am a small minority,. I can have ocular migraines due to lights flicker.
I noticed that these became much more frequent with oled, but never made the connection.
Could it be the emulated pwm flicker that is being talked about? I'm not sure, but it would make sense for me.
It could be possible. Have you tried switching back to your old monitor for awhile to see if things get better?
 
Yeah, SDR content at 120 nits looks great and as intended in Creator sRGB mode at 2.4 gamma.

And HDR content looks great as intended in the standard HDR mode.


I don't understand what's so controversial lol.

I have seen many monitors and this one easily looks by far the best and causes no eye strain in the months I have had it so far and using it for hours a day.

I'm never going back to LCD black levels, local array blooming, and multi-millisecond response times. You couldn't pay me to do so.
 
Last edited:
Back
Top