Dell Alienware AW3423DW 34″ QD-OLED 175Hz (3440 x 1440)

I decided to finally bite the bullet on this and replace my X27. Worst case scenario, if I don't like this, I'll just give it to the wife and move back to the X27.

Pricing isn't bad right now. With a 10% coupon, and Dell offering this for $100 off starting, +5% 'financing' discount, after taxes my total cost is basically $1100.
 
I decided to finally bite the bullet on this and replace my X27. Worst case scenario, if I don't like this, I'll just give it to the wife and move back to the X27.

Pricing isn't bad right now. With a 10% coupon, and Dell offering this for $100 off starting, +5% 'financing' discount, after taxes my total cost is basically $1100.
I replaced my X27 with this, it’s really a side grade overall. Dark scenes look a lot better, bright scenes look a lot worse
 
I decided to finally bite the bullet on this and replace my X27. Worst case scenario, if I don't like this, I'll just give it to the wife and move back to the X27.

Pricing isn't bad right now. With a 10% coupon, and Dell offering this for $100 off starting, +5% 'financing' discount, after taxes my total cost is basically $1100.
I thought Dell had this for $900 for some pre-Black Friday sale?
 
I replaced my X27 with this, it’s really a side grade overall. Dark scenes look a lot better, bright scenes look a lot worse
I’m willing to give up the total brightness given the awful haloing honestly. I’m just so tired of it.
 
I'm thinking about buying this monitor. Is there any drawback to play games in 16:9 (2560 x 1440) on this monitor even though the native res is 3440 x 1440?. I was thinking if there is any kind of scaling issues (softness), input lag or anything when playing not in its native res? Since it is an OLED I was thinking the edges will be turned off anyway, so it won't bother me like it would on a IPS with glowing edges. What do you guys think? Thanks guys
 
I'm thinking about buying this monitor. Is there any drawback to play games in 16:9 (2560 x 1440) on this monitor even though the native res is 3440 x 1440?. I was thinking if there is any kind of scaling issues (softness), input lag or anything when playing not in its native res? Since it is an OLED I was thinking the edges will be turned off anyway, so it won't bother me like it would on a IPS with glowing edges. What do you guys think? Thanks guys
I used to play Overwatch in 16:9 on mine and it was fine. the nice thing about OLED is the black on the sides are black black so it's not distracting. My Alienware 3420 IPS had IPS glow in the corners which was sometimes distracting.
 
I thought Dell had this for $900 for some pre-Black Friday sale?
This hasn't even been out for a year, so maybe you are thinking about the normal 34'' panel that was out before this OLED one. Only way you can get this panel for under $1k is if you have a business account and order it without taxes applied, and the PPP discounts that require a business billing account.
 
Received today, and i've been using all day. Very pleased with it.

Pros compared to X27 -
Fan is basically silent
Dark scenes are no longer blown out by the FALD
Refresh rate increase is nice
I've never used an OLED of this size, it's just generally impressive over the IPS for gaming. Although I wouldn't use it for professional work.

Cons -
As mentioned, it can't get as bright. However, this thing still can get bright enough to work for HDR content. It just doesn't get burn your eyes out bright.
The downgrade of pixel density does suck, but at least falling back to 1440 gives a serious FPS boost.

I will be selling my X27, pay attention to the for-sale section.
 
Even with a QD layer, OLED still has massive ABL to the point where the starfield looks miles away dimmer and shrinked in both color volume and perceptual contrast.

OLED cannot display HDR 1000 when it is only capable of a 2% window of 1,000 nits. It looks even less than HDR 400.


I wanted to comment on this post. In direct comparison to something like the X27, this Alienware monitor looks way better. Does it get as bright? No. However, there is some darkness to that scene, and on a FALD panel it's just entirely blown out. On the Alienware it looks absolutely perfect, even if the bright areas don't get quite as bright.
 
I wanted to comment on this post. In direct comparison to something like the X27, this Alienware monitor looks way better. Does it get as bright? No. However, there is some darkness to that scene, and on a FALD panel it's just entirely blown out. On the Alienware it looks absolutely perfect, even if the bright areas don't get quite as bright.
Ignore him. He's been on an anti-oled binge for months... :). I don't think he has said he owns either an oled or an fald display.
 
Last edited:
  • Like
Reactions: Zahua
like this
I wanted to comment on this post. In direct comparison to something like the X27, this Alienware monitor looks way better. Does it get as bright? No. However, there is some darkness to that scene, and on a FALD panel it's just entirely blown out. On the Alienware it looks absolutely perfect, even if the bright areas don't get quite as bright.

For that scene, maybe. I had my old X27 sitting next to the AW3423DW for months before selling it off, and in daylight scenes I had to double check whether the Alienware actually had HDR on or not, it was that dim
 
For that scene, maybe. I had my old X27 sitting next to the AW3423DW for months before selling it off, and in daylight scenes I had to double check whether the Alienware actually had HDR on or not, it was that dim
For me the brightness is irrelevant when the contrast ratio is turned to shit in any section of the panel with the FALD blowing out a dark area. Also, your eyes are going to adjust to the retina burning of the X27 and the Alienware right next to it of course will be dim in direct comparison. It would be like looking at the sun and then immediately looking at anything else and wondering why it isn’t as bright. Doesn’t mean it isn’t still really bright compared to fake HDR panels, etc.
 
For that scene, maybe. I had my old X27 sitting next to the AW3423DW for months before selling it off, and in daylight scenes I had to double check whether the Alienware actually had HDR on or not, it was that dim
HDR isn't about eye-searing brightness though, it's about contrast. Obviously if you side-by-side a blown-out FALD next to an OLED, the FALD is going to look more impactful. Your eyes are going to be adjusted for the brighter display. But the actual contrast is worse. There's really no point to having eye-searing brightness if you can't even make black look black. Which the X27 cannot do at all.
 
HDR isn't about eye-searing brightness though, it's about contrast. Obviously if you side-by-side a blown-out FALD next to an OLED, the FALD is going to look more impactful. Your eyes are going to be adjusted for the brighter display. But the actual contrast is worse. There's really no point to having eye-searing brightness if you can't even make black look black. Which the X27 cannot do at all.
This would be the case if you viewed near black content exclusively, but this isn't the case for me. The actual contrast in anything with a moderate APL is leagues better on the X27. Pretty irrelevant now given the CoolerMaster GP27U has taken it's spot in the market. Currently sending back my AW3423DW for uniformity and burn in issues and will not be getting another
 
I wanted to comment on this post. In direct comparison to something like the X27, this Alienware monitor looks way better. Does it get as bright? No. However, there is some darkness to that scene, and on a FALD panel it's just entirely blown out. On the Alienware it looks absolutely perfect, even if the bright areas don't get quite as bright.
Completely BS.

The picture must look great on your 200nits flickering OLED where the eyes cannot stand any more than that flickering brightness. And I haven't mention how little color OLED has.

Don't make this place as slum as a discord where people get paid like trash, cannot see anything better, cannot afford anything better.

Enjoy your flickering 200nits OLED while I already jump on the 500+nits APL IPS with tons of contrast you've never seen in your whole life where the black is blacker than your OLED with a little tweak of EOTF lol
 
Completely BS.

The picture must look great on your 200nits flickering OLED where the eyes cannot stand any more than that flickering brightness. And I haven't mention how little color OLED has.

Don't make this place as slum as a discord where people get paid like trash, cannot see anything better, cannot afford anything better.

Enjoy your flickering 200nits OLED while I already jump on the 500+nits APL IPS with tons of contrast you've never seen in your whole life where the black is blacker than your OLED with a little tweak of EOTF lol
You realize I own both, and can literally do a direct side-by-side comparison, correct?
 
Ignore him. He's been on an anti-oled binge for months... :). I don't think he has said he owns either an oled or an fald display.
Funny people who probably don't own anything, don't know how to use things properly just simply crank up their imagination.

PG35VQ, PG27UQ, AW3423DW
52144784027_f5a72238c0_o_d.png


BTW I have to turn on the light so you can identify the monitor stand. People must ignore that red/grey blackness on OLED.
 
HDR isn't about eye-searing brightness though, it's about contrast. Obviously if you side-by-side a blown-out FALD next to an OLED, the FALD is going to look more impactful. Your eyes are going to be adjusted for the brighter display. But the actual contrast is worse. There's really no point to having eye-searing brightness if you can't even make black look black. Which the X27 cannot do at all.
HDR is about contrast and color. People don't know high APL scene has even more contrast as well as more colors. Funny a picture of 100nits background plus a fewer dots of 1000nits can be called HDR.
What is even funnier is that eyes can see 500nits+ APL DC dimming monitor without problem while the actual "eye-searing" brightness is as bright as only 200nits but flickering all the time. This is what OLED or any other cheap flickering backlight can give as eye-searing brightness.
 
I like how you put a light over the panel, and don't even show the same content on the one we are talking about in the middle.

Whatever, have fun. You are 100% right.
 
I like how you put a light over the panel, and don't even show the same content on the one we are talking about in the middle.

Whatever, have fun. You are 100% right.
You want the 27" vs 35"? https://flic.kr/s/aHBqjzMEs5

And you still think 200nits flickering OLED can beat the 27" real HDR 1,000 FALD IPS panel from even 4 years ago that is still unmatched and further implemented in monitors in 2022 such as ViewSonic VX2781-4K-PRO?
 
You want the 27" vs 35"? https://flic.kr/s/aHBqjzMEs5

And you still think 200nits flickering OLED can beat the 27" real HDR FALD IPS panel from even 4 years ago that is still unmatched and further implemented in monitors in 2022 such as ViewSonic VX2781-4K-PRO?
Yes, because I do more than look at content that is bright day light scenes. I play games, I watch videos, etc. The FALD panel does not look good when you're playing any game that has dark scenes with bright highlights.

At the end of the day, i'd rather have the panel that isn't as eye-searing bright that can at least still look good in every situation, versus the panel that basically only ever looks good if it's only a bright scene.
 
  • Like
Reactions: Zahua
like this
Yes, because I do more than look at content that is bright day light scenes. I play games, I watch videos, etc. The FALD panel does not look good when you're playing any game that has dark scenes with bright highlights.

At the end of the day, i'd rather have the panel that isn't as eye-searing bright that can at least still look good in every situation, versus the panel that basically only ever looks good if it's only a bright scene.
Funny you didn't show any side-by-side comparison of yours. I was expecting the classic starfield or whatever.

200nits brightness OLED with ABL is indeed eye-searing because you cannot even look at OLED with only 200nits without eyestrain for 30 minutes.
 
Yes, because I do more than look at content that is bright day light scenes. I play games, I watch videos, etc. The FALD panel does not look good when you're playing any game that has dark scenes with bright highlights.

At the end of the day, i'd rather have the panel that isn't as eye-searing bright that can at least still look good in every situation, versus the panel that basically only ever looks good if it's only a bright scene.
I wouldn't bother dude. This guy spends all his days telling people that they are not allowed to like OLED displays, and then repeats "200nits" over and over as though it is something we all give a shit about. You aren't going to change his mind. According to him, you are wrong for having your opinion and he'll just go around in circles.
 
For that scene, maybe. I had my old X27 sitting next to the AW3423DW for months before selling it off, and in daylight scenes I had to double check whether the Alienware actually had HDR on or not, it was that dim
AW OLED cannot display that starfield HDR scene properly. People should know how bright a star is in the starfield.
The highlights will be hit massively by ABL close to 200nits while PG27UQ has close to 1,000 nits highlights, PG32UQX has 1,800nits. It's easily 5 times more impact you can get from a true HDR monitor. It's what the HDR scene is meant for.
The dark area is even darker than OLED due to the tweaks of EOTF. That's why I said it's completely BS that OLED looks anyway better in this scene. And why he's so shy to show his comparison. He knows.

I wouldn't bother dude. This guy spends all his days telling people that they are not allowed to like OLED displays, and then repeats "200nits" over and over as though it is something we all give a shit about. You aren't going to change his mind. According to him, you are wrong for having your opinion and he'll just go around in circles.
Funny who's going around in circles denying the truth. You can enjoy the OLED all you want but the truth is that OLED is a just so bad at HDR. With all that ABL and flickering crap, you cannot even use it like a HDR monitor.


727491_52344048974_8f95e0c286_o_d.png
 
Hey again guys, I'm thinking about buying this monitor. There is one concern for me though, and that is resolution scaling. For games that don't support ultra wide I want to play in 16:9 (2560 x 1440).

From my understanding the monitor itself can only do the resolution scaling over HDMI. Since I'm going to want to use DisplayPort, I will have to let my GPU (RTX 2080) do the scaling. Would this introduce any input lag? If so, how much?

Do you need to resolution scale in 2560 x 1440 anyway? Since all you're really doing is removing the sides. Cheers guys.
 
Yes, because I do more than look at content that is bright day light scenes. I play games, I watch videos, etc. The FALD panel does not look good when you're playing any game that has dark scenes with bright highlights.

At the end of the day, i'd rather have the panel that isn't as eye-searing bright that can at least still look good in every situation, versus the panel that basically only ever looks good if it's only a bright scene.

Agreed 100%. After moving to my OLED (Albeit a C2 not AW) even my Neo G7 seemed like trash. Everything just looks better overall. If it's too dim that's certainly isn't something I have noticed or cared for as much vs. the advantages in overall image quality. I have tried pretty much every monitor for a weeks/months. OLED is another level and it will be hard going back until we get something like a 3K+ Zone IPS.
 
Last edited:
Agreed 100%. After moving to my OLED (Albeit a C2 not AW) even my Neo G7 seemed like trash. Everything just looks better overall. If it's too dim that's certainly isn't something I have noticed or cared for as much vs. the advantages in overall image quality. I have tried pretty much every monitor for a weeks/months. OLED is another level and it will be hard going back until we get something like a 3K+ Zone IPS.
There is no reason to go another display tech from OLED, only need to keep improving their burn in resistance and sustained brightness.
 
How do people even manage to enjoy playing games or watching movies in HDR without eye strain? I always calibrate monitor brightness to aboutg 100cd/m^2 for dark rooms and 120cd/m^2 for brighter rooms. Its a personal preference and eye strain comfort.

There is still no standard for HDR calibration, is there? Supposedly, the tone mapping used on reference displayus for film mastering is nothing like tone mapping on any consumer displays. I don't know, I've been out of display calibration ever since HDR hit the market.
 
There is a standard for HDR display calibration, it is the PQ curve in STMPE standard ST.2048. So you calibrate to that. You will see articles on sites like RTings talk about how well a display tracks the EOTF, electro-optical transfer function and that is what they are talking about: How accurate it tracks that curve.

The big difference with SDR is that HDR content is mastered for absolute levels, rather than relative levels. So, you don't choose the brightness you want to calibrate for, you calibrate the display to track the brightness curve accurately. If properly calibrated, and if everything is doing its job, you get the same result they got in the studio.

Of course there are issues with that. The big one is brightness level. HDR can support up to 10,000 nits, and you can master content for something that high. But no consumer, or even professional, display can do that at this point. When something is mastered to go beyond what the display is capable of, that's when the display has to decide what to do. It can just clip the output, and that is what is specified in some calibrated modes, or it can dynamically remap the content in a way to be within the display's limits.

Now when it comes to enjoying things without eye strain, well depends on the media. If it is well mastered, it shouldn't be a problem. Normally HDR media should mostly fall within SDR levels. You keep the brightness at a normal viewing level, and just use the increased color space for most scenes. It should only be occasionally that you get things that are super bright, or dark. When something does this, it looks good and doesn't strain the eyes. Much like how movies can have sound output up to 105dB on a reference grade system, but shouldn't blast your ears because they should rarely, if ever, go to that peak. Normally their dialogue is going to be more in the 65-75dB range.

But some creators overdo it. Then it'll either be eyestrain, or you'll have to have your device dial it back, if it can (most TVs can).

For games you usually have some control over it. Generally, they have a calibration screen where you tell the game what the maximum brightness of your TV is. They then map their content to fit in that. If you want it less peak brightness, you can always just dial it back. Some games, like Cyberpunk, even let you dial in the SDR levels, so you can tell it how bright normal content should be vs HDR.

As it is a new standard, things are all over the map right now for how good or bad they do. As examples of good I've tried there's Resident Evil Village and the Harry Potter 4k Blu-rays. RE makes very effective HDR use to make things dark and spooky, but have bright lights and torches and such. It also makes use of the increased color space to give some very realistic looking gold and mahogany colors. It really enhances the atmosphere. For Harry Potter, it basically just makes the movie look like the original film it was shot on. Good film stock exceeds SDR both in dynamic range and in color space, and HDR can capture it both. So the movie just ends up looking more like a theater showing than video. They do punch up some things for HDR, but it tends to be subtle.

On the bad end there is Red Dead Redemption 2 and The Mandalorian. In both cases it looks like they just shoved SDR content in to an HDR container and didn't set the metadata right so both end up looking darker and more washed out than they do on the same device in SDR mode.

All in all though, it is neat technology and I feel really can give a visual upgrade when done right. Much bigger deal than 4K.
 
How do people even manage to enjoy playing games or watching movies in HDR without eye strain? I always calibrate monitor brightness to aboutg 100cd/m^2 for dark rooms and 120cd/m^2 for brighter rooms. Its a personal preference and eye strain comfort.

There is still no standard for HDR calibration, is there? Supposedly, the tone mapping used on reference displayus for film mastering is nothing like tone mapping on any consumer displays. I don't know, I've been out of display calibration ever since HDR hit the market.

I don't know if there's any science to support it but about 15 years ago I heard a bias light will help prevent eye strain. I have been using them ever since, primarily because they improve perceived contrast levels, but I've never had any eye strain issues including the HDR era where both my TV and PC monitor are HDR OLEDs
 
I find some minimal bias lighting near 6500k helps a bit.

At least we have the choice between HDR400 TB and HDR1000 which is more than most displays offer.

With Sjögren’s syndrome resulting in eyes that are perpetually devoid of any moisture even with punctal plugs HDR really sucks to watch and results in all sorts of blinding glare and pain. Dramatically reduces the time I can spend looking at a screen anymore. Sometimes I just want to set everything to SDR only and not bother with HDR at all because of it.
 
How do people even manage to enjoy playing games or watching movies in HDR without eye strain? I always calibrate monitor brightness to aboutg 100cd/m^2 for dark rooms and 120cd/m^2 for brighter rooms. Its a personal preference and eye strain comfort.

There is still no standard for HDR calibration, is there? Supposedly, the tone mapping used on reference displayus for film mastering is nothing like tone mapping on any consumer displays. I don't know, I've been out of display calibration ever since HDR hit the market.
It's more like your monitor is flickering. Flickering monitors cause eye strain even at 200nits.
People have been enjoying HDR 1000 in a dark room for years without eye strain.
 
There is some near-black VRR flickering which is annoying. But AW3423DW doesn't use PWM. What is all this talk about flickering and eye strain? RTINGS gives it a 10 out of 10 for image flicker. As a person who struggles with chronic migraine and tension headaches, I have to be picky when it comes to displays to not cause an increase in my amount of migraine headaches and I've been using the AW3423DW for months without any issues. I do calibrate my displays to achieve 100cd/m2 brightness. Overly bright displays are just terrible for eye strain and are certain migraine triggers.
 
There is some near-black VRR flickering which is annoying. But AW3423DW doesn't use PWM. What is all this talk about flickering and eye strain? RTINGS gives it a 10 out of 10 for image flicker. As a person who struggles with chronic migraine and tension headaches, I have to be picky when it comes to displays to not cause an increase in my amount of migraine headaches and I've been using the AW3423DW for months without any issues. I do calibrate my displays to achieve 100cd/m2 brightness. Overly bright displays are just terrible for eye strain and are certain migraine triggers.
Even it doesn't use PWM it still flickers at each frame. This is why It's only fine when you see under 100nits. Any brightness higher than that your eyes cannot stand it with flickers.
When a panel uses the actual DC dimming with no flickers, your can see whatever high brightness you want.
 
Even it doesn't use PWM it still flickers at each frame. This is why It's only fine when you see under 100nits. Any brightness higher than that your eyes cannot stand it with flickers.
When a panel uses the actual DC dimming with no flickers, your can see whatever high brightness you want.
So... the panel doesn't use PWM, and doesn't use DC dimming, what kind of dimming does it use then that creates this "flicker" that you see? I have had this monitor for about 5 months now, and I cant see any flicker, and no idea what its calibrated to, as I have not preformed anything calibration on it, its what ever it runs out of the box, and I have it set for HDR400 and run that all the time.

You seem to have a hate or at the very least a strong dislike for this monitor.
 
So... the panel doesn't use PWM, and doesn't use DC dimming, what kind of dimming does it use then that creates this "flicker" that you see? I have had this monitor for about 5 months now, and I cant see any flicker, and no idea what its calibrated to, as I have not preformed anything calibration on it, its what ever it runs out of the box, and I have it set for HDR400 and run that all the time.

You seem to have a hate or at the very least a strong dislike for this monitor.
It is OLED property that's too organic to emit steady light at the start of each frame. There is oscillation happening. Or it doesn't have involuntary/general PWM behavior in a short period enough to cause eye strain.

Eyes cannot stare at a flickering monitor at just average 200nits brightness for long.

I have said it somewhere before.
https://hardforum.com/threads/the-3...s-started-wait-for-it.2002618/post-1045498268
 
Last edited:
Back
Top