Dell Alienware AW3423DW 34″ QD-OLED 175Hz (3440 x 1440)

Interesting. Looks like the new AW3423DWF may indeed support firmware update as shown here. If true, I'm pleasantly surprised!

On the other hand, looks like it will still have at least 1 fan in there.

Fixes & Enhancements​

1. Fixes loud fan noise after prolong use

Well that’s disappointing it’s still got a fan. Guess I’m holding out for another display.
 
Interesting. Looks like the new AW3423DWF may indeed support firmware update as shown here. If true, I'm pleasantly surprised!

On the other hand, looks like it will still have at least 1 fan in there.
Just based off of that I would advise folks to wait for it unless they absolutely need g-sync ultimate, which should be almost no one.
 
PSA for folks looking for this monitor. My Microcenter (MN) has 25+ in stock. They must have finally got over the hump on the production shortages.
 
How does Skyrim look? I want to get this monitor...I wanna know how dark the dungeons looks? On my IPS they are greyish.
 
How does Skyrim look? I want to get this monitor...I wanna know how dark the dungeons looks? On my IPS they are greyish.

I think an open world RPG like Skyrim would look much more immersive on a LG C2 42 4k display, to really see the vast world in front of you.

To me Ultrawides don't do RPG'S justice, too narrow of a view, and limited vertical height.

I think the AW3423DW is perfect for shooters, first person viewed games.
 
I bit the bullet and picked up the AW3423DW at Microcenter, with their 2 year warranty, which they say will cover anything and everything you bring it in with the slightest of blemish and they'll give you the latest current model for no charge.

So let's see how this goes? I did try the LG C2 42" last month and it was just a little too big for me as my desk is shallow and the display is right in front of my face. And I didn't like how it felt more like a big TV connected to my computer. Not a dedicated PC monitor.

Plus 3440 X 1440res is easier to run
 
Plus 3440 X 1440res is easier to run
This is some old-school thinking guys.

We have DLSS, FSR2, TAAU, and just plain ass bilinear upscaling these days. This all looks fine at 4k. You don't have to run every game at the full 2160p, you can play some games at 70% scale, others at 90%, doesn't matter
 
This is some old-school thinking guys.

We have DLSS, FSR2, TAAU, and just plain ass bilinear upscaling these days. This all looks fine at 4k. You don't have to run every game at the full 2160p, you can play some games at 70% scale, others at 90%, doesn't matter
Depends on the game, and the frame rate. I have one of the 38" UWs so that's 3840x1600, which is about 75% of the pixels of a 4k display. Runs at 144Hz, and I'd like to have games above 100fps if I can, closer to 144fps the better. Some I can do that but many I can't. Control is almost always under 100fps, even with DLSS. Raytracing just hits hard. That's with a 3090.

It's ok if you like pixel count and that's what you want to spend power on, but others, like me, find it less important. I'd rather trade some pixel count for fps.
 
This is some old-school thinking guys.

We have DLSS, FSR2, TAAU, and just plain ass bilinear upscaling these days. This all looks fine at 4k. You don't have to run every game at the full 2160p, you can play some games at 70% scale, others at 90%, doesn't matter

Not a big fan of DLSS, I much prefer to run games at the raw default res of the display, with no tricks to make it play smoother.
 
Why not? It looks like TAA but sharper


That's what you're doing with DLSS. DLSS is using fewer pixels internally than native, it's just using temporal data to fill out pixels at a minimal performance cost
I know that. My point is that even with that, it still isn't enough to drive high rez displays to the framerates I like. Now don't get me wrong, I'd take this same monitor in a higher pixel count if I could have it for the same price and run it lower... I'm just saying that there are reasons why higher pixel count alone isn't appealing to some people.
 
I'm more open to it now. Is DLSS an in-game setting? Or a nVidia Control Panel setting?

Where have you been the past two years? hahaha

DLSS is like one of the most important things that has happened in PC games recently.

Anyway, Digital Foundry does a really good job explaining it. They even use some extreme examples like using a 540p base resolution:

DLSS is an option in select games. In other games, it's fine to just use the built-in resolution scaler. Which is some games, also uses temporal data to use information from previous frames to fill out the pixels.

I know that. My point is that even with that, it still isn't enough to drive high rez displays to the framerates I like. Now don't get me wrong, I'd take this same monitor in a higher pixel count if I could have it for the same price and run it lower... I'm just saying that there are reasons why higher pixel count alone isn't appealing to some people.

All you do is run a lower DLSS setting. Like if you're getting 80fps at "Quality", you can drop to "Balanced" to get closer to 120fps or something.
 
All you do is run a lower DLSS setting. Like if you're getting 80fps at "Quality", you can drop to "Balanced" to get closer to 120fps or something.
Understood, but that hurts visual quality which then is kind of opposed to the "more pixels more better" idea. There are also differences between using DLSS, at any resolution, and native rendering. One place I really noticed it was Elder Scrolls Online which offers DLAA, which is DLSS at native rez. That game is basically always CPU bound (because their engine sucks) so I can throw more GPU power at it. MLAA does reduce aliasing and sometimes makes textures sharper... but at the expense of blur on textures in motion. I found it quite noticable that things would be blurry when moving towards them, but then when you stopped they'd resolve to a crisp image. Now does that matter? Depends on the person, but it isn't like DLSS is all gain, no pain. I can see why people don't like running it.

That's also not to mention all the games that don't support it.

Don't misunderstand, I'm not hating on high rez displays, but it also isn't the thing that everyone cares about the most, nor are all the solutions to get frames back perfect.
 
I'm excited to try this AW3423DW out. Wanna see the OLED gameplay.

But my mind is already thinking the 34" Ultrawide size is still a bit small and narrow, which it's the same as my current AW3420DW I've had for two years, and I do like, hit always think something bigger is better.

But then I did try the LG C2 42 and found that just too large for my short desk, with the display just an arms length or a bit more away from my face.

LOL I'm never satisfied. But I keep saying a 38" to 40" Ultrawide OLED would be pretty much perfect.

34" a bit too small, and 42" is a kinda large for my situation. Need a middle ground size between the two with QD-OLED tech.
 
Understood, but that hurts visual quality which then is kind of opposed to the "more pixels more better" idea.
Yeah, it ends up looking low-res, which you were doing anyway if you had chosen a lower-res monitor

There are also differences between using DLSS, at any resolution, and native rendering.
Yeah, DLSS from, say 1440p internal res upscaled to 2160p is supposed to look better than 1440p on 1440p monitors

I found it quite noticable that things would be blurry when moving towards them, but then when you stopped they'd resolve to a crisp image. Now does that matter? Depends on the person, but it isn't like DLSS is all gain, no pain. I can see why people don't like running it.
This is something Nvidia's been improving over time. You can actually drop new DLSS DLL files into the game folder and it will essentially "update" the version of DLSS in that game

And it also begs the question if the reduction in resolution at 2160p from blur is relatively close to the 1440p native image you'd be comparing to
That's also not to mention all the games that don't support it.
Thing is, more and more modern games are using some form of temporal upscaling that gives you better picture quality than the resolution you're scaling up from.

But even so, 1440p upscaled to 2160p via the basic bilinear GPU upscaling still looks pretty good. Not as good as native, but close enough.

Besides, it's not like every game you play runs best at 1440p. Some slightly older games would run at full 2160p just fine, depending on your GPU and how old the game is. Or you can run a resolution in between, like 1800p.

So all that to say: there's no reason not to go 4k. Whether you have a kickass GPU or mediocre GPU, scaling will give you more flexibility for good image quality across a multitude of games on a 2160p display.
 
LOL my kids looking at my new monitor. My teenage daughter, who's also a gamer, she has an older Acer Predator 34" Ultrawide, she looked at my new AW3423DW, and was like "I don't see any difference, like it looks exactly like your old monitor" I said it has pitch black deep inky blacks now, and more vibrant colors, and she was like "Uh, ok..."

Then I payed World of Warcraft and she said "Oh crap, WOW that's very colorful, and super clear, looks amazing, but also that's way too bright, blinding bright Dad"

Then my 7th grader son, he's a gamer too, uses my old 40" 4k Philips 60hz monitor, he sees the monitor, and says "Dad, is that your new monitor? Doesn't look any different than the old one" I said look at this OLED black wallpaper I have, my son says "Ok, I guess it's deeper and darker". But he didn't see me load up any games yet.
 
This is something Nvidia's been improving over time. You can actually drop new DLSS DLL files into the game folder and it will essentially "update" the version of DLSS in that game
For some games. Not all are ok with that (ESO is an example of one that is not).

And it also begs the question if the reduction in resolution at 2160p from blur is relatively close to the 1440p native image you'd be comparing to
I kinda doubt it. I can't do a 1:1 comparison, but my monitor is 3480x1600, so while not 2160 it is not that far off. Also the same thing happens on my TV, which is 4k.

So all that to say: there's no reason not to go 4k. Whether you have a kickass GPU or mediocre GPU, scaling will give you more flexibility for good image quality across a multitude of games on a 2160p display.
Ya there is: Not all monitors are 4k, and some people want something different. Like when I got mine, I wanted an ultrawide. Well there is only one ultrawide I'm aware of with 2160 vertical pixels and it is limited to 60hz. I also wanted a high refresh rate display. So the Alienware 38" ultrawide was what I ended up choosing. I, and others, are willing to trade resolution for other things. The monitor that this thread about is another example: There is no 4k equivilant. The C2 is not only much larger, too large for some people, myself included, but it isn't as fast, nor does it handle full screen brightness as well. To some people, that matters. If it doesn't to you that's completely cool, but there's reasons why people might want something different.

Another reason is there are productivity apps that just don't do HiDPI well. Davinci Resolve is an example. If you have a 200ppi display, you are going to have a miserable time with it. Nuendo is another example. It supports HiDPI quite well itself, but not all plugins do. So you can have issues with it depending on what VSTs you load if you use a HiDPI display.
 
Looks cool;
 

Attachments

  • PXL_20221020_040218065.jpg
    PXL_20221020_040218065.jpg
    264.3 KB · Views: 0
  • PXL_20221020_040229536.jpg
    PXL_20221020_040229536.jpg
    340.9 KB · Views: 0
  • PXL_20221020_040314535.PORTRAIT.jpg
    PXL_20221020_040314535.PORTRAIT.jpg
    308.3 KB · Views: 0
Looks cool;
Hope it works out well for you. I still may get one, I keep going back and forth. I don't have much time to game for awhile so I'm going to leave off it, and probably build a new system core first since Nuendo could use some more CPU (and I just want a new computer) but I'm thinking about it. I love my 38" size wise though. I would really like to try HDR gaming on one though.
 
Honest impressions so far. I like the AW3423DW, but not blown away by it either.

Yes the perfect deep blacks are amazing. It gets awfully bright too, meaning specific highlights of colors are piercing bright. The colors are good, but my AW3420DW also had vibrant good colors too.

Played Doom Eternal, CB2077, Skyrim, and World of Warcraft on it briefly so far, and it does look really good, but not jaw droppingly amazing good compared to my old monitor.

I think it being the exact same size display as my old screen doesn't help my opinion of it being night and day better, it's the exact same 34" Ultrawide size, and design with both being Alienware brand. I think I want something larger now, not just the same with better display tech.

I need some more time with this monitor to form my final opinions.
 
Hope it works out well for you. I still may get one, I keep going back and forth. I don't have much time to game for awhile so I'm going to leave off it, and probably build a new system core first since Nuendo could use some more CPU (and I just want a new computer) but I'm thinking about it. I love my 38" size wise though. I would really like to try HDR gaming on one though.

You have the 38" Ultrawide, right?

I'd keep that and wait a year or so see if an OLED 38" comes out. 34" is too small, and even though the OLED is nice, it's not night and day radically different than a high end IPS display, yes blacks are of course BLACK not grey, but everything else isn't crazy better.
 
Tested out World of Warcraft again this morning and had a strange issue, something I've never seen playing WoW in my 18 years, a strange flickering in the load screens. It was on the outer fringes of the screen like this pulsating sort of weird anomaly.
 
I think it being the exact same size display as my old screen doesn't help my opinion of it being night and day better, it's the exact same 34" Ultrawide size, and design with both being Alienware brand.
On the other hand, that allowed you to judge this panel tech more objectively. Same thing happened to me. It was quite better and impressive yes, but not out of this world better.
 
On the other hand, that allowed you to judge this panel tech more objectively. Same thing happened to me. It was quite better and impressive yes, but not out of this world better.

Yep. I will return this AW3423DW this weekend, what will I get next? Maybe go back to the LG C2 42, or try the Asus ROG 42?
 
Tested out World of Warcraft again this morning and had a strange issue, something I've never seen playing WoW in my 18 years, a strange flickering in the load screens. It was on the outer fringes of the screen like this pulsating sort of weird anomaly.
Try disabling g-sync. It might be VRR flicker. It often rears its head in loading screens and menus where the framerate is bouncing all over the place.
 
Why not? It looks like TAA but sharper

Both don't look that great to me. That's why I turn them off and usually just use externally injected SMAA instead.


Honestly I just have not been impressed with any of the up-scaling techs yet including DLSS.

All I see are motion artifacts here and there and often an overly-sharpened look which I don't like.
 
You have the 38" Ultrawide, right?

I'd keep that and wait a year or so see if an OLED 38" comes out. 34" is too small, and even though the OLED is nice, it's not night and day radically different than a high end IPS display, yes blacks are of course BLACK not grey, but everything else isn't crazy better.
Ya I have the 38. The main interest point for me is HDR gaming. The 38 is actually usable for HDR games, which surprised me, but it isn't great. It can do most of the color space, and can do bright and dark areas, but can't do good highlights since it lacks lots of zone or per pixel control. I've seen HDR games on OLEDs and it is really good. The darker blacks would be nice, but not a huge thing, a bias light behind the monitor works well for me. The faster pixel response would be nice too, there isn't a lot of blur on the 38, but there's a little.

For the moment, I'm doing nothing.
 
For those interested, here's the manual for the AW3423DWF. Seems like there's a couple more settings in the menus compared to the original from what I remember.
 
So all that to say: there's no reason not to go 4k. Whether you have a kickass GPU or mediocre GPU, scaling will give you more flexibility for good image quality across a multitude of games on a 2160p display.

Hard disagree, the sacrifice to make a modern title playable on mediocre hardware at 4k is just to high imo. Even on high end hardware it can be, and screen tearing is real.

DLSS is very hit and miss, and outside of a few titles where frames really matter more than quality (like CoD) or the game forces TAA with no other options it drives me nuts. I always see when dlss is turned on, it makes every game smudgy in motion, especially already low res assets in a scene. I have even less interest in DLSS 3 the revenge of Interpolation. That coupled with the general lack of market longevity for proprietary solutions make it a near-moot point to me. *I don't believe dlss will be around by 2030 unless it has industrial applications outside the consumer market.

At 3440x1440 with a 3090 and new intel cpu I'm punching out 80-130fps at ultra settings depending on the title, its the sweet spot for me and why I've picked up this monitor. I am not impressed at all with the 4k performance of my system on my buddies lg oled c1, he has the same system as me so it was an wasy test.

Use based purchasing is critical to being a happy consumer.
 
How do games look that don't support extra wide screen? I'm using a Samsung G7 32inch and love it. I'm not able to go back anything smaller. I had a hard time playing on a 27inch since.
 
How do games look that don't support extra wide screen? I'm using a Samsung G7 32inch and love it. I'm not able to go back anything smaller. I had a hard time playing on a 27inch since.

Black bars on the sides. Some indie titles stretch the image which is awful.
 
*I don't believe dlss will be around by 2030 unless it has industrial applications outside the consumer market.


That doesn't make a lot of sense. Upscaling has been getting more and more popular thanks to DLSS, FSR2, TAAU. We're going to see more of it in the future, not less.

Because running a lower resolution and using temporal data to increase perceived resolution allows you to run higher fidelity settings like ray tracing at lower cost. It's a no brainer, and it looks great. And any motion artifacts like you're talking about are getting worked out as time goes on.
 
That doesn't make a lot of sense. Upscaling has been getting more and more popular thanks to DLSS, FSR2, TAAU. We're going to see more of it in the future, not less.

Because running a lower resolution and using temporal data to increase perceived resolution allows you to run higher fidelity settings like ray tracing at lower cost. It's a no brainer, and it looks great. And any motion artifacts like you're talking about are getting worked out as time goes on.
Thanks for denying my experience, its not a no brainer.

Upscaling has come and gone before, now its proprietary upscaling.
 
Nice setup, I see that you have a Logitech keyboard. It looks like you have an iCUE NEXUS Companion Touch Screen, am I correct? What mousepad is that? I cannot make out what that red glowing object on the left, what is that?
Also, why did you return the monitor?
Ya, that is a Logitech G815 and the iCue Nexus.
Mousepad is an LTT Northern Lights deskpad. And the red glowing thing is my old Logitech G13 gamepad.
And I didn't return my monitor if you are asking me.

IMG_2480.JPEG
 
Back
Top