Dell Alienware AW3423DW 34″ QD-OLED 175Hz (3440 x 1440)

Matches my initial impression of the monitor as well. Works fine for my "competitive" needs though I assume most would be using a 16:9 for said scenarios. Curious if Dell can fine tune the processing delay and how the G8 will handle this. That being said, have basically swapped over gaming/FPS duties on the AW and the added 1ms hasn't been an issue personally.

So far haven't seen any real numbers taken with Native display resolution and Max hz. Waiting on Rtings, HUB, TFT, Prad, etc.

The panel is fast and without the weird ghosting/smearing issues.
Have placed a G7 240hz 1440p and AW side by side and the G7 is a smidge faster in response while AW has better clarity/smoother. This was using "Duplicate Display" and unfortunately can't send a native/active signal resolution of 2560x1440p to the AW.

1649431948653.png
 
Matches my initial impression of the monitor as well. Works fine for my "competitive" needs though I assume most would be using a 16:9 for said scenarios. Curious if Dell can fine tune the processing delay and how the G8 will handle this. That being said, have basically swapped over gaming/FPS duties on the AW and the added 1ms hasn't been an issue personally.



View attachment 461937
Another reviewer that didn't test properly! Where are the results with no gsync or adaptive sync? Where are the input lag tests with Gsync (Hardware) instead of Adaptive sync? These are big oversights IMHO. A LOT of us never use that nonsense because it increases input lag. Back to waiting for TFTcentral and Rtings.
 
Last edited:
Another reviewer that didn't test properly! Where are the results with no gsync or adaptive sync? A LOT of us never use that nonsense because it increases input lag. Back to waiting for TFTcentral and Rtings.
The test is valid and "proper". There are a lot of us who use Gsync/Adaptive Sync.

That being said, hopefully you get your non-gsync results as well since that is also valid.
 
It doesn't offer that high a bitrate. It caps out at 80Gbps raw (77.4Gbps after overhead) so a little less than twice HDMI 2.1 (48/42Gbps). It also should be noted that is the max rate, there are actually 3 new bitrates for 2.0: 40Gbps, 54Gbps, and 80Gbps so just because something supports 2.0 doesn't mean it'll support the highest data rate.

Anything that would require data rate beyond 77Gbps of effective throughput, is going to either use DSC, chroma subsampling, or both.
I'm aware. The 16K spec is with DSC, but we don't know exactly how efficient DSC is, so I calculated how much bandwidth that spec would require without and then matched it with my hypothetical 4K spec assuming an equal level of compression.

Then Google just another site with specs about DP 2.0, it’s not that difficult mate. 😉
It's almost like I'm asking here because I already did that and didn't find anything. The resolutions you find in every article or thread about DP 2.0, like what you linked, are just quoted from the official display modes, nothing interesting like what I asked.

Another reviewer that didn't test properly! Where are the results with no gsync or adaptive sync? Where are the input lag tests with Gsync (Hardware) instead of Adaptive sync? These are big oversights IMHO. A LOT of us never use that nonsense because it increases input lag. Back to waiting for TFTcentral and Rtings.
G-Sync doesn't add appreciable input lag unless you're playing a game where you can get several hundred FPS more than your G-Sync limit, and even then it's like 4-5 ms (142 fps cap gsync vs 1000+ fps no-sync).
 
Last edited:
The test is valid and "proper". There are a lot of us who use Gsync/Adaptive Sync.

That being said, hopefully you get your non-gsync results as well since that is also valid.
But he never tested Gsync, only adaptive sync and we know for a fact that input lag consistently measures lower on Gsync displays. Personally, I won't be using either gsync, freesync or HDR because all I play is Warzone.
 
G-Sync doesn't add appreciable input lag unless you're playing a game where you can get several hundred FPS more than your G-Sync limit, and even then it's like 4-5 ms (142 fps cap gsync vs 1000+ fps no-sync).

That varies from display to display but if your argument is that this monitor with a hardware gsync module and gsync enabled has comparable input lag to this monitor without any form of adaptive sync enabled we have ZERO data. He didn't test gsync because Tim used an AMD graphics card. He didn't test with no adaptive sync because Techspot and Tim think all gamers give a shit about HDR. Half of us don't care AT ALL.
 
That varies from display to display but if your argument is that this monitor with a hardware gsync module and gsync enabled has comparable input lag to this monitor without any form of adaptive sync enabled we have ZERO data. He didn't test gsync because Tim used an AMD graphics card. He didn't test with no adaptive sync because Techspot and Tim think all gamers give a shit about HDR. Half of us don't care AT ALL.

I bet more gamers would care about HDR than single digit millisecond differences in input lag, especially after experiencing in person what good HDR can add to the experience of playing a game. Double especially on a console port like Warzone where the TTK is like a minute and a half

The vast majority of PC gamers don't play competitively
 
I bet more gamers would care about HDR than single digit millisecond differences in input lag, especially after experiencing in person what good HDR can add to the experience of playing a game. Double especially on a console port like Warzone where the TTK is like a minute and a half

The vast majority of PC gamers don't play competitively
The ttk in Warzone is 700ms, the ttk in Apex is a minute and a half. Battle Royale games still account for over 700 million gamers worldwide. Just because you don’t enjoy them doesn’t mean millions of don’t play them.
 
The ttk in Warzone is 700ms, the ttk in Apex is a minute and a half. Battle Royale games still account for over 700 million gamers worldwide. Just because you don’t enjoy them doesn’t mean millions of don’t play them.
I dunno if you realize this but the majority of people here are in favor of HDR and consider it a transformative experience. The idea of catering a display to a single demographic of users that you happen to be in rather than making it great for all purposes is baffling to me.

It's great that you're a esports whore and can take advantage of the motion clarity benefits this monitor provides but we can also have great contrast, HDR, etc. Your posts make it sound like everyone on the planet spends 16 hours a day shitting/pissing in their computer chair on Warzone.
 
The ttk in Warzone is 700ms, the ttk in Apex is a minute and a half. Battle Royale games still account for over 700 million gamers worldwide. Just because you don’t enjoy them doesn’t mean millions of don’t play them.

What percent of those 700 million people are playing on a cell phone (which probably has better HDR support than most PCs)? 70?

I said most PC gamers don't play competitively. BRs being popular kind of supports that claim, actually. A lot of the sweatier gamers avoid BRs because of the RNG factor which allows more casual people to beat better players due to luck. They are not purely skill based like other more competitive games.
 
I dunno if you realize this but the majority of people here are in favor of HDR and consider it a transformative experience. The idea of catering a display to a single demographic of users that you happen to be in rather than making it great for all purposes is baffling to me.

It's great that you're a esports whore and can take advantage of the motion clarity benefits this monitor provides but we can also have great contrast, HDR, etc. Your posts make it sound like everyone on the planet spends 16 hours a day shitting/pissing in their computer chair on Warzone.

Except the HDR on this monitor seems to have a few issues though.

1649442206969.png
1649442241868.png


I'm sure it's still miles better than any LCD, but it seems like it could use some work.
 
Game-bugs and server issues are more likely to fail you, than 4-5ms of display input lag. Especially in Apex and Warzone :(
 
Except the HDR on this monitor seems to have a few issues though.

I'm sure it's still miles better than any LCD, but it seems like it could use some work.
Yeah I noted way back that HDR looks more impressive on a C1. The fact that he recommends HDR1000 mode confirms his eyes don't work or he didn't test enough because the ABL makes it unusable. Didn't know about the EOTF issue but that also explains why 1000 mode looks poor.

Real world use this is a 450nit peak brightness monitor.
 
I dunno if you realize this but the majority of people here are in favor of HDR and consider it a transformative experience. The idea of catering a display to a single demographic of users that you happen to be in rather than making it great for all purposes is baffling to me.

It's great that you're a esports whore and can take advantage of the motion clarity benefits this monitor provides but we can also have great contrast, HDR, etc. Your posts make it sound like everyone on the planet spends 16 hours a day shitting/pissing in their computer chair on Warzone.
I'm not asking for this display to cater to me, all I'm saying is that testing input lag with Adaptive sync instead of Gsync or neither makes zero sense on a $1400 monitor with a built-in hardware Gsync module.
 
Yeah I noted way back that HDR looks more impressive on a C1. The fact that he recommends HDR1000 mode confirms his eyes don't work or he didn't test enough because the ABL makes it unusable. Didn't know about the EOTF issue but that also explains why 1000 mode looks poor.

Real world use this is a 450nit peak brightness monitor.
Which isn't a big deal. It's certified as HDR400 True black. So it being able to actually punch above that at all is kind of a nice trend.
 
Matches my initial impression of the monitor as well. Works fine for my "competitive" needs though I assume most would be using a 16:9 for said scenarios. Curious if Dell can fine tune the processing delay and how the G8 will handle this. That being said, have basically swapped over gaming/FPS duties on the AW and the added 1ms hasn't been an issue personally.



View attachment 461937
Yuck.
 
I wonder if the processing lag is a result of the pixel shifting.

Curious to see how Samsung's numbers come out when they release their monitor with this panel.
 
Anyone got any experience with the LG 38WN95C-W?

if I start to get too annoyed by the fan noise on this Alienware then I may consider switching to the LG. It appears to have an internal LUT so the means of colour calibration that I am used to direct to the monitor is retained from my old LG as I don't really like calibrating via an icc profile and GFX card's lut combo. Plus, 38" 144Hz so would be rather good on that front.

Just throwing around an idea should the fan noise start to get on my nerves since my PC so almost as silent in use as it is when off so I can hear every whisper from the monitor's two fans.
 
Anyone got any experience with the LG 38WN95C-W?

if I start to get too annoyed by the fan noise on this Alienware then I may consider switching to the LG. It appears to have an internal LUT so the means of colour calibration that I am used to direct to the monitor is retained from my old LG as I don't really like calibrating via an icc profile and GFX card's lut combo. Plus, 38" 144Hz so would be rather good on that front.

Just throwing around an idea should the fan noise start to get on my nerves since my PC so almost as silent in use as it is when off so I can hear every whisper from the monitor's two fans.
edge-lit IPS and HDR will never work well, period.
 
Hmm figured that might be the case. Ultimately I will likely just have to accept the fan noise and hope a future 38" model will come with FreeSync Premium instead of Gsync so there will be no fan. There's no denying that OLED is unbeatable really.
 
Problem is hitting those kind of frame rates though. I have a 3080 Ti and 6900 XT and neither GPU is really capable of pushing 3440x1440 at 175fps in the latest games, at least not at ultra details or even high. But if next gen GPU's are as powerful as they are rumored to be then that will all change.
Hey! New here, but I agree that it's really hard to hit 175 FPS at 1440p ultrawide with high settings. I ordered this monitor about 3 weeks ago and won't receive it until June 9th. I'm holding off on playing games that I already own like CP2077 and Red Dead until I replace my (apparently now lowly) 1080Ti with a new GPU. Prices are falling back to Earth, but I'll wait for the RTX 4000 series since they are less than six months away. Maybe the RTX 4080 (or whatever they label it) will get me to 175Hz with eye candy. We'll see...
 
Screenshot_20220409-193108_Chrome.jpg


I'm about to cave and order one just because I'm so curious about the tech. Been looking around for deals or coupons - anybody here know of one?
 
Yeah, pretty lame. I'll be waiting on this because of that. Typical corporate/government discounts of 10% aren't working on this.
 
Yeah, pretty lame. I'll be waiting on this because of that. Typical corporate/government discounts of 10% aren't working on this.
I've honestly been waiting for them to just flat out cancel my 5-6 week old order one of these days because I used a bunch of the working coupons on day 1.

I also don't have the "advanced exchange warranty" on the invoice either since it was bugged majority of the first day of orders as well which is a little worrisome
 
Anyone know if there are any discounts for this? I asked Dell as I am part of the Membership Discount Program but they are telling me this particular monitor will not be eligible for the program (even though all other monitors pretty much are). I am thinking it will someday be part of the program but it might be a long time.
 
Someone was asking about ambient lighting making the screen grey on another forum so I took these two photos earlier.
I have my really bright LED shop light on behind me and screen still looks fine,
IMG_2126.JPEG


IMG_2127.JPEG
 
Back
Top