AW3821DW (Nov 2020) with GSync Ultimate

Set it in the nVidia control panel. I don't know that Gsync modules have good scalers (the hardware where the scaler would normally be is what the Gsync module replaces) so have the GPU do it instead.

Thanks, I set it to not-scale. Though, I also had to set the resolution there to 2560x1440. If I left it at 3840x1600 and opened say Fall Guys at 2560x1440, full-screen, it still scaled. I guess some how the monitor was still displaying 3840x1600 due to either Windows and/or G-Sync.
 
Thanks, I set it to not-scale. Though, I also had to set the resolution there to 2560x1440. If I left it at 3840x1600 and opened say Fall Guys at 2560x1440, full-screen, it still scaled. I guess some how the monitor was still displaying 3840x1600 due to either Windows and/or G-Sync.
I have encountered that once and awhile like Windows didn't get the message to scale. Tabbing back in and out seemed to wake it up. Also make sure it is specified to scale on the GPU, not the display.
 
Okay, ran the demo benchmark for Tomb Raider.

SDR Game - Default Graphic (high)/Display Settings (Windows HDR on)
  1. Scene 1 - Looked good. SDR probably helped this night scene. Could see details in nearly all the areas
  2. Scenes 2 & 3 - Looked good
HDR Game - Default Graphic (high)/Display Settings (Windows HDR on, Monitor set to Mode 0)
  1. Scene 1 - Immediately very dark, even in the menu. Hard to tell if it was crushed or just very very dim. So much so that the display backlighting was becoming noticeable and distracting in this scene.
  2. Scenes 2 - Looked good, didn't suffer as much in that night scene did upon switching to HDR
  3. Scenes 3 - This scene, once in moves to ground level with people, the rocky walls seems crushed in several areas, especially around the steps just after the camera rotated around the butcher.
HDR Game - Change -> Game Brightness Slider Maxed, otherwise default Graphic (high)/Display Settings (Windows HDR on, Monitor set to Mode 0)
  1. Scene 1 - Suddenly this looked almost the same as the SDR test, just a bit more luminance in the lights and signs I think.
  2. Scenes 2 - Looked good, despite the game brightness at 100, still had plenty of contrast
  3. Scenes 3 - This scene, again, once in moves to ground level with the people, while this time you could make out more detail in the rock walls, there's still a hint of detail lost between the rocks
Maybe the mid-point for HDR is really messed up in Tomb Raider. With this setting, totally playable, though, still a bit of information lost here in there. Like the opening scene in the demo where Lara frees her leg and it hands over control for the first time, camera looking down at her. The rocky wall below her at the bottom of the screen, you lose detail there. But again, not enough that you would notice unless you compared to maybe SDR?

Is there a way to compensate for this in Gears 5?

Also, I've been playing Cyberpunk in HDR and haven't noticed any issues there. Though, that game has a mid-point slider to help balance the dark areas with the max luminance you set it to. Only a few times have I walked myself into a really really dark area where I can't see anything, at which point I should of tested if there was a problem.
Thanks for taking the time to check!

SotTR - The thing is, shadow detail and HDR across the range looks excellent to my eyes on an OLED TV, no need to mess around with any settings on the TV, all I did was set the HDR brightness so that there was still detail left in the associated menu test image.
Gears5 - This one has three controls for HDR, the test images look fine (well, decent) but it still loses shadow details on the AW in that first scene of the tutorial. Not tested it on the OLED.
Horizon Zero Dawn - Haven't mucked about with the settings much, but on the AW I still find considerable loss of shadow detail versus the TV.
Cyberpunk - Decided to leave it until a few more patches come out; the 3080 does an admirable job in terms of performance, but my 6850k CPU struggles with crowds and while driving. HDR would be a perfect fit for the game, too bad it is (or was at launch) totally broken with elevated blacks - not an AW issue, it's also messed up with the OLED.
Setting the dark stabilizer to 3 improves shadows a bit in all of these, but it's a pain to change that setting before and after launching a game (and it messes the gamma for Windows and programs if left on all the time).

I guess what I'm more interested in is whether the LG has the same behaviour as the AW when activating HDR in games...

Side note/Rant :)
I don't actually play games on the TV.
On the one hand, because of OLED uneven wear due to HUD elements (not burn-in, at least not in the sense we had burn-in with plasma TVs; but uneven wear is a fact of life with OLED, inherent to the technology), but mostly because I am used to PC gaming with m+k and close to the monitor :)
Still, the OLED blows any IPS/VA out of the water in terms of image quality. I wonder how long we'll have to wait for 38-40" ultrawide microLED displays... :rolleyes:
 
Last edited:
Still, the OLED blows any IPS/VA out of the water in terms of image quality. I wonder how long we'll have to wait for 38-40" ultrawide microLED displays... :rolleyes:
Probably quite awhile. They are still having trouble making ingoranic micro LEDs small enough for big screen TVs. Samsung has one but it is reeeealy big on account of the LED size. So a good bit more work is needed before they are the itty bitty things we'd need for a monitor. More likely we first see something that uses mini/micro LED to provide a ton of dimming zones which can do quite a good HDR job. Another possibility is actually a completely different tech, that would in a way be like CRT or SED renewed: Quantum dots don't have to be excited with light, it can be with electrons too. So you could make a QD display that is then driven by some sort of electron emitting array. That could work really well for monitors. At this point the tech is theoretical, there aren't even prototypes I'm aware of but it could happen.

Basically for desktop usage, LCDs are the thing for the immediate future. Other tech just isn't there quite yet for monitors, particularly because of the stress of desktop use. I figure in 5 years probably there will be something to replace it that works good, maybe less, but for now, I'll use my LCD happily :). I just think about how this thing compares to the tiny, fuzzy, washed out 13" CRT that I used to game on as a kid.

Also with regards to the LG while I can't test it, I imagine it is worse because it sounds like they have less local dimming zones. Strange, as I would think that would be a panel feature but maybe Dell specified more? Either way these monitors aren't going to be amazing for HDR, being edge lit.
 
I received mine today. Initial impression is mixed bag really.
I am using both LG CX65 OLED and LG 38GN950 Ultrawide for my PC. My intention is comparing AW3821DW with 38GN950 (same size, same panel). I've been using 38GN950 for about two months now.

Although AW3821DW is supposed to be using the same LCD panel used in 38GB950, it's immediately noticeable that AW3821DW looks washed out compared to vibrant 38GB950.
Both 38GN950 and AW3821DW have no stuck/dead pixels. In terms of backlight bleed, well, every edge lit LCD DOES have backlight bleed as they cannot completely turn backlight off. It's just matter of consistency and uniformity. In this sense, BLB is all right. They are uniform across the screen. Edge-lit IPS cannot get better than this currently.

In terms of build quality, I think LG 38GN950 has better build quality. AW3821DW feels like cheap and thin plastic compared to thicker feeling of LG's 38GN950.
AW3821DW has what they call "variable backlight mode". I have no idea why they include this feature as it is useless on this edge lit monitor.

Now, the only thing going for this AW3821DW is hardware G-Sync module over 38GN950 (G-Sync vs. G-Sync Compatible).
I will play around with them a bit more and share further findings. I might end up returning AW3821DW and keep 38GN950. We'll see.
 
I received mine today. Initial impression is mixed bag really.
I am using both LG CX65 OLED and LG 38GN950 Ultrawide for my PC. My intention is comparing AW3821DW with 38GN950 (same size, same panel). I've been using 38GN950 for about two months now.

Although AW3821DW is supposed to be using the same LCD panel used in 38GB950, it's immediately noticeable that AW3821DW looks washed out compared to vibrant 38GB950.
Both 38GN950 and AW3821DW have no stuck/dead pixels. In terms of backlight bleed, well, every edge lit LCD DOES have backlight bleed as they cannot completely turn backlight off. It's just matter of consistency and uniformity. In this sense, BLB is all right. They are uniform across the screen. Edge-lit IPS cannot get better than this currently.

In terms of build quality, I think LG 38GN950 has better build quality. AW3821DW feels like cheap and thin plastic compared to thicker feeling of LG's 38GN950.
AW3821DW has what they call "variable backlight mode". I have no idea why they include this feature as it is useless on this edge lit monitor.

Now, the only thing going for this AW3821DW is hardware G-Sync module over 38GN950 (G-Sync vs. G-Sync Compatible).
I will play around with them a bit more and share further findings. I might end up returning AW3821DW and keep 38GN950. We'll see.
I'm shocked LG's build quality and coloring is better than Alienware's. Please do update your impression.
 
I'm shocked LG's build quality and coloring is better than Alienware's. Please do update your impression.
Dunno about the GN but the WN's build quality left much to be desired compared to the AW.
As for the colours... Different picture modes, maybe?
Once calibrated they should be more or less equal since the panels are the same.
 
I'm a bit surprised he feels the GN's build quality is better too. I also have both, and the AW exterior quality is excellent and elegant. My GN's is perfectly acceptable, not as bad as the WN previously posted here, but not as elegant as the AW. On a side note, I might finally be getting my replacement AW soon. More than a month wait. I wish the GN had an integrated power source instead of the external brick.
 
I'm a bit surprised he feels the GN's build quality is better too. I also have both, and the AW exterior quality is excellent and elegant. My GN's is perfectly acceptable, not as bad as the WN previously posted here, but not as elegant as the AW. On a side note, I might finally be getting my replacement AW soon. More than a month wait. I wish the GN had an integrated power source instead of the external brick.
I will never understand why LG likes the external bricks. There is plenty of room in even small monitors for internal power.
 
I'm a bit surprised he feels the GN's build quality is better too. I also have both, and the AW exterior quality is excellent and elegant. My GN's is perfectly acceptable, not as bad as the WN previously posted here, but not as elegant as the AW. On a side note, I might finally be getting my replacement AW soon. More than a month wait. I wish the GN had an integrated power source instead of the external brick.
To each his own I guess. They both are in plastic and none of them shows excellent build quality, IMO. However, LG's 38GN950 seems to have thicker plastic compared to Dell's. None of these comes close to the build quality of Pro Display XDR I use for work though.
 
Ok that makes sense now lol. You’re spoiled with Apple. Can’t compare much to the Pro Display. Jealous you’re using it daily.
 
You have to understand different task for the monitor. Production monitors are 5-25k each.
Right... but I think the point is it doesn't make sense to compare these two monitors. Saying a $1500 144Hz Gsync monitor isn't the same build quality as a $5000 60Hz production monitor is a little silly.
 
Okay, ran the demo benchmark for Tomb Raider.

SDR Game - Default Graphic (high)/Display Settings (Windows HDR on)
  1. Scene 1 - Looked good. SDR probably helped this night scene. Could see details in nearly all the areas
  2. Scenes 2 & 3 - Looked good
HDR Game - Default Graphic (high)/Display Settings (Windows HDR on, Monitor set to Mode 0)
  1. Scene 1 - Immediately very dark, even in the menu. Hard to tell if it was crushed or just very very dim. So much so that the display backlighting was becoming noticeable and distracting in this scene.
  2. Scenes 2 - Looked good, didn't suffer as much in that night scene did upon switching to HDR
  3. Scenes 3 - This scene, once in moves to ground level with people, the rocky walls seems crushed in several areas, especially around the steps just after the camera rotated around the butcher.
HDR Game - Change -> Game Brightness Slider Maxed, otherwise default Graphic (high)/Display Settings (Windows HDR on, Monitor set to Mode 0)
  1. Scene 1 - Suddenly this looked almost the same as the SDR test, just a bit more luminance in the lights and signs I think.
  2. Scenes 2 - Looked good, despite the game brightness at 100, still had plenty of contrast
  3. Scenes 3 - This scene, again, once in moves to ground level with the people, while this time you could make out more detail in the rock walls, there's still a hint of detail lost between the rocks
Maybe the mid-point for HDR is really messed up in Tomb Raider. With this setting, totally playable, though, still a bit of information lost here in there. Like the opening scene in the demo where Lara frees her leg and it hands over control for the first time, camera looking down at her. The rocky wall below her at the bottom of the screen, you lose detail there. But again, not enough that you would notice unless you compared to maybe SDR?

Is there a way to compensate for this in Gears 5?

Also, I've been playing Cyberpunk in HDR and haven't noticed any issues there. Though, that game has a mid-point slider to help balance the dark areas with the max luminance you set it to. Only a few times have I walked myself into a really really dark area where I can't see anything, at which point I should of tested if there was a problem.
Cyberpunk has a lot of really pitch black spots if you keep playing extensively. It's periodic. I went into an empty apartment complex in Valentino territory once that I literally could not see an inch in front of myself. And there were exploding mines hidden all over. So I was just dying left and right. Lol. This Variable Backlights on both the AW3821DW and the AW2721D are complete trash. They need to figure out some kind of fix for them. Or maybe fix their default HDR Profile that kicks in whenever you go into HDR mode and try to compensate for the awful VBL. That Variable Backlight will go off when you're staring into a pitch black corner with total black crush, and then make the black crush even worse, because now you got this massive column of white bloom shining on your screen overlapping the black crush and intensifying it, making visibility absolutely impossible. And there's zero light sources in those game scenes when it happens, so it's like WTF is the Variable Backlight even going off like that? Makes no sense. Whomever designed these things are imbeciles. Definitely wasn't worth over 3 grand to buy two of these panels. I should've waited for Mini LED.

Stumbled onto a post on the Cyberpunk forum that recommended switching NVCP color settings from my default "Accurate" to checking the "Reference" box and letting Nvidia control it. After a brief test in Division 2 tucking my character into the darkest spots of the White House, I noticed improved visibility, and the VBL is less obnoxious. I still notice the VBL kicking in, but at least it's not glaringly bright. This was tested on my AW2721D though. I'm going to do more extensive testing in The Summit tonight on my AW3821DW. This might've even improved the shitty piss colored whites I was noticing when HDR mode is enabled in Windows. I immediately noticed better Whites when I switched to Reference. And since HDR mode doesn't use my sick ICC profile for SDR mode, this might be as good as it gets.
 
Last edited:
Here's two issue, you've been already talking about. I loaded two videos on my YouTube channel to show 'em you.

Monitor settings:
Preset Modes: Comfort View - Smart HDR: On - Response Time: Fast - Dark stabilizer: 2 - Variable Backlight: Mode 1


1. Flickering, or l don't know what you'd call it, in Response Time test of the Eizo monitor test.




2. Variable backlight, I suppose caused by local dimming, clearly visible in the end credits of a tv series/movies. Not good, really bad effect.




Do you think these issue can be solved by firmware update? :cry:
 
If you don't like the local dimming so intense, change the mode. Mode 0 is the most intense, Mode 2 the least (well off the the least but then it isn't variable).
 
Forget HDR with this monitor. Edge-lit IPS panel is just horrible, can't handle HDR at all. Use it strictly for SDR (ultrawide) contents. For HDR, use OLED.
 
I think it definitely depends what you're doing with HDR mode. I think games, like Control, FarCry 5, etc look great with HDR. Is it going to look like OLED HDR? no, it will not, but for me personally, without having those expectations it looks just fine. Is there a bit of black crush? Yeah, but if you play with the settings, it's manageable. The dimming zones don't bother me at all, once I understand the function, but that might be a personal thing.
 
I’ll be honest, I own the LG CX 48 OLED, and I’ve yet to find an enjoyable gaming experience in HDR. Maybe a personal things but I’m just underwhelmed by it.

I have the AW38 on order and am pumped for the UW experience again. That’s where I feel the true value is. The AW34 is amazing and I’m using as a second monitor... hoping this lives up to the same.
 
Last edited:
My Dell 2410 finally bit the dust. I was going for the LG, but since they are backordered till the end of days, I got this one. It ticks all the right boxes, and for $1,200, it's a steal even without USB-C. Now to get my damn Ergotron out the closet.
 
I’ll be honest, I own the LG CX 48 OLED, and I’ve yet to find an enjoyable gaming experience in HDR. Maybe a personal things but I’m just underwhelmed by it.

I have the AW38 on order and am pumped for the UW experience again. That’s where I feel the true value is. The AW34 is amazing and I’m using as a second monitor... hoping this lives up to the same.

There are plenty of games that do HDR poorly. Heck plenty of TV shows and movies too. So you will find some things underwhelming because they aren't well done probably. Ann example of a videogame and TV show I know of are RDR2 and The Mandalorian. Both are objectively better in SDR mode because they were designed for SDR, and HDR is basically just the SDR signal in an HDR container and just looks dark and/or washed out.

Sadly HDR is going to be one of those technologies that'll take time before everyone uses it properly.
 
My Dell 2410 finally bit the dust. I was going for the LG, but since they are backordered till the end of days, I got this one. It ticks all the right boxes, and for $1,200, it's a steal even without USB-C. Now to get my damn Ergotron out the closet.
How did you get one for 1200? I am been waiting to get one for around that range, been at 1529 since I started watching it though.
 
How did you get one for 1200? I am been waiting to get one for around that range, been at 1529 since I started watching it though.
Dell Japan has it for $1560 (160,000 yen), then there is a 15% off coupon dropped it to $1250.
 
free shipping to the us? :) 15% coupon work in the US? Where do I find that?
I just checked, they won't do shipping. And the coupon is Japan only it seems. But makes me wonder if there is a US one floating out there somewhere.

MONTORSP15PQ4W12
 
Just ordered one for $1316 delivered, had a distill notification on it for price drop and it just went off.. and had 10 percent off coupon.
 
Perfect timing. Monitor is on sale for $1379 before discount. I just reordered.
Hmm, this new price is 95.29 from my current AW3821DW.

Is it crazy to order a 2nd one to see which has the best panel, and/or possibly save $95.29 when returning the first one? (Edit: Unless there's a chance Dell will apply a restocking fee>)

Only thing bothering me on the current one is a darker spot in the center top that is sometimes noticeable on white webpages, but the backlight on black looks great. If I tried to just straight up change it, I assume I would have to give up the current one, even if the replacement is worse.
 
Last edited:
After a long internal struggle, I've decided to scrap 144 Hz completely unless I really feel I need those 24 FPS.

I found the colors to be slightly better and blacks to be less crushed if using 120 Hz+10Bit+HDR than 144Hz+8Bit+HDR. I'm, sure this is a 'duh' moment, but eh.
 
Hmm, this new price is 95.29 from my current AW3821DW.

Is it crazy to order a 2nd one to see which has the best panel, and/or possibly save $95.29 when returning the first one? (Edit: Unless there's a chance Dell will apply a restocking fee>)

Only thing bothering me on the current one is a darker spot in the center top that is sometimes noticeable on white webpages, but the backlight on black looks great. If I tried to just straight up change it, I assume I would have to give up the current one, even if the replacement is worse.

Dell is really easy with returns. They also price match. Order another and return current or call for price match.
 
After a long internal struggle, I've decided to scrap 144 Hz completely unless I really feel I need those 24 FPS.

I found the colors to be slightly better and blacks to be less crushed if using 120 Hz+10Bit+HDR than 144Hz+8Bit+HDR. I'm, sure this is a 'duh' moment, but eh.
What game(s) have you tested with? I haven't been able to notice a difference with Hitman 2... but it seems to override the Windows settings and go 8-bit 144Hz no matter what ><.
 
Dell is really easy with returns. They also price match. Order another and return current or call for price match.
Well, just my luck, the price went up 40 in the middle of my check out process. Still a bit cheaper. Thinking it over
 
What game(s) have you tested with? I haven't been able to notice a difference with Hitman 2... but it seems to override the Windows settings and go 8-bit 144Hz no matter what ><.
The main game I tested and saw a difference on was Star Wars Battlefront 2 with HDR. Single player had some really dark places so I tried it for kicks and some of the blacks turned more gray and I was convinced. I am not saying it for 100% certain it will impact other games or whatever but because I can't see 24 extra hz I decided to go with my gut for a while.

You can force any game to do your bidding. Just find the raw settings file don't mess with the game options. Never played hitman but I might be able to help out if you have specifics.
 
The main game I tested and saw a difference on was Star Wars Battlefront 2 with HDR. Single player had some really dark places so I tried it for kicks and some of the blacks turned more gray and I was convinced. I am not saying it for 100% certain it will impact other games or whatever but because I can't see 24 extra hz I decided to go with my gut for a while.

You can force any game to do your bidding. Just find the raw settings file don't mess with the game options. Never played hitman but I might be able to help out if you have specifics.
Hitman 2 does its settings in the registry. You can limit the max refresh rate, which I did, but it still seems to change the monitor mode. There isn't any way I can figure out to tell it to use 10-bit color. It's config is just not that well designed. Still looks good, I played back and forth with HDR/SDR and while there are issues in HDR mode because of the edge lit dimming, over all the game looks better with HDR on.
 
Ordered a 2nd AW monitor to test. Should be here Friday if Dell doesn't delay it and ship 3-day instead of Next Day again.

Going to be interesting to have 2 of the same screens side by side to really see how they can differ due to panel lottery.

I still have the LG WN setup next to the current AW. Trying to recall additional tests for it to compare. Such as
  • Record if it has flickering from local dimming flickers, like the AW
  • How local dimming behaves, including Variable Backlight modes (which already don't seem to change much)
  • etc
 
Hitman 2 does its settings in the registry. You can limit the max refresh rate, which I did, but it still seems to change the monitor mode. There isn't any way I can figure out to tell it to use 10-bit color. It's config is just not that well designed. Still looks good, I played back and forth with HDR/SDR and while there are issues in HDR mode because of the edge lit dimming, over all the game looks better with HDR on.
Seems someone if you go to the registry where the configuration is, there's a "Refreshrate" maybe DWORD? Key you can make and set, presumably to 120. Might be worth a shot.
 
Back
Top