Sweet Baby Jeebus Alienware 55" 4k120 OLED Displayport

Joined
Apr 14, 2011
Messages
647
Checking back in, first time having a monitor and not even thinking about replacing it at all. Still the only OLED 4k@120hz VRR monitor game in town. End Game.

yeah and TBH I am going to bet the LGC8X or the 3080 Ti's will all be delayed big time. This Alienware will be the only game in town for a LONG time
 

MistaSparkul

[H]ard|Gawd
Joined
Jul 5, 2012
Messages
1,521
but then you'd have to wait for a HDMI 2.1 GPU to be released soon with out any delays

Well 4k120Hz is suppose to work on the CX....but at 4:2:0 chroma. Still better than 60Hz regardless and something I'd gladly live with for a few months.
 
  • Like
Reactions: elvn
like this

elvn

2[H]4U
Joined
May 5, 2006
Messages
3,871
I wonder if you could run a ultrawide rez or other letterboxed rez on it at 4:2:2 120hz, within the bandwidth limit of hdmi 2.0. Going to post the question in the CX thread since at least one person there already has a 65" LG CX I think.
 

skypine27

Gawd
Joined
Apr 18, 2008
Messages
698
So I had more money than sense and dropped 3K on this thing....

Days of gaming opinions:

+Easy to set up, once you realize the PC native resolutions are listed BELOW the TV ones!! (you need to select the PC resolution to get 120hz and g-sync compatible). I set windows scaling to 150%, it defaults to 300% which makes the icons huuuuge
+Looks awesome color wise and especially the blacks. They arent kidding about OLED and black, Elite Dangerous is just wow
+Web browsing / text feels comfy, was worried about tis

-a single GTX 2080Ti and overclocked HEDT CPU isnt enough to drive it at 120 hz in any games. Hoping the 3080Ti will change this....
-Huge, you need a desk the size of a conference room table (I have a desk that big so it isnt a - for me but would be to others)
-Doesnt feel as snappy as the AW3418DW I came from, but that was 3440 x 1440 vs 4K so I was getting consistent 100 FPS on it.
-Heres one that I didnt expect.... coming from a curved monitor for a LONG time, this flat panel actually feels like its curved THE WRONG way!!! Looking at it makes me feel like sids are actually curved AWAY from me and the front is bulging. I know its mental, hope it goes away.
 

Attachments

  • IMG_9967.jpg
    IMG_9967.jpg
    511.3 KB · Views: 0
  • IMG_9966.jpg
    IMG_9966.jpg
    504.3 KB · Views: 0
Joined
Apr 22, 2011
Messages
521
So I had more money than sense and dropped 3K on this thing....

Days of gaming opinions:

+Easy to set up, once you realize the PC native resolutions are listed BELOW the TV ones!! (you need to select the PC resolution to get 120hz and g-sync compatible). I set windows scaling to 150%, it defaults to 300% which makes the icons huuuuge
+Looks awesome color wise and especially the blacks. They arent kidding about OLED and black, Elite Dangerous is just wow
+Web browsing / text feels comfy, was worried about tis

-a single GTX 2080Ti and overclocked HEDT CPU isnt enough to drive it at 120 hz in any games. Hoping the 3080Ti will change this....
-Huge, you need a desk the size of a conference room table (I have a desk that big so it isnt a - for me but would be to others)
-Doesnt feel as snappy as the AW3418DW I came from, but that was 3440 x 1440 vs 4K so I was getting consistent 100 FPS on it.
-Heres one that I didnt expect.... coming from a curved monitor for a LONG time, this flat panel actually feels like its curved THE WRONG way!!! Looking at it makes me feel like sids are actually curved AWAY from me and the front is bulging. I know its mental, hope it goes away.

Congrats, have one myself running with a volt-modded Titan V. First time I don't long for a monitor with better tech to come along, ticks all the boxes: OLED pq with infinite contrast that makes it pop, 4k, 120hz, VRR, .5ms response time. I also run custom resolutions like 3840x1440 and 3840x1600 to play in ultrawide.

Some folks say it doesn't have HDR blaze bla... but it's a gimmick. Alot of negativity from those that don't own one and have never used it, just ignore the peanut gallery and enjoy the best monitor experience there is!
 

skypine27

Gawd
Joined
Apr 18, 2008
Messages
698
Stryker:

What is the setting under Display, SmartHDR? Ive set it "desktop" and "game HDR" and am not sure if I notice a difference?

Thx!
 
Joined
Apr 22, 2011
Messages
521
Stryker:

What is the setting under Display, SmartHDR? Ive set it "desktop" and "game HDR" and am not sure if I notice a difference?

Thx!

I will tinker with it when I get home today to see if it works in conjunction with setting HDR in windows as well as games. I don't run HDR because it's not for me, but I'm willing to experiment with it just for the hell of it.

The manual has info on them but seems like it makes it clear as mud.

https://downloads.dell.com/manuals/all-products/esuprt_electronics_accessories/esuprt_electronics_accessories_monitors/dell-aw5520qf-monitor_user's-guide_en-us.pdf

Smart HDR The Smart HDR (High Dynamic Range) menu enhances the display output by optimally adjusting contrast and the ranges of color and luminosity to resemble true-to-life visuals. The default setting is Off. You may set the Smart HDR mode to:
• Desktop: Suitable for using the monitor with a desktop computer.
• Game HDR: Suitable for playing HDR-compatible games. It displays more realistic scenes and makes the gaming experience immersive and entertaining.
• Movie HDR: Suitable for the playback of HDR video content.
• Reference: Displays OLED panel's native color gamut.
NOTE: When the monitor is processing HDR content, Preset Modes and Brightness will be disabled.
 

skypine27

Gawd
Joined
Apr 18, 2008
Messages
698
Stryker:

So here's what I found out so far. When you set the monitor to Game HDR, I guess it "simulates" that the monitor supports HDR, even tho we know it technically does not. So now, in games that support HDR, the HDR box in the game menu is no longer grey-ed out, you can actually toggle it on! Im testing this in Assassins Creed Odyssey right now, and it definitely makes the game look different. Not sure if "better" or not, but it is certainly noticeably different!

Update: Ive tested it in a couple of games now. It works in Far Cry 5 and also makes the game look different. However, In War Thunder, when you launch it with the HDR enabled in the game menu, it gives an error message "output device is not HDR mode" or something like that. So I guess the simulated HDR doesnt work in all games

EDIT: The text in install menus, like installing a windows app, seems kind of fuzzy. Any solution to this?? I read somewhere about changing clear type or something but I have no idea what that means
 
Last edited:

Vega

Supreme [H]ardness
Joined
Oct 12, 2004
Messages
6,531
-Heres one that I didnt expect.... coming from a curved monitor for a LONG time, this flat panel actually feels like its curved THE WRONG way!!! Looking at it makes me feel like sids are actually curved AWAY from me and the front is bulging. I know its mental, hope it goes away.

Ya that is a problem when sitting close to a large flat screen. It's not a natural perception. I was so pissed when LG decided to get rid of the curve after the C6. I LOVED the curve of the C6 for PC gaming.
 
Joined
Apr 22, 2011
Messages
521
Stryker:

So here's what I found out so far. When you set the monitor to Game HDR, I guess it "simulates" that the monitor supports HDR, even tho we know it technically does not. So now, in games that support HDR, the HDR box in the game menu is no longer grey-ed out, you can actually toggle it on! Im testing this in Assassins Creed Odyssey right now, and it definitely makes the game look different. Not sure if "better" or not, but it is certainly noticeably different!

Update: Ive tested it in a couple of games now. It works in Far Cry 5 and also makes the game look different. However, In War Thunder, when you launch it with the HDR enabled in the game menu, it gives an error message "output device is not HDR mode" or something like that. So I guess the simulated HDR doesnt work in all games

EDIT: The text in install menus, like installing a windows app, seems kind of fuzzy. Any solution to this?? I read somewhere about changing clear type or something but I have no idea what that means

Haven't tinkered but the manual stated that it does support HDR content, the simulated presets will not be available when it has a real HDR signal to process.

From the manual:

NOTE: When the monitor is processing HDR content, Preset Modes and Brightness will be disabled
 

Baasha

Weaksauce
Joined
Feb 23, 2014
Messages
107
Haven't posted on this forum in a while but had to post in here.

Got this Alienware AW5520QF back in Oct. 2019. Worked beautifully until mid-Feb. 2020. Went into BIOS to tweak some OC settings for my CPU, hit "save and exit" and then poof, the monitor went black and never turned on again. Well, it turned "on" with the colors in the back but there was no image.

Fast forward a few days and I had to get a replacement monitor from Dell. That one worked until mid-March 2020 and then bang - again, image went black when I was just online (not gaming) and now the monitor didn't even turn on. Called Dell again and got yet another replacement.

So, this is the 3rd monitor and guess what? Today, again while just online and not gaming, I heard a LOUD "pop" sound and the image went black and the thing won't even turn on!

I had the monitor plugged into a surge protector by itself - meaning - NO OTHER DEVICES were plugged into that surge protector which went to its own socket at the wall!

Needless to say, I am HUGELY FRUSTRATED with this display - there is definitely something wrong with its power layout/design as this is the THIRD time this has happened.

1.) First time monitor was plugged into a wall socket directly

2.) Second time monitor was plugged into a surge protector with other devices (peripherals etc.)

3.) Third time monitor was plugged into a surge protect by itself with no other devices daisy-chained to it

It is hands down the best monitor I've used - I just reconnected my old Dell UP3017Q 4K OLED 30" 60Hz monitor and it feels like molasses. The image quality, 120Hz, and overall experience is second to none.

However, the monitor just crapping out on me is ridiculous and this is the THIRD time it's happened.

Has anyone else experienced similar issues?

Obviously, I'm going to get another replacement but can you guys suggest how to avoid this in the future?

One point - there was a huge thunder/lightning/rain last night. The whole point of the surge protector is to do just that - PROTECT - the device(s) from dying. The loud pop I heard I bet is a blown capacitor or some such thing.

Would really appreciate some advice/help on this. I love this monitor and want to use it always. I bought back when it was still $4k as well.

Playing FPS games at > 120fps in 4K maxed out is a truly unbelievable experience - with the 55" OLED image it's so beautiful.
 

Attachments

  • IMG_3813.jpg
    IMG_3813.jpg
    372 KB · Views: 0

Lateralus

More [H]uman than Human
Joined
Aug 7, 2004
Messages
16,009
Haven't posted on this forum in a while but had to post in here.

Got this Alienware AW5520QF back in Oct. 2019. Worked beautifully until mid-Feb. 2020. Went into BIOS to tweak some OC settings for my CPU, hit "save and exit" and then poof, the monitor went black and never turned on again. Well, it turned "on" with the colors in the back but there was no image.

Fast forward a few days and I had to get a replacement monitor from Dell. That one worked until mid-March 2020 and then bang - again, image went black when I was just online (not gaming) and now the monitor didn't even turn on. Called Dell again and got yet another replacement.

So, this is the 3rd monitor and guess what? Today, again while just online and not gaming, I heard a LOUD "pop" sound and the image went back and the thing won't even turn on!

I had the monitor plugged into a surge protector by itself - meaning - NO OTHER DEVICES were plugged into that surge protector which went to its own socket at the wall!

Needless to say, I am HUGELY FRUSTRATED with this display - there is definitely something wrong with its power layout/design as this is the THIRD time this has happened.

1.) First time monitor was plugged into a wall socket directly

2.) Second time monitor was plugged into a surge protector with other devices (peripherals etc.)

3.) Third time monitor was plugged into a surge protect by itself with no other devices daisy-chained to it

It is hands down the best monitor I've used - I just reconnected my old Dell UP3017Q 4K OLED 30" 60Hz monitor and it feels like molasses. The image quality, 120Hz, and overall experience is second to none.

However, the monitor just crapping out on me is ridiculous and this is the THIRD time it's happened.

Has anyone else experienced similar issues?

Obviously, I'm going to get another replacement but can you guys suggest how to avoid this in the future?

One point - there was a huge thunder/lightning/rain last night. The whole point of the surge protector is to do just that - PROTECT - the device(s) from dying. The loud pop I heard I bet is a blown capacitor or some such thing.

Would really appreciate some advice/help on this. I love this monitor and want to use it always. I bought back when it was still $4k as well.

Playing FPS games at > 120fps in 4K maxed out is a truly unbelievable experience - with the 55" OLED image it's so beautiful.

Gross. That's inexcusable for the price if it does turn out to be a design flaw, but considering I haven't heard of this happening to others yet and it has happened to you three times... Maybe consider plugging it in to a UPS with power conditioning? If you have dirty power, that's not good for sensitive electronics. I had an APC UPS that provided surge protection + power conditioning; great unit.
 

Baasha

Weaksauce
Joined
Feb 23, 2014
Messages
107
Gross. That's inexcusable for the price if it does turn out to be a design flaw, but considering I haven't heard of this happening to others yet and it has happened to you three times... Maybe consider plugging it in to a UPS with power conditioning? If you have dirty power, that's not good for sensitive electronics. I had an APC UPS that provided surge protection + power conditioning; great unit.

Yea, it seems to be too coincidental that this has happened three times. Can you suggest a really good APC UPS model that does what you mentioned? I've never used one before.
 

Lateralus

More [H]uman than Human
Joined
Aug 7, 2004
Messages
16,009
Yea, it seems to be too coincidental that this has happened three times. Can you suggest a really good APC UPS model that does what you mentioned? I've never used one before.

This would be a good solid unit with AVR (automatic voltage regulation), and similar to the one that I had:

APC UPS, 1000VA UPS Battery Backup & Surge Protector, BX1000M Backup Battery, AVR, Dataline Protection and LCD Display, Back-UPS Pro Uninterruptible Power Supply https://www.amazon.com/dp/B06VY12HW4/ref=cm_sw_r_cp_api_i_xYaZEbM2NH8ZZ

The true power conditioners that do AVR and also noise filtering are a little more expensive:

https://www.apc.com/shop/us/en/cate.../av-power-conditioners-with-battery/N-1lvvdnh

But they are specifically marketed towards protecting A/V equipment like our OLEDs and expensive home theater gear, so they probably do a better job, although unfortunately I do not have experience with one personally.
 

Baasha

Weaksauce
Joined
Feb 23, 2014
Messages
107
Thanks for the links.

So, to understand better, the UPS Battery Backup & Surge Protector BX1000M says it has only "600W" - does this mean that it provides protection for up to only 600W?

My system uses a Corsair AX1500i and my second system (to be built) will be using EVGA 1600W T2 PSU. That along with all the peripherals is what I want protection for - mainly the computer and the monitor (and my NAS device).

The 1500VA version says 900W so I guess I'm not understanding correctly.

Regarding the more expensive units, do we need a "battery backup" or just more surge protection and voltage regulation? How would the battery backup help if I'm using a system that say draws around 1800W at the wall (using 2x GPUs, OC'd CPU etc.) ?

If I want to have two systems (say both with 1600W PSUs), should I get two units? If so, which ones? Also, should I get a 20A breaker w/ outlet installed or even a higher one at 30A?

Thanks again.

This would be a good solid unit with AVR (automatic voltage regulation), and similar to the one that I had:

APC UPS, 1000VA UPS Battery Backup & Surge Protector, BX1000M Backup Battery, AVR, Dataline Protection and LCD Display, Back-UPS Pro Uninterruptible Power Supply https://www.amazon.com/dp/B06VY12HW4/ref=cm_sw_r_cp_api_i_xYaZEbM2NH8ZZ

The true power conditioners that do AVR and also noise filtering are a little more expensive:

https://www.apc.com/shop/us/en/cate.../av-power-conditioners-with-battery/N-1lvvdnh

But they are specifically marketed towards protecting A/V equipment like our OLEDs and expensive home theater gear, so they probably do a better job, although unfortunately I do not have experience with one personally.
 
Joined
Apr 22, 2011
Messages
521
Haven't posted on this forum in a while but had to post in here.

Got this Alienware AW5520QF back in Oct. 2019. Worked beautifully until mid-Feb. 2020. Went into BIOS to tweak some OC settings for my CPU, hit "save and exit" and then poof, the monitor went black and never turned on again. Well, it turned "on" with the colors in the back but there was no image.

Fast forward a few days and I had to get a replacement monitor from Dell. That one worked until mid-March 2020 and then bang - again, image went black when I was just online (not gaming) and now the monitor didn't even turn on. Called Dell again and got yet another replacement.

So, this is the 3rd monitor and guess what? Today, again while just online and not gaming, I heard a LOUD "pop" sound and the image went black and the thing won't even turn on!

I had the monitor plugged into a surge protector by itself - meaning - NO OTHER DEVICES were plugged into that surge protector which went to its own socket at the wall!

Needless to say, I am HUGELY FRUSTRATED with this display - there is definitely something wrong with its power layout/design as this is the THIRD time this has happened.

1.) First time monitor was plugged into a wall socket directly

2.) Second time monitor was plugged into a surge protector with other devices (peripherals etc.)

3.) Third time monitor was plugged into a surge protect by itself with no other devices daisy-chained to it

It is hands down the best monitor I've used - I just reconnected my old Dell UP3017Q 4K OLED 30" 60Hz monitor and it feels like molasses. The image quality, 120Hz, and overall experience is second to none.

However, the monitor just crapping out on me is ridiculous and this is the THIRD time it's happened.

Has anyone else experienced similar issues?

Obviously, I'm going to get another replacement but can you guys suggest how to avoid this in the future?

One point - there was a huge thunder/lightning/rain last night. The whole point of the surge protector is to do just that - PROTECT - the device(s) from dying. The loud pop I heard I bet is a blown capacitor or some such thing.

Would really appreciate some advice/help on this. I love this monitor and want to use it always. I bought back when it was still $4k as well.

Playing FPS games at > 120fps in 4K maxed out is a truly unbelievable experience - with the 55" OLED image it's so beautiful.

I found the included DisplayPort cable to be problematic, maybe cause it's long (10ft if I remember correctly) and I suspect the signal peters out and sometimes the AW55 would lose signal. Not sure how signal strength works but maybe my Titan V had trouble sending a strong enough signal and maintaining handshake. I resolved it with a short (6ft) robust DisplayPort cable from Amazon.

I think it was this one:
https://www.amazon.com/Accell-DP-1-...ords=displayport+accell&qid=1591712886&sr=8-3
 
Last edited:

kasakka

[H]ard|Gawd
Joined
Aug 25, 2008
Messages
1,931
10 feet is afaik about the max recommended for DisplayPort. The quality of the cable becomes very important at longer lengths based on my experience with long 8-10m HDMI cables.
 

Seyumi

Limp Gawd
Joined
Mar 30, 2011
Messages
288
Thanks for the links.

So, to understand better, the UPS Battery Backup & Surge Protector BX1000M says it has only "600W" - does this mean that it provides protection for up to only 600W?

My system uses a Corsair AX1500i and my second system (to be built) will be using EVGA 1600W T2 PSU. That along with all the peripherals is what I want protection for - mainly the computer and the monitor (and my NAS device).

The 1500VA version says 900W so I guess I'm not understanding correctly.

Regarding the more expensive units, do we need a "battery backup" or just more surge protection and voltage regulation? How would the battery backup help if I'm using a system that say draws around 1800W at the wall (using 2x GPUs, OC'd CPU etc.) ?

If I want to have two systems (say both with 1600W PSUs), should I get two units? If so, which ones? Also, should I get a 20A breaker w/ outlet installed or even a higher one at 30A?

Thanks again.

Your situation is a bit tricky Baasha since you have such a high wattage PC. Most "consumer" UPS's tops out at 900 watts. Anything more than than you start getting in the commercial-grade server rack $1000+ realm.

Usually for the UPS's, there's 2 groups of plugs to plug your items into. Half are usually "battery backup + surge protection" while the other half is just "surge protection only". Either ports will still get the clean energy you need to prevent any potential frying. Your main PC itself probably needs to go on the non-battery side since it won't have enough to power your PC if you're stressing the system or whatnot. All your other peripherals (OLED, speakers, etc,) can probably go on the battery backup plugs.

Personally I would recommend this unit:

https://www.amazon.com/gp/product/B00429N19W/ref=ox_sc_act_title_1?smid=ATVPDKIKX0DER&psc=1

It's the newest version of the most popular UPS system sold on Amazon which is the one I use to own in the past:

https://www.amazon.com/gp/product/B000FBK3QK/ref=ox_sc_act_title_1?smid=AJZI1JW62R4Q6&psc=1
 
Joined
Apr 22, 2011
Messages
521
Stryker:

So here's what I found out so far. When you set the monitor to Game HDR, I guess it "simulates" that the monitor supports HDR, even tho we know it technically does not. So now, in games that support HDR, the HDR box in the game menu is no longer grey-ed out, you can actually toggle it on! Im testing this in Assassins Creed Odyssey right now, and it definitely makes the game look different. Not sure if "better" or not, but it is certainly noticeably different!

Update: Ive tested it in a couple of games now. It works in Far Cry 5 and also makes the game look different. However, In War Thunder, when you launch it with the HDR enabled in the game menu, it gives an error message "output device is not HDR mode" or something like that. So I guess the simulated HDR doesnt work in all games

EDIT: The text in install menus, like installing a windows app, seems kind of fuzzy. Any solution to this?? I read somewhere about changing clear type or something but I have no idea what that means

Was HDR enabled in Windows? Curious to know if that is why the option was grayed out in games, maybe Windows wasn't reporting that it was in HDR if the Windows HDR mode wasn't enabled and therefore games also didn't recognize it. That may be worth a shot to see if you get the real HDR going instead of "simulating" HDR with the monitor presets.
 

IdiotInCharge

NVIDIA SHILL
Joined
Jun 13, 2003
Messages
14,712
using a system that say draws around 1800W at the wall (using 2x GPUs, OC'd CPU etc.) ?
Mostly, that shouldn't even be possible. You'd be stretching a lot with consumer components to even get 1000W at the wall. Obviously you'd want the PSU to be rated higher than that for a number of reasons, but actual wattage in the 1000W+ range should be very difficult to hit.
 

cybereality

Supreme [H]ardness
Joined
Mar 22, 2008
Messages
6,266
No, not needed at all. This comes from a guy that has bought 1500W PSUs just to be [H]ard. It's overkill.

And yeah, I was running Tri-SLI and also sorts of OC and everything. After I did the math, I think a 1200W would have been enough. I can't imagine needing 1800W.

I'm kind of over it at this point. SLI is a bust, and even though I had 5GHz on this machine, I put it back on stock. Not that it wasn't fun but I guess I'm a simpler man now.
 

IdiotInCharge

NVIDIA SHILL
Joined
Jun 13, 2003
Messages
14,712
I'm kind of over it at this point. SLI is a bust, and even though I had 5GHz on this machine, I put it back on stock. Not that it wasn't fun but I guess I'm a simpler man now.

Overclocking has become a bit of a downer recently, and multi-GPU support is in the shitter while devs grapple with actually supporting current GPU features.

Not sure if overclocking will be more of a thing in the future; these days, you're more or less paying for the number of cores. As for multi-GPU, that'll likely make a comeback at some point, but not before we see a new paradigm on the OS side of things to get stuff actually lined up.

No, not needed at all. This comes from a guy that has bought 1500W PSUs just to be [H]ard. It's overkill.
I mean, I can imagine a system that could do that, even one that might even be somewhat reasonable in that the power could actually be put to use, but even then only just edging up to 1000W.

Reality is, you need maybe 650W max for a single GPU setup with any consumer or enthusiast CPU. You need to push the larger TR or even Epyc CPUs and stack a few GPUs in there to push up further, and well, one would hope at that point you're actually doing work!
 

cybereality

Supreme [H]ardness
Joined
Mar 22, 2008
Messages
6,266
Well maybe if you are mining with 6x RTX Titans or something crazy.

I'm talking about a consumer build with 1 or even 2 video cards and nothing fancy.
 

Vega

Supreme [H]ardness
Joined
Oct 12, 2004
Messages
6,531
With my single-GPU system in sig, I can pull up to 900 watts. There is a benefit to having a higher wattage PSU like I do (EVGA 1600w), if you keep the power draw in the middle range, it is the most efficient. And it's also quieter, as my PSU fan never turns on since it never has to struggle. The PSU will also last longer.

1591779202304.png
 

skypine27

Gawd
Joined
Apr 18, 2008
Messages
698
Was HDR enabled in Windows? Curious to know if that is why the option was grayed out in games, maybe Windows wasn't reporting that it was in HDR if the Windows HDR mode wasn't enabled and therefore games also didn't recognize it. That may be worth a shot to see if you get the real HDR going instead of "simulating" HDR with the monitor presets.

Hi fren sorry for the slow reply, not on here much.

No, I did not enable the HDR toggle in windows. That toggle seems misleading too, its labeled like "Windows HD Color, Play HDR games and apps". I dont use any apps at all and as far as HDR games go, I believe it should be up to the game to "tell" the monitor its HDR capable and thus allow me to check the HDR box in the game settings.

I think it doesnt work for WarThunder because the AW monitor is not real HDR, its just somehow half-ass simulated. So it seems to work in some games, AC Odyssey works, etc. I don't know that I even have any other HDR games to test it with!
 

kasakka

[H]ard|Gawd
Joined
Aug 25, 2008
Messages
1,931
Hi fren sorry for the slow reply, not on here much.

No, I did not enable the HDR toggle in windows. That toggle seems misleading too, its labeled like "Windows HD Color, Play HDR games and apps". I dont use any apps at all and as far as HDR games go, I believe it should be up to the game to "tell" the monitor its HDR capable and thus allow me to check the HDR box in the game settings.

I think it doesnt work for WarThunder because the AW monitor is not real HDR, its just somehow half-ass simulated. So it seems to work in some games, AC Odyssey works, etc. I don't know that I even have any other HDR games to test it with!

Games often have wildly different interpretations on if HDR should be available or not. Some games will give you the option regardless of the Windows HDR toggle position and others are like "no HDR here boss" if it is off, which is totally the wrong way to go when most people would prefer to enable HDR on a case by case basis due to how awful it looks on the desktop unless you have an OLED.

You could download Shadow of the Tomb Raider demo from Steam and try that. It has a pretty decent HDR implementation that is easy to see in the opening sequences. If I remember correctly it requires the Windows HDR toggle to be on.

Hitman 2 and maybe Destiny 2 on the other hand let you turn on HDR regardless of the toggle. RDR2 will just hide the option completely unless the toggle is on.
 

Baasha

Weaksauce
Joined
Feb 23, 2014
Messages
107
Finally got my replacement monitor today! It took them almost THREE WEEKS to send me one.

Just set everything up and it works beautifully - I plugged it into my 900W UPS in the "Battery+Surge" section - can someone confirm if that's the correct set of sockets I should plug the monitor into? I plugged it in directly - here's to hoping the monitor doesn't fail. This is literally the 4th monitor now - three have failed on me - all seem to be power-related.

Anyway - what profiles do you guys recommend for work & gaming? I have it on "Standard" and it looks great. Also, in NVCP, I set the 'Use Nvidia Color Settings' and am using Highest (32-bit), 8bpc, RGB, and Full Dynamic Range - is this the optimal setting? (see pic)

Mostly, that shouldn't even be possible. You'd be stretching a lot with consumer components to even get 1000W at the wall. Obviously you'd want the PSU to be rated higher than that for a number of reasons, but actual wattage in the 1000W+ range should be very difficult to hit.

At my old place, I had a 20A breaker/circuit for my rig and when I was running 4x Titan Xp OC'd along with the CPU OC'd, I regularly saw ~ 1500W at the wall (Kill-A-Watt) and that was with just the PC - the peripherals were connected to a separate circuit altogether. When I used to run 4x 580 Classified (using dual PSU's), I saw 1850W at the wall on a regular basis.

With 2x GPUs OC'd now along with an OC'd CPU, when gaming and/or benchmarking, I see 1300 - 1400W at the wall (Kill-A-Watt).


i2WOhuJ.png
 
Joined
Apr 22, 2011
Messages
521
I found the included DisplayPort cable to be problematic, maybe cause it's long (10ft if I remember correctly) and I suspect the signal peters out and sometimes the AW55 would lose signal. Not sure how signal strength works but maybe my Titan V had trouble sending a strong enough signal and maintaining handshake. I resolved it with a short (6ft) robust DisplayPort cable from Amazon.

I think it was this one:
https://www.amazon.com/Accell-DP-1-...ords=displayport+accell&qid=1591712886&sr=8-3

I also wrapped it in Faraday Tape as well, overkill? This is [H]ard Forum right? :p

I don't want dropped frames due to dropped packets! ;p

https://www.amazon.com/gp/product/B07CRLCGCH/ref=ppx_yo_dt_b_asin_title_o00_s00?ie=UTF8&psc=1
 

skypine27

Gawd
Joined
Apr 18, 2008
Messages
698
Gents: Is there any way to kill the auto-dimming BS this monitor seems to do? It will do it so fast, it even does it in GAME, if youre sitting around not moving the mouse or keyboard (like an base building RTS where you are waiting 5 minutes for some credits to rack up). Its annoying as hell. There is in OSD setting for it. I have monitor sleep disabled in the AW OSD (I just use the remote to turn off the power when i go AFK). But the auto dimming really needs to go away

DDC/CI defaults to on, you think killing this might help?

Other than that, I really like the display still!
 
Joined
Apr 22, 2011
Messages
521
Gents: Is there any way to kill the auto-dimming BS this monitor seems to do? It will do it so fast, it even does it in GAME, if youre sitting around not moving the mouse or keyboard (like an base building RTS where you are waiting 5 minutes for some credits to rack up). Its annoying as hell. There is in OSD setting for it. I have monitor sleep disabled in the AW OSD (I just use the remote to turn off the power when i go AFK). But the auto dimming really needs to go away

DDC/CI defaults to on, you think killing this might help?

Other than that, I really like the display still!

Strange, I have not experienced any amount of auto-dimming on my display. Be good to know if there is a setting that affects this so I don't enable it :yuck:

Is that with HDR on in the monitor settings?
 

skypine27

Gawd
Joined
Apr 18, 2008
Messages
698
Strange, I have not experienced any amount of auto-dimming on my display. Be good to know if there is a setting that affects this so I don't enable it :yuck:

Is that with HDR on in the monitor settings?
thats an interesting point, fren. Yes. In the monitor settings I have "Smart HDR" set to "game" (which allows me to select HDR in some game settings, like Destiny 2 and AC: Odyssey. That might be it. (I do NOT have HDR turned on in the windows display settings however)

I turned it off now and will see if it auto dims still. Thx for the suggestion

Edit: Nope, still does it. Shoot!
 
Last edited:
Joined
Apr 22, 2011
Messages
521
thats an interesting point, fren. Yes. In the monitor settings I have "Smart HDR" set to "game" (which allows me to select HDR in some game settings, like Destiny 2 and AC: Odyssey. That might be it. (I do NOT have HDR turned on in the windows display settings however)

I turned it off now and will see if it auto dims still. Thx for the suggestion

Let me know what you find.

I played with the monitors HDR setting (I use Reference mode) and also enabled it in windows and found it to be useful because it enables 8-bit RGB 4:4:4 + dithering (considered equivalent in pq to 10 bit by some including RTINGS). Without the HDR setting on it only runs 8-bit RGB 4:4:4 without dithering. Useful for smoother color grades. The Windows Display tab reflects this.

BTW I highly recommend the cable and faraday tape mentioned in post 475 above, I noticed significantly smoother gameplay with that combination than the stock cable that I believe is too long for the purposes. Dropped packets + error correction = frame skipping/lost fps. Not to mention all the radio waves like 5G giving your sweet 4k 120hz signal the Rona. :eek::D

"The transmission mode used by the DisplayPort main link is negotiated by the source and sink device when a connection is made, through a process called Link Training. This process determines the maximum possible speed of the connection. If the quality of the DisplayPort cable is insufficient to reliably handle HBR2 speeds for example, the DisplayPort devices will detect this and switch down to a lower mode to maintain a stable connection.[8](§2.1.1) The link can be re-negotiated at any time if a loss of synchronization is detected." - Wikipedia https://en.wikipedia.org/wiki/DisplayPort
 
Last edited:
Top