Sweet Baby Jeebus Alienware 55" 4k120 OLED Displayport

One of the articles I read about this mentioned a couple of other new AW monitors dropping at the same time? Something like an AW3418DW refresh maybe and something else? Saw mention of 120 and 240hz displays.. Anybody know about those?
 
Ok I found this on another site:

"Other products include the $1,500 Alienware 34-inch Curved Gaming Monitor AW3420DW. It has an immersive 1900R curved, wide 21:9 WQHD-resolution monitor. Alienware’s first 34-inch WQHD 120Hz curved gaming monitor with fast IPS response time and IPS Nano Color technology that provides 98% DCI-P3 color coverage, along with a 120Hz (native) refresh rate and Nvidia G-Sync display technology that offers smooth and realistic images. It is available on August 28.

Alienware also has a $600 Alienware 27 Gaming Monitor AW2720HF. It has a 240Hz refresh rate and true one millisecond response time (gray-to-gray) in Extreme mode with AMD Radeon FreeSync technology. It is available in the U.S. on September 17.

Dell has 32-inch Curved Gaming Monitor S3220DGF for $600. It has a curved 1800R screen which enhances field-of-vision and provides a wrap-around perspective. Gamers can experience up to 165Hz refresh rates for fast gameplay and see up to 77% more on-screen game content with QHD resolution compared to FHD. It debuts on August 28 for $600."

Hopefully the AW3420DW will allow overclocking to 140hz+ and have HDR support. Might be a good upgrade from the AW3418DW when those crazy sales hit.
 
I called it in another thread that this, as an HDR OLED, wouldnt happen for PCs.
PCs will kill an HDR OLED over 400Nits.
I dont think its a good idea even without HDR unless brightness is kept down.
It will look great for a while :)
I bet the peak brightness of 400nits is for a small window for a short period of time, no way it will be full screen.
Why would a PC be any different than a game console? Plenty of movies and games can reach or get close to the near-peak 700 nit brightness in small areas without an issue. 400 nits for a 3% white window is absolutely pathetic.
 
Why would a PC be any different than a game console? Plenty of movies and games can reach or get close to the near-peak 700 nit brightness in small areas without an issue. 400 nits for a 3% white window is absolutely pathetic.
My comment is about what would happen if the display did have HDR.
You can do a lot more with a PC at the same time, it is not as safe.
There will be PC users who wont realise how fragile OLEDs are with HDR and others that want to push it hard not knowing what will happen.
ie HDR mode can be left/forced on. Simulated HDR can be left on. An HDR TV program or movie can be left paused. An HDR game can be left paused ...
 
My comment is about what would happen if the display did have HDR.
You can do a lot more with a PC at the same time, it is not as safe.

Source? This really doesn't make any sense. All HDR content, including video games, is professionally graded, and it is rarely brighter on average than SDR content, it's typically graded to average 100 nits. The difference is that the highest and lowest points in a scene can be much higher and lower respectively, that's all.

Also, HDR content itself is still quite rare. So even using your display in HDR mode would not happen every day. Putting the Windows desktop in HDR mode does not produce extreme brightness peaks or anything like that(nor should it). I've never seen an HDR game that uses extremely high brightness in the UI -- the only static elements. It would be pretty stupid if it did. Plus, basically all HDR games are console ports that are already widely played on HDR TVs using consoles.

I can't think of any evidence for HDR being 'more dangerous' on PCs in any real way. The real danger is that on a PC display, people are going to spreadsheet on it, but that doesn't have anything to do with HDR.
 
Source? This really doesn't make any sense. All HDR content, including video games, is professionally graded, and it is rarely brighter on average than SDR content, it's typically graded to average 100 nits. The difference is that the highest and lowest points in a scene can be much higher and lower respectively, that's all.

Also, HDR content itself is still quite rare. So even using your display in HDR mode would not happen every day. Putting the Windows desktop in HDR mode does not produce extreme brightness peaks or anything like that(nor should it). I've never seen an HDR game that uses extremely high brightness in the UI -- the only static elements. It would be pretty stupid if it did. Plus, basically all HDR games are console ports that are already widely played on HDR TVs using consoles.

I can't think of any evidence for HDR being 'more dangerous' on PCs in any real way. The real danger is that on a PC display, people are going to spreadsheet on it, but that doesn't have anything to do with HDR.
Try quoting the rest of my post, it answers you.
 
Try quoting the rest of my post, it answers you.

Your post is just a list of your hypothetical fantasies, with zero evidence that any of those things increase risk of burn-in. Good luck with that, you're not worth it.
 
I'm not surprised about the price, just disappointed. And I probably would still buy it if it had all the features the LG tvs do. But lacking HDR support, old hdmi ports, etc. it's not going to happen at that price.
 
Glad I bought the C9. When HDMI 2.1 GPUs come out next year I'll grab one and do 4k120 then. Live with 1440p@120 for now. That price ...
 
The tech specs they released are so stupid. Who the fuck came up with this shit? They put the wrong version of display port on there and WTF is up with the "0.5 ms response rate"?? OLEDs pixel response is like .0001 ms. They're just completely making up shit and it's completely inaccurate. It gives me hope it does actually support HDR. But I really doubt I'll buy one even if it does at this point.
 
Who the hell is this monitor being marketed to?

This shouldn't have been marketed under the Alienware brand. That design aesthetic is for teenage gamers. And apparently rich teenage gamers, with that whopper of a price tag. It would have made much better sense to market it under the Dell brand as a UP monitor, ditch that bipolar color scheme, and class it up a bit. At least then I could understand professionals going for it, 1) because they can actually afford it and 2) because professionals would be okay with (maybe even prefer?) non-HDR DCI-P3 since most calibrate to 120cdm2 anyway.

But for that price tag, this really needed to come with HDMI 2.1. Huge fail. It just baffles me. Who the hell is going to buy one of these over a C9? And we're heading into fall soon. That means deep discounts on LG's 2019 models, and we're just four months away from CES 2020.

They'll sell maybe 10 of these. All of them to rich YouTube/Twitch streamers.
 
Well as solely a computer monitor, considering both DP 1.4 and HDMI 2.1 can do 4K/120, HDMI 2.1 in addition to DP 1.4 doesn't do much. Unless you want to hook up a game console etc. It would be nice to have the flexibility, but not a deal breaker.
 
Glad I bought the C9. When HDMI 2.1 GPUs come out next year I'll grab one and do 4k120 then. Live with 1440p@120 for now. That price ...

I'm holding off on the C9 until rtings can test what the VRR range is; they can't test is since no HDMI 2.1 source currently exists (*glares at NVIDIA*). I'm hoping it's much wider then the Samsung's Q9 series Freesync implementation (48-60Hz). I'll stick with my B6 for now.

Well as solely a computer monitor, considering both DP 1.4 and HDMI 2.1 can do 4K/120, HDMI 2.1 in addition to DP 1.4 doesn't do much. Unless you want to hook up a game console etc. It would be nice to have the flexibility, but not a deal breaker.

The problem is when used with next-gan game consoles, which will almost certainly utilize HDMI 2.1.

My main argument is why get this over something like a LG C9, which offers all this plus HDMI 2.1, VRR, and HDR, all for over $2000 less?
 
VRR has never worked over HDMI with NVIDIA cards. Displayport only. Now yes the HDMI 2.1 feature set includes VRR, but that doesn't mean NVIDIA has to support it and hurt their own G-Syna brand. Only time will tell. But to be stuck at 4K/120 with NO VRR would really suck. At least with the Alienware you are guaranteed working VRR.

As for game consoles, not sure the player base for people splurging $4K on a gaming monitor are into them much.
 
VRR has never worked over HDMI with NVIDIA cards. Displayport only. Now yes the HDMI 2.1 feature set includes VRR, but that doesn't mean NVIDIA has to support it and hurt their own G-Syna brand. Only time will tell. But to be stuck at 4K/120 with NO VRR would really suck. At least with the Alienware you are guaranteed working VRR.

As for game consoles, not sure the player base for people splurging $4K on a gaming monitor are into them much.

Considering HDMI Forum VRR is basically NVIDIA Gsync, I have a suspicion NVIDIA will be supporting it.
 
To suggest people can just play old games to obtain 120hz @ 4K is .... idiotic. It's skirting the truth of it all. It's underhanded, it's ducking and weaving around the reality of it all. Fake news. Just stop.
This is complete bullshit. You're making one extremely narrow-minded assumption here: that people are starting with the end goal of playing something, anything, at 4K 120Hz.

This is simply not the case. I don't think "hmm, I need to get 120Hz while playing a game at 4K… how can I make that happen? Oh, I know! Old games!" This isn't what's happening.

I am already playing old games! I enjoy older titles, as far back as the late 2000s at least. That has nothing to do with what my graphics card can output or what my screen can display. That's just what I enjoy. Is it all I enjoy? No, of course I play new titles too, but the old ones are somewhat omni-present.

It's more like I think "hmm, I'm sitting here playing this game at 160Hz at 4K, but my monitor can't show it. Wouldn't it be nice if it could?" This is what's going on. I'm not deluding myself into believing a reality that doesn't exist. I just know what I play. I just know what my display needs to do. You're the one that seems to fail to grasp the concept of people enjoying older things.
 
VRR has never worked over HDMI with NVIDIA cards. Displayport only. Now yes the HDMI 2.1 feature set includes VRR, but that doesn't mean NVIDIA has to support it and hurt their own G-Syna brand. Only time will tell. But to be stuck at 4K/120 with NO VRR would really suck. At least with the Alienware you are guaranteed working VRR.

As for game consoles, not sure the player base for people splurging $4K on a gaming monitor are into them much.

With the latest announcements from nVidia/LG it looks like VRR over HDMI is a go. Super bad news for this $4k Alienware monitor ... it's off to the iceflow.
 
With the latest announcements from nVidia/LG it looks like VRR over HDMI is a go. Super bad news for this $4k Alienware monitor ... it's off to the iceflow.

Interesting how the LG OLED's make Nvidia's official G-Sync Compatible list but the Alienware isn't listed:

https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/

Curious if this $4k POS can't even make the list for quality issues or Nvidia just hasn't gotten around to it yet. You would think this would make the list more than LG OLED's since it's such a "gamer" monitor.
 
Interesting how the LG OLED's make Nvidia's official G-Sync Compatible list but the Alienware isn't listed:

https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/

Curious if this $4k POS can't even make the list for quality issues or Nvidia just hasn't gotten around to it yet. You would think this would make the list more than LG OLED's since it's such a "gamer" monitor.

I'm sure it'll make the list but as it's not actually available yet it's no surprise it's not there ...
 
Interesting how the LG OLED's make Nvidia's official G-Sync Compatible list but the Alienware isn't listed:

https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/

Curious if this $4k POS can't even make the list for quality issues or Nvidia just hasn't gotten around to it yet. You would think this would make the list more than LG OLED's since it's such a "gamer" monitor.
Supposedly the entire G-SYNC certification is pretty comprehensive and takes a lot validation.
 
There are still some displays that haven't made it on that list. Like the new HP Omen X 27 240 Hz 1440p display, even has a G-Sync logo on the front of it. Even though it's a Freesync 2 display, not on the list.
 
Interesting how the LG OLED's make Nvidia's official G-Sync Compatible list but the Alienware isn't listed:

https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/

Curious if this $4k POS can't even make the list for quality issues or Nvidia just hasn't gotten around to it yet. You would think this would make the list more than LG OLED's since it's such a "gamer" monitor.

From what I've gotten out of LG, it sounds like what is really happening is NVIDIA is implementing HDMI Forum VRR over a HDMI 2.0 connection, allowing LGs TVs (which already support HDMI Forum VRR via HDMI 2.1) to use the feature for now until HDMI 2.1 GPUs hit the market. That would explain why *only* LG is getting this feature, as no other TVs (aside from Samsung, who went for Freesync over HDMI) support HDMI 2.1 at present.

But yes, the Alienware now has no market; there's no justification for choosing it over an LG C9.
 
From what I've gotten out of LG, it sounds like what is really happening is NVIDIA is implementing HDMI Forum VRR over a HDMI 2.0 connection, allowing LGs TVs (which already support HDMI Forum VRR via HDMI 2.1) to use the feature for now until HDMI 2.1 GPUs hit the market. That would explain why *only* LG is getting this feature, as no other TVs (aside from Samsung, who went for Freesync over HDMI) support HDMI 2.1 at present.

But yes, the Alienware now has no market; there's no justification for choosing it over an LG C9.

The Alienware can do 4k 120hz, you can't with the LG yet. It's the only OLED you can until new cards come out.

I'm going to get an LG and run 1080p 120hz in games and 4k 60 for desktop until HDMI 2.1 cards come out.
 
If they'd priced this right, they could have practically had the PC OLED market to themselves. There is obviously no way they can justify a $2K price hike for a Display Port 1.4 connection, so this is nothing but greed, and while I'm sure some people will buy this thing, it's practically DOA at the price they're asking.
 
The Alienware can do 4k 120hz, you can't with the LG yet. It's the only OLED you can until new cards come out.

I'm going to get an LG and run 1080p 120hz in games and 4k 60 for desktop until HDMI 2.1 cards come out.

To be fair, by NVIDIA's release cadence they should have the 3000 series around March. Can't speak for when AMD will support HDMI 2.1 though. And I know a few DP 1.4->HDMI 2.1 converters that at the very least would support all resolutions DP 1.4 can handle are on the way.

I'm holding off on the C9 in the short term until rtings actually reviews it's VRR. But I'm almost certainly getting a C9/C10 whenever HDMI 2.1 GPUs hit.
 
If they'd priced this right, they could have practically had the PC OLED market to themselves. There is obviously no way they can justify a $2K price hike for a Display Port 1.4 connection, so this is nothing but greed, and while I'm sure some people will buy this thing, it's practically DOA at the price they're asking.

BUT, BUT GAMEOOR RGB LIGHTS ON THE BACK!!!!!!
 
If they'd priced this right, they could have practically had the PC OLED market to themselves. There is obviously no way they can justify a $2K price hike for a Display Port 1.4 connection, so this is nothing but greed, and while I'm sure some people will buy this thing, it's practically DOA at the price they're asking.
No point in buying an OLED without HDR. That was their biggest mistake. I'd frankly buy a BFGD over this, price being equal.
 
The Alienware can do 4k 120hz, you can't with the LG yet. It's the only OLED you can until new cards come out.

I'm going to get an LG and run 1080p 120hz in games and 4k 60 for desktop until HDMI 2.1 cards come out.

Remember you can do 1440p @ 120hz on the C9.

The only reason why I haven't bought the C9 yet is because I'm waiting for the official G-sync firmware/driver first and possibly holding off until next year due to the EARC issue on the C9's. This may not be resolved with just a firmware update and may need next years model (doesn't do uncompressed 5.1 or 7.1 sound over EARC which shouldn't be happening and is quite a bummer).
 
https://www.cnet.com/reviews/alienware-aw5520qf-monitor-review/

meh review, but it's somethin', not sure if it has HDR...

Has some questionable statements:

https://www.cnet.com/reviews/alienware-aw5520qf-monitor-review/

"If you're familiar with Dell monitors, you'll recognize the Smart HDR modes -- in this case, game, movie, desktop and reference -- which report to Windows whether it can toggle its HDR settings. If you've chosen the game setting, Windows can enable it for games or wide-color gamut. If you want it for movies, you'll have to enable that setting instead."

"In fact, G-Sync support is slated to come to LG's TVs soon, thanks to the company's rollout of HDMI 2.1 in its 2019 TVs and Nvidia's imminent firmware upgrade to 2.1 for its RTX graphics cards."



Make this a 30-40" monitor, add G-Sync Ultimate and I'll pay the early adopter tax.
If the firmware upgrade thing is true, or even possible, then LG is the way to go.
 
https://www.cnet.com/reviews/alienware-aw5520qf-monitor-review/

meh review, but it's somethin', not sure if it has HDR...

Has some questionable statements:

https://www.cnet.com/reviews/alienware-aw5520qf-monitor-review/

"If you're familiar with Dell monitors, you'll recognize the Smart HDR modes -- in this case, game, movie, desktop and reference -- which report to Windows whether it can toggle its HDR settings. If you've chosen the game setting, Windows can enable it for games or wide-color gamut. If you want it for movies, you'll have to enable that setting instead."

"In fact, G-Sync support is slated to come to LG's TVs soon, thanks to the company's rollout of HDMI 2.1 in its 2019 TVs and Nvidia's imminent firmware upgrade to 2.1 for its RTX graphics cards."



Make this a 30-40" monitor, add G-Sync Ultimate and I'll pay the early adopter tax.
If the firmware upgrade thing is true, or even possible, then LG is the way to go.

A bit late to the party. The gsync driver update has been confirmed a while back already. RTX cards will be limited to 60Hz VRR on the LG 2019 sets but the main point is that nvidia has confirmed HDMI VRR support so once they launch cards with HDMI 2.1 we will have full 4k120Hz VRR over hdmi.
 
Make this a 30-40" monitor, add G-Sync Ultimate and I'll pay the early adopter tax.
If ANYONE releases a 30”-43” OLED display, I’d pay the early adopter tax. I’m still holding out hope that LG not only has a 48” planned for next year, but a 43” too. At that size, I’d be able to use it as a monitor, which is my main use case. High refresh for games would be much appreciated, but it’s not necessary for me.
 
A bit late to the party. The gsync driver update has been confirmed a while back already. RTX cards will be limited to 60Hz VRR on the LG 2019 sets but the main point is that nvidia has confirmed HDMI VRR support so once they launch cards with HDMI 2.1 we will have full 4k120Hz VRR over hdmi.

Must of missed the bolded text so let me help you out with what was emphasized: Nvidia's imminent firmware upgrade to 2.1 for its RTX graphics cards.
 
I'm on here every day, haven't heard anything about an HDMI 2.1 firmware upgrade. News to me at least.
 
Back
Top