Sweet Baby Jeebus Alienware 55" 4k120 OLED Displayport

Looks like Acer will release their own version of this monitor in CES 2020:

https://www.anandtech.com/show/1528...g-with-predator-cg552k-55inch-4k-oled-monitor

Don't get too excited. Still gimped like the Alienware such as 400 nits & no HDR.

HDR is a gimmick and never looks natural, might as well use torch mode box stores use, and I don't use over 150 CDM2. This thing can do 400 CDM2.

They are selling for 3k, AW55 can be had for 2659.99 before tax

https://www.dell.com/en-us/shop/new...0qf/apd/210-auds/monitors-monitor-accessories
 
Last edited:
HDR is a gimmick and never looks natural, might as well use torch mode box stores use, and I don't use over 150 CDM2. This thing can do 400 CDM2.

They are selling for 3k, AW55 can be had for 2659.99 before tax

https://www.dell.com/en-us/shop/new...0qf/apd/210-auds/monitors-monitor-accessories

HDR on an OLED display looks fantastic for both movies and games. The Acer version of this display is dead before it's released since HDMI 2.1 GPUs are getting released this year. Literally the only thing going for these displays is a DP 1.4 connector.

These screens are just a bad purchase when a 55" LG C9 is over thousand bucks less and a smaller 48" C10 is coming this year.
 
HDR is a gimmick and never looks natural, might as well use torch mode box stores use, and I don't use over 150 CDM2. This thing can do 400 CDM2.

They are selling for 3k, AW55 can be had for 2659.99 before tax

https://www.dell.com/en-us/shop/new...0qf/apd/210-auds/monitors-monitor-accessories

True HDR is not a gimmick at all. What hardware is being sold to a lot of people and how it is implemented in software can be though.

A lot of people just don't realize how HDR is supposed to work, mostly because they don't understand it properly but perhaps also because it's been poorly implemented in a lot of games and media playback displays in order to allow you to change the white point/gamma as well as (and perhaps due to) the fact that most "HDR" screens people own so far remap the HDR range out of it's original absolute values since they are incapable of using the HDR1000 scale.


if you follow hdtvtest on youtube he always shows examples of how details are lost when the lower *color* brightness limits (color volume height limits) are reached because they then clip to white (or roll down to the same color brightness limit as the best color limit the screen can reach losing gradation) .. So they lose any detail that goes past that and leave a uniform area.

People need to realize having colors throughout brightness in colored highlights and details and a few colored light sources, reflections, etc.. while the rest of the scene generally remains in SDR ranges is not the same as taking the relative brightness of a SDR screen's brightness control "knob" and turning the whole screen to "11".

HDR is highlights. The average scene brightness is still quite low, in fact some people complain that it's too low since it uses absolute values designed for a home theater dim to dark viewing environment. HDR is not like SDR where the whole scene's brightness is relative and turned up or down.

A 800nit or higher peak is impossible on OLEDs outside of peak 10% window of highlights and before ABL kicks in.

A C9 OLED's HDR can only do
Fullscreen peak/burst: 301 nit ... Fullscreen sustained: 286 nit
50% peak/burst: 530nit ... 50% sustained: 506nit
25% peak/burst: 845nit ... 25% sustained: 802nit
10% peak/burst: 855nit .. 10% sustained: 814nit

A Samsung Q90's HDR for comparison since it's LED LCD and very bright
Fullscreen peak/burst: 536 nit ... Fullscreen sustained: 532 nit
50% peak/burst: 816nit ... 50% sustained: 814nit
25% peak/burst: 1275nit ... 25% sustained: 1235nit
10% peak/burst: 1487nit .. 10% sustained: 1410nit

......

HDR 10,000 is the benchmark for the most realism. Even HDR 1000 as a content set point is fractional HDR. There are a lot of hdr 1000 uhd discs, a bunch at hdr4000 and only 1 or 2 afaik that are 10,000 because noone has the hardware to play them yet at those levels. Some games can also technically do hdr10,000 when tested with color maps.


https://www.theverge.com/2014/1/6/5276934/dolby-vision-the-future-of-tv-is-really-really-bright

"The problem is that the human eye is used to seeing a much wider range in real life. The sun at noon is about 1.6 billion nits, for example, while starlight comes in at a mere .0001 nits; the highlights of sun reflecting off a car can be hundreds of times brighter than the vehicle’s hood. The human eye can see it all, but when using contemporary technology that same range of brightness can’t be accurately reproduced. You can have rich details in the blacks or the highlights, but not both.


So Dolby basically threw current reference standards away. "Our scientists and engineers said, ‘Okay, what if we don’t have the shackles of technology that’s not going to be here [in the future],’" Griffis says, "and we could design for the real target — which is the human eye?" To start the process, the company took a theatrical digital cinema projector and focused the entire image down onto a 21-inch LCD panel, turning it into a jury-rigged display made up of 20,000 nit pixels. Subjects were shown a series of pictures with highlights like the sun, and then given the option to toggle between varying levels of brightness. Dolby found that users wanted those highlights to be many hundreds of times brighter than what normal TVs can offer: the more like real life, the better.
"

HLG is still relative which isn't the truest form of HDR which should have absolute values.

PQ (Perceptual quantization) gamma curve is based on the characteristics of human visual perception.

HLG (Hybrid Log-Gamma) is sort of a sloppy way to make quasi HDR work with inadequate SDR based systems, live un-encoded video feed material, and lower quality quasi-HDR hardware. There's no reason games should be using the makeshift relative version other than that most of the displays people are using for HDR material are inadequate in the first place.

https://www.eizo.com/library/management/ins-and-outs-of-hdr/index2.html/
lkfnvmb-png.png
 
Last edited:
HDR on an OLED display looks fantastic for both movies and games. The Acer version of this display is dead before it's released since HDMI 2.1 GPUs are getting released this year. Literally the only thing going for these displays is a DP 1.4 connector.

These screens are just a bad purchase when a 55" LG C9 is over thousand bucks less and a smaller 48" C10 is coming this year.

If your into torch mode which the masses are.

In it's current implementation HDR is a gimmick, people are just dazzled by the unrealistically bright display because they are like bugs and fly into the light.

TV's still have more input lag than dedicated monitors so no they won't be the same. Close though.

Some pay for no compromise and no wait because time is more valuable than money depending on the person. I work long hours and don't settle on anything I purchase and some will. To each their own. I have a C9 and it's cool but I can tell the difference in response with a no input lag monitor.

Also who knows when gpu's with HDMI 2.1 will release, yah gotta pay for one of those to use C9 to the fullest. Some don't have to pay because they already have a capable gpu without HDMI 2.1.
 
Last edited:
If your into torch mode which the masses are.

In it's current implementation HDR is a gimmick, people are just dazzled by the unrealistically bright display because they are like bugs and fly into the light.

TV's still have more input lag than dedicated monitors so no they won't be the same. Close though.

Some pay for no compromise and no wait because time is more valuable than money depending on the person. I work long hours and don't settle on anything I purchase and some will. To each their own. I have a C9 and it's cool but I can tell the difference in response with a no input lag monitor.

Also who knows when gpu's with HDMI 2.1 will release, yah gotta pay for one of those to use C9 to the fullest. Some don't have to pay because they already have a capable gpu without HDMI 2.1.

LG C9 has an input lag of 13ms in 60 Hz and 6.8 ms at 120 Hz. That's less than a frame on either refresh rate. Total non-issue.

HDR is made to allow for a display to have output closer to real life. Real life can have extremely dark as well as extremely bright areas, like looking at a car's headlights during the night. In HDR content you won't have the extreme brightness most of the time and instead enjoy the wider variety of light levels it is able to represent.

You are free to spend your money on whatever you want but I can't justify the kind of price increase they try to ask for a DP connector.
 
If your into torch mode which the masses are.

In it's current implementation HDR is a gimmick, people are just dazzled by the unrealistically bright display because they are like bugs and fly into the light.

You keep posting and you sound more and more like you don't know what you're talking about with every single post. There is no such thing as "torch mode" for HDR because HDR uses the full brightness capability of the display to begin with. It needs to do this because REAL LIFE IS NOT LIMITED TO 100 NITS. When you look outside at night, the lights outside are much brighter than is possible in any SDR content.

It sounds like you do not understand the difference between high APL and contrast. HDR is about contrast, and to have real world levels of contrast you need both high and low brightness in the same image and that's what HDR provides.

Increasing the brightness of the whole image('torch mode'), as in turning your TV's brightness up in SDR mode, doesn't have anything to do with HDR.

You might as well get used to HDR because everything will be HDR in 10 years and SDR will be a bad memory just like 480p.
 
HDR benefits are lost on some people when they only see it on poor displays or a badly set up system.
I saw my Dads new lower end Samsung QLED TV at Christmas and its HDR 1000 implementation was very poor, SDR colour reproduction was terrible too. I was shocked how bad it is.
Spot on guys for putting him straight about how good it can be.

Stryker7314, go have a look at HDR on a Samsung Q9FN, the newer Samsung TVs or the latest OLEDs.
Ask for demo torch mode to be turned off, view it in a darkened room for best effect.
If they dont sway you then perhaps its not for you.
I use a Q9FN and HDR is amazing at up to 2000Nits. So good I use the fake HDR mode (HDR+) for normal TV with brightness dialed down to normal, it is great for boosting contrast only on very bright objects without overblowing the image. It gives a very CRT like image.
 
Last edited:
I have had OLED and with it HDR for 4 years and HDR is lame. Own calibrated C6, C9, FW900 and JVC projector.

The gimmick is obviously working. Not everyone buys it, but you do, therefore it's made for folks like you. It's oversaturated toony colors and unrealistic contrast. It caters to folks that like fantastical imagery instead of realistic.
 
I have had OLED and with it HDR for 4 years and HDR is lame. Own calibrated C6, C9, FW900 and JVC projector.

The gimmick is obviously working. Not everyone buys it, but you do, therefore it's made for folks like you. It's oversaturated toony colors and unrealistic contrast. It caters to folks that like fantastical imagery instead of realistic.
I wasnt referring to old OLED.
Projectors cant do HDR well.
Its only oversaturated toony colours with bad equipment or bad setup.
Perhaps you arent doing it right.
 
It's oversaturated toony colors and unrealistic contrast. It caters to folks that like fantastical imagery instead of realistic.

The thing you don't seem to understand is that what you keep calling "unrealistic" is in fact the contrast and the dynamic range you see in the actual real world. It seems very strange to vehemently insist that 6 stops of dynamic range and sRGB are all that exist in the world, as any photographer, filmmaker, or even youtube content creator knows different. But I've seen stranger.

Maybe you have some visual cognitive or eye problems, I dunno.
 
Some people drink the marketing kool-aid, some of us see through that bs, maybe you guys are part of the sheep masses it's marketed to?

Looks like neither of you speak from experience because you refer to everything but your own experience. Go get OLED first, then we'll talk.
 
Some people drink the marketing kool-aid, some of us see through that bs, maybe you guys are part of the sheep masses it's marketed to?

Looks like neither of you speak from experience because you refer to everything but your own experience. Go get OLED first, then we'll talk.
You are blind to what is already written.
I speak from experience and enjoy HDR because it is very good.
Your situation is unfortunate, not ours.
 
I own a C9. A good HDR film/show is a much better visual experience than SDR, and that's running the display in Cinema mode in a dark room, which is the lowest APL mode for HDR but also the best contrast.
 
It's in, running it at 3840x1600 1x1 pixel mapping 120hz for that ultrawide goodness and periphery advantage. Freesync is working great on the Titan V.

No input latency relative to the C9. AW55 is instant.

Also tested on testufo and has less motion blur than the C9. Looks like Dell tweaked it specifically for gaming.

OLED, Ultrawide, 120hz, VRR.

Endgame. Blouses.
 
C9 and AW55 have the same panel, no difference in motion blur. OLED pixels are naturally insanely fast and there are no "overdrive" settings to tweak like on LCD's. You can't test/run a C9 at native 4K/120Hz yet to do an apples to apples comparison anyway.

The C9 at 1440P/120 Hz was measured at 6.8ms input lag. The fastest zero input lag hypothetical reading you could get on the center of the display due to the 120 Hz refresh is 4.16ms. 2-3 ms is well within placebo range, and that is assuming the AW55 has absolutely zero input lag, which is impossible.
 
Sounds good. I'm holding out for the 48" CX and a 3080ti at some point personally.

I've also dipped into VR with an accessorized oculus quest this xmas and found the best games pretty mind blowing experiences - so I'm definitely pushing into that gaming arena now. Keeping an eye on the new pimax 8k, and defnitely interested in what VR headsets come out on more powerful standalone chips (yet still able to connect to a more powerful pc) in 2022 - 23 (e.g. Quest 2 and competitors with more ergo glasses planned).

This is a blurb about the new CX 48" in regard to gaming.. .. the whole article is a pretty good read even if it's just an in person impression with some data points:
https://www.forbes.com/sites/johnarcher/2020/01/06/lg-oled65cx-oled-tv-first-impressions/

I also got to spend a few minutes just before publishing this article checking out a 48-inch OLED48CX running a PC racing game at 120fps (something LG’s new OLED TVs can support from HDMI 2.0 sources this year; you don’t need an HDMI 2.1 source like you did last year). And the results looked stunning, as silky smooth motion joined forces with the extra sharpness you get from squeezing a 4K resolution into a 48-inch screen.

LG also claims that while it’s hard to measure, input lag should be as low as 5ms or so when gaming in 120fps.

Of course there is always something better around the corner sooner or later. I'm glad there are some great toys/gear to look forward to in the next few years.
 
Screen size for different 1:1 resolutions:

3840x1600 - 52"
3840x1440 - 51"
2560x1600 - 37"
2560x1440 - 36"
1920x1200 - 28"
1920x1080 - 27"

Run my desktop 4k and game at 3840x1600.
 
Last edited:
LG C9 has an input lag of 13ms in 60 Hz and 6.8 ms at 120 Hz. That's less than a frame on either refresh rate. Total non-issue.

HDR is made to allow for a display to have output closer to real life. Real life can have extremely dark as well as extremely bright areas, like looking at a car's headlights during the night. In HDR content you won't have the extreme brightness most of the time and instead enjoy the wider variety of light levels it is able to represent.

You are free to spend your money on whatever you want but I can't justify the kind of price increase they try to ask for a DP connector.

The question is, do I really want my monitor to get as bright as real life? I don't enjoy getting blasted in the face by high beam HID at night so I definitely don't want my PC doing that to me.
 
The question is, do I really want my monitor to get as bright as real life? I don't enjoy getting blasted in the face by high beam HID at night so I definitely don't want my PC doing that to me.

No OLEDs are capable of extreme brightness anyways, they top out at 800 nits which is frankly no big deal. The eye-searing doesn't really happen until a few thousand nits. And yeah, I've stared into plenty of HDR sunrises and lightning on my C9. It's definitely a bit of a surprise sometimes, but that's part of the enjoyment.
 
The question is, do I really want my monitor to get as bright as real life? I don't enjoy getting blasted in the face by high beam HID at night so I definitely don't want my PC doing that to me.

HDR displays are still miles away from real life brightness levels. The point is to have things look closer to real life by having a wider range of brightness to use when producing content, not to blind you at every turn.

producing-content-in-hdr-02-md.jpg


For desktop use you would turn HDR off as you don't want to have the backlight of an OLED at 100 there as automatic brightness limiter will kick in if you have a big browser window open for example.

On PC the support for toggling HDR is really messy, with a couple of games capable of enabling it without it being enabled in Windows while others will not show the option unless HDR is enabled on the desktop. I would vastly prefer the "HDR on demand" functionality for games so it works like on consoles and only activates when needed. While I understand why MS has done it like this (HDR support in windowed content), it does not work well for SDR content on anything but OLEDs so I wish they provided "whitelist and blacklist apps" options as an alternative to a single toggle.
 
  • Like
Reactions: elvn
like this
For now with OLED I'd probably relegate a LG CX48" OLED for example to being a blacked out desktop wallpaper, no icons or toolbar/taskbar "media and gaming stage", which when not showing either of those types of content would be all black, timed out/screen savered, or turned off. Hopefully I'd be able to keep a few other large monitors for desktop and apps. However in the long run as better tech comes out over the years everything should be HDR capable and have no burn in issues so people can edit HDR photos and videos they took, have HDR range capable wallpapers and interfaces, watch HDR youtube videos in a window on any of their monitors, HDR streams, forum pictures, etc.. Eventually the much taller color volume/color brightness range capability will be on everything just like having higher color and contrast capable monitors, resolutions, etc. That is still a ways off yet for HDR ubiquity though , especially in ms windows.
 

LG 2020 4K OLED TVs with G-Sync, 48" size, Filmmaker Mode - BX, CX, GX, WX

https://www.flatpanelshd.com/news.php?subaction=showfull&id=1578322870

Another new feature this year is HGiG Mode, which was developed the HDR Gaming Interest Group comprising Sony (PlayStation), Microsoft (Xbox), and other companies. The mode was proposed after console game developers struggled to find common ground for HDR implementations in games. As such, it can be seen as a sort of standard for HDR gaming.

- "In another industry-first, 2020 LG OLED TVs feature the HDR Gaming Interest Group’s HGiG Mode so gamers can enjoy high quality visuals as game creators and developers intended when playing HDR games via consoles on their LG TVs. HGiG is a body made up of companies from the game and display industries that develops standards to improve the HDR gaming experience for consumers," said LG.

In addition, OLED TVs have received the 'Eye Comfort Display' certification from TÜV Rheinland as they are flicker-free (in SDR and HDR) and offer adjustability for blue light content.

Why so many gaming-related features? Well, you may have heard that PlayStation 5 and Xbox Series X are both coming out later this year. Like everyone else, LG wants a piece of the pie and it is positioning its new OLED TVs as the best choice for console gamers. Are they? Obviously, that will be hard to say until the next generation of game consoles come out but all of these features come on top of LG's HDMI 2.1 ports that also support 4K120 and VRR.

Other features in LG's 2020 OLED TVs include Dolby Vision IQ, a further development of the Dolby Vision HDR format that ensures "optimal picture quality as the creatives intended, no matter the ambient light environment or content genre", Dolby Atmos support, and hardware-level calibration. Like LG's new 8K TVs, the new 4K OLED models will support AV1 decoding.
 
Too bad the tv's have a weird constant autodimming function that is noticeable at monitor viewing distances. From a couch it's not though.

The AW55 monitor does not have it so playing games on a desk is no problem.

Maybe when they release the 48" panel as a monitor it won't have it.

55 to 48 is so insignificant, will revisit when they release an OLED monitor in the 30's that's not also a tv with tv input lag.
 
Too bad the tv's have a weird constant autodimming function that is noticeable at monitor viewing distances. From a couch it's not though.

The AW55 monitor does not have it so playing games on a desk is no problem.

Maybe when they release the 48" panel as a monitor it won't have it.

55 to 48 is so insignificant, will revisit when they release an OLED monitor in the 30's that's not also a tv with tv input lag.

In my experience it mainly applies to HDR mode. I was using my LG C9 in SDR mode yesterday trying it out on the desktop and noticed no sudden dimming despite having a bright white browser window covering most of the screen. With OLED light set to something like 40 or 50 it did not seem to do this so maybe they have tweaked how it functions? Apparently it should not kick in if you have dim enough settings, probably similar to what you would use on the AW55.

IMO 55" to 48" is a pretty sizable difference and pushes the PPI up enough that you don't need to put it as far back.

It's a shame that the makers that are releasing these large 55" "monitor" versions of the LG OLEDs don't put any real effort to adding better features to support the hefty price tags. A good PbP mode like on the Asus 43" monitors would add good value on a screen this large for example.
 
Screen size for different 1:1 resolutions:

3840x1600 - 52"
3840x1440 - 51"
2560x1600 - 37"
2560x1440 - 36"
1920x1200 - 28"
1920x1080 - 27"

Run my desktop 4k and game at 3840x1600.
Not enough information to make a claim of this nature.
At minimum we need the distance from the screen and ability of the eye to focus at said distance.
Are the distances to each of those the same?
If not, why?
 
Using same brightness on both the C9 and AW55 measured by an i1DisplayPro and the C9's dimming is extremely noticeable and distracting.

Wondering who else in here actually has an AW55 to speak from experience other than just making guesses.
 
Not enough information to make a claim of this nature.
At minimum we need the distance from the screen and ability of the eye to focus at said distance.
Are the distances to each of those the same?
If not, why?

They are measurements taken of the display at those resolutions the old fashioned way corner to corner on my AW55.

Those other questions you have are personal preference/ability of the user.
 
They are measurements taken of the display at those resolutions the old fashioned way corner to corner on my AW55.

Those other questions you have are personal preference/ability of the user.
The size of the screen for optimal viewing depends how far you are away from it.
You cannot blindly state a size as optimal.

edit
I think I follow what you wanted to say, sorry I dragged this out.
We understand UHD to 1080p, or 1440p to 720p is 1/4 the res which is 1/2 the horizontal and vertical size for any particular screen when keeping 1:1, its a mathematical certainty.
Were you trying to add anything more?
 
Last edited:
Using same brightness on both the C9 and AW55 measured by an i1DisplayPro and the C9's dimming is extremely noticeable and distracting.

Wondering who else in here actually has an AW55 to speak from experience other than just making guesses.

l88bastard has one. Most of the rest of us already either have LG OLEDs or are waiting on the 48" CX and aren't willing to pay the premium for the lack of dimming and other minor advantages of the AW55. The ABL used to annoy the crap out of me when I was using a 32" 1080p Sharp TV way back in the day, but I really don't find it that big of a deal. YMMV though, not saying it's not a legitimate complaint for you personally especially if you have the AW55 as an alternative.
 
Using same brightness on both the C9 and AW55 measured by an i1DisplayPro and the C9's dimming is extremely noticeable and distracting.

You should specify what kind of dimming you're seeing; ABSL or ABL. Shouldn't be able to hit ABL unless your brightness preferences are really high(>150 nits).

I would believe the AW55 has less ABL though, as its HDR brightness capability is almost completely removed, at 1/4 or less than that of a C9. (~260 nits @ 10% window compared to 850).
 
Hoping we get a 48 panel variant from AW with both DP 1.4 (with perhaps USB C variant as well) and also HDMI 2.1. At that point, and with the price lowering that has gone on for the AW55 I think you'd have a lot less uncertainty from buyers. Folks legitimately wanting more TV dual purpose than monitor might well go CX. But since HDMI would allow for stream boxes as well pc's, you'd probably have folks with dual use that ended a bit more on the pc side going with a monitor variant. I do think LG will share the panel with the marketplace. They have big visions as to how many more panels they want to sell in 2020.
 
Checking back in, first time having a monitor and not even thinking about replacing it at all. Still the only OLED 4k@120hz VRR monitor game in town. End Game.
 
Checking back in, first time having a monitor and not even thinking about replacing it at all. Still the only OLED 4k@120hz VRR monitor game in town. End Game.

yeah and TBH I am going to bet the LGC8X or the 3080 Ti's will all be delayed big time. This Alienware will be the only game in town for a LONG time
 
Back
Top