LG 48CX

I would not recommend this even though visually it looks fine. This means you will most likely accelerate any burn-in as the display will be constantly at a high brightness level. It's still better to set it to SDR when you are not HDR content and reducing the brightness of the display.



I think the smaller 48" model would still be a lot easier to manage. My 49" super ultrawide is about the same width as a 55" 16:9 and having something that is the same width but twice as tall would be quite overwhelming even from further away.

Nothing is overwhelming if you sit far enough away. You virtually shrink the screen to your perspective the farther away you sit. That's why I posted what I figured was the minimum comfortable distance for each 16:9 screen size, with the distance being farther away as you go larger. In fact if you sat very far away it would be too small again. And if you strap a screen to your eyeballs with a VR headset it becomes a huge FoV again (somewhat overly large compared to perceived ppi in current generations though).

It depends on how much area of a room you want to dedicate to your pc and a desk "island", including the whole room if it's a modestly sized room like I have as my pc room currently. Think of a couch view but instead of it being a couch it is a PC multimedia command center/island desk.

48" TV ~~> 40" to 48" away

55" TV ~~> 46" to 55" away .. depending what I'm doing, since my island desk is on caster wheels.
 
If you already own a C9 then I don't see the point in getting a CX either. The CX is more appealing to people like me with older OLED sets.
 
If you already own a C9 then I don't see the point in getting a CX either. The CX is more appealing to people like me with older OLED sets.
Yeah I dont think I would get the CX if I already had the C9 and using it as a monitor. I'm just glad I'm not doing anything until closer to the end of the year. By then we'll know for sure what the CX can/can't do and we'll see what the 3080Ti brings to the table. Heck we still dont know if it will have 2.1 or not. For me it pays to wait.
 
Rtings published their review and stated that the TV does not support LFC at any resolution. Gotta be a bug because it's a deal breaker otherwise especially for future consoles.
Where does Rtings mention LFC in the review? Is it the "FreeSync No" and "G-SYNC stops working on all resolutions below 40Hz"?
Didn't this video prove LFC actually worked for sub-40 fps via G-SYNC on the C9, even though there's no official FreeSync Premium support (which shouldn't be that different to the CX)?:

Also don't recall Rtings even mentioning or making any observations related to LFC on the C9 review for some reason...

That 4k VRR input latency...yikes.

That's pretty much a deal breaker for me.
Compared to anything else in particular or in general (ie gaming monitor)?
I'm wondering why 4K VRR wasn't tested on the LG C9? Perhaps because its range on HDMI 2.0 sources is limited to 40-60?
On the CX the chroma needs to be set to 4:2:0 for it to go up to 120Hz, else if at higher chroma then it's limited to a maximum of 60Hz (or so I would assume considering the limits of current HDMI 2.0 sources being 18Gbps).

Rtings doesn't seem to specify this fact and neither do they specify the chroma for the 4K VRR test, unless I've missed it?
That or we don't have any control over it and it runs automatically at 4K @ 40-120 Hz VRR @ 4:2:0 if it detects an HDMI 2.0 source.

I wonder how different these reviews/tests would be with proper HDMI 2.1 sources...
 
I find it extremely weird if the CX did not have LFC after being officially certified by nvidia for gsync. I mean isn't LFC mandatory to even be certified in the first place? First the black level "bug" that ended up not even being a bug at all and now this supposed "lack" of LFC but it turns out it actually has LFC...hmmmm interesting....o_O
 
I find it extremely weird if the CX did not have LFC after being officially certified by nvidia for gsync. I mean isn't LFC mandatory to even be certified in the first place? First the black level "bug" that ended up not even being a bug at all and now this supposed "lack" of LFC but it turns out it actually has LFC...hmmmm interesting....o_O

Just to be clear, when we're talking about LFC, does this only apply to syncing below 40 hz/fps? We're still good with this TV at 40-120, right?
 
Just to be clear, when we're talking about LFC, does this only apply to syncing below 40 hz/fps? We're still good with this TV at 40-120, right?

The supported variable refresh range is 40-60hz or 40-120hz depending on resolution and refresh rate.

However, I just did a test on my CX, and it appears to work below 40hz as well.
 
Windows has a slider for hdr-sdr brightness adjustment, so you will have the same exact brightness as you would have in normal sdr.

That only affects how bright SDR elements are. The display will still be running at 100% brightness which is the real issue.
 
That only affects how bright SDR elements are. The display will still be running at 100% brightness which is the real issue.
Hmm, that sounds physically impossible. Running at 100% brightness but not emitting light? (or am I missing something here?)
 
FYI this is thread creep, but I dont see anything about the Alienware here. Pulled the trigger on the 55" . Got it direct from Dell for 3K shipped.

Two Days of gaming opinions:

+Easy to set up, once you realize the PC native resolutions are listed BELOW the TV ones!! (you need to select the PC resolution to get 120hz and g-sync compatible). I set windows scaling to 150%, it defaults to 300% which makes the icons huuuuge
+Looks awesome color wise and especially the blacks. They arent kidding about OLED and black, Elite Dangerous is just wow



-a single GTX 2080Ti and overclocked HEDT CPU isnt enough to drive it at 120 hz in any games
-Huge, you need a desk the size of a conference room table (I have a desk that big so it isnt a - for me but would be to others)
-Doesnt feel as snappy as the AW3418DW I came from, but that was 3440 x 1440 vs 4K so I was getting consistent 100 FPS on it.
 

Attachments

  • IMG_9967.jpg
    IMG_9967.jpg
    511.3 KB · Views: 0
  • IMG_9966.jpg
    IMG_9966.jpg
    504.3 KB · Views: 0
FYI this is thread creep, but I dont see anything about the Alienware here. Pulled the trigger on the 55" . Got it direct from Dell for 3K shipped.

Two Days of gaming opinions:

+Easy to set up, once you realize the PC native resolutions are listed BELOW the TV ones!! (you need to select the PC resolution to get 120hz and g-sync compatible). I set windows scaling to 150%, it defaults to 300% which makes the icons huuuuge
+Looks awesome color wise and especially the blacks. They arent kidding about OLED and black, Elite Dangerous is just wow



-a single GTX 2080Ti and overclocked HEDT CPU isnt enough to drive it at 120 hz in any games
-Huge, you need a desk the size of a conference room table (I have a desk that big so it isnt a - for me but would be to others)
-Doesnt feel as snappy as the AW3418DW I came from, but that was 3440 x 1440 vs 4K so I was getting consistent 100 FPS on it.

Here is the thread for the Alienware. Glad you're happy with it, but I'm afraid that it's already somewhat obsolete and the HDR stinks on it. The LG is better for half the cost, which is why that thread isn't very active and this one is.

Note: I'm NOT saying that it's a crappy display. Far from it. But anyone forking over $3K for that thing needs to know its limitations compared to the C9/CX. Basically, you forgot some negatives in your list and I'm not sure if you just don't know about them yet or chose to omit them. :)
 
Fair point, Ive never had or even seen an HDR monitor in action, so I actually forget that even exists. The LG C9 sounds nice, but I wanted a new monitor today....
 
Hmm, that sounds physically impossible. Running at 100% brightness but not emitting light? (or am I missing something here?)

Displays always run at 100% brightness when in HDR mode. While the brightness setting most certainly works differently in this mode, it could still be more likely to trigger automatic brightness limited on the LG if you have for example a browser window open. So it's better to run in SDR mode with brightness reduced and activate HDR mode as needed.
 
Here is the thread for the Alienware. Glad you're happy with it, but I'm afraid that it's already somewhat obsolete and the HDR stinks on it. The LG is better for half the cost, which is why that thread isn't very active and this one is.

Note: I'm NOT saying that it's a crappy display. Far from it. But anyone forking over $3K for that thing needs to know its limitations compared to the C9/CX. Basically, you forgot some negatives in your list and I'm not sure if you just don't know about them yet or chose to omit them. :)

2 things-
-HDR is a gimmick (pleases people that are attracted to box-stores torch mode, but not realistic at all.) From experience, just blown out pq. Own calibrated LG C9, AW55, and a 2016 LG OLED since 2016 .
-LG's input lag/latency is tv status, crap. Therefore not even close to being better. I have gamed on the C9 and not close to being as responsive. Long story short, these are tv's and not monitors. Not to mention can't even run it 4k 120hz for who knows how long.

If you're going to counter, please tell your real life experience not just that you read the brochure.
Alot of folks with zero real life experience out here downplaying the Alienware smh... lol
 
2 things-
-HDR is a gimmick (pleases people that are attracted to box-stores torch mode, but not realistic at all.) From experience, just blown out pq. Own calibrated LG C9, AW55, and a 2016 LG OLED since 2016 .
-LG's input lag/latency is tv status, crap. Therefore not even close to being better. I have gamed on the C9 and not close to being as responsive. Long story short, these are tv's and not monitors. Not to mention can't even run it 4k 120hz for who knows how long.

If you're going to counter, please tell your real life experience not just that you read the brochure.
Alot of folks with zero real life experience out here downplaying the Alienware smh... lol


Hold on a second.

LG CX Review: https://www.rtings.com/tv/reviews/lg/cx-oled
Dell S2716DGR Review: https://www.rtings.com/monitor/reviews/dell/s2716dgr-s2716dg

Now, let's be clear. The CX is an OLED TV, while the s2716dgr is a 1440p, 144hz gsync monitor with a TN panel and DisplayPort.

CX Response times:
1080p @ 120 Hz: 6.8 ms
1440p @ 120 Hz: 6.9 ms

s2716dgr Response times:
Total Response Time (at max refresh rate): 7.7 ms


And you're telling me that the CX cannot compete with dedicated gaming monitors with similar specs? Nevermind the fact that the CX has far superior image quality and capabilities. Also, just so we're clear, LG's OLED TVs from last year, the C9 and even cheaper B9, also had exactly the same response times.

Welcome to 2020, where the best TV gaming modes can and do compete with dedicated gaming monitors.
 
Hold on a second.

LG CX Review: https://www.rtings.com/tv/reviews/lg/cx-oled
Dell S2716DGR Review: https://www.rtings.com/monitor/reviews/dell/s2716dgr-s2716dg

Now, let's be clear. The CX is an OLED TV, while the s2716dgr is a 1440p, 144hz gsync monitor with a TN panel and DisplayPort.

CX Response times:
1080p @ 120 Hz: 6.8 ms
1440p @ 120 Hz: 6.9 ms

s2716dgr Response times:
Total Response Time (at max refresh rate): 7.7 ms


And you're telling me that the CX cannot compete with dedicated gaming monitors with similar specs? Nevermind the fact that the CX has far superior image quality and capabilities. Also, just so we're clear, LG's OLED TVs from last year, the C9 and even cheaper B9, also had exactly the same response times.

Welcome to 2020, where the best TV gaming modes can and do compete with dedicated gaming monitors.

Go look them up at 4k with VRR, 23.3ms yuck. Any real life experience? You can read the brochure-Tokyo Drift

Don't know what their deal is but maybe the internal hardware can't keep up, and maybe why the AW was so expensive (hardware that can keep up at 4k 120hz VRR), and that could be why the AW felt instant vs C9 to me.
 
Last edited:
Go look them up at 4k with VRR, 23.3ms yuck. Any real life experience? You can read the brochure-Tokyo Drift

Don't know what their deal is but maybe the internal hardware can't keep up, and maybe why the AW was so expensive (hardware that can keep up at 4k 120hz VRR), and that could be why the AW felt instant vs C9 to me.

No one, and I mean no one should be buying a TV as a monitor if they are a highly competitive or professional gamer. At the same time, highly competitive/professional gamers aren't going to be using VRR anyway; it introduces a small amount of input latency no matter what the display is. For highly competitive gamers, high refresh rate along with backlight strobing/BFI is going to produce the clearest image.

4k with VRR... are you serious? The only sources that can produce VRR @ 4k with this TV at the moment are 4k/60. Of course response time is going to be crap; there is nothing at the moment that supports HDMI 2.1.

Experience? I moved from a Dell S2716DGR -> LG CX. That's the reason I posted it as an example. I will gladly take the near instant pixel response of an OLED compared to the comparatively smeary mess of a high-refresh TN panel, not to mention the added color accuracy and definition of the OLED.

Introducing VRR into this "response times are crap" argument is like saying a Corolla is a fast car; it is a fast car compare to a bicycle.
 
Last edited:
No one, and I mean no one should be buying a TV as a monitor if they are a highly competitive or professional gamer. At the same time, highly competitive/professional gamers aren't going to be using VRR anyway; it introduces a small amount of input latency no matter what the display is. For highly competitive gamers, high refresh rate along with backlight strobing/BFI is going to produce the clearest image.

Introducing VRR into this "response times are crap" argument is like saying a Corolla is a fast car; it is a fast car compare to a bicycle.

Alot of statements and opinions... no experience lol.

The Alienware 55 runs 4k VRR 120hz without a hit, just correcting the misinformation being spread about it. OLED TV's do not and are not better.
 
With that monitor, how about with an LG OLED? Yah know... what this entire thread is about, not some rando monitor you decided to talk about.

My current monitor is a CX 55". Do you need proof? I posted a video last night. I can post another one easily.

I also have a Samsung Q90R 65" and a Samsung NU8000 55". Need proof of those as well?

I still have the Dell monitor in my closet. Still works perfectly. Just sayin... I can do this all day.
 
I'm not interested in getting into a pissing match over it. The Alienware is unquestionably an awesome display, but many feel it's overpriced and even l88bastard (who owns one) said he will most likely switch to the 48CX.

I didn't go into the Alienware thread and shit all over it - and I even threw a like to some of your posts in it - so it would be cool if you didn't do that here.

There are pros and cons to each one, but both are awesome. And just because you feel like HDR is a gimmick, doesn't mean that everyone should share that opinion. When done properly, it can add a lot of detail and vibrance:

1590159129531.png


1590159154995.png


1590159211006.png

1590159241614.png
 
My current monitor is a CX 55". Do you need proof? I posted a video last night. I can post another one easily.

I also have a Samsung Q90R 65" and a Samsung NU8000 55". Need proof of those as well?

I still have the Dell monitor in my closet. Still works perfectly. Just sayin... I can do this all day.

Don't care, my posts are here to correct misinformation. It's been done. AW55 > OLED TV

It's never been about money, it's been about real life performance. Peoples opionions on what's worth what are completely meaningless because it is 100% personal preference. Go ask the masses if they will pay $2000 for an OLED or $500 for a Vizio... see, doesn't matter.
 
Last edited:
<.....>
-HDR is a gimmick (pleases people that are attracted to box-stores torch mode, but not realistic at all.) From experience, just blown out pq. Own calibrated LG C9, AW55, and a 2016 LG OLED since 2016 .
-LG's input lag/latency is tv status, crap. Therefore not even close to being better. I have gamed on the C9 and not close to being as responsive. Long story short, these are tv's and not monitors. Not to mention can't even run it 4k 120hz for who knows how long.

If you're going to counter, please tell your real life experience not just that you read the brochure.
Alot of folks with zero real life experience out here downplaying the Alienware smh... lol


This will be my first real HDR monitor whether it is the C9, E9, or CX in the end. I have some fake hdr tvs 400nit, edge lit.

Of course I've seen HDR in stores (including the magnolia "dark room" not "torch under fluorescent lighting) running HDR material, especially OLEDs. I've also seen lot of posts from people who use HDR (in movies and consoles especially since windows/pc gaming isn't solid overall from title to title with HDR implementation) and trusted reviewers showing and explaining it (e.g. HDTVTEST) and I can't imagine they are all wrong and that it is a gimmick from everything I've read and seen. It's also measurable with testing hardware and able to be shown on graphs and tables, and especially temperature maps. I'll trust Dolby labortories and expert TV calibrators over you thanks

---------------------------------------------------

True HDR is not a gimmick at all. What hardware is being sold to a lot of people and how it is implemented in software can be though.

A lot of people just don't realize how HDR is supposed to work, mostly because they don't understand it properly but perhaps also because it's been poorly implemented in a lot of games and media playback displays in order to allow you to change the white point/gamma as well as (and perhaps due to) the fact that most "HDR" screens people own so far remap the HDR range out of it's original absolute values since they are incapable of using the HDR1000 scale.


if you follow hdtvtest on youtube he always shows examples of how details are lost when the lower *color* brightness limits (color volume height limits) are reached because they then clip to white (or roll down to the same color brightness limit as the best color limit the screen can reach losing gradation) .. So they lose any detail that goes past that and leave a uniform area.

People need to realize having colors throughout brightness in colored highlights and details and a few colored light sources, reflections, etc.. while the rest of the scene generally remains in SDR ranges is not the same as taking the relative brightness of a SDR screen's brightness control "knob" and turning the whole screen to "11".

HDR is highlights. The average scene brightness is still quite low, in fact some people complain that it's too low since it uses absolute values designed for a home theater dim to dark viewing environment. HDR is not like SDR where the whole scene's brightness is relative and turned up or down.

------------------------------------

HDR Color temperature map based on HDR 10,000 in a game:

QfSF1rn.jpg



---------------------------------------

https://www.youtube.com/user/hdtvtest/videos

One of the biggest things he shows in particular HDR performance videos is how much detail-in-color, or specular highlight detail, is lost at lower color brightness (aka color volume) ceilings. Also that detail is lost in the low end on detail-in-blacks (detail in dark areas).

It's about breaking out of the narrow band of SDR blacks to color ceiling and showing fine degrees of detail all the way up. I'm greatly looking forward to owning a per pixel emissive HDR display.

LG C9 OLED TV Review

OLED excels because it can have color detail all the way through, up to 700nit or higher of color brightness in INDIVIDUAL pixels right next to lower color or even ultra black pixels - allowing for an insane dynamic range and contrast to give you that HDR pop.

Quote from the above HDTVTest C9 OLED review video:
"The peak brightness on paper of these FALD TVs operate in zones, rather than per pixel. So let's say if you ask an FALD LED LCD to light up a very small area with specular highlights, for example, reflections on a vehicle or fireworks against a night sky - these small bright elements are not going to come close to the measured peak brightness of 1000nits or 1500 nits simply because the FALD LED LCD TV has to balance backlight illumination between adjacent zones or else there will be too much blooming which will wash out the picture even further. This the reason why in many side by side comparisons that I, and maybe some of you, have seen - specular highlight detail on a 700nit OLED can often look brighter than even a 1600nit FALD LED LCD. When you add the total absence of blooming and glowing in dark scenes, again because of pixel level illumination, it's not surprise
that it is an OLED and not an LED LCD that has been voted as the best HDR TV by the public in our HDR shootout for two years in a row. Nevertheless I still have a soft spot for high performing LED LCDs that can present high APLscenes with more HDR impact than OLEDs which are restricted by ABL or auto brightness limiter which caps full field brightness at 150 nits for heat and power management. "

---------------------------------------------

2D Color
VR6gxX2.png

"3D" Color volume breaking through the SDR color brightness range ceiling to heights with more realism:
HDR_color-graph_3r1M8aR.png






------------------------------------------

HDR 10,000 is the benchmark for the most realism. Even HDR 1000 as a content set point is fractional HDR. There are a lot of hdr 1000 uhd discs, a bunch at hdr4000 and only 1 or 2 afaik that are 10,000 because noone has the hardware to play them yet at those levels. Some games can also technically do hdr10,000 when tested with color maps.


https://www.theverge.com/2014/1/6/5276934/dolby-vision-the-future-of-tv-is-really-really-bright
"The problem is that the human eye is used to seeing a much wider range in real life. The sun at noon is about 1.6 billion nits, for example, while starlight comes in at a mere .0001 nits; the highlights of sun reflecting off a car can be hundreds of times brighter than the vehicle’s hood. The human eye can see it all, but when using contemporary technology that same range of brightness can’t be accurately reproduced. You can have rich details in the blacks or the highlights, but not both.

So Dolby basically threw current reference standards away. "Our scientists and engineers said, ‘Okay, what if we don’t have the shackles of technology that’s not going to be here [in the future],’" Griffis says, "and we could design for the real target — which is the human eye?" To start the process, the company took a theatrical digital cinema projector and focused the entire image down onto a 21-inch LCD panel, turning it into a jury-rigged display made up of 20,000 nit pixels. Subjects were shown a series of pictures with highlights like the sun, and then given the option to toggle between varying levels of brightness. Dolby found that users wanted those highlights to be many hundreds of times brighter than what normal TVs can offer: the more like real life, the better.
"[/QUOTE]

HLG is still relative which isn't the truest form of HDR which should have absolute values.

PQ (Perceptual quantization) gamma curve is based on the characteristics of human visual perception.

HLG (Hybrid Log-Gamma) is sort of a sloppy way to make quasi HDR work with inadequate SDR based systems, live un-encoded video feed material, and lower quality quasi-HDR hardware. There's no reason games should be using the makeshift relative version other than that most of the displays people are using for HDR material are inadequate in the first place.

https://www.eizo.com/library/management/ins-and-outs-of-hdr/index2.html/

321001_lkfnvmb-png.png
 
Last edited:
"Dolby labortories and expert TV calibrators" are selling something shiny. People like shiny things. Marketing is working.

Like I said, only posted to correct misinformation relative to the AW55. Just enjoy it, but don't downplay the AW55 because "what about muh HDR that doesn't even work properly on PC," and "it's not worth it." I can tell you it has been worth it for months, and still outperforms these new TV's, and does things these TV's still cannot. But lets make it about money and wait until X product comes out in a year for less (and still has worse performance) (with technology??? c'mon mannnnn.) Lets make it about about one single subjective tech that is an excellent marketing tool. Not so [H]ardcore if yah ask me ; )

Enjoy!
 
"Dolby labortories and expert TV calibrators" are selling something shiny. People like shiny things. Marketing is working.

Like I said, only posted to correct misinformation relative to the AW55. Just enjoy it, but don't downplay the AW55 because "what about muh HDR that doesn't even work properly on PC," and "it's not worth it." I can tell you it has been worth it for months, and still outperforms these new TV's, and does things these TV's still cannot. But lets make it about money and wait until X product comes out in a year for less (and still has worse performance) (with technology??? c'mon mannnnn.) Lets make it about about one single subjective tech that is an excellent marketing tool. Not so [H]ardcore if yah ask me ; )

Enjoy!

Why are you crapping on the CX vs. the AW55? These are 2 completely different products designed for 2 completely different markets. The AW55 is a dedicated computer monitor designed to be used as a monitor. The CX is a Smart TV w/ gaming capabilities baked in. The CX was not designed as a gaming product first, while the AW55 is very much a BFGD.

We're not going into the AW55 thread saying "Well, it doesn't have HDR and it doesn't have a youtube app built in, so it's a terrible product." We shouldn't be doing this.
 
Why are you crapping on the CX vs. the AW55? These are 2 completely different products designed for 2 completely different markets. The AW55 is a dedicated computer monitor designed to be used as a monitor. The CX is a Smart TV w/ gaming capabilities baked in. The CX was not designed as a gaming product first, while the AW55 is very much a BFGD.

We're not going into the AW55 thread saying "Well, it doesn't have HDR and it doesn't have a youtube app built in, so it's a terrible product." We shouldn't be doing this.

My posts corrected misinformation about the AW55. But hey if the facts of their performance strike a cord maybe it's a good thing I posted! And last but not least cause this is a public forum and I can! A forum is a thing of beauty!

And yes folks have gone into that thread and mentioned the HDR and port availability and future-proofing, all junk I don't care about TODAY! When a better performing DISPLAY comes out I will buy that one too!!! I own many displays, FW-900, XB270H, VG32VQ, C9, I might even pick this one up to tinker with!

TODAY the AW55 can OLED VRR 4K @ 120hz. Not to mention custom resolutions for that ultrawide goodness. When a better display arrives I will get that one too. No fanboi here. The better products get the mula.
 
Last edited:
I never said the AW55 sucked, my posts were in reply to you saying HDR is a gimmick.

"Dolby labortories and expert TV calibrators" are selling something shiny. People like shiny things. Marketing is working.

Life hits your eyes with shiny things, and much higher colors. Without HDR you are limiting your specular highlight color palette drastically when a HDR version of materials are avialable.. HDR is the future, like 4k and high hz. Yes they want to sell it just like they want to sell higher color, higher contrast~ deeper blacks, higher resolutions , higher hz, and un-juddered~un-tearing frame rate fluctuations. You probably bought your alienware similarly for the mirror opposite "unshiny thing" - the black levels (and uniformity).

There is plenty of SDR material. Like I said I'm running SDR screens now. An Alienware 55" seems great for the majority of pc games for now and the great majority of the past PC library that will probably never be HDR. A good non OLED gaming monitor is great too but the OLED will have way better blacks and per pixel lighting/detail. I came from 60hz screens and 850:1 black level TNs at one point, but if I can get better I will in whatever category I can, within reasonable price ranges. People said similar things about 1080p and later 4k resolution increases and lack of initial material, high hz with games, g-sync/VRR, etc.. growing pains. History repeats.
 
Last edited:
I never said the AW55 sucked, my posts were in reply to you saying HDR is a gimmick.



Life hits your eyes with shiny things, and much higher colors. Without HDR you are limiting your specular highlight color palette drastically when a HDR version of materials are avialable.. HDR is the future, like 4k and high hz. Yes they want to sell it just like they want to sell higher color, higher contrast~ deeper blacks, higher resolutions , higher hz, and un-juddered~un-tearing frame rate fluctuations. You probably bought your alienware similarly for the mirror opposite "unshiny thing" - the black levels (and uniformity).

There is plenty of SDR material. Like I said I'm running SDR screens now. An Alienware 55" seems great for the majority of pc games for now and the great majority of the past PC library that will probably never be HDR. A good non OLED gaming monitor is great too but the OLED will have way better blacks and per pixel lighting/detail. I came from 60hz screens and 850:1 black level TNs at one point, but if I can get better I will in whatever category I can, within reasonable price ranges. People said similar things about 1080p and later 4k resolution increases and lack of initial material, high hz with games, g-sync/VRR, etc.. growing pains. History repeats.

It's a gimmick. Luma already exists. Luminance per pixel already exists in non HDR signals. But pretend like it's new.

https://www.rtings.com/tv/learn/chroma-subsampling

IN MOVIES AND TV SHOWS
Now, its importance with smaller text is undeniable, but what about with movies? 4:2:0 subsampling has been an industry standard for a long time now, and it isn't without reason. The benefits of having full color in video are debatable, especially at 4k. It would be tough to recognize the difference between a full 4:4:4 sequence and the same content in 4:2:0.

4:2:0 is almost lossless visually, which is why it can be found used in Blu-ray discs and a lot of modern video cameras. There is virtually no advantage to using 4:4:4 for consuming video content. If anything, it would raise the costs of distribution by far more than its comparative visual impact. This becomes especially true as we move towards 4k and beyond. The higher the resolution and pixel density of future displays, the less apparent subsampling artifacts become.

IN VIDEO GAMES
While some PC games that have a strong focus on text might suffer from using chroma subsampling, most of them are either designed with it in mind or implement it within the game engine. Most gamers should therefore not worry about it.
 
My posts corrected misinformation about the AW55. But hey if the facts of their performance strike a cord maybe it's a good thing I posted! And last but not least cause this is a public forum and I can! A forum is a thing of beauty!

And yes folks have gone into that thread and mentioned the HDR and port availability and future-proofing, all junk I don't care about TODAY! When a better performing DISPLAY comes out I will buy that one too!!! I own many displays, FW-900, XB270H, VG32VQ, C9, I might even pick this one up to tinker with!

TODAY the AW55 can OLED VRR 4K @ 120hz. Not to mention custom resolutions for that ultrawide goodness. When a better display arrives I will get that one too. No fanboi here. The better products get the mula.

Again, you're comparing 2 completely different products designed for 2 completely different markets.

The AW55, as a completely dedicated gaming monitor 4k/120hz + GSYNC costs $3000.
The CX 55", as a Smart TV that is able to do 4k/120hz with Chroma 4:2:0, supports HDR10 and Dolby Vision, gets brighter, and comes with tons of additional features that the AW55 doesn't have... costs $1800.

The AW55 has ONE feature that the CX 55" doesn't have at the moment: 4k/120hz + GSYNC. However, the CX has 4k/120hz or 1440p/120hz GSYNC w/ HDR.

I'm glad you think it's worth $1200 more, but me... personally... I don't see it. Seems like a pretty bad ROI if you ask me.


Of course, I'm not going over to the AW55 thread to proclaim the CX OLED's superiority, but hey... to each their own.
 
It's a gimmick. Luma already exists. Luminance per pixel already exists in non HDR signals. But pretend like it's new.

https://www.rtings.com/tv/learn/chroma-subsampling

IN MOVIES AND TV SHOWS
Now, its importance with smaller text is undeniable, but what about with movies? 4:2:0 subsampling has been an industry standard for a long time now, and it isn't without reason. The benefits of having full color in video are debatable, especially at 4k. It would be tough to recognize the difference between a full 4:4:4 sequence and the same content in 4:2:0.

4:2:0 is almost lossless visually, which is why it can be found used in Blu-ray discs and a lot of modern video cameras. There is virtually no advantage to using 4:4:4 for consuming video content. If anything, it would raise the costs of distribution by far more than its comparative visual impact. This becomes especially true as we move towards 4k and beyond. The higher the resolution and pixel density of future displays, the less apparent subsampling artifacts become.

IN VIDEO GAMES
While some PC games that have a strong focus on text might suffer from using chroma subsampling, most of them are either designed with it in mind or implement it within the game engine. Most gamers should therefore not worry about it.

It's the color luminance~color brightness LEVEL (or volume). SDR is a hard cap. Any color hitting the SDR color range limited display's ceiling will roll off to the highest color the SDR range is capable of, or it will clip to white - no matter what your subpixel layout and output is set to.

Imagine having a jumbo box of multicolor crayon rows.. you have 3 rows (~ 300nit) out of 7.5 to 8 rows high of brighter colors when comparing Aw55 SDR OLED vs a C9 or CX OLED with HDR material. Any of the detail throughout those higher rows of highlights is lost to a blob of color (much like shadow detail can be lost in blacks on many tvs) on a SDR display, .. or historically in older displays, clipped to white at their SDR ceiling.

You aren't raising the entire screen to those color brightnesses. The majority of the scene remains in the SDR range typically. HDR provides more realism and details in highlights and bright areas.

..

Remember nits is what nits of color levels and detail-in-colors it is capable of, not turning the whole screen up all at once)
---------------------------------------------------------------------------------
CNET alienware 55 OLED review:

"The monitor has such relatively low brightness, that HDR barely looks better. It's rated for a maximum of 400 nits, and that's for a window of about 3% of the screen, where normally the specification is a 10% window. Typically, as the window grows, the brightness drops, and in my testing the peak was closer to 300 nits for a 2% window, and typically about 260 nits for a 10% window. It gets as low as 110 nits at full screen. "

Alienware AW55
2% window 300nit
10% window 260nit (typical)
100% window 110nit
-------------------------------------------------------------------------------------

C9
2% window 855nit peak, 814nit sustained
10% window 845nit peak, 802nit sustained
100% window 145nit peak, 138nit sustained

--------------------------------------------------------------------------------------------

CX
2% window 799nit peak, 752nit sustained
10% window 813nit peak, 775nit sustained
100% window 146nit peak, 140nit sustained

------------------------------------------------------------------------------------------

So again as you can see by the numbers, the HDR color brightnesses (color volume) is more often for a small % of a scene's light sources and highlights. The screens are incapable of blasting color brightness heights across 100% of the screen all at once. They can however increase realism with much higher detail in many more gradations of colors throughout highlights and brighter areas of scenes dynamically.

SDR's color range is mapped as "2D":

VR6gxX2.png

HDR color nit range aka color volume is mapped as "3D" since it has so much more color range to graph

HDR_color-graph_3r1M8aR.png


...


You aren't getting HDR color heights (highlights, detail in color in bright areas) out of a SDR range screen.

Game showing a HDR color map. The beauty is the details, or in this case, the highlights. As mentioned before - most of the % of the screen is in the SDR range of values.

HDR_Game-color-map_jE1pSgZ.jpg
 
Last edited:
Again, you're comparing 2 completely different products designed for 2 completely different markets.

The AW55, as a completely dedicated gaming monitor 4k/120hz + GSYNC costs $3000.
The CX 55", as a Smart TV that is able to do 4k/120hz with Chroma 4:2:0, supports HDR10 and Dolby Vision, gets brighter, and comes with tons of additional features that the AW55 doesn't have... costs $1800.

The AW55 has ONE feature that the CX 55" doesn't have at the moment: 4k/120hz + GSYNC. However, the CX has 4k/120hz or 1440p/120hz GSYNC w/ HDR.

I'm glad you think it's worth $1200 more, but me... personally... I don't see it. Seems like a pretty bad ROI if you ask me.


Of course, I'm not going over to the AW55 thread to proclaim the CX OLED's superiority, but hey... to each their own.

More than ONE feature. CX55 has some severe shortcomings that are catastrophic dealbreakers: can't combine all it's technologies yet, and input latency for 4K VRR, 23.3ms = internal hardware can't keep up, and Chroma 4:2:0? lol. I can wait until the cows come home to have CXI with all these things ironed out or I can forego the gimmicky HDR and have OLED/4K/120hz/VRR/Instant Input Latency TODAY! Hmmmm decisions decisions. I have gotten 100% ROI since the day I bought the AW55. Let me know when your ROI for all that tech you cannot use TODAY becomes worth it. Payed alot for something that cannot do it all TODAY, and if it could, which it can't, it would still have latency issues.

Also I personally A & B'd a C9 vs AW55 (I still own them both) and TV's will always TV with their latency.

I am coming here to correct misinformation because to each their own and I can.
 
Last edited:
More than ONE feature. CX55 has some severe shortcomings that are catastrophic dealbreakers: can't combine all it's technologies yet, and input latency for 4K VRR, 23.3ms = internal hardware can't keep up, and Chroma 4:2:0? lol. I can wait until the cows come home to have CXI with all these things ironed out or I can forego the gimmicky HDR and have OLED/4K/120hz/VRR/Instant Input Latency TODAY! Hmmmm decisions decisions. I have gotten 100% ROI since the day I bought the AW55. Let me know when your ROI for all that tech you cannot use TODAY becomes worth it. Payed alot for something that cannot do it all TODAY, and if it could, which it can't, it would still have latency issues.

Also I personally A & B'd a C9 vs AW55 (I still own them both) and TV's will always TV with their latency.

I am coming here to correct misinformation because to each their own and I can.

While there are few reviews of the AW55, https://www.tomsguide.com/reviews/alienware-aw5520qf-55-inch-oled-gaming-monitor says the following:

"But there's one area where the AW5520QF fell below our expectations, and that was lag time. When tested with our Leo Bodnar input lag tester over HDMI, the AW5520QF consistently clocked a lag time of 29.5 milliseconds. While most of our tests were performed in standard picture mode, we made sure to perform lag testing across every mode the Alienware offered, but it stayed consistent in every mode. While that's not far out of line with premium OLED TVs, like the Sony A9F (27.5 ms) and the LG C9 OLED (21.2 ms), it's significantly slower than the Samsung Q60 QLED, which offered a game mode with lag times of 16.3 ms. For a gaming-focused monitor, from one of the biggest brand names in gaming, I expected better. "

Now their lag times are higher than what Rtings review has for C9 so this might be at 4K 60 Hz or different way of measuring it, I don't know. But in any case in their test it looks like the AW55 input lag was actually higher. To compare like for like, you would probably want to run C9 and AW55 at 1440p @ 120 Hz. C9 should be in Game mode apparently.

Personally I cannot tell a difference in input lag between my C9, ASUS PG278Q or Samsung CRG9. The differences are entirely in response times. Even my Samsung KS8000 at around 22ms input lag was fine in SDR, but the 35ms in HDR mode was noticeably worse. I think it's just not worth nitpicking about input lag or try to turn it into some massive issue, it's low enough on the C9 and CX even at 4K 60 Hz.

With next gen GPUs rumored to be releasing in September, there is just no good reason to buy the AW55 at this point. In my country the 55" LG C9 is 1500 euros cheaper. The 55" CX is only about 300 euros cheaper. The 48" CX is about 1100 euros cheaper and a more desktop friendly size.

To me rather than correct misinformation it sounds like you are trying to justify your purchase to us. In the end it's the same LG OLED panel paired with different hardware for a different purpose. Enjoy your display but please understand that those who don't own one have no good reason to buy one at this point when we could save a massive pile of money with 55" C9 or 48" CX, get good HDR support and put that saved money towards an Nvidia 3080 or 3080 Ti later this year.
 
While there are few reviews of the AW55, https://www.tomsguide.com/reviews/alienware-aw5520qf-55-inch-oled-gaming-monitor says the following:

"But there's one area where the AW5520QF fell below our expectations, and that was lag time. When tested with our Leo Bodnar input lag tester over HDMI, the AW5520QF consistently clocked a lag time of 29.5 milliseconds. While most of our tests were performed in standard picture mode, we made sure to perform lag testing across every mode the Alienware offered, but it stayed consistent in every mode. While that's not far out of line with premium OLED TVs, like the Sony A9F (27.5 ms) and the LG C9 OLED (21.2 ms), it's significantly slower than the Samsung Q60 QLED, which offered a game mode with lag times of 16.3 ms. For a gaming-focused monitor, from one of the biggest brand names in gaming, I expected better. "

Now their lag times are higher than what Rtings review has for C9 so this might be at 4K 60 Hz or different way of measuring it, I don't know. But in any case in their test it looks like the AW55 input lag was actually higher. To compare like for like, you would probably want to run C9 and AW55 at 1440p @ 120 Hz. C9 should be in Game mode apparently.

Personally I cannot tell a difference in input lag between my C9, ASUS PG278Q or Samsung CRG9. The differences are entirely in response times. Even my Samsung KS8000 at around 22ms input lag was fine in SDR, but the 35ms in HDR mode was noticeably worse. I think it's just not worth nitpicking about input lag or try to turn it into some massive issue, it's low enough on the C9 and CX even at 4K 60 Hz.

With next gen GPUs rumored to be releasing in September, there is just no good reason to buy the AW55 at this point. In my country the 55" LG C9 is 1500 euros cheaper. The 55" CX is only about 300 euros cheaper. The 48" CX is about 1100 euros cheaper and a more desktop friendly size.

To me rather than correct misinformation it sounds like you are trying to justify your purchase to us. In the end it's the same LG OLED panel paired with different hardware for a different purpose. Enjoy your display but please understand that those who don't own one have no good reason to buy one at this point when we could save a massive pile of money with 55" C9 or 48" CX, get good HDR support and put that saved money towards an Nvidia 3080 or 3080 Ti later this year.

Tom's Guide. LOL
Over HDMI!!!! LMFAO!!! I AM FUCKING DYING RIGHT NOW!!!!!
Someone making it about money again. LOL

OLED/4K/120hz/VRR/Instant Input Latency TODAY. Enjoy the limited functionality for less $ and justifying it, after all that's how things work in anything you purchase ; )
 
I'm not hating the Alienware, you definitely got in on 120hz OLED at least a year ahead of me, (maybe 1.4 years before me overall by the time I get my OLED). It sounds like it is great quality for SDR.

I just don't agree at all that HDR is a gimmick. Far from it.

Considering the timeframe that you've had and you will have your screen before mine - other than pricing - it sounds well worth having as a screen considering how much is available in HDR for PC gaming so far and what other 120hz displays have been available.

I won't have a OLED for gaming until 3080ti and eventually probably a PS5 the way things are going. I'll be using it as a gaming and media stage rather than a desktop/static apps monitor. It will be showing a lot of non-gaming content including HDR material, but also HDR games when available on PC and PS5. So I'll probably end up with a 55" C9 or E9 once the dust settles.
 
HDR is anything but a gimmick. It's just very early and the displays available to consumers can't do it very well just yet. They simply can't match 100% of the specs for now.

Add to that software issues and a lack of supported media and yeah, it's not exactly a "great" selling point I suppose. But the HDR on the LG OLED TVs (for example) isn't that bad at all (it's great actually even though not perfect) and it's not like there is a cheaper version of those TVs without HDR available anyway (I would myself probably be interested in that right now).
 
Back
Top