LG 48CX

I've never seen someone try so hard to justify their bad purchase before. Sounds like someone is in denial and the world only revolves around them and their personal tastes & choices.

HDR is a "gimmick"? Pretty sure RGB's on the BACK of a TV is more of a gimmick. Try playing an HDR game for 20+ hours, then play it on SDR mode. You'll instantly notice how devoid of life of the SDR version of the game is. If you don't then something is physically wrong with your vision. The rest of the world including professional TV & monitor reviewers can attest to that. Almost every single big-budget game that has come out for years is now HDR. I bet it's going to be damn near 100% when next-gen consoles hit.

If having the most frame-rate is the most important thing to you, then you should be playing on some 23" TN 300hz+ panel and not an OLED. In fact, you should just get an old 1990's CRT monitor and be done with it. Don't forget to lower all your in-game settings to LOW to get the maximum frame-rate :whistle:

This is one of the more enthusiast-level PC gaming hardware forums. Don't you think it's a bit strange that this thread has 3x the amount of replies for a product THAT ISN'T EVEN OUT YET than the Alienware 55" thread when that product has been out for like 6+ months?

There's pros & cons to both products. We're all grown adults to know that. Most people opted not to spend 2x on the Alienware that was already outdated the day it was released. Technically the LG's are superior to the Alienware in every single way, it's just Nvidia/AMD fault for not having HDMI 2.1 GPU's yet to use the TV to its full potential. Most people here are patient enough to wait.
 
I've never seen someone try so hard to justify their bad purchase before. Sounds like someone is in denial and the world only revolves around them and their personal tastes & choices.

HDR is a "gimmick"? Pretty sure RGB's on the BACK of a TV is more of a gimmick. Try playing an HDR game for 20+ hours, then play it on SDR mode. You'll instantly notice how devoid of life of the SDR version of the game is. If you don't then something is physically wrong with your vision. The rest of the world including professional TV & monitor reviewers can attest to that. Almost every single big-budget game that has come out for years is now HDR. I bet it's going to be damn near 100% when next-gen consoles hit.

If having the most frame-rate is the most important thing to you, then you should be playing on some 23" TN 300hz+ panel and not an OLED. In fact, you should just get an old 1990's CRT monitor and be done with it. Don't forget to lower all your in-game settings to LOW to get the maximum frame-rate :whistle:

This is one of the more enthusiast-level PC gaming hardware forums. Don't you think it's a bit strange that this thread has 3x the amount of replies for a product THAT ISN'T EVEN OUT YET than the Alienware 55" thread when that product has been out for like 6+ months?

There's pros & cons to both products. We're all grown adults to know that. Most people opted not to spend 2x on the Alienware that was already outdated the day it was released. Technically the LG's are superior to the Alienware in every single way, it's just Nvidia/AMD fault for not having HDMI 2.1 GPU's yet to use the TV to its full potential. Most people here are patient enough to wait.
Sorry guy, its not a "bad" purchase. I like how you start off bashing Stryker then go on to say, " there's pros and cons to both products". Leave it alone now
 
Last edited:
Sorry guy, its not a "bad" purchase. I like how you start off bashing Stryker then go on to say, " there's pros and cons to both products". Leave it alone now

Oh come on. Call the things as they are. The AW55 (tbh AW itself for me) is quiet a epitome of marketing. Please do not give others the impression that this is a sensible purchase right now, i think the average joe these days has to made a well-considered choice before spending thousand of dollars.
 
Oh come on. Call the things as they are. The AW55 (tbh AW itself for me) is quiet a epitome of marketing. Please do not give others the impression that this is a sensible purchase right now, i think the average joe these days has to made a well-considered choice before spending thousand of dollars.

Both have their merits. If AW released an "updated version" that matched the current LG OLEDs I would probably buy that instead as a PC user. Ideally with a real GSync chip. But we have to make due with what's actually available.
 
Both have their merits.
Yeah, the AW will still holding the 4K120Hz@8Bit@4:4:4@SDR merit for about a couple of months from now on. A pure Gamechanger.

If AW released an "updated version" that matched the current LG OLEDs I would probably buy that instead as a PC user.
I'm sure you would also like to put probably extra 1-2k bucks on top for this.

Ideally with a real GSync chip. But we have to make due with what's actually available.
The fact that it doesn't even come with a Gsync-module in first place was already a joke.
 
Last edited:
https://insights.club-3d.com/thread/hdmi-2-1-to-displayport-1-4

Unfortunately their dp1.4 to hdmi 2.1 will not support vrr or gsync. But you will be able to do 4k 120hz with hdr. Release in june, so might be in hands early July. 70 euro for 2-3 months before the 3080ti...

Confusing since their brochure does state gsync and vrr bypass. Maybe im interpreting bypass incorrectly...

https://www.google.com/url?sa=t&sou...Vaw3cwPA4lv3CGh1BSZ743ZRm&cshid=1590316899512

The Club 3D CAC-1085 is the perfect solution to connect to any HDMI™ 4K120Hz ready displays. If you have a DisplayPort™ 1.4 ready PC or any other device that lacks the new HDMI™ 4K120Hz speci-fication, the Club3D CAC-1085 will be the simple way to upgrade your device and connect to your new TV. With its DP1.4 DSC 1.2 video compression technology, this adapter is able to convert DP1.4 video signals to HDMI™2.1, supporting video display resolutions up to 3840x2160 at 120Hz creating life like colors and movements with HDR giving users the ultimate visual experience. Support for G-Sync, FreeSync and VRR bypass. The Adapter is powered thru an USB Type C to USB Type A cable(provided with the product).
 
Don't bother. Every other post he makes is some crusade about how HDR is fake/a gimmick despite knowledgeable people having explained to him what 'dynamic range' is in photography and filmmaking a million times. Let him live alone in his delusion that SDR is all you need lol.

The marketing is working well I see. So far everyone that has experienced it outside of a brochure and in real life knows it's either not implemented correctly at all so meh, or it just looks different but not necessarily better. Enjoy talking about the brochure fellas ; )
 
I have seen it in person. I've also seen experts showing temperature maps of HDR color details not present in SDR material, even live videos showing it dynamically throughout. A bunch of people in this thread play HDR games (even if some are console games) and HDR movies on HDR TVS and have spoken about their experiences.

Imagine having a jumbo box of multicolor crayon rows.. you have 3 rows (~ 300nit) out of 7.5 to 8 rows high of brighter colors when comparing Aw55 SDR OLED vs a C9 or CX OLED with HDR material. Any of the detail throughout those higher rows of highlights is lost to a blob of color (much like shadow detail can be lost in blacks on many tvs) on a SDR display, .. or historically in older displays, clipped to white at their SDR ceiling.

You aren't raising the entire screen to those color brightnesses. The majority of the scene remains in the SDR range typically. HDR provides more realism and details in highlights and bright areas.


https://www.resetera.com/threads/hdr-games-analysed.23587/

https://arstechnica.com/gaming/2018...es-broken-down-by-incredible-heatmap-imagery/


----

"HDR is a gimmick"

People have said the same about 16:9 taking over 4:3, 120hz vs 60hz (I was one of them), 1440p and later 4k over 1080p, and the value of g-sync/vrr. A segment of people always dig their heels in. Growing pains.

7K94SEY.png
 
Last edited:
"HDR is a gimmick"

People have said the same about 16:9 taking over 4:3, 120hz vs 60hz (I was one of them), 1440p and later 4k over 1080p, and the value of g-sync/vrr. A segment of people always dig their heels in. Growing pains.

As someone who thought "why the heck would anyone pay so much money for more than 60hz and gsync?", I definitely ate my words about 3 years ago. This isn't evolutionary monitor tech... it's revolutionary monitor tech. High refresh + VRR is a game changer for gamers. The difference in quality is crazy.

I feel that HDR is headed that direction, but due to the battle between standards, highly variable monitor specifications, and just different implementations, it's not there yet. The worst part is that most people are currently experiencing HDR on 400 nit or less screens with no local dimming; there's a reason their experience is trash.

HDR is only fully realized on 1000+ nit peak screens with tons of local dimming zones OR an OLED TV. Unfortunately, both of these items are still $1000+ at the low-end, so most people either can't afford it or simply don't want to spend that sort of money. I feel like once this tech is implemented well around the $500 range, we will see a much higher adoption rate among consumers and manufacturers. It will happen eventually, just the same way you can now get a big 4k Smart TV for under $300.
 
Oh come on. Call the things as they are. The AW55 (tbh AW itself for me) is quiet a epitome of marketing. Please do not give others the impression that this is a sensible purchase right now, i think the average joe these days has to made a well-considered choice before spending thousand of dollars.

Nobody said "it's a sensible purchase right now". You just couldn't leave it alone alone. lol
 
As someone who thought "why the heck would anyone pay so much money for more than 60hz and gsync?", I definitely ate my words about 3 years ago. This isn't evolutionary monitor tech... it's revolutionary monitor tech. High refresh + VRR is a game changer for gamers. The difference in quality is crazy.

I feel that HDR is headed that direction, but due to the battle between standards, highly variable monitor specifications, and just different implementations, it's not there yet. The worst part is that most people are currently experiencing HDR on 400 nit or less screens with no local dimming; there's a reason their experience is trash.

HDR is only fully realized on 1000+ nit peak screens with tons of local dimming zones OR an OLED TV. Unfortunately, both of these items are still $1000+ at the low-end, so most people either can't afford it or simply don't want to spend that sort of money. I feel like once this tech is implemented well around the $500 range, we will see a much higher adoption rate among consumers and manufacturers. It will happen eventually, just the same way you can now get a big 4k Smart TV for under $300.

On top of that HDR simply isn't as big of a visual impact as going from 60fps to 90fps or doubling resolution. And on top of that you can legitimately make the argument that some HDR effects don't even look visually appealing whereas more resolution and higher framerates are basically an objective improvement.

I will say this, though: variable refresh really isn't a big deal unless you're into video game emulators where you need to be able to run games at different refresh rates to move smoothly. For modern PC games it simply doesn't matter much either way. Variable refresh is kind of a niche feature.
 
On top of that HDR simply isn't as big of a visual impact as going from 60fps to 90fps or doubling resolution. And on top of that you can legitimately make the argument that some HDR effects don't even look visually appealing whereas more resolution and higher framerates are basically an objective improvement.

I will say this, though: variable refresh really isn't a big deal unless you're into video game emulators where you need to be able to run games at different refresh rates to move smoothly. For modern PC games it simply doesn't matter much either way. Variable refresh is kind of a niche feature.

HDR on an OLED display with a decent brightness can really bring way more depth to the visual perception in games, from my experience especially for indoor environments.

As for VRR: What do you talking about? Emulators? Wtf. It's pretty much the most important thing to have for modern demanding games.
 
On top of that HDR simply isn't as big of a visual impact as going from 60fps to 90fps or doubling resolution. And on top of that you can legitimately make the argument that some HDR effects don't even look visually appealing whereas more resolution and higher framerates are basically an objective improvement.

I will say this, though: variable refresh really isn't a big deal unless you're into video game emulators where you need to be able to run games at different refresh rates to move smoothly. For modern PC games it simply doesn't matter much either way. Variable refresh is kind of a niche feature.

VRR doesn't matter for modern PC games? Why the hell not? 99% of games I play I can never match the fps to the monitor Hz so VRR is important to me in that regard. I don't use emulators at all.
 
For anyone wanting to save a little bit of money (and get a larger TV), you may want to consider the B9 55" compared to the CX 48".

https://www.rtings.com/tv/tools/compare/lg-b9-vs-lg-cx/915/10619?usage=11114&threshold=0.1

I have no doubt that the CX is the better TV, but by how much? Turns out... not much at all. They're neck-and-neck for brightness and HDR performance. The CX also has 120hz BFI. Beyond that, the only thing you're really paying for with the CX is the more color accurate screen out of the box. However, once calibrated, they're virtually identical.

I'm starting to think that the most notable feature of the CX compared to last year's B9/C9 is... there's a 48" model. I think I have a bit of buyer's remorse now going with the CX 55". I could have saved $500 if I had gone with the B9.
 
Last edited:
The C9 was already down to $1200 from a few authorized resellers last november but now it's back up to $1500 for now. I'm expecting the 55" C9 and possibly the E9 to be on sale in november down to at least $1200 if not $1000 to dump stock. That's probably what I'm going with, in a 55". Hopefully the nvidia 3080ti will be out by then or around the same time so we'll know what we're dealing with on the gpu end too.
 
For anyone wanting to save a little bit of money (and get a larger TV), you may want to consider the B9 55" compared to the CX 48".

I'm starting to think that the most notable feature of the CX compared to last year's B9/C9 is... there's a 48" model. I think I have a bit of buyer's remorse now going with the CX 55". I could have saved $500 if I had gone with the B9.



1:36 - 2:00 convinced me to get the 65C9 instead of the B9, also trumotion works worse on the B9. But overall this is more important for watching movies rather than gaming i think. But i still plan to buy the 48CX for my desk when it's hopefully hitting the 1000$ mark next year.
 
The C9 was already down to $1200 from a few authorized resellers last november but now it's back up to $1500 for now. I'm expecting the 55" C9 and possibly the E9 to be on sale in november down to at least $1200 if not $1000 to dump stock. That's probably what I'm going with, in a 55". Hopefully the nvidia 3080ti will be out by then or around the same time so we'll know what we're dealing with on the gpu end too.

I think the reason the price went back up was due to COVID-19. Retailers overnight lost nearly all of their traffic, so they probably saw it as a way to keep their revenue stream up a little bit. This is complete speculation on my part; I have no evidence of this. Just guessing.



1:36 - 2:00 convinced me to get the 65C9 instead of the B9, also trumotion works worse on the B9. But overall this is more important for watching movies rather than gaming i think. But i still plan to buy the 48CX for my desk when it's hopefully hitting the 1000$ mark next year.


I hate motion interpolation. It creates strange artifacts on screen and makes the picture look unnatural. For gaming, there isn't really a good use case for it as it just adds input latency. However, I do understand your point on this.
 
I hate motion interpolation. It creates strange artifacts on screen and makes the picture look unnatural. For gaming, there isn't really a good use case for it as it just adds input latency. However, I do understand your point on this.

I also would never use it for gaming. I only use it at a very low level for movies because i can't stand the 24fps stutter.
 
I too was one of those people that said HDR was a gimmick in the beginning. I said give it two or three years. I do see now how it is now becoming a real thing. Games are finally starting to take advantage of it although they are still just a few. For me though its full HDR or nothing. If I'm paying for HDR then give me HDR 1000 or not at all. Just how I feel about it. HDR 400 just seems so cheap and well not true HDR. Like I'm being ripped off and over charged. I think the same about HDR 600 as well but I do understand its better than 400. To be honest I'm more interested in Gsynch than HDR in monitors and TVs. I'm slowly coming around. I still want to see more people with the CX48 before I agree its good or not. Professional reviews are fine but I like to hear from actual users to make a informed opinion. It is looking good so far though from what I have seen. Obviously its not for everyone though. I have to respect that.
 
On top of that HDR simply isn't as big of a visual impact as going from 60fps to 90fps or doubling resolution. And on top of that you can legitimately make the argument that some HDR effects don't even look visually appealing whereas more resolution and higher framerates are basically an objective improvement.

I will say this, though: variable refresh really isn't a big deal unless you're into video game emulators where you need to be able to run games at different refresh rates to move smoothly. For modern PC games it simply doesn't matter much either way. Variable refresh is kind of a niche feature.
Respectfully, I couldn’t disagree more. HDR 1000 is transformative and often revolutionary to me. The difference is startling. I’ve been PC gaming for over 20 years now. Been through it all in regards to monitors and nothing has compared to high end HDR. 1080 to 1440 gsync was a big deal but i think I would rather have High end HDR Over it.
 
Respectfully, I couldn’t disagree more. HDR 1000 is transformative and often revolutionary to me. The difference is startling. I’ve been PC gaming for over 20 years now. Been through it all in regards to monitors and nothing has compared to high end HDR. 1080 to 1440 gsync was a big deal but i think I would rather have High end HDR Over it.
What games/movies/shows does HDR shine the most?
 
What games/movies/shows does HDR shine the most?

My test case for HDR movies is Avengers: Infinity War (4k Blu-Ray), specifically 2 scenes.

  1. The introduction in "Space" for The Guardians of the Galaxy (If you don't like "Rubber Band Man", I officially hate you.)
  2. The first battle scene with Vision and Scarlet Witch, specifically the part where Vision fires off a beam through the "Mind Stone"

Both scenes are dark with flashes of light. Only OLED TVs and LED TVs with great local dimmin will do well here. Otherwise, it will either look muted or over-brightened.

My test case for gaming in HDR is Final Fantasy VII Remake. However, I have found that the only way this game looks good is to turn off the "game mode" of the TV and set local dimming to max (or for OLED, turn brightness to 100). For any game that requires fast reaction speed, I turn off HDR.

I have yet to find a TV show that I like for HDR. The Grand Tour does HDR pretty well, but not great.

Hope this helps. :)
 
My test case for HDR movies is Avengers: Infinity War (4k Blu-Ray), specifically 2 scenes.

  1. The introduction in "Space" for The Guardians of the Galaxy (If you don't like "Rubber Band Man", I officially hate you.)
  2. The first battle scene with Vision and Scarlet Witch, specifically the part where Vision fires off a beam through the "Mind Stone"

Both scenes are dark with flashes of light. Only OLED TVs and LED TVs with great local dimmin will do well here. Otherwise, it will either look muted or over-brightened.

My test case for gaming in HDR is Final Fantasy VII Remake. However, I have found that the only way this game looks good is to turn off the "game mode" of the TV and set local dimming to max (or for OLED, turn brightness to 100). For any game that requires fast reaction speed, I turn off HDR.

I have yet to find a TV show that I like for HDR. The Grand Tour does HDR pretty well, but not great.

Hope this helps. :)

I've already posted this before but FF7 Remake is not the best example of an HDR game.



COD:MW is actually what you want to use as an HDR benchmark



COD:MW for me is the one of the few HDR games that really blew me away, it's too bad not every game I played with HDR has given me the same feeling. I was extremely disappointed by Red Dead Redemption 2 and Monster Hunter World: Iceborne. HDR for gaming has been quite a hit or miss for me.
 
My test case for HDR movies is Avengers: Infinity War (4k Blu-Ray), specifically 2 scenes.

  1. The introduction in "Space" for The Guardians of the Galaxy (If you don't like "Rubber Band Man", I officially hate you.)
  2. The first battle scene with Vision and Scarlet Witch, specifically the part where Vision fires off a beam through the "Mind Stone"

Both scenes are dark with flashes of light. Only OLED TVs and LED TVs with great local dimmin will do well here. Otherwise, it will either look muted or over-brightened.

My test case for gaming in HDR is Final Fantasy VII Remake. However, I have found that the only way this game looks good is to turn off the "game mode" of the TV and set local dimming to max (or for OLED, turn brightness to 100). For any game that requires fast reaction speed, I turn off HDR.

I have yet to find a TV show that I like for HDR. The Grand Tour does HDR pretty well, but not great.

Hope this helps. :)
I plan to re-watch infinity wars so I will look for those scenes!

I was pleasantly surprised with the HDR in Carnival Row. There are a lot of very dark scenes with candlelight or torches. I watched it during the day in a bright room but this is one show that needs to be watched in a dark room!

I wish Picard was HDR on other platforms other than apple tv because I believe that would be a great HDR test.
 
What games/movies/shows does HDR shine the most?

I haven’t watched any movies in HDR but horizon zero dawn, god of war, far cry 5, forza horizon, Ac odyssey, etc. the list goes on. I have had both the x27 and pg27uq and many different oled and recent very high end LCD TVs. Something about the smaller 27” hdr monitors really kick it up a gear.....as much as I want the 48cx, I’m really looking forward to something like The PG32UGX once they come with hdmi 2.1
 
Last edited:
I've already posted this before but FF7 Remake is not the best example of an HDR game.



COD:MW is actually what you want to use as an HDR benchmark



COD:MW for me is the one of the few HDR games that really blew me away, it's too bad not every game I played with HDR has given me the same feeling. I was extremely disappointed by Red Dead Redemption 2 and Monster Hunter World: Iceborne. HDR for gaming has been quite a hit or miss for me.


FF7 Remake is not perfect, for sure, but what I like about it is that it's rendered to HDR 1000 and the majority of the game is dark scenes with bright highlights. This is completely subjective, but I think it looks amazing.

I've played it on both my CX and Q90R, and it looks gorgeous on both... again, in my humble opinion.

Also, IMO, while I'm sure Modern Warfare looks great in HDR, I would rather play it in SDR simply because it is a fast moving first person shooter. I don't want super deep blacks and extreme highlights. I'd rather have a scene which lumps information into a rather narrow color spectrum so that my eyes don't have to constantly adjust to the screen's brightness.
 
Have any of you seen The Man in the High Castle in HDR? It's used in the stupidest way. They have scenes where characters are inside a house in front of windows with the sun shining in full blast directly from the window so you have to squint your eyes to see the character talking. OLED contrast makes it possible to still see all the detail while the character is speaking but WTF. Just because you can doesn't mean you should.

There's gimmicky ways to use HDR, but HDR is not just a gimmick. HDR bright highlights can make a picture look much, much more realistic than what is capable with SDR. HDR is one of the few things that can still be vastly improved in screen technology.

Resolution is already at the point of vastly diminishing returns, and framerate is starting to get there too, but there isn't a screen even close to capable of the brightness and contrast to make you believe it's a fireplace.

In 30 years people will play those Christmas yule log videos and you'll only know it's playing on a TV because you don't smell the wood burning.
 
HDR is definitely the future and has no drawbacks. It is simply the new standard combination of colorspace/gamma (and rudimentary hardware quality assurance in standards greater than HDR400) that will replace older standards.
Previously, colorspace/gamma was based on the technical limitations of CRT. While CRT was amazing tech for it's time (nearly a century, more if you count experimentation/early products), it had severe limitations in this regard, and when other technologies like LCD started taking over, they had to initially emulate the CRT limitations in colorspace and gamma to fit in the market at all.
For the past 1.5-2 decades, some monitor manufacturers occasionally tried to sell wide gamut monitors (one of the puzzle pieces for HDR), but while technically better, the consumer ecosystem just wasn't ready for it and these were generally considered trash consumer-wise.

The move to HDR standards are basically the last death throes of CRT.

The hdr requirements can be found here:
https://displayhdr.org/performance-criteria-cts1-1/
As mentioned, HDR 400 requirements are so lax that they essentially mean nothing, hdr500-600 do have a requirement of corner black level which is probably why products with these standards are nearly non-existent (if you build one, you could probably build a HDR1000 for the same effort), and HDR1000 is where it really gets good.
Some have found ways to "cheat the system", like samsung with their 10 vertical zones on some products easily covering all four corners and the middle requirements for HDR1000.

Personally, after nearly 2 years of HDR1000 on PG27UQ, i hold HDR among the greatest impactful new technologies of the past decades like SSD's - g'sync - HDR.
 
Last edited:
HDR is as transformative as high refresh rate. Anyone who says otherwise is either blind or hasn't experienced the content on a capable display which together consist of 90% of people.

So you admit that for 90% of humans it's worthless and therefore less impactful than resolution and refresh rate.

Glad we're all in agreement!

Here's the thing. People admit that there's basically hardly any good HDR content. By definition, any tech that's completely content dependent is not going to be as impactful as tech that's content independent.

Resolution makes fucking 1993's Doom look better. Refresh rate makes 1993's Doom look better. It's content independent. It literally applies universally to all software.

HDR just isn't a big deal yet, and it won't be for probably a decade if not more.
 
So you admit that for 90% of humans it's worthless and therefore less impactful than resolution and refresh rate.

As for PC gamers, they still mainly play on monitors with crippled HDR support or even without any HDR support. No wonder why these people still don't care or even think it's worthless.

Here's the thing. People admit that there's basically hardly any good HDR content.

There is enough good HDR content that makes it absolutely worth it getting a capable display for it right now.

HDR just isn't a big deal yet, and it won't be for probably a decade if not more.

LOL.
 
You're reaching pretty hard to justify not getting glasses or spending money on a better display.
You know, my wife is blind -- I've been looking for lower resolution monitors for her :ROFLMAO:

[and threatening to drop her off at the laser surgery place...]
 
You know, my wife is blind -- I've been looking for lower resolution monitors for her :ROFLMAO:

[and threatening to drop her off at the laser surgery place...]

If you decide to do this, video the process... for science... the fact that she's beating your butt and it will be quite entertaining is definitely not the main reason to video her. ;)
 
If you decide to do this, video the process... for science... the fact that she's beating your butt and it will be quite entertaining is definitely not the main reason to video her. ;)
Just to throw this out there, such an action would not result in virtual violence in game, but physical violence...

I'm more interested in ensuring that she can do whatever work she wishes :).

(I saw a 34" 2560x1080 panel, marketed for gaming with 144Hz VRR, but also IPS with good color response and otherwise good specs -- to me, this would probably be perfect for her)
 
You're reaching pretty hard to justify not getting glasses or spending money on a better display.

Chances are though glasses won't help and you are clinically blind when it comes to PQ like many are. In that case enjoy that fact that a $150 display looks the same to you as a $1500 one.

On another note for people actually interested in this TV instead of crying and whining about HDR, B&H has preorders up:

https://www.bhphotovideo.com/c/product/1555062-REG/lg_oled48cxpua_cxpua_48_class_hdr.html

This means Amazon may come soon which is where I'm aiming to buy.

Oh, snap. Thanks for the heads up because I haven't been checking for this. I assumed that the whole COVID-19 thing was going to mean that these were delayed since things had been quiet.

I agree with the majority that HDR isn't fully mature yet, but it ain't gonna be 10 years. Anyone not buying a display with capable HDR is in denial or just uninformed. It's coming, and it's currently awesome when properly implemented like in the examples posted earlier.
 
So you admit that for 90% of humans it's worthless and therefore less impactful than resolution and refresh rate.

Glad we're all in agreement!

Here's the thing. People admit that there's basically hardly any good HDR content. By definition, any tech that's completely content dependent is not going to be as impactful as tech that's content independent.

Resolution makes fucking 1993's Doom look better. Refresh rate makes 1993's Doom look better. It's content independent. It literally applies universally to all software.

HDR just isn't a big deal yet, and it won't be for probably a decade if not more.



doom , quake and tombraider with polygon smoothing were all dependent on glide support at first though.. and opengl. Which also required a pass-through add-on gpu to do it properly. But mainly, games needed to be patched to get polygon smoothing so it was content dependent in that respect.

There are games that are modded to use raytracing too, including doom I think. And plenty of games that don't have it, so that again is content dependent.

Even high rez textures are not only dependent on having a native resolution high enough to view them on, but are dependent on whether the game has a high rez texture pack available or included (or modded).

I'll also mention that even though 120hz has been around awhile, many games were locked at 60fps or were "broken" at higher hz. Some still are.

---



I can remember when everyone was still watching VHS tapes. There were dvds starting to pop up for sale at circuit city, and a few rental places had some. There were very expensive DVD players, somewhat cost prohibitive back then, but creative had a DVD drive with a mpeg addon card that was more affordable if you already had a pc to install it into.. so guess what - I got that dvd drive and was watching "The 5th element", "Army of Darkness", and a few other movies and rentals over s-video in all their 720p glory.

Then people said the same thing about 1080p content. Not necessary, won't be a thing for years, etc. etc. I can remember when one of the few tv shows at 1080p was the tonight show over OTA antenna and there were few expensive blu-ray discs for purchase or rental at certain stores .. but guess what - I had a 1080p TV and was watching animal segments on the tonight show, renting or buying a few blurays and then I got VOOM satellite TV service with a handful of HD channels.

A lot of people crapped on 120hz gaming monitors at first. I was one of them. They were a lot more expensive. Not every game could even do 120hz without breaking physics., and without a high frame rate the benefits weren't appreciable. But guess what - I got a ~ $800 120hz 1080p samsung 27" on sale for $650 at one point and never looked back.

1440p was also considered frame rate killing, that and 27" monitors expensive and not so not worth compared to 1080p (and smaller monitors) at one point. G-sync/VRR was considered a gimmick.


Wiki:
"Theaters began projecting movies at 4K resolution in 2011.[58] Sony was offering 4K projectors as early as 2004.[59] The first 4K home theater projector was released by Sony in 2012.[60]

Sony is one of the leading studios promoting UHDTV content, as of 2013 offering a little over 70 movie and television titles via digital download to a specialized player that stores and decodes the video. The large files (≈40 GB), distributed through consumer broadband connections, raise concerns about data caps.[61]

In 2014, Netflix began streaming House of Cards, Breaking Bad,[62] and "some nature documentaries" at 4K to compatible televisions with an HEVC decoder. Most 4K televisions sold in 2013 did not natively support HEVC, with most major manufacturers announcing support in 2014.[63] Amazon Studios began shooting their full-length original series and new pilots with 4K resolution in 2014.[64] They are now currently available though Amazon Video.[65]

In March 2016 the first players and discs for Ultra HD Blu-ray—a physical optical disc format supporting 4K resolution and HDR at 60 frames per second—were released.[66]

On August 2, 2016 Microsoft released the Xbox One S, which supports 4K streaming and has an Ultra HD Blu-ray disc drive, but does not support 4K gaming.[67] On November 10, 2016 Sony released the PlayStation 4 Pro, which supports 4K streaming and gaming,[68] though many games use checkerboard rendering or are upscaled 4K.[69] On November 7, 2017 Microsoft released the Xbox One X, which supports of 4K streaming and gaming,[70] though not all games are rendered at native 4K.[71]"
https://en.wikipedia.org/wiki/4K_resolution#cite_note-:8-71

So 4k sets showed up for consumers in around 2013.. netflix started broadcasting some 4k content in 2014. , uhd discs in 2016, and 4k content was pretty easy to get in 2016 - 2017.


On April 8, 2015, The HDMI Forum released version 2.0a of the HDMI Specification to enable transmission of HDR. The Specification references CEA-861.3, which in turn references the Perceptual Quantizer (PQ), which was standardized as SMPTE ST 2084.[24] The previous HDMI 2.0 version already supported the Rec. 2020 color space.

On June 24, 2015, Amazon Video was the first streaming service to offer HDR video using HDR10 Media Profile video.[90][91]

On November 17, 2015, Vudu announced that they had started offering titles in Dolby Vision.[92]

On March 1, 2016, the Blu-ray Disc Association released Ultra HD Blu-ray with mandatory support for HDR10 Media Profile video and optional support for Dolby Vision.[62][63]

On April 9, 2016, Netflix started offering both HDR10 Media Profile video and Dolby Vision.[93]

On July 6, 2016, the International Telecommunication Union (ITU) announced Rec. 2100 that defines two HDR transfer functions—HLG and PQ.[38][39]

On July 29, 2016, SKY Perfect JSAT Group announced that on October 4, they will start the world's first 4K HDR broadcasts using HLG.[94]

On September 9, 2016, Google announced Android TV 7.0, which supports Dolby Vision, HDR10, and HLG.[80][95]

On September 26, 2016, Roku announced that the Roku Premiere+ and Roku Ultra will support HDR using HDR10.[96]

On November 7, 2016, Google announced that YouTube would stream HDR videos that can be encoded with HLG or PQ.[97][82]

On November 17, 2016, the Digital Video Broadcasting (DVB) Steering Board approved UHD-1 Phase 2 with a HDR solution that supports Hybrid Log-Gamma (HLG) and Perceptual Quantizer (PQ).[77][98] The specification has been published as DVB Bluebook A157 and will be published by the ETSI as TS 101 154 v2.3.1.[77][98]

On January 2, 2017, LG Electronics USA announced that all of LG's SUPER UHD TV models now support a variety of HDR technologies, including Dolby Vision, HDR10, and HLG (Hybrid Log Gamma), and are ready to support Advanced HDR by Technicolor.

On September 12, 2017, Apple announced the Apple TV 4K with support for HDR10 and Dolby Vision, and that the iTunes Store would sell and rent 4K HDR content.[48]


There was little DVD available , then lots of dvds and players became more affordable. There were few 1080p broadcasts and blurays were expensive, with small areas in rental stores with a handful of them. Same with 4k. It didn't take a decade in each case, I think that is being pessimistic about the timeline on HDR from today's date going forward.

The thing that is holding back HDR just like DVD, 1080p/Blu-ray, 4k .. 120hz gaming monitors, g-sync monitors, and perhaps even VR.. all at first.... is that people want cheap and profiteers want to sell cheap ass lcds to them. The fewer people buying into the tech, the less content is made for it (e.g. VR initially, it probably took a console-priced ~ $400 oculus quest to start breaking out). Also, the smaller adoption rates, the longer it takes to get standardized (see HD-DVD vs Blu-ray). HDR is taking a bit longer because mfgs are pushing bogus hdr tvs onto people, milking their inadequate-for-HDR displays as long as possible.
 
Last edited:
You're reaching pretty hard to justify not getting glasses or spending money on a better display.

Chances are though glasses won't help and you are clinically blind when it comes to PQ like many are. In that case enjoy that fact that a $150 display looks the same to you as a $1500 one.

On another note for people actually interested in this TV instead of crying and whining about HDR, B&H has preorders up:

https://www.bhphotovideo.com/c/product/1555062-REG/lg_oled48cxpua_cxpua_48_class_hdr.html

This means Amazon may come soon which is where I'm aiming to buy.

That's funny given that I'm going to be buying a CX when it comes out (but not for HDR). Can't wait to read the psychoanalysis of that statement.
 
I could immediately understand the superiority of HDR on a 10bit oled display, not sure why I’d need to convince myself it isn’t. Sony’s first party games and 4k/hdr animated movies really blew me away, and PC games are getting more consistent. I can certainly enjoy SDR content on my 8bit oled still however, when I have to. Honestly it’s ridiculous for anyone to argue otherwise.

Let’s create a new thread for the merits of HDR, shall we?
 
Back
Top