LG 48CX

Well you technically could use it at 120Hz BUT at 4:2:0 chroma which equally sucks for desktop usage. I'm just going to keep on using my Acer X27 as my desktop display for the foreseeable future while the CX gets used for all gaming.



Exactly. People just need to hold their horses and wait instead of crying about it when it doesn't even matter right now due to the lack of HDMI 2.1 cards in the first place.

Now have to wait on nvidia b/c LG hdmi 2.1 isn't 48Gbps, and wait on LG for uncompressed audio support fw updates if ever.

So Yep need to wait because LG fails to support full hdmi standards out of the gate. No buying this display "early" secure in it's hdmi 2.1 capabilities for me.

I'll have to consider a 55" C9 if they never deliver. I'm not paying $1630 for a (edit: "hobbled") hdmi 2.1 port TV when there are others with full support avaialble.

Hobble:
  1. tie or strap together (the legs of a horse or other animal) to prevent it from straying.
    • cause (a person or animal) to limp.
      "Johnson was still hobbled slightly by an ankle injury"
    • restrict the activity or development of.
      "cotton farmers hobbled by low prices"
 
Last edited:
Now have to wait on nvidia b/c LG hdmi 2.1 isn't 48Gbps, and wait on LG for uncompressed audio support fw updates if ever.

So Yep need to wait because LG fails to support full hdmi standards out of the gate. No buying this display "early" secure in it's hdmi 2.1 capabilities for me.

I'll have to consider a 55" C9 if they never deliver. I'm not paying $1630 for a gimped hdmi 2.1 port TV when there are others with full support avaialble.
The 2.1 standard allows for 10 Gbps per link. LG is still following the HDMI standard. There is nothing "gimped" about it (can we take that word out of our lexicon?) as the link speed still fully supports the features of the TV.
 
Now have to wait on nvidia b/c LG hdmi 2.1 isn't 48Gbps, and wait on LG for uncompressed audio support fw updates if ever.

So Yep need to wait because LG fails to support full hdmi standards out of the gate. No buying this display "early" secure in it's hdmi 2.1 capabilities for me.

I'll have to consider a 55" C9 if they never deliver. I'm not paying $1630 for a gimped hdmi 2.1 port TV when there are others with full support avaialble.

You are stuck waiting regardless because even if the LG TV DID support 48Gbps well guess what? Still no HDMI 2.1 cards out anyways so how did you plan on getting 444 10-12bit if you did buy the TV immediately at launch?o_O I really don't see the big deal in waiting. As for the HDMI audio thing, yeah I suppose that's the real bummer for those who use it.
 
I don't have the option of buying the CX TV "ahead of time" if it's not certain to have full uncompressed 10bit video source support and full uncompressed audio source support is one of my points. I'm not painting myself into a corner on a $1630 tv~monitor support wise and hoping for the best. If they don't come through on full support between them and nvidia on 10bit as well as audio standards pass-through I'd consider this TV hobbled compared to the C9 models and I won't buy it at all.

HDMI 2.1 was supposed to be sure thing and this display was supposed to break out of all of the previous bandwidth/support limitations in a slightly more digestible size. It's disappointing that these issues are even in question, and It'll be even more disappointing if it doesn't pan out in the long run.
 
I don't have the option of buying the CX TV "ahead of time" if it's not certain to have full uncompressed 10bit video source support and full uncompressed audio source support is one of my points. I'm not painting myself into a corner on a $1630 tv~monitor support wise and hoping for the best. If they don't come through on full support between them and nvidia on 10bit as well as audio standards pass-through I'd consider this TV hobbled compared to the C9 models and I won't buy it at all.

HDMI 2.1 was supposed to be sure thing and this display was supposed to break out of all of the previous bandwidth/support limitations in a slightly more digestible size. It's disappointing that these issues are even in question, and It'll be even more disappointing if it doesn't pan out in the long run.

The whole 10bit thing will get resolved with Ampere cards. If Ampere comes out and it doesn't then yeah everyone should definitely be raising their pitchforks at nvidia then, but for now seriously this whole complaints of no 444 10bit support needs to stop because nobody is getting it right now anyways. WAIT until Ampere cards come out and see whether or not 444 10bit supports comes with it before making anymore nvidia complaints. I agree the HDMI audio thing sucks even though I don't use HDMI audio myself so that's a valid thing to raise concerns about.
 
I don't have the option of buying the CX TV "ahead of time" if it's not certain to have full uncompressed 10bit video source support and full uncompressed audio source support is one of my points. I'm not painting myself into a corner on a $1630 tv~monitor support wise and hoping for the best. If they don't come through on full support between them and nvidia on 10bit as well as audio standards pass-through I'd consider this TV hobbled compared to the C9 models and I won't buy it at all.

HDMI 2.1 was supposed to be sure thing and this display was supposed to break out of all of the previous bandwidth/support limitations in a slightly more digestible size. It's disappointing that these issues are even in question, and It'll be even more disappointing if it doesn't pan out in the long run.
What is disappointing is people getting up in arms over something that will not even affect the performance of this TV.
 
What are you people complaining about. Who cares if it is 40gb bandwidth. Is 12 bit color going to make a difference in a 10 bit panel? Which these TVs are. The audio part sucks but can be fixed if LG ever gets around to it. I am 100% positive that the next gen cards from both Nvidia and and AMD will have HDMI 2.1 support. I can't wait to see it in all its glory on my C9.
 
Last edited:
What are you people complaining about. Who cares if it is 40gb bandwidth. Is 12 bit color going to make a difference in a 10 bit panel? The audio part sucks but can be fixed if LG ever gets around to it. I am 100% positive that the next gen cards from both Nvidia and and AMD will have HDMI 2.1 support. I can't wait to see it in all its glory on my C9.

People are just losing their minds because nvidia CURRENTLY doesn't support 10bit output through HDMI yet I think they fail to realize that even if nvidia DID have 10bit HDMI output right now at this very moment, it still doesn't freaking matter at all because you still can't get 10bit 444 without a damn HDMI 2.1 card. So instead of just waiting for the eventual support that we can all pretty much assume with good certainty that is coming with the next gen cards, they'd rather start raising their pitchforks at nvidia right now.
 
What are you people complaining about. Who cares if it is 40gb bandwidth. Is 12 bit color going to make a difference in a 10 bit panel? The audio part sucks but can be fixed if LG ever gets around to it. I am 100% positive that the next gen cards from both Nvidia and and AMD will have HDMI 2.1 support. I can't wait to see it in all its glory on my C9.
Not trying to continue to beat a dead horse but since you might not understand the 12bit/10bit concern on the CX once hdmi 2.1 gpus at 120hz 4k are out, I'll repeat it again:

-You can't show 10bit material on a 10bit display from a nividia card over hdmi without it being able to send 12bit instead. 10 bit over HDMI is not supported by nvidia currently.
-The LG CX will be unable to accept a 48Gbps 12bit 4k 444 120hz signal since it is only 40Gbps, so at that point you wouldn't be able to just send a 12bit 4k 444 signal anymore like you could at 60hz or lower.
-We are all assuming that nvidia will support sending 10bit 4k 444 120hz off of their hdmi 2.1 gpus and while it's likely, it hasn't been confirmed.

...Uncompressed video - 10bit 120hz 444 at native rez over hdmi is not certain as of now. That is required to run the source material 1:1 rather than dithering 8bit ~ banding/losing some source fidelity, DSC compressing it, lowering chroma, or lowering rez.
...Uncompressed audio format pass-through like other hdmi 2.1 tvs is also uncertain - so you can't get unaltered uncompressed audio sources via the TV's eARC like other hdmi 2.1 models

-------------------------------------------------.

So yes it'll be a long wait where otherwise "full" hdmi 2.1 support could have been certain for buying the CX "ahead of time" - before the hdmi 2.1 gpus drop (and the Ti GPUs I'd be interested in sometimes come out a bit later at that).

I might end up with a (55") C9 too vegeta. Will see how it goes time wise.
 
People are just losing their minds because nvidia CURRENTLY doesn't support 10bit output through HDMI yet I think they fail to realize that even if nvidia DID have 10bit HDMI output right now at this very moment, it still doesn't freaking matter at all because you still can't get 10bit 444 without a damn HDMI 2.1 card. So instead of just waiting for the eventual support that we can all pretty much assume with good certainty that is coming with the next gen cards, they'd rather start raising their pitchforks at nvidia right now.
That's how I understood it since I brought my C9. Been waiting for the 3080ti. I knew I wasn't getting the best out of my C9 til then.
 
I wonder if it's a better idea to just wait until November to buy. The thought of having to use it @ 60hz on desktop for 5-6 months makes me want to puke.
And using OLED TV for deskop with it's burin-in issues doesn't?
There will still be option to use 120Hz on desktop, just not at at 10bit
 
And using OLED TV for deskop with it's burin-in issues doesn't?
There will still be option to use 120Hz on desktop, just not at at 10bit

What does burn-in have to do with any of this? Why are people obsessed with randomly interjecting with burn-in? It's already been discussed 10x too many and if you care about this display at all, you don't give a shit about it.
 
What does burn-in have to do with any of this? Why are people obsessed with randomly interjecting with burn-in? It's already been discussed 10x too many and if you care about this display at all, you don't give a shit about it.


Right. I ran my C7 65" for the first year on a GTX 750 Ti. The chroma sub-sampling was a lot less noticeable than running the desktop at 1080p native 4:4:4.

I can barely tell the difference running my GTX 960 at 4:4:4 (but I'm viewing it fromten feet away). It would barely be noticeable at normal monitor distances (unless you're viewing photos.)

Video will already be Chroma sub-sampled, and video games are going to be hard to make out those missing details (moving images are harder to make-out).

If you don't view the same content constantly with high-contrast borders, you will never have burn-in on an OLED. In addition, to prevent burn-in from the Windows taskbar, I have a black screensaver setup.
 
Last edited:
What does burn-in have to do with any of this? Why are people obsessed with randomly interjecting with burn-in? It's already been discussed 10x too many and if you care about this display at all, you don't give a shit about it.

Or you have actual experience owning one and using it as a monitor for over two years, and you know that it's not an issue for your use case and even if DID happen, you can't stomach the thought of going back to dogshit LCD anyway. :wtf:
 
I'm going to use whichever one I get with a black wallpaper, transparent and hidden taskbar with even the minimized sliver removed via 3rd party app, and no desktop icons showing on the OLED. The OLED will be a gaming and media "stage", running games mostly but also movies/shows, youtube, game streams, even 4k+ art and image slideshows and perhaps some audio visualizations at times. I'll have other monitor(s) for static desktop/apps.

That's even if it has to end up being a C9 to be assured of uncompressed video and uncompressed audio.

Right. I ran my C7 65" for the first year on a GTX 750 Ti. The chroma sub-sampling was a lot less noticeable than running the desktop at 1080p native 4:4:4.

I can barely tell the difference running my GTX 960 at 4:4:4 (but I'm viewing it fromten feet away). It would barely be noticeable at normal monitor distances (unless you're viewing photos.)

Video will already be Chroma sub-sampled, and video games are going to be hard to make out those missing details (moving images are harder to make-out).

If you don't view the same content constantly with high-contrast borders, you will never have burn-in on an OLED. In addition, to prevent burnin from the taskbar, I have a black screensaver setup.

Your viewing distance would definitely matter with how visible degredation to the original source material would be when using 8bit, 8bit dithered, or lowering chroma .. something like like seeing the difference between 1080p, 1440p, 4k, and 8k native panels is relative to your viewing distance. Since these are going to be used as monitors by some of us at around 4' viewing distance rather than from 8' to 10' away at a couch that could be very visible (even then people notice and complain about banding though).

Lower chroma definitely affects text detail/fidelity and so will affect high detail textures and photos, etc. and as shown in example below it also affects graphics in games. Whether that bothers you or not is another matter.

--------------------------------------------------------------------------------------


8bit Banding vs 10bit on 10 bit content

"From Nvidia's own white paper in 2009: "While dithering produces a visually smooth image, the pixels no longer correlate to the source data "

From nvidia's studio drivers pages, showing the difference between 24bit (8bit) color banding and 30bit (10bit) color on a 10bit panel. You can use dithering to add "noise" as a workaround on 8bit in order to smooth/smudge/haze the banding out but it will degrade from 1:1 "lossless" source material fidelity.

"By increasing to 30-bit color, a pixel can now be built from over 1 billion shades of color, which eliminates the abrupt changes in shades of the same color. "

(Have to click the thumbnails to really see it)
346583_24-bit-32-bit-gfe.png

346584_300567_Colour_banding_example01.png

-------------------------------------------------------------------------
Chroma, Lowering Chroma, Movies, etc
-------------------------------------------------------------------------


Similarly, downgrading the signal to lower chroma no longer matches any 444 source material.
4:4:4 has no compression (so it is not subsampled) and transports both luminance and color data entirely. In a four by two array of pixels,
4:2:2 has half the chroma of 4:4:4
4:2:0 has a quarter of the color information

"smoother line edges mean that text on a contrasty background might look fuzzy or unfocused, but this is also an artifact that may be visible when comparing 4:2:2 to 4:4:4 video on a sharp computer monitor. "


BenQ:
" in anything that requires a lot of fine text the difference does emerge. That’s why PCs by default use full RGB, which employs 4:4:4 as its nominal sampling technique. Even 4:2:2 begins to make small letters appear smudged, and 4:2:0 has a rather apparent ghosting or rainbow effect around text. This is important if you choose a TV as your PC monitor – make sure the set and your PC operating system or graphics card driver all support 4:4:4. "


Rtings PC Text examples at different chroma on a samsung U7100 TV:

4k 60Hz PC MODE 4:4:4 (pic)

4k 30Hz Non-PC Mode 4:2:2 (pic)

4k 60Hz Non-PC Mode 4:2:0 (pic)


------------------------------------------------------------------------------
Game Pics at different chroma in this post from 2016 (none are HDR)
https://www.neogaf.com/threads/how-important-is-4-4-4-chroma-to-you.1308591/#post-223143263

Colorcomp.jpg

Sure you could deal with it. We could use lower native resolution displays like 1080 or 1440p similarly.. but we still want higher resolution displays right? And HDR when posssible. I know I do.

Roboto_Comparison_400x.png


-----------------------
Movies
-----------------------


UHD Movies can be 10Bit, 10Bit HDR. Movies are mastered in 4:2:2 but distributed via 4:2:0 (so far). Most are also mastered at 10,000nit HDR but released at HDR 1000 or HDR4000 (for now).

----------------------------------------------------------------
http://www.acousticfrontiers.com/uhd-101-v2/
  • 12 bit color. This refers to the color bit-depth. Old Blu-Ray is 8 bit, new UHD Blu Ray can support 10 bit. It’s not clear what bit depth is being used by the streaming services, but it’s probably 8 bit.
  • 4:4:4. This refers to color sub-sampling. Whilst 4:4:4 is used in content mastering, UHD is distributed via 4:2:0.
Historically the source upsamples to 4:2:2, which is sent over HDMI, and then the display upsamples to 4:4:4 and converts to RGB. The reason for this sequence is that HDMI v1.4 and previous iterations did not support 4:2:0. HDMI2.0 does support 4:2:0 though only at 50/60 frames per second (FPS). At 24 FPS 10 bit only 4:4:4 and RGB are supported.

--------------------------------------------------------------
Linked Comment from Reddit /r Monitors
On the industry side everything is 4:2:2, except in the VFX world, where everything is 4:4:4:4 (fourth channel being alpha, and full-range RGB being used instead of Rec 601/709/2020 YCbCr) and some chroma key workflows (though 4:2:2 is good enough for a clean key 95% of the time). 4:4:4 is becoming more prevalent with the increasing availability of RAW formats, but it's not regularly worked in.
Most major TV studios still finish in 4:2:2, though some air masters were 3:1:1 on HDCAM until recently. It's not until it hits distribution that it gets knocked down to 4:2:0.
Source: been doing video engineering work for several years, including work on major network and internationally distributed pieces. 4:2:2 has been standard since the introduction of Meridien and DVCPRO 50. Every place I've ever delivered to wants ProRes 422 HQ, but if you sweet talk BO&E they'll begrudgingly accept XDCAM50 until you can push a ProRes or DCP-wrapped JPEG-2000 archival master.
 
Last edited:
LG CX OLED TV Unboxing + Picture Settings - HDTVTest 3 Hrs ago

 
Even if the 3000 series doesn't support 10 bit over HDMI and all the other stupid issues people are complaining about with this TV it will still the best PC gaming display you can get by far. Your only alternative is a 55" C9.
Downhyping it and pretending like you aren't going to get it is just a ploy to deter demand enough that you can get one yourself. It's going to sell out like crazy.

I don't even want one until a 3000 series comes out but I feel like I'll need to pre-order as soon as it goes on sale just to get one by then.
 
Some people requested a C9 and CX comparison video in the comments of the HDTVTest video I posted a link to earlier. Vincent the HDTVTest guy asked for suggestions at the end so I'm looking forward to the possibility of that to compare things like their tone mapping, detail enhancement in dark areas, selective near-black grey's dithering (or lack thereof) as a workaround to prevent flashing , how their 8bit dithered banding compares, etc.

I'll see what LG OLEDs in C9 and CX are available around Black Friday time and if "3080 Ti" are out yet and their hdmi 2.1 support reviewed. A 55" C9 could indeed be my choice depending how it all pans out since they have 48Gbps for a 12bit "guaranteed 10bit" signal off of nvidia gpus and according to Rtings the C9 already supports full uncompressed hdmi audio formats over eArc. I'd have to make adjustments for an even farther away desk island and would probably lose a little more desktop real-estate on other monitor(s) in the array since I'd probably need to bump up the scaling a bit more too. 48" would be prefered but I'll have to wait to see what's up with whether full uncompressed video and full uncompressed audio source material support (1:1 without alteration) is avaialble on the CX by then.

vz9rNOX.png


I'm absolutely not playing any downplaying games. The hdmi 2.1 feature support I mentioned on the CX is important to me, it's not stupid. Stupid to me would be buying the CX before support of both uncompressed video (at native 120hz 4k 10bit 444) and uncompressed audio formats unaltered are both confirmed to be supported eventually. <crickets>
I don't have the option of buying the CX TV "ahead of time" if it's not certain to have full uncompressed 10bit video source support and full uncompressed audio source support is one of my points. I'm not painting myself into a corner on a $1630 tv~monitor support wise and hoping for the best. If they don't come through on full support between them and nvidia on 10bit as well as audio standards pass-through I'd consider this TV hobbled compared to the C9 models and I won't buy it at all.

HDMI 2.1 was supposed to be sure thing and this display was supposed to break out of all of the previous bandwidth/support limitations in a slightly more digestible size. It's disappointing that these issues are even in question, and It'll be even more disappointing if it doesn't pan out in the long run.
 
Some people requested a C9 and CX comparison video in the comments of the HDTVTest video I posted a link to earlier. Vincent the HDTVTest guy asked for suggestions at the end so I'm looking forward to the possibility of that to compare things like their tone mapping, detail enhancement in dark areas, selective near-black grey's dithering (or lack thereof) as a workaround to prevent flashing , how their 8bit dithered banding compares, etc.

I'll see what LG OLEDs in C9 and CX are available around Black Friday time and if "3080 Ti" are out yet and their hdmi 2.1 support reviewed. A 55" C9 could indeed be my choice depending how it all pans out since they have 48Gbps for a 12bit "guaranteed 10bit" signal off of nvidia gpus and according to Rtings the C9 already supports full uncompressed hdmi audio formats over eArc. I'd have to make adjustments for an even farther away desk island and would probably lose a little more desktop real-estate on other monitor(s) in the array since I'd probably need to bump up the scaling a bit more too. 48" would be prefered but I'll have to wait to see what's up with whether full uncompressed video and full uncompressed audio source material support (1:1 without alteration) is avaialble on the CX by then.

View attachment 245109


I'm absolutely not playing any downplaying games. The hdmi 2.1 feature support I mentioned on the CX is important to me, it's not stupid. Stupid to me would be buying the CX before support of both uncompressed video (at native 120hz 4k 10bit 444) and uncompressed audio formats unaltered are both confirmed to be supported eventually. <crickets>

Not stupid if I always intended to use 4k120Hz 8bit SDR :)
 
Not calling you or anyone else stupid at all.
I'm saying my priorities for "full" hdmi support for both uncompressed source video and audio 1:1 on a display that would cost me $1630 out of pocket are not stupid after someone else called them "stupid issues". Considering they are important to me, it would be "stupid"~unwise of me to buy ahead of time hoping for support to pull through - with zero confirmation or even mention afaik from the mfgs about a full uncompressed video support pipeline (nvidia/LG), and uncompressed audio formats pass-through (LG). Especially if it turns out either/both don't get supported fully and/or timely after 3080ti drops while a C9 does have those capabilities already. I'll have to wait it out and see.
 
1. We don't know if the LG 48CX will be good. We're just basing it on listed specs and what the C9 are like. 2. Nvidia has not said one word about 2.1 support in the next (3000) line of cards. So many people are just hoping for these things and forgetting it doersn't yet exist. My own opinion is to wait and see if these two things actually happen. Then get excited. Pretty soon we're going to find out about the LG 48CX but for GPUs. I'm thinking Q4 but that remains to be seen or not.
 
1. We don't know if the LG 48CX will be good. We're just basing it on listed specs and what the C9 are like. 2. Nvidia has not said one word about 2.1 support in the next (3000) line of cards. So many people are just hoping for these things and forgetting it doersn't yet exist. My own opinion is to wait and see if these two things actually happen. Then get excited. Pretty soon we're going to find out about the LG 48CX but for GPUs. I'm thinking Q4 but that remains to be seen or not.

Yeah, waiting sucks. Either way, I'll still get a 3080Ti and a CX unless the C9 55 drops to a point that I just go with that, wall mounting either way. 48 and 55 are both in my opinion not worth having on a desk surface. I'll just be disappointed in Nvidia, but that's already something I'm used to.
 
Yeah, waiting sucks. Either way, I'll still get a 3080Ti and a CX unless the C9 55 drops to a point that I just go with that, wall mounting either way. 48 and 55 are both in my opinion not worth having on a desk surface. I'll just be disappointed in Nvidia, but that's already something I'm used to.
Based off how previous models went the '19 models will be only a few hundred cheaper then the '20 ones. Keep checking Slickdeals around fall time. I got my 65" c9 for $1800 shipped. Which was about the best price I seen for it. The '18 models were around $1500 for a 65".
 
Look it's a bummer that the LG 48CX doesn't fully support every single feature of HDMI 2.1 but let me ask you this....what similar sized TV actually DOES? Hell some TV's coming out in 2020 still uses HDMI 2.0b and don't even have VRR!! So what exactly are your alternatives to the CX if you want a full fat HDMI 2.1 loaded up TV? or ANY HDMI 2.1 display whether it's a TV or monitor AT ALL that has every single feature? NONE. zip. zilch. I don't even see a single PC monitor in existence that even has HDMI 2.1 period. So yeah, LG may suck but so does every other TV and monitor maker out there equally. If you want the perfect display you'll just have to keep waiting. For me the 48CX is just another stopgap to something like a desktop sized MicroLED monitor just like my Acer X27 was.
 
Yeah, waiting sucks. Either way, I'll still get a 3080Ti and a CX unless the C9 55 drops to a point that I just go with that, wall mounting either way. 48 and 55 are both in my opinion not worth having on a desk surface. I'll just be disappointed in Nvidia, but that's already something I'm used to.

I'm definitely taking a hard look at the C9 as an option now. Hopefully there will be a C9 to CX comparison video by HDTVtest, and awaiting the RTings CX review, and seeing how a 3000 series gpu works with both by november hopefully.

If you go back to last november in slickdeals searching for 55 C9 OLED you'll see there were several times they went for $1200 from authorized resellers (when brick and mortar stores were charging $1600 - $1800 I think). The pricing could have been due to overproduction back then so not counting on extreme sale prices but historically near year end ~ black friday price drops around 22% on the current year's models. So potentially $1500 x .22 = $330 off ~> $1270 for the 48CX if there are some flash in the pan BF deals in november. The 55" C9 is $1500 right now so it should have a decent drop by end of the year, I'd expect it to hit $1200 - $1300 again at least if not better.

The pandemic could however bring up different factors like supply chain issues or excuses to price gouge between that as an excuse and perhaps standing pat on prices counting on cashing in on stimulus checks ... or joblessness and economic fallout going the other way with pricing incentives ... either way could end up breaking out of the normal year end and BF patterns +/- who knows at this point.
 
Last edited:
AFAIK LG has change the pixel structure somewhat on their OLEDs the last couple of years, does this have an effect on what optimal "Windows font settings" to use for sharp text? Main reason for asking is that I have a feeling that my 55" C7 renders text better than my 55" GX. which was not what I expected. Now, due to physical reasons, I have not been able to actually compare them side by side but still have that distinct feeling when both are sent the same signal. Could it be that I need to change my settings with something like Better ClearType or similar to make text on the GX look better? On the C7, I find that RGB ClearType actually look better than Grayscale which is not inline with what is discussed here.

Also, what about contrast values for ClearType? I found that increasing it from default 1200 to something like 2000 make text also look better (subjectively speaking), more like Grayscale but still in RGB.
 
Last edited:
What is the actuall difference between changing OLED light and changing energy options? Both seem to make the screen dimmer/brighter but I would assume there must be a difference to them?

How do we actually know that the 2019 OLEDs does support 48 Gbit/s, have LG stated it officially? I'm guessing that noone has actully succeded in sending such a signal yet to the TV so we know that way?
 
AFAIK LG has change the pixel structure somewhat on their OLEDs the last couple of years, does this have an effect on what optimal "Windows font settings" to use for sharp text? Main reason for asking is that I have a feeling that my 55" C7 renders text better than my 55" GX. which was not what I expected. Now, due to physical reasons, I have not been able to actually compare them side by side but still have that distinct feeling when both are sent the same signal. Could it be that I need to change my settings with something like Better ClearType or similar to make text on the GX look better? On the C7, I find that RGB ClearType actually look better than Grayscale which is not inline with what is discussed here.

Also, what about contrast values for ClearType? I found that increasing it from default 1200 to something like 2000 make text also look better (subjectively speaking), more like Grayscale but still in RGB.
And this is actually in a way very relevant post !!!!!!!!!!!!!
It seems LG uses RGBW pixel structure on their OLED panels making them less than perfect for desktop... if burn-in was not enough reason this idea was silly anyway...

Not really sure about C7 and GX and C9 and especially CX but generally text rendering with normal ClearType won't be perfect. Generally for ClearType you want sRGB screen, not even wide gamut RGB pixel layout will give intended results if your screen have wide gamut.

For superior text rendering you can use program called MacType.
It unfortunately does not work with all web browsers/programs. I personally use it and for web browser use Cent browser (Chromium based) which is compatible. Previously I used Ezgdi and even earlier Gdi++. All these programs replace GDI font rendering with Freetype2 font rendering.
I highly recommend trying it out but it requires some tinkering with settings for best result (though "iOS" profile in MacType is near perfect and what I currently use) and web browser configuration. It also have some performance impact. Not really such a big issue but there it is.
 
Based off how previous models went the '19 models will be only a few hundred cheaper then the '20 ones. Keep checking Slickdeals around fall time. I got my 65" c9 for $1800 shipped. Which was about the best price I seen for it. The '18 models were around $1500 for a 65".

Same here actually, right around the holidays. The 48" is a great size but considering I'll wall mount regardless, I could make a C9 55" work. Putting a couch in my office behind my desk chair so I could game back there if need be to be further back.
 
  • Like
Reactions: elvn
like this
You can also put it on a slim metal belt like pillar stand with a foot on it that a few of use linked examples of earlier in the thread if you don't want to wall mount it. Some of those stands seem pretty nice and can probably be adjusted more as well as the obvious modularity vs bolting to a wall.

--------------------------------------------------------------------------------------------------------------
A little info on how the WRGB is in general, not specifically in regard to text clarity:
---------------------------------------------------------------------------------------------------------------

From what I remember, LG uses an all (yellow and blue emitter combo of) white OLED sub pixel array through color filters R, G, B, and one transparent so the white shows through as the W. They probably use the additonal white oled as a "cheat" or "hack" ~ workaround if you prefer, to get higher brightness levels in effect since OLED are so limited in brightness vs burn in, especially higher HDR color brightness levels. It's part of the nature of why OLED's burn in risk is so low and why LG was able to dominate OLED.

https://www.oled-info.com/lgs-wrgb-oled-tv-sub-pixels-captured-macro-photo

"Digital Versus managed to get a macro photo showing LG's WRGB OLED TV sub pixels. LG's structure uses four white sub pixels (made from yellow and blue emitters) with color filters on top: white (unfiltered), red, green and blue. As you can see from the photo, the white sub pixel is actually larger than the colored ones. The white sub pixel is added to increase brightness and efficiency. "

0yI05dH.jpg


---------------------------------------------

OLEDW.gif

---------------------------------------------------

https://www.avsforum.com/forum/40-o...oes-wrgb-oled-produce-artifacts-pc-usage.html
comment excerpts:

"what the white pixel does is boost the light output of the OLED. As it increases in intensity, it begins to dilute the output from the R,G & B pixels, so colour is less saturated than it should be at higher output levels. This doesn't mean anything looks funky about the image, it just means that next to a true RGB display of similar capability an element like the sun will retain more of a yellow hue than would be seen on the OLED. "

"You wouldn't realise it just watching the content, but side by side with an otherwise like for like RGB display you would. I wouldn't call it a major flaw, just a quirk of the technology. It's because LG went the WRGB route instead of straight RGB that they're currently in the position they're in and their competitors had to abandon OLED production, so it was a smart choice.

The 2019 LGs will allow you to switch off the white subpixel (on current sets you can do it through the service menu) in order to retain colour saturation at the high-end, but peak brightness would be reduced to around 400 nits. Preferable to leave the white subpixel to do its thing for a better HDR experience. "

"
Not exactly--the white subpixel operates even in SDR mode, but it does not desaturate for the sake of brightness until you get to the really high HDR brightness levels. It's just used as another subpixel along with RGB to create a certain color like RGB alone would be combined on a more traditional display. The new feature in the 2019 OLEDs will turn off "white boost" that sacrifices saturation for the highest levels of brightness, as you say, but does not turn off the white subpixel itself."
-----------------------------------------------

RTings C9 Review:

Like all other OLEDs, the C9 uses 4 sub-pixels, but all 4 are never used at the same time. This image shows the red, white, and blue sub-pixels. You can see the green sub-pixel in our alternative pixel photo.

"The LG OLED C9 is an excellent TV to use as a monitor, as it has an excellent low input lag and outstanding response time, making your desktop experience feel responsive. However, care should be taken to avoid static user interface elements being displayed for a long time, as there's a risk of permanent burn-in. The TV can also display chroma 4:4:4 properly and it has wide viewing angles, so the image remains accurate even if you sit up close. "

--------------------------------------------------------------------

https://forums.overclockers.co.uk/t...-crazy-yes-time-to-get-some-burn-in.18866316/ (September 2019)

"reducing sharpness to zero made the text go from looking horrible to awesome. Also disable true motion and enter game mode."
---------------------------------------------------------------------

-- Not sure if that overclocker's forum guy was running 4:4:4 chroma or not to start with though. Plenty of people here have already been running OLEDs with text so have personal experience. The overclocker's link goes into a lot of other details in a min review too if anyone is interested.
 
1. We don't know if the LG 48CX will be good. We're just basing it on listed specs and what the C9 are like. 2. Nvidia has not said one word about 2.1 support in the next (3000) line of cards. So many people are just hoping for these things and forgetting it doersn't yet exist. My own opinion is to wait and see if these two things actually happen. Then get excited. Pretty soon we're going to find out about the LG 48CX but for GPUs. I'm thinking Q4 but that remains to be seen or not.
What do you think about Nvidia not commenting on 2.1 support? We should find out tomorrow at their "Get Amped" event
 
What do you think about Nvidia not commenting on 2.1 support? We should find out tomorrow at their "Get Amped" event

Aren't they only going to talk about server stuff or Quadros/Teslas? Typically those cards don't even have an HDMI output, just a bunch of Displayports.
 
What do you think about Nvidia not commenting on 2.1 support? We should find out tomorrow at their "Get Amped" event
Mostly, business as usual. They're unlikely to release information about, let alone hype, unreleased products while they still have inventory to move. And that's just business, something Nvidia is quite good at.
 
AFAIK LG has change the pixel structure somewhat on their OLEDs the last couple of years, does this have an effect on what optimal "Windows font settings" to use for sharp text? Main reason for asking is that I have a feeling that my 55" C7 renders text better than my 55" GX. which was not what I expected.

Also, what about contrast values for ClearType? I found that increasing it from default 1200 to something like 2000 make text also look better (subjectively speaking), more like Grayscale but still in RGB.
Not really sure about C7 and GX and C9 and especially CX but generally text rendering with normal ClearType won't be perfect. Generally for ClearType you want sRGB screen, not even wide gamut RGB pixel layout will give intended results if your screen have wide gamut.

I would like to reference my post that I made back on page 7 regarding "Better ClearType Tuner":
https://hardforum.com/threads/lg-48cx.1991077/page-7#post-1044458873

Info: https://github.com/bp2008/BetterClearTypeTuner
Download: https://github.com/bp2008/BetterClearTypeTuner/releases
 
Based on my 65" LG C9, you want to try the different options on Better Cleartype Tuner. You might like the grayscale over RGB. Otherwise the TV should be set to PC mode or you will get some green fringing on text that can be hard to see if DPI scaling is used. I don't know if this is different on CX.
 
  • Like
Reactions: elvn
like this
Pretty sure the yellow/blue parts are mixed up. Converting yellow to blue is not even really possible in the realm of lasers outside of (possibly) sum-frequency generation let alone LEDs to my knowledge.
You would typically use a blue pump source via stokes conversion to yellow/white light... just like any white LED or otherwise that's not a SLED or supercontinuum ASE source lol (not applicable here).

Basically the only negatives to this screen so far are nvidia being fags about 10bit HDMI, LG being fags about 48Gbps, Audio bullshit, a very perceptual/per-user text clarity issue (seems mostly to be a non-issue) and burn in.

Solutions: Use AMD if they make something good enough, run 10bit, run direct audio feed via 2nd HDMI cable and be aware of content/static use.

Really, compared to shitty clamping, poor VRR, overpriced 15" 4k TN panel bullshit, shitty QC, low colour clarity, LCD panels etc etc, this thing is a godsend. If the above issues are the worst things, really.. it's not impossible to mitigate most of them. The panel is great to begin with and that is the weakest point of most screens.
 
I just wanted to let you all know I've been following along as this thread has grown. Never imagined it would've become THE thread for 48CX discussion here at [H] Displays. Lots of good discussion and reference material for visitors from across the web, as I see we've attracted a few.

I've been working from home (my coronavirus command center as I like to call it) during these times, and I hope everyone's been staying well as we patiently wait for nirvana.

NVIDIA's GTC keynote will be posted on YouTube today around 6PM pacific time, so lets see if Ampere will come with the goods we're all hoping for.
 
How Would you guys use this with a adjustable height standing desk? Just on the desk or some kind of wall mount or poll mount behind the desk?

i’m thinking I would need a secondary LED or LCD monitor for document production for work and maybe wall mount this above? Not sure I would be comfortable using this for drafting documents because of burn in.
 
Pretty sure the yellow/blue parts are mixed up. Converting yellow to blue is not even really possible in the realm of lasers outside of (possibly) sum-frequency generation let alone LEDs to my knowledge.
You would typically use a blue pump source via stokes conversion to yellow/white light... just like any white LED or otherwise that's not a SLED or supercontinuum ASE source lol (not applicable here).

Basically the only negatives to this screen so far are nvidia being fags about 10bit HDMI, LG being fags about 48Gbps, Audio bullshit, a very perceptual/per-user text clarity issue (seems mostly to be a non-issue) and burn in.

Solutions: Use AMD if they make something good enough, run 10bit, run direct audio feed via 2nd HDMI cable and be aware of content/static use.

Really, compared to shitty clamping, poor VRR, overpriced 15" 4k TN panel bullshit, shitty QC, low colour clarity, LCD panels etc etc, this thing is a godsend. If the above issues are the worst things, really.. it's not impossible to mitigate most of them. The panel is great to begin with and that is the weakest point of most screens.

I was working from quotes saying that was how they got the white array as a base for WRGB color filtration. Many sites detail the yellow and blue emitters. To be clear, they are using a yellow emitter, and a blue emitter. They aren't converting yellow to blue.
"Digital Versus managed to get a macro photo showing LG's WRGB OLED TV sub pixels. LG's structure uses four white sub pixels (made from yellow and blue emitters) with color filters on top: white (unfiltered), red, green and blue. As you can see from the photo, the white sub pixel is actually larger than the colored ones. The white sub pixel is added to increase brightness and efficiency. "

-------------------------------------------------------------------
https://www.cnet.com/news/what-is-oled-and-what-can-it-do-for-your-tv/ November 2019

Yellow plus blue makes green (and red and cyan and magenta)
Currently, all OLED TVs are made by LG, and how they've made them is rather unusual. All TVs, to create the images you watch, use red, green, and blue mixed together to create all the colors of the rainbow (well, not quite all colors, but most). To create the colored light, LCDs use RGB color filters, while plasmas used RGB phosphors and Samsung's short-lived OLED TV (and all their OLED-screened phones use red, green and blue OLED elements.

LG's OLED only use two colors: a sandwich of blue and yellow OLED materials. Then, using color filters, the yellow and blue light is filtered to create red, green and blue. To add a bit more brightness, there's also a clear "white" element, too. It's a lot easier if I show you:


The steps to create an image with LG's OLED.
lmEPGCX.png

A yellow OLED material creates yellow (i.e. red and green) light. When combined with blue (1), this creates "white" light (2). Using color filters (3) the desired sub-pixel color (including clear/white) is created (4).
Geoffrey Morrison/CNET

Though this seems odd and convoluted, it obviously works since LG is the only company that has successfully marketed large-screen OLED TVs in any numbers. This is because it's more cost-effective to make ("more" being the key word there).

The apparent downsides, such as light output and color accuracy, don't seem to be issues. Sure, they're not as bright as the brightest LCDs, but they are still very bright, and the current models have the same color saturation as the best LCDs.


-----------------------------------------------------------------
https://www.oled-info.com/reports-say-lgd-aims-change-its-woled-tv-structure-yb-rgb Dated article but details the tech.

Reports from China suggest that LG Display is considering changing the basic structure of its white OLED panels (WOLED) used in LGD's OLED TVs. LGD is currently using yellow and blue OLED materials to create a white OLED, but now LGD may switch to an RGB based mix.

It's not clear from the Chinese reports (which are unverified yet, of course) - but it's likely that LGD will not switch to a direct-emission RGB structure, but rather use the RGB materials to create a white OLED and remain with a color-filter based design. Switching from Y/W to R/G/B may enable LGD to achieve higher color purity - and so a larger color gamut, and may also be more efficient.

LGD's WRGB architecture - which creates 4 sub pixels using color filters (reg, green, blue and non-filtered) to create a colored image from a single white OLED pixel - is less efficient and less color-pure compared to a real RGB sub-pixel architecture, but WOLED displays are much easier to produce as there's less need for subpixel patterning.


------------------------------------------------------------
https://www.oled-info.com/qa-cynoras-ceo-discuss-companys-new-blue-5-emitter March 5, 2020

-------------------------------------------------------------

The main point being they are all made into a white array with a clear unfiltered (white) and, r, g, b subpixel filter above. I wasn't listing it as a negative outside of the highest HDR brightness levels perhaps. Rather showing that they likely utilize the additional large white clear spot in the fitler as a subpixel in order to boost effective brightness to our eyes without having to boost the output of the OLED emitters as much. Cutting down on the actual brightness levels and heat of the OLEDS while still getting effective color brightness levels is probably one of the reasons that burn in isn't as much of an issue on LG OLEDs, especially with the brightness levels ~ %windows they are able to hit in HDR now. Before LG developed this method, I believe it was that non-WRGB OLED's blue subpixel emitter would wear out unevenly, much earlier than the others and as I said, the overall output per color brightness probably had to be more without the added large white subpixel.

As the reviews I quoted said, LG's multi layer WRGB tech is one of the reasons LG OLED are reliable enough and why LG OLED became dominant so I wouldn't consider it a negative overall. It's a neat ~ "hack" workaround technology as the main tool vs. burn-in danger levels and OLED wear, and it works. (Further protections using ABL, %window restrictions, pixel shifting, logo detection ~dimming, OLED wear evening cycles, etc. too of course). The white subpixel structure was however brought up by others in the thread as a potential negative - in regard to text rendering for regular desktop use which is why it's been added to the conversation currently in that respect.
 
Last edited:
Back
Top