LG 48CX

It's also worth noting that this 10bit 4:4:4 limitation does not exist on AMD GPUs.

So if RDNA2 scales up well from console-sized "big APU" chips to full-fat discrete GPUs, then that'd certainly be an option.

And hey, we all know that [current year whenever you're reading this] is the year of the LInux desktop, and those kernel-level drivers makes AMD GPUs a much better proposition than Nvidia (not to mention that AMD's Linux drivers seemed to avoid the wonky 5700/XT driver woes).
 
  • Like
Reactions: N4CR
like this
It seems rather pointless to worry about when we don't even have HDMI 2.1 GPUs yet. Personally I'm not buying anything for the desktop based around HDMI 2.1 until I see the GPU announcements.
always something around the corner....

https://www.tweaktown.com/news/7229...d-prevent-them-from-getting-bigger/index.html
According to a new report from DigiTimes, NVIDIA has "already pre-booked TSMC's 5nm production capacity in 2021" while "also in discussion with Samsung for smaller volume orders". The reason? To "prevent AMD from getting any bigger, NVIDIA has decided to catch up, even leapfrog AMD, in adopting TSMC's 7nm and 5nm EUV nodes".

Now remember, there's no one going on the record here so take this all with a grain of salt. There are some big power moves going on, I'm sure we can all see that by now -- but this is all just so very interesting.

If LG OLED displays didn't have hdmi 2.1 and 7nm hdmi 2.1 GPUs weren't imminent I wouldn't even consider a display upgrade in 2020 personally.
 
It's also worth noting that this 10bit 4:4:4 limitation does not exist on AMD GPUs.

So if RDNA2 scales up well from console-sized "big APU" chips to full-fat discrete GPUs, then that'd certainly be an option.

And hey, we all know that [current year whenever you're reading this] is the year of the LInux desktop, and those kernel-level drivers makes AMD GPUs a much better proposition than Nvidia (not to mention that AMD's Linux drivers seemed to avoid the wonky 5700/XT driver woes).

Has anyone tested 4k120 444 10bit on an AMD card? I have an HTPC that I stuck a 1060 in for VP9 support going to my CX65 (bought before aware of 40gps nerf), now I'm wishing I went AMD. Have to drop 4k to 24hz to get RGB/4:4:4 at 12bit (10bit not selectable).

Also I saw some people discussing this early but nobody gave their results - if you duplicate display and send one HDMI out to CX and one HDMI out to AVR does this work fluidly?
 
Last edited:
It's been confirmed AMD cards do 10-bit just fine it's just some type of BS software lock Nvidia does for their gaming GPU's (for HDMI only, not displayport) unless you pony up for their professional line.

I know this is a mute point if Nvidia RTX 3000 series don't come out this year or don't fully support HDMI 2.1 anyway but it's really starting to seem that the CX is actually a downgrade from the C9. LG even took out the DTS license in the CX which is a big problem for a lot of people who had like terabytes of ripped movies on a USB drive which no longer plays the proper audio.

I have no intention of using the better CX BFI which lowers the brightness of the display as I want max brightness. I really doubt the slightly faster processor is really doing anything for PC gaming PQ when all that BS electronics is still is what PC gamers avoid to reduce input lag & artifacts. I'm pretty sure using game mode reduces most dependency on the internal processor for "enhanced" effects. Most of my current & future games I'm currently playing also support HDR so I don't think I would take advantage of the CX's 4:2:0 8-bit 4K 120hz mode that can be done on HDMI 2.0 unlike the current C9 as I don't believe you can even HDR in this setting.

Still waiting for some better reviews such as Rtings but the current reviews do not show any improvements besides the usual margin of error or placebo effect. I was dead set on even spending additional $ to change my 55" C9 to a 48" CX even though this would seem like a downgrade or a sidegrade to most but again not so sure anymore.
 
Last edited:
  • Like
Reactions: elvn
like this
it's really starting to seem that the CX is actually a downgrade from the C9
The big draw for most of us is that it comes in 48" size. A lot of us, including me, would even prefer it smaller at 43" and some like 32 - 40". Jumping from already making a lot of viewing distance compromises in order to use a 48" comfortably to considering a 55", including not just the physical size but the ppi to your perspective at what viewing distance, is a considerable difference.
Oh1jlc2.png

854lCi4.png
72svAR0.png

Regarding your questions about NVIDIA gaming gpus not sending 10bit but being able to send 12bit - hdmi 2.1 does include DSC (display screen compression) support which can compress a 12bit signal into 10bit. I don't know exactly how it works. It would be nice if we could set the nvidia GPU to output 12bit and the DSC was able to automatically compress a nvidia 12bit signal to 10bit and ONLY as much compression as needed to hit 10bit, essentially converting the 12bit to 10bit.

----------------------------------------------------------------------------------------
A7gEgdY.png

.


- if you duplicate display and send one HDMI out to CX and one HDMI out to AVR does this work fluidly?

I was talking about sending
4k 120hz 4:4:4 (10 bit??) VRR + uncompressed audio formats from a NVIDIA (3000 series eventually) GPU
to the HDMI IN on the LG CX, then passing the uncompressed audio through the LG CX eARC hdmi output
to a modern eARC capable receiver(which doesn't require hdmi 2.1 ports to have eARC support over a 18gbps eArc port).

That would avoid having to buy a hdmi 2.1 4k 120hz capable receiver and passing the video through the receiver at all, as well as possible resulting input lag additions. Duplicating the video signal , using a receiver as a clone of a monitor would limit you to the max Hz of the receiver in clone mode. VRR and HDR could also fail to work or have issues. You COULD send a hdmi output to the receiver as an additional "ghost" or "fake" monitor (as if part of an extended desktop) and get hdmi sound though. However the resolution of the fake monitor, even if set to a small rez, could cause problems with layouts and mouse movement (and the aforementioned clone mode would be 4k so even worse). With my setup, whenever I'd turn the receiver off it would shift all of the window positions I have displayfusion memorize as well as misplacing window positions when I use hotkeys I set up in displayfusion. So for me that is not going to work either. You could also run into DHCP issues using audio to a different monitor, making a screen go blank.

Note that eARC is backwards compatible with slightly older recivers with "ARC" input but the bandwidth is much less on ARC vs eARC so doesn't support the same uncompressed audio formats.

89pVD56.jpg
 
Last edited:
The big draw for most of us is that it comes in 48" size. A lot of us, including me, would even prefer it smaller at 43" and some like 32 - 40". Jumping from already making a lot of viewing distance compromises in order to use a 48" comfortably to considering a 55", including not just the physical size but the ppi to your perspective at what viewing distance, is a considerable difference.

Exactly. I would have already bought a C9 a long time ago if there was a 48" version. I'll be happy but also annoyed if they release a 43" in 2021 as I'll be tempted to sell my 48" and buy that. I've never sold a used TV, but I'm going to assume resale is not good, especially having to pay for shipping. lol
 
still too big and 40Gbps instead of 48.

sadly have pass.

maybe sonys version will fix the HDMI limitation.
 
It's been confirmed AMD cards do 10-bit just fine it's just some type of BS software lock Nvidia does for their gaming GPU's (for HDMI only, not displayport) unless you pony up for their professional line.

I know this is a mute point if Nvidia RTX 3000 series don't come out this year or don't fully support HDMI 2.1 anyway but it's really starting to seem that the CX is actually a downgrade from the C9. LG even took out the DTS license in the CX which is a big problem for a lot of people who had like terabytes of ripped movies on a USB drive which no longer plays the proper audio.

I have no intention of using the better CX BFI which lowers the brightness of the display as I want max brightness. I really doubt the slightly faster processor is really doing anything for PC gaming PQ when all that BS electronics is still is what PC gamers avoid to reduce input lag & artifacts. I'm pretty sure using game mode reduces most dependency on the internal processor for "enhanced" effects. Most of my current & future games I'm currently playing also support HDR so I don't think I would take advantage of the CX's 4:2:0 8-bit 4K 120hz mode that can be done on HDMI 2.0 unlike the current C9 as I don't believe you can even HDR in this setting.

Still waiting for some better reviews such as Rtings but the current reviews do not show any improvements besides the usual margin of error or placebo effect. I was dead set on even spending additional $ to change my 55" C9 to a 48" CX even though this would seem like a downgrade or a sidegrade to most but again not so sure anymore.

Complaining about 40 instead of 48 is dumb.
The CX panel maxes out at 10 bit 4k 120hz. There is no benefit accepting more than a 40 gb/s signal. It can't do anything with more than 40 gb/s. It's not going to downscale 12 bit content and make it look better than if you just sent it 10 bit content.
Saying the CX doesn't fully support HDMI 2.1 because of that is like saying it doesn't fully support HDMI 2.1 because it doesn't display 8k.


The lack of DTS support sucks though.
I use my 65" B7 to watch 4k HDR DTS movies through LG smartshare all the time. That's not really a problem for the CX 48" version because I'm only going to use it as a monitor, but I was going to get a 77" CX to replace my 65" B7. Instead of spending the extra money on a 77" CX maybe I'll just get a C9 or B9 instead, or maybe I'll just wait another year. It's definitely a turn off on "upgrading" when you actually lose some functionality.
 
So kind of a problem for Nvidia PC gamers for the CX series which may make the C9 better. After reading the last 200 posts on the AVS forums:

We already know about the whole 48 to 40 gbps HDMI 2.1 downgrade that LG ninja’d and admitted within this week.This means 4K 120hz 4:4:4 10-bit Max instead of 4K 120hz 4:4:4 12-bit max. Since these panels are only naturally 10-bit anyway this isn’t the end of the world BUT:

Nvidia gaming cards CURRENTLY only allow you to chose 8 bit or 12 bit color when doing full 4:4:4 RGB over HDMI. Only their professional lines allows 10-bit. You can get 10-bit to show but only after some type of downgrade such as 4:2:2 or 4:2:0 or using YcBCr color instead of RGB. This can even be tested at lower resolutions such as 4K/30 to not saturate they current HDMI 2.0 spec.

8-bit HDR gaming WILL introduce banding and noise for overall lower PQ.

long story short, unless LG and/or Nvidia makes changes to their products:

C9 Series = 4K 120hz 4:4:4 RGB 12-bit Max
CX Series = 4K 120hz 4:4:4 RGB 8-bit Max

I think I may be sticking to my C9 for PC/Nvidia only gaming.

note: even though the panels are 10-bit anyway, some people claim there’s still some benefits of using 12-bit, maybe think of it like downsampling a higher resolution texture or something into a lower resolution panel.


So use 12 bit at 4:2:0. problem solved.

I seriously doubt you'll notice the chroma difference at TV viewing distances anyway? Folks running 4k HDR gaming on current TVs already have to do this trick!

Also, I thought that Nvidia enabled 10-bit color for full-screen apps back in 2016? That would essentially cover games!
 
Last edited:
As a TV, I don't care about DTS support on the TV because I have an AV receiver to handle that. As a monitor, I also don't care, because the TV won't be receiving audio anyway.

If I really wanted to do 7.1 or more from my desktop using the CX as a monitor... I'd get a USB sound card to output the audio and smaller discrete amps for the speakers.

I get wanting to have the TV connected directly to the video card and then having it send audio to the receiver, it does simplify the topology in some cases, but this isn't really a hard or even necessary requirement.
 
Yes I don't care if you don't get DTS on the TV's apps or off of a hdd plugged directly in to the TV (again using the TV apps) - but it should pass-through uncompressed HDMI audio to the receiver over eARC.

I get that some people won't care and will just use spdif/optical for compressed audio for a pc but afaik the only way to get uncompressed hdmi 2.1 sound output to a receiver is through a hmdi eARC input on the receiver. I don't know of any usb sound devices that have an eARC port that passes uncompressed audio formats through.

You could probably still use the "ghost" / "fake" monitor trick to output sound from your (nvidia) GPU to the receiver but that is a wonky way to do it and it causes issues with memorized window placements since whenever you turn the reciever off, for example using a hdmi sound device like an astro mixamp at times instead, an active monitor worth of extended desktop disappears. More importantly for large multi-monitor setups like mine, or at least what I'd like to do , is it's taking away an entire display output that would otherwise be available for another monitor from your GPU as well.

That said, since you CAN get 7.1 uncompressed PCM audio using a CRU EIDE hack and it seemed like RTINGS got full 7.1 DTS audio passed through to a C9, it would seem like uncompressed HD AUDIO can be done via pass through. Hopefully also DOLBY DTS/ ATMOS eventually.

According to RTINGS , the LG C9 does have eARC passthrough support working from PC. Perhaps there is still hope for full uncompressed pass-through for the LG CX (but the C9 has 48Gbps hdmi 2.1). I'm looking forward to RTings doing a review of the CX.
344966_vz9rNOX.png

https://www.avsforum.com/forum/40-o...s-general/3072900-lg-c9-earc-info-thread.html

However, early adopters of C9 have discovered the following issues with LGs implementation of the eARC feature;
  • The LG implementation ignores the media handles for PCM 5.1 and PCM 7.1 audio, which means it is not possible to pass uncompressed HD audio from devices like game titles on consoles like Xbox/PS4 that send HD audio uncompressed. There is no technical reason this shouldn't work (and does work on competitor televisions from Sony) this is just an omission on LGs part in supporting the formats. This issue was first reported in rtings.com review of LG C9.
  • Owners of 2017 Denon products have reported that their AVRs are not recognized by LG C9 as being eARC capable devices. It is reported that 2017 Denons also have this issue with other brand televisions so possibly this issue can only be fixed by Denon or that Denon and display makers will have to collaborate on a fix.
  • It has been confirmed that LG C9 operates properly with eARC delivery when HDMI CEC is turned off on the source (TV) and destination (AVR). This is accomplished by removing HDMI configuration for target AVR in the LG C9 Connections Manager (reset configuration) and disabling ARC and TV control in the Denon/Marantz unit.... then enabling ARC and eARC w/passthrough in the C9 HDMI audio settings. It is unknown if this is functional across all AVR brands but strongly indicates that LG has properly implemented the feature so that it can be turned on independent of use of HDMI control (HDMI CEC).

---------------------------------------------------------------------

https://www.avsforum.com/forum/40-o...020-lg-cx-gx-owners-thread-no-price-talk.html

Features that are delivered:


  • 5.1/7.1 LPCM passthrough via eARC - this works only via a CRU EDID hack - so not a user-friendly feature


Expected/Announced features to be delivered via firmware updates:

  • 5.1/7.1 LPCM passthrough via eARC - the CX HDMI EDID mentions only support for 2.0 LPCM - once CRU is used to add PCM 5.1 to the TV's EDID, you may be able to get surround sound (via eARC) - so it seems that both C9 and CX will get the firmware update in Q2-2020 - as previously announced
  • AMD FreeSync compatibility - will be available via a future firmware upgrade at the end of 2020;
  • HDCP 2.3 - the CX is still at HDCP 2.2 level - HDCP 2.3 will be available via a future firmware update, probably at the same time with the C9;
  • Dolby Vision glows at brightness 50 - LG is aware and is working for a fix;
  • DTS/DTS-HD audio is not passed via eARC - LG was notified about this issue, hopefully, they will provide a fix via a future firmware update;
  • Full 48Gbps support - as of April 2020 the "12 Gbps on 4 lanes" from the Max Fixed Rate Link EDID is missing (the C9 has it) - if we believe the EDID values, the 2020 models support only 40Gbps links, not full 48Gbps;


Features that will never be delivered via firmware updates:
  • WebOS apps support for lossless audio - there will be no WebOS app support for lossless HD audio soundtracks such as TrueHD, TrueHD+Atmos, DTS-HD HR, DTS-HD MA or DTS:X - this is the same situation as for the 2019 models; The Alpha9 SoC has only an ARC capable audio output, so it cannot send lossless audio back to the receiver/soundbar, even if they are eARC capable, firmware updates cannot modify hardware limitations;
  • DTS and DTS-HD support for USB and HDMI sources - the internal decoder is missing from the factory (as announced in the documentation) and probably it will never be added to the 2020 generation - what is puzzling is that DTS and DTS-HD is not even permitted to passthrough via ARC/eARC;
 
So use 12 bit at 4:2:0. problem solved.

I seriously doubt you'll notice the chroma difference at TV viewing distances anyway? Folks running 4k HDR gaming on current TVs already have to do this trick!

Also, I thought that Nvidia enabled 10-bit color for full-screen apps back in 2016? That would essentially cover games!

HDMI 2.1 standard allows for DSC (Display Stream Compression) which can compress 12 bit or 10 bit signals downward. It was mentioned in the HDTVtest video linked here:

2HIg99W.png



..

So theoretically HDMI 2.1 DSC could be used to compress a 12 bit 120Hz 4k 4:4:4 signal sent from a Nvidia GPU down to 10bit 120Hz 4k 444 without losing anything compared to 10bit to 10bit. (For example, rather than running 4:2:0 or running 8bit). I'm not sure if DSC can be used to "cut" 12bit down to exactly 10 bit or if it always compresses it even more than that however. I'm also uncertain as to whether 12bit DSC to 10 bit will actually made available to us through a hdmi 2.1 nvidia gpu's drivers and LG hardware capbility and firmware. That remains to be seen.
 
Last edited:
10bit 444 is overkill for games, I don't know what the big deal here is as most PC gamers have been doing 8 bit forever......and console gamers are still doing 2bit lol

My Alienware 55" "only" does 8-bit 4k120 or 10-bit 422 and has been amazing, so I am pretty pumped for the 48" CX

$1500 is a bit on the high end for the 48"CX, but you are getting a lot for your money....certainly more for your money than those $2,000 Acer X27 panels were. If people want the full potential of the hdmi 2.1 they can pick up a c9 right now for cheap! But as is 10bit 444 is plenty good enough.
 
So use 12 bit at 4:2:0. problem solved.

I seriously doubt you'll notice the chroma difference at TV viewing distances anyway? Folks running 4k HDR gaming on current TVs already have to do this trick!

As someone who never cared about this stuff before but wanted to upgrade my TV (after plasma went down) and use all of its advertised features - I'm not exactly a videophile and I can tell you I definitely see the difference between 444 and 420. My wife can also tell who cares even less than I do.

Duplicating the video signal , using a receiver as a clone of a monitor would limit you to the max Hz of the receiver in clone mode. VRR and HDR could also fail to work or have issues.

Yes eARC should work with a capable receiver EXCEPT for DTS. Because DTS fell off you can't even pass DTS through eARC (or ARC for that matter). That's why for me I'm hoping to buy a HDMI 2.1 capable receiver (if they ever come out, sounds like LG ones soonish?) and duplicate the video and cross my fingers that that works well at 120hz 4k 444 10bit.

Complaining about 40 instead of 48 is dumb.
The CX panel maxes out at 10 bit 4k 120hz. There is no benefit accepting more than a 40 gb/s signal. It can't do anything with more than 40 gb/s. It's not going to downscale 12 bit content and make it look better than if you just sent it 10 bit content.
Saying the CX doesn't fully support HDMI 2.1 because of that is like saying it doesn't fully support HDMI 2.1 because it doesn't display 8k.

There are some extreme people who will advocate that you get less rounding errors by sending 12bit to a 10bit TV (unless you have hardware that converts it first?). But I agree in the sense that I doubt even a trained eye could ever tell the difference - HOWEVER given that the CX currently has banding issues (flashbang scene in Mandalorian EP5, Ocean Floor scenes in Oceans Our Blue Planet), if these issues never get fixed with an update, 12bit may have provided a solution.
 
I agree that a 10bit 444 120hz 4k pipeline would be fine but I was replying specifically to this information posted my Seyumi:


Nvidia gaming cards CURRENTLY only allow you to chose 8 bit or 12 bit color when doing full 4:4:4 RGB over HDMI. Only their professional lines allows 10-bit.


It's been confirmed AMD cards do 10-bit just fine it's just some type of BS software lock Nvidia does for their gaming GPU's (for HDMI only, not displayport) unless you pony up for their professional line.

I know this is a mute point if Nvidia RTX 3000 series don't come out this year or don't fully support HDMI 2.1 anyway but it's really starting to seem that the CX is actually a downgrade from the C9. LG even took out the DTS license in the CX which is a big problem for a lot of people who had like terabytes of ripped movies on a USB drive which no longer plays the proper audio.

I have no intention of using the better CX BFI which lowers the brightness of the display as I want max brightness. I really doubt the slightly faster processor is really doing anything for PC gaming PQ when all that BS electronics is still is what PC gamers avoid to reduce input lag & artifacts. I'm pretty sure using game mode reduces most dependency on the internal processor for "enhanced" effects. Most of my current & future games I'm currently playing also support HDR so I don't think I would take advantage of the CX's 4:2:0 8-bit 4K 120hz mode that can be done on HDMI 2.0 unlike the current C9 as I don't believe you can even HDR in this setting.

Still waiting for some better reviews such as Rtings but the current reviews do not show any improvements besides the usual margin of error or placebo effect. I was dead set on even spending additional $ to change my 55" C9 to a 48" CX even though this would seem like a downgrade or a sidegrade to most but again not so sure anymore.


--------------------------------------------

We already know about the whole 48 to 40 gbps HDMI 2.1 downgrade that LG ninja’d and admitted within this week.This means 4K 120hz 4:4:4 10-bit Max instead of 4K 120hz 4:4:4 12-bit max. Since these panels are only naturally 10-bit anyway this isn’t the end of the world BUT

Nvidia gaming cards CURRENTLY only allow you to chose 8 bit or 12 bit color when doing full 4:4:4 RGB over HDMI. Only their professional lines allows 10-bit. You can get 10-bit to show but only after some type of downgrade such as 4:2:2 or 4:2:0 or using YcBCr color instead of RGB. This can even be tested at lower resolutions such as 4K/30 to not saturate they current HDMI 2.0 spec.

8-bit HDR gaming WILL introduce banding and noise for overall lower PQ.

long story short, unless LG and/or Nvidia makes changes to their products:

C9 Series = 4K 120hz 4:4:4 RGB 12-bit Max
CX Series = 4K 120hz 4:4:4 RGB 8-bit Max

I think I may be sticking to my C9 for PC/Nvidia only gaming.

note: even though the panels are 10-bit anyway, some people claim there’s still some benefits of using 12-bit, maybe think of it like downsampling a higher resolution texture or something into a lower resolution panel.


------------------------------------------------------------------------

.
HDMI 2.1 standard allows for DSC (Display Stream Compression) which can compress 12 bit or 10 bit signals downward. It was mentioned in the HDTVtest video linked here:

345649_2HIg99W.png 345649_2HIg99W.png


..

So theoretically HDMI 2.1 DSC could be used to compress a 12 bit 120Hz 4k 4:4:4 signal sent from a Nvidia GPU down to 10bit 120Hz 4k 444 without losing anything compared to 10bit to 10bit. (For example, rather than running 4:2:0 or running 8bit). I'm not sure if DSC can be used to "cut" 12bit down to exactly 10 bit or if it always compresses it even more than that however. I'm also uncertain as to whether 12bit DSC to 10 bit will actually made available to us through a hdmi 2.1 nvidia gpu's drivers and LG hardware capbility and firmware. That remains to be seen.
 
Last edited:
Well as I said, since HDMI 2.1 can have DSC (dynamic screen compression) - potentially you could send a 12bit 444 signal from a nvidia GPU that gets compressed down to the 10 bit that the LG CX supports and can accept instead of dropping the nvidia output to 8bit or 4:2:0. I hope RTINGS in their LG LX review and HDTVtest in further videos examine the 12bit 444 -> DSC -> to 10bit 444 possibility in detail. We probably won't know the full potential until reviewers get hdmi 2.1 gpus to test with though.
 
Last edited:
Yes eARC should work with a capable receiver EXCEPT for DTS. Because DTS fell off you can't even pass DTS through eARC (or ARC for that matter). That's why for me I'm hoping to buy a HDMI 2.1 capable receiver (if they ever come out, sounds like LG ones soonish?) and duplicate the video and cross my fingers that that works well at 120hz 4k 444 10bit.

If you use two separate HDMI connections, one for video, and one for audio, there is no need to duplicate the display, unless you are extremely bothered by having a phantom display. TBH I ran my audio this way for like 5 years to a 1080p AVR and it was fine.
 
If you use two separate HDMI connections, one for video, and one for audio, there is no need to duplicate the display, unless you are extremely bothered by having a phantom display. TBH I ran my audio this way for like 5 years to a 1080p AVR and it was fine.

I'm confused how do you have a phantom display if you aren't duplicating the display. You mean you run the phantom display at a different res?
 
I was gonna post this to hot deals but I decided this thread is more worthy of the hot deal.

Have a Microcenter near you?

Microcenter just marked all of their 55" C9's down to $999 ... I bought one. These are still $1500+ everywhere else.

This is pretty damn hot.

I'll be returning my CX tomorrow so if you're in the KC area, look for an open box CX online.

https://www.microcenter.com/product...a-hd-hdr-smart-oled-tv-w--thinq---refurbished

This set also has nearly all the features of the CX .... Save yourself ... A LOT ... of money.
 
I was gonna post this to hot deals but I decided this thread is more worthy of the hot deal.

Have a Microcenter near you?

Microcenter just marked all of their 55" C9's down to $999 ... I bought one. These are still $1500+ everywhere else.

https://www.microcenter.com/product...a-hd-hdr-smart-oled-tv-w--thinq---refurbished

This set also has nearly all the features of the CX .... Save yourself ... A LOT ... of money.
Very nice for those with one! Looks like it's a refurb tho not new?
 
I'm confused how do you have a phantom display if you aren't duplicating the display. You mean you run the phantom display at a different res?

Yes, you just let Windows treat it as a secondary monitor(which is what it is). You can use the audio transport of HDMI even if you're not using that secondary display for anything. You just select the HDMI device as your audio out in sound settings.

Just to be clear about this, Windows treats each HDMI device as a separate display AND separate audio device. So if you have two HDMIs connected, you have two HDMI audio devices, and also two displays. And if you use the "Extend" display mode rather than the "Duplicate", you can run the 2nd display device at whatever resolution you want. It has no effect on the audio.
 
Last edited:
I think the biggest beef here is HDMI 2.1 was supposed to be the solution to all our current problems. Today we have to chose between several pros & cons in regards to resolution, refresh rate, color bit, etc. because you can’t have it all with either HDMI 2.0B nor DP 1.4 due to bandwidth limitations.

With the 48 to 40 GBPS downgrade, we now still can’t have our cake and eat it too. We yet again need to fiddle with settings and sacrifice something and some people’s choices are different than others. This can probably be solved with a TV software update if LG chooses, maybe the full bandwidth can be unlocked while the TV is set is in “game mode” if enough people are loud enough about it.
 
I think the biggest beef here is HDMI 2.1 was supposed to be the solution to all our current problems. Today we have to chose between several pros & cons in regards to resolution, refresh rate, color bit, etc. because you can’t have it all with either HDMI 2.0B nor DP 1.4 due to bandwidth limitations.

With the 48 to 40 GBPS downgrade, we now still can’t have our cake and eat it too. We yet again need to fiddle with settings and sacrifice something and some people’s choices are different than others. This can probably be solved with a TV software update if LG chooses, maybe the full bandwidth can be unlocked while the TV is set is in “game mode” if enough people are loud enough about it.

I agree but:
Nvidia gaming cards CURRENTLY only allow you to chose 8 bit or 12 bit color when doing full 4:4:4 RGB over HDMI. Only their professional lines allows 10-bit. You can get 10-bit to show but only after some type of downgrade such as 4:2:2 or 4:2:0 or using YcBCr color instead of RGB. This can even be tested at lower resolutions such as 4K/30 to not saturate they current HDMI 2.0 spec.

8-bit HDR gaming WILL introduce banding and noise for overall lower PQ.

long story short, unless LG and/or Nvidia makes changes to their products:

C9 Series = 4K 120hz 4:4:4 RGB 12-bit Max
CX Series = 4K 120hz 4:4:4 RGB 8-bit Max


Nvidia apparently locks away 10bit output from gaming GPUS
----------------------------------------------------------------------------------------------------


40Gbps would be enough to do 4k 120hz 444 10bit HDR + uncompressed audio and the LG CX is a 10 bit panel so if nvidia would enable sending a 10 bit signal from their gaming gpus that part would be good enough for everyone other than people who perhaps stubbornly want to "supersample" their color bit rate sending 12bit color to a 10bit panel (or have some device or software that can only send 12bit rather than 10bit?). As it is now apparently you have to send 8bit 444 or 10bit 4:2:0 to the 10 bit panel because nvidia keeps 10bit unavailable.
.

HDMI DSC can theoretically compress a 12bit signal down to 10bit but would it OVER compress it?
---------------------------------------------------------------------------------------------------------------------------------------------------------------


I mentioned before that it could be possible for manufacturers to use HDMI 2.1 DSC (display stream compression) to compress 12bit down to 10bit for the LG CX to accept, but now that I look at it more the only DSC compression numbers that I've seen seem to compress the signal A LOT, not just enough to squeeze down to 10bit. In the graph I posted from the HDTVtest video , both 48Gbps 12bit and 40Gbps 10 bit were compressed down to 18Gbps. That makes me question whether you could compress 12bit 48Gbps down "just enough" to 40Gbps rather than compressing it more than necessary down to 18Gbps.
.

LG Doesn't support uncompressed HDMI 2.1 audio pass-through on the LG CX currently
---------------------------------------------------------------------------------------------------------------------------------------------


LG would need to enable 7.1 uncompressed PCM audio pass-through and enable DTS-HD and ATMOS (7.1, 9.2 uncompressed) pass-through to get to full functionality on the audio. You can get 7.1 uncompressed PCM using a CRU EIDE hack so it seems possible that a firmware upgrade could enable it by default.

You can do the "phantom" or "fake" monitor trick running a 2nd hdmi cable but that wastes a display output for people who want to run a bunch of monitors. It can also cause window placements to change every time you turn off what is usually a very hot and power hungry receiver. This can screw up current window positions, memorized and hotkeyed window positions by apps like displayfusion, etc. The "virtual" monitor also exists in your extended desktop array so your mouse can slide off onto it though that is rare if you offset the virtual monitor to be on an upper corner and make it a smaller resolution. Finally, you can potentially run into problems when trying to run hdcp video content if you run the audio to a separate virtual monitor - which can cause your screen to go black/blank.
 
Last edited:
I agree but:



Nvidia apparently locks away 10bit output from gaming GPUS
----------------------------------------------------------------------------------------------------


40Gbps would be enough to do 4k 120hz 444 10bit HDR + uncompressed audio and the LG CX is a 10 bit panel so if nvidia would enable sending a 10 bit signal from their gaming gpus that part would be good enough for everyone other than people who perhaps stubbornly want to "supersample" their color bit rate sending 12bit color to a 10bit panel (or have some device or software that can only send 12bit rather than 10bit?). As it is now apparently you have to send 8bit 444 or 10bit 4:2:0 to the 10 bit panel because nvidia keeps 10bit unavailable.
.

HDMI DSC can theoretically compress a 12bit signal down to 10bit but would it OVER compress it?
---------------------------------------------------------------------------------------------------------------------------------------------------------------


I mentioned before that it could be possible for manufacturers to use HDMI 2.1 DSC (display stream compression) to compress 12bit down to 10bit for the LG CX to accept, but now that I look at it more the only DSC compression numbers that I've seen seem to compress the signal A LOT, not just enough to squeeze down to 10bit. In the graph I posted from the HDTVtest video , both 48Gbps 12bit and 40Gbps 10 bit were compressed down to 18Gbps. That makes me question whether you could compress 12bit 48Gbps down "just enough" to 40Gbps rather than compressing it more than necessary down to 18Gbps.
.

LG Doesn't support uncompressed HDMI 2.1 audio pass-through on the LG CX currently
---------------------------------------------------------------------------------------------------------------------------------------------


LG would need to enable 7.1 uncompressed PCM audio pass-through and enable DTS-HD and ATMOS (7.1, 9.2 uncompressed) pass-through to get to full functionality on the audio. You can get 7.1 uncompressed PCM using a CRU EIDE hack so it seems possible that a firmware upgrade could enable it by default.

You can do the "phantom" or "fake" monitor trick running a 2nd hdmi cable but that wastes a display output for people who want to run a bunch of monitors. It can also cause window placements to change every time you turn off what is usually a very hot and power hungry receiver. This can screw up current window positions, memorized and hotkeyed window positions by apps like displayfusion, etc. The "virtual" monitor also exists in your extended desktop array so your mouse can slide off onto it though that is rare if you offset the virtual monitor to be on an upper corner and make it a smaller resolution. Finally, you can potentially run into problems when trying to run hdcp video content if you run the audio to a separate virtual monitor - which can cause your screen to go black/blank.
DSC is lossless compression. The image should look exactly the same with it on or off.
 
  • Like
Reactions: N4CR
like this
I'll believe that when I see it..

I'd love to see zoomed in tests of colored text on backgrounds and of high detail textures, video.. of

8bit 4:4:4, 10bit at 4:2:2 and 4:2:0 full/uncompressed.. compared to 12 bit 4:4:4 using DSC compressed 3:1
 
DSC is lossless compression. The image should look exactly the same with it on or off.
It cannot be true lossless compression.
VESA claim it is "visually lossless" which only means they did bunch of tests with various test pictures and people generally could not tell when it was used.

DSC operates on per line basis so it should add only minuscule amount of input lag and how much compression will be visible depends on content of whole line in case of most images should look very good. In videos you have mostly gradients and color changes in line are not very sharp. In typical desktop scenario you do have sharp color changes in line but usually also have a lot of pixels which have the same or similar color/pattern and those are easily compressible.

DSC will probably be only visible only on specially prepared test images designed to exploit its weaknesses. Should be better than dropping color resolution and/or bitdepth.
 
I'd think it would be better if you could use DSC just enough to compress 12 bit down to 10bit. That's only 1/6th compression needed to save 8Gbps rather than the massive 3:1 compression ratio I see referenced everywhere which compresses 48Gps down to 18Gbps.

Or, Nvidia could just start supporting 10bit output on gaming GPUs :yuck:

- and LG could start supporting passing through uncompressed hdmi audio formats over eArc on the CX series....
 
I'd think it would be better if you could use DSC just enough to compress 12 bit down to 10bit. That's only 1/6th compression needed to save 8Gbps rather than the massive 3:1 compression ratio I see referenced everywhere which compresses 48Gps down to 18Gbps.

Or, Nvidia could just start supporting 10bit output on gaming GPUs :yuck:

- and LG could start supporting passing through uncompressed hdmi audio formats over eArc on the CX series....

If nvidia never supported 10 bit on geforce cards then what's the whole point of the Acer X27 being able to choose 98Hz 10 bit? I've always stayed at 120Hz 8bit so this didn't matter to me but are you saying selecting 10 bit doesn't actually give you 10 bit?
 
Doing some fact checking on this atm...

Nvidia gaming cards CURRENTLY only allow you to chose 8 bit or 12 bit color when doing full 4:4:4 RGB over HDMI. Only their professional lines allows 10-bit. You can get 10-bit to show but only after some type of downgrade such as 4:2:2 or 4:2:0 or using YcBCr color instead of RGB. This can even be tested at lower resolutions such as 4K/30 to not saturate they current HDMI 2.0 spec.

8-bit HDR gaming WILL introduce banding and noise for overall lower PQ.

long story short, unless LG and/or Nvidia makes changes to their products:

C9 Series = 4K 120hz 4:4:4 RGB 12-bit Max
CX Series = 4K 120hz 4:4:4 RGB 8-bit Max

According to this article, the RTX card do support 10bit since an update in July of 2019:

https://www.cnet.com/news/nvidia-studio-drivers-deliver-geforce-30-bit-color-unto-photoshop-and-more/

"I never thought I'd see the day: Until today you had to spring for a pricey Nvidia Quadro workstation graphics card to properly view your shiny ray-traced renders or accurately grade HDR video in professional applications such as Adobe Photoshop and Premiere. Now that 30-bit support comes down to more affordable GeForce and Titan cards. And not just the RTX models -- "across all Nvidia product lines and GPUs. "

"But Photoshop and Premiere use OpenGL to communicate with the graphics card, at least for color rendering, and the specific API calls to use deep color have only worked with Quadro cards. That can sting when you spent over $1,000 on a GTX 1080 Ti. "


-------------------------

https://www.engadget.com/2019-07-29-nvidia-studio-laptops-10-bit-photoshop.html?



" The latest Studio driver, due to be released shortly, will support 10-bit color for all GPUs in Adobe Photoshop CC, Premier CC and other OpenGL-powered apps. That's a pretty big deal, because up until now, you needed to buy a costly Quadro RTX card to get the most out of your fancy 10-bit HDR monitor for photo or video editing. Now, you get the same feature with GeForce RTX, too."


So it sounds like you need a RTX card and switch to a studio driver instead of a gaming driver?

The point Seyumi was making was if a LG CX can't support a 12bit signal nvidia GPU is sending and no Nvidia gaming driver can send 10bit 444 120Hz 4k, then you'd be limited to 8bit or lower chroma.
 
Last edited:
" The latest Studio driver, due to be released shortly, will support 10-bit color for all GPUs in Adobe Photoshop CC, Premier CC and other OpenGL-powered apps. That's a pretty big deal, because up until now, you needed to buy a costly Quadro RTX card to get the most out of your fancy 10-bit HDR monitor for photo or video editing. Now, you get the same feature with GeForce RTX, too."


So it sounds like you need a RTX card and switch to a studio driver instead of a gaming driver?

The point Seyumi was making was if a LG CX can't support a 12bit signal nvidia GPU is sending and no Nvidia gaming driver can send 10bit 444 120Hz 4k, then you'd be limited to 8bit or lower chroma.

Yes you can do this with the Studio driver but it seems to enable that support only for select apps. The Studio driver gets releases far less often than the standard drivers.

I would not worry about this at all as I expect by the time HDMI 2.1 GPUs are out or soon after Nvidia would fix this issue as there isn't really any reason not to support 10-bit color over HDMI. Considering they have worked with LG for the HDMI 2.1 VRR/G-Sync support you'd think they would want to make sure that those TVs display the best image they can. We don't even know if it is an actual issue yet.
 
Yes.

Nvidia could just start supporting 10bit output on gaming GPU's gaming drivers :yuck:

- and LG could start supporting passing through uncompressed hdmi audio formats over its eARC on the CX series....

Hopefully they will both come through.
 
Lol how do you even know they don't? Are you saying if I go and select 98Hz 10bit right now I will NOT actually get 10bit unless I switch to a studio driver? If that's gonna be the case why don't just grey out 10bit option unless I do switch to a studio driver.
 
Last edited:
Lol how do you even know they don't? Are you saying if I go and select 98Hz 10bit right now I will NOT actually get 10bit unless I switch to a studio driver? If that's gonna be the case why don't just grey out 10bit option unless I do switch to a studio driver.
Are you talking about HDMI connection?
What do you have checked under Digital color format?
 
Are you talking about HDMI connection?
What do you have checked under Digital color format?

No I'm using DP 1.4 but it's RGB 444. At 120Hz it's 8bit and at 98Hz you have the option to select 10bit. Can someone just link me a test pattern that easily shows a difference between 8bit and 10bit? That way we can find out if nvidia really does or does not support 10bit color on a regular geforce driver. Hell I'll even test it out on my GTX 1080 Ti too. If nvidia supports 10bit color through DP....don't see why they'd suddenly limit it to 8bit over HDMI.

From TFTC review of the Asus PG27UQ which has the same limitations:
1589058103085.png


It would make no sense for nvidia to support 10bit over DP1.4 but then drop support on it over HDMI 2.1. If I can run 98Hz 444 10bit over DP 1.4 then I should be able to run 120Hz 444 10bit over HDMI 2.1 once it's out. That is of course unless nvidia doesn't actually output a 10bit signal even when you do select it.
 
Last edited:
If that's gonna be the case why don't just grey out 10bit option unless I do switch to a studio driver.


https://www.cnet.com/news/nvidia-st...geforce-30-bit-color-unto-photoshop-and-more/ (July 29, 2019)

"Photoshop has long given you the option to turn on a 30-bit color pipe between it and the graphics card. But if you enabled it on a system with a consumer-targeted GeForce or Titan graphics card, it didn't do anything. That's why there's always been such confusion as to whether you could display 30-bit color with a GeForce card. I mean, there's a check box and you can check it! "

----------------------------------------------------------

"To properly take advantage of this, you still need all the other elements -- a color-accurate display capable of 30-bit (aka 10-bit) color, for one. The ability to handle a 30-bit data stream is actually pretty common now -- most displays claiming to be able to decode HDR video, which requires a 10-bit transform, can do it -- but you won't see much of a difference without a true 10-bit panel, which are still pretty rare among nonprofessionals.

That's because most
people associate insufficient bit depth with banding, the appearance of visually distinguishable borders between what should be smoothly graduated color. Monitors have gotten good at disguising banding artifacts by visually dithering the borders between colors where necessary. But when you're grading HDR video or painting on 3D renders, for example, dithering doesn't cut it.

And the extra precision is surely welcome when your doctor is trying to tell the difference between a tumor and a shadow on his cheap system. From Nvidia's own white paper in 2009: "While dithering produces a visually smooth image, the pixels no longer correlate to the source data. This matters in mission-critical applications like diagnostic imaging where a tumor may only be one or two pixels big."
 
Last edited:
https://www.cnet.com/news/nvidia-st...geforce-30-bit-color-unto-photoshop-and-more/

"Photoshop has long given you the option to turn on a 30-bit color pipe between it and the graphics card. But if you enabled it on a system with a consumer-targeted GeForce or Titan graphics card, it didn't do anything. That's why there's always been such confusion as to whether you could display 30-bit color with a GeForce card. I mean, there's a check box and you can check it! "

So you're saying selecting 10bit doesn't actually give you 10bit then, ok then let's put that to the test. If you can link me a test pattern (Or game as TFTC states it's only useful for games) that easily shows whether the GPU is outputting 8 or 10 bit then I'll give it a shot.
 
Back
Top