LG 48CX

And you've only accomplished this while using Tidal and still have to reset it every time you boot? Also what specific receiver are you using?
Denon AVR X3400H

However, I just ran the Dolby channel check movie and it is sending 2.1 to the center and surround speakers so maybe it isn't working. I has been awhile since I tested this.
 
This is not correct. I have 5.1 Dolby Surround to my Denon right now through eArc.

EDIT: I made a few posts about this earlier in the thread with a picture here.
The CX EDID doesn't report 5.1 / 7.1 channel PCM so it simply doesn't appear as a selectable speaker configuration in Windows. Are you sure it's sending multi-channel LPCM and not bitstreaming a Dolby format?

Maybe the EDID changes when the TV is connected to an eARC AVR.
 
I tried eARC again with my x4700h....still garbage. lol

It's possible to make it kinda work sometimes but just now I played an Atmos movie through Kodi for a few minutes(it was fine) then played a couple games for about an hour(still fine) but then sitting here on the desktop with no sound playing it started bugging out with the tv flipping between TV Speaker and ARC. If I remember correctly from when I first bought this, I can probably shut things down and restart and be ok again for some period of time.....sometimes a couple hours and sometimes 5 minutes.

I bought a displayport to hdmi adapter a long time ago and never looked back. eARC just doesn't work with the CX. That said, I think I had regular ARC working just fine in the past but I paid too much money for all this stuff to have lesser audio.

I certainly WISH I could try all this with a 3090...maybe next year when I finally find one in stock I'll try eARC again.
 
  • Like
Reactions: elvn
like this
It would really suck if they can't this working properly on the LG CX TVs. Theoretically eARC tech should work as I outlined below so I hope they get it fixed eventually.
-------------------------------------------------------------------------------

I agree that passing video through a receiver is a bad idea. It will always add input lag. Receivers even have lipsync delay timing settings as an option for worst case scenarios which should tell you something. I can't imagine a 4k hdmi 2.1 receiver handling video signals without input lag and handling 120hz VRR and all of the other features seamlessly.

You should theoretically be able to pass a digital audio signal via HDMI 2.1 through an eARC capable TV and out of the TV's eARC hdmi port to an eARC capable receiver... as if the receiver was just a DAC and amp receiving an audio line out that processes and amplifies it to your speakers.

Where DTS isn't supported, you should be able transcode DTS video on the fly in plex to uncompressed PCM before you send it to the TV. You can probably also convert DTS videos to PCM on the fly using mpc-hc in windows if not using plex. You could probably transcode those movies or streaming services from other hardware sources too, for example using a nvidia shield set to transcode to PCM on a home theater tv in scenarios such as where you have a CX in your living room with a surround system and no PC connected to it.

For PC gaming, most games are probably PCM in the end after their audio API.
PC Gaming Surround Sound Roundup (updated October 2020)
Anything in a modern surround format it would probably be dolby formats up to dolby atmos~TrueHD rather than DTS. The only problem with converting to PCM is that it probably strips the atmos ceiling speaker data from Atmos data that was added to TrueHD. If you had a legit atmos ceiling speaker setup you could just turn off the transcoding for that movie though anyway since it would only be needed to pass audio from "legacy" dts sound only titles. (I doubt most people using these TVs as a pc gaming monitor would have or pay extra for ceiling speakers and an atmos surround receiver over a 5.1 or 7.1 non-atmos setup anyway).

Those uncompressed hdmi audio tracks are mostly for movies with dolby TrueHD tracks or Atmos (which is TrueHD + extra ceiling speaker data added) though there are some games listed below (and perhaps others) that have atmos~TrueHD support now.

https://www.pocket-lint.com/games/n...r-gaming-what-is-it-and-what-games-support-it

Beyond that, using your TV as your input switching allows you to keep OSD settings per input which is cleaner and more handy in my opinion. That is, providing you have enough inputs on your TV. Another big factor is that you can get a ~ $400 + tax eARC capable audio capable receiver that doesn't do all of the 4k hdmi 2.1 video processing stuff found in receivers that cost $1200 - $1500.

As far as I know the only input lag free and receiver processing avoiding ways to get full uncompressed audio formats are
...using eARC OUT to a capable receiver's eARC IN
...using a SHARC off of the TV's eARC OUT to a non-arc receiver's HDMI IN
or
... using the "fake" monitor trick off of another hdmi output (or displayport to hdmi adapter) on your pc. This method drops the "fake" monitor out of the array every time you turn the receiver off so it will screw up window positions and window position management software every time you turn the receiver on or off. To me that is a very clunky work-around and not valid for my purposes. Nvidia could probably fix this by allowing for a screenless/pixel free audio device off of one of the video outputs but they have never bothered to.

...EDIT: Perhaps in the long run I can swap out one of my 43" side monitors to a hdmi 2.1 eARC capable one if such gets released someday.. (one that actually has working eARC passthrough and perhaps even including dts passthrough) - using that display as the audio device essentially. that way I wouldn't have to make an "invisible" monitor that disappears every time I turn off my reciever.

Overall the pass through method of eARC out from the TV is theoretically the cleanest way to do audio out that is not incapable of doing uncompressed HDMI audio formats like spdif/optical is. It is the cleanest from the processing standpoint and also regarding managing sources and inputs, for storing or "loading" the optimal OSD settings relevant to each source. Once you set up your source input's settings on the tv the first time, you'd usually just switch inputs on the tv any time after that where the source would have the settings stored - rather than having to swap OSD settings on the TV after every time you swap the receiver to a different source (e.g. PC and a console). eARC can also control the receiver's volume if set up properly so it eliminates having to use the receiver remote for the most part.
 
Last edited:
Let's say you paid $50 for a gold plated HDMI monster cable even though you knew there wasn't a chance it would make any difference over the $5. Let's then say you found out that the $50 cable doesn't have gold plating and is actually an exact duplicate of the $5 cable. You are saying because there is not real world functional difference you wouldn't be upset in the slightest?

We still have a ways to go until we elucidate all of the consequences of not getting a 48gps C9 quality HDMI 2.1. Obviously in theory, even accounting for improved rounding errors from a 12bit signal, there should not be a real world difference on a 10-bit display. But I'm just saying that right now, our budget HDMI 2.1 ports are beginning to get that shit smell, and I hope I'm wrong and some updates magically fix everything.

By broken I mean I highly doubt the consoles will be fully HDMI 2.1 ready at launch. We shall see.

It's more like you thought you were getting 24 carat gold plating but only got 20 carat while Monster advertised neither.

Not having full HDMI 2.1 bandwidth has so far caused no issues. There has been all kinds of speculation that has turned out to be simply bugs with CX series firmware. With more devices coming to market that only support 40 Gbps there is unlikely to be anything on the market that makes good use of it. By the time there is a need for the full 48 Gbps there better be native 12-bit panels, content that uses it (Dolby Vision can but will actual content be made with that support?) etc. I fully expect that nobody but the most anal users will notice a difference vs 10-bit either.

By the time there is a need for the full bandwidth we are talking about 4K 240 Hz displays and probably Nvidia 50 series GPUs to be able to make any use of that refresh rate.

It's just not plain worth worrying about with what we have now.

I also expect that MS and Sony are testing the HDMI 2.1 capabilities and that they will work just fine. It might take time for games to catch up and make any good use of them though as most console games are still using things like fixed framerates.
 
Hoping it is all just firmware issues and not hardware limitations. If the C9 had the same issues that would seem to indicate that is not related to a difference in cutting corners along with the hdmi bandwidth hardware.

As it stands now the remaining issues seem to be (unless I am forgetting something):

... 120hz 4k 444 10bit's VRR range has stuttering
....eARC 5.1/7.1 performance: reliability/stability are not solid at all (dropping to or only capable of stereo, connection not remaining solid requiring reconnects or restarts).
....slight raising of black depth in areas with near-blacks or some near-black flashing with VRR active.
....slight raising of blacks with Dolby Vision hdr
 
The CX EDID doesn't report 5.1 / 7.1 channel PCM so it simply doesn't appear as a selectable speaker configuration in Windows. Are you sure it's sending multi-channel LPCM and not bitstreaming a Dolby format?

Maybe the EDID changes when the TV is connected to an eARC AVR.
These are my selectable speaker configurations:
1602517185655.png


I changed it to Stereo and then back to 5.1 but now the AVR won't allow Dolby to be selected as a sound mode. ugh :(
 
From what I've heard from hdtvtest and avsforum raised blacks happen in dolby vision hdr and there was mention of raised near black in hdr-10.
Here are a few examples from the video below showing the raised blacks in dolby vision vs non-dolby vision:
fIMtnjF.png

AspUnoK.png

There are also reports that slight (1 notch on a "can you see this" contrast setting adjustment in games) raised blacks happen in/adjacent to near black areas when VRR is active. Since the latest firmware update a few people have reported that near blacks have been flashing or flickering on their CX too. I think the flashing was from a 3000 series gpu if I'm not mistaken.

Apparently some time ago when the C9 was the latest, there was a firmware fix applied to address near blacks flashing by dithering the near blacks (even if the rest of the screen was not dithered I think). However this was what must have been an aggressive dithering because it was resulting in a loss of detail - so LG changed this in a later firmware to a flattening of blacks as a work-around. According to what I've read and listened to online, when VRR is active (on both the C9 and the CX) that black-flattening "near black fix" workaround is being bypassed. This reportedly results in slightly raised blacks where near black areas are. Since it sounds like it is not the whole screen but rather just the near-black areas of the screen throughout scenes, changing the overall gamma or other settings would alter the rest of the image too so just making the whole screen darker isn't a perfect fix either. However the difference is not overtly noticeable by everyone, especially those not viewing in dark rooms so it doesn't seem to be getting a lot of attention so far and probably isn't a huge deal to everyone. Still it would be nice if it were fixed.

The Dolby vision raised (or "elevated") blacks apparently affect the whole of the screen/scenes. Vincent reported on this again and also made mention of raised near blacks in HDR10
at this timestamp back on firmware 03.10.20 in august 2020:

"Obviously there are other bugs or say, shortcomings that LG needs to work on. Lets say if you had elevated blacks in Dolby Vison, if you had elevated near-blacks in HDR10 content - you know there will be panel to panel variation so sometimes it may imporve it, sometimes it may not but when I applied the firmware here it didn't really make any change and also the raised near black gamma in VRR mode, you know, that all remains unchanged.. then I can reassure you that LG is on the case. You know, they are aware of these issues and hopefully they will be able to fix these issues - or at least- minimize these issues in the next major firmware."
 
Last edited:
Hoping it is all just firmware issues and not hardware limitations. If the C9 had the same issues that would seem to indicate that is not related to a difference in cutting corners along with the hdmi bandwidth hardware.

As it stands now the remaining issues seem to be (unless I am forgetting something):

... 120hz 4k 444 10bit's VRR range has stuttering
....near-black flashing with VRR active.
I think this is a glitch with the "instant game response" mode not triggering properly sometimes, turning it off/on again fixes these issues for me, i need to do it sometimes when turning the tv on, but not always.
Definitely perfectly smooth in the 100-120 fps range.
Turning off instant game response also glitches, as it says it has been launched.. :D
(cx 55)
Edit: in hdr g-sync seems to not work at all? only tried in destiny2. In destiny2 it doesn't work in sdr either. In wow and wow beta it works fine once enabled/stable.
 
Last edited:
It's more like you thought you were getting 24 carat gold plating but only got 20 carat while Monster advertised neither.

Not having full HDMI 2.1 bandwidth has so far caused no issues. There has been all kinds of speculation that has turned out to be simply bugs with CX series firmware. With more devices coming to market that only support 40 Gbps there is unlikely to be anything on the market that makes good use of it. By the time there is a need for the full 48 Gbps there better be native 12-bit panels, content that uses it (Dolby Vision can but will actual content be made with that support?) etc. I fully expect that nobody but the most anal users will notice a difference vs 10-bit either.

By the time there is a need for the full bandwidth we are talking about 4K 240 Hz displays and probably Nvidia 50 series GPUs to be able to make any use of that refresh rate.

It's just not plain worth worrying about with what we have now.

I also expect that MS and Sony are testing the HDMI 2.1 capabilities and that they will work just fine. It might take time for games to catch up and make any good use of them though as most console games are still using things like fixed framerates.
In my opinion, you can not say that it "has so far caused no issues" until you can actually test it.

I have already resigned myself to HTPC dual 2.1 out to CX and ghost screen AVR, ps5/X series 2.1 to TV then eARC to AVR. Unfortunately still cant buy anything.
 
Just bought a new 65CX to replace the 65B6 in the living room. Excited for the 120Hz panel and HDMI 2.1 support!
 
For some odd reason 03.11.26s have been pulled from all sites, Korea and even US (where it finally made it a few days ago).
 
I'm putting together a new build in the near future. Is the LG CX series pretty much the go to if I'm only going to have one display for TV and PC Monitor? Everything seems to suggest that. I'm considering a 48 or 55 wall mounted.
 
I'm putting together a new build in the near future. Is the LG CX series pretty much the go to if I'm only going to have one display for TV and PC Monitor? Everything seems to suggest that. I'm considering a 48 or 55 wall mounted.
Yes!
 
PSA TO Y'ALL......(except Elvn lmao)

Make sure you set Colorimetry in the hidden menu to BT2020, otherwise colors are washed out ass!
If you don't know how to get to hidden menu see post 4734 above


1602554358034.png
 
Didn't make much of a diff here, if at all.
Must've had it set to BT2020 manually without hardsettingthe flag.
 
PSA TO Y'ALL......(except Elvn lmao)

Make sure you set Colorimetry in the hidden menu to BT2020, otherwise colors are washed out ass!
If you don't know how to get to hidden menu see post 4734 above


View attachment 288240
You do not need to do this. The display detects changing to BT2020 just fine. If it doesn't, try going to 4K 60 Hz and back to 4K 120 Hz again. BT2020 should not be forced for anything but HDR mode and not even there.
 
In my opinion, you can not say that it "has so far caused no issues" until you can actually test it.

I have already resigned myself to HTPC dual 2.1 out to CX and ghost screen AVR, ps5/X series 2.1 to TV then eARC to AVR. Unfortunately still cant buy anything.
I have been running 4K 120 Hz with the Club 3D adapter for a few months now. All the issues have been either adapter or TV firmware related, not bandwidth related.
 
Hello,

Locking the FPS at 100 or below don't stop the stuttering on my TV...
If you can test it on forza horizon 4 the stuttering are visible...
I am wondering if this issue is not relative to the cable itself ?
Is there a label for hdmi 2.1 cable ?
 
How is the Club 3D adapter today? Does it allow for trouble-free (no stuttering, no lag spikes, or artifacts of any kind) 4k 120hz 4:4:4 HDR gaming on a 2080Ti?
 
How is the Club 3D adapter today? Does it allow for trouble-free (no stuttering, no lag spikes, or artifacts of any kind) 4k 120hz 4:4:4 HDR gaming on a 2080Ti?
It does not support G-Sync (at least yet) but it has been surprisingly troublefree on v1.03 adapter firmware since the 03.11.25 TV firmware update. Far removed from requiring constant power cycling like it did earlier.

The only issue that still remains is when enabling HDR at 4K 120 Hz it might output the wrong color space instead of BT2020. This causes washed out colors. This seems to be an issue with the adapter as this happened with both C9 and CX. The solution is to simply toggle back to 4K 60 Hz and then back to 4K 120 Hz. A few times if it doesn't stick on the first try. I use the ColorControl app for this to do it with a few clicks. I haven't seen this issue occur in games, only on the desktop.
 
some updates on eARC from avsforum:

https://www.avsforum.com/threads/2020-lg-cx–gx-dedicated-gaming-thread-consoles-and-pc.3138274/post-60164474
I am one who has had issues getting things to work. It basically works now by setting windows to output Atmos. I occasionally get some handshake issues that require turning eARC off then on again on the tv. It’s not every time though so I’m sure it can be fixed in updates. Windows does not let me output LPCM 5.1/7.1 but everything works fine via Atmos so that’s good enough for me. On the console side, the series x will support Atmos so I would always have that on and for the ps5 we will see if they use bitstream for their positional audio tech or if it’s like the ps4 and lpcm is the best method. I’m confident that consoles will work differently than PC HDMI out.


https://www.avsforum.com/threads/2020-lg-cx–gx-dedicated-gaming-thread-consoles-and-pc.3138274/post-60165220
FallenUndying said:
So I noticed now that when switching to something like a game or when I was watching that Disney WoW video, my Speakers would give off a horrible clip sound(Popping Sound) and it was really loud just now starting God of War, any idea's?
Might be the arc turning on and off constantly. I have seen that and you basically have to force arc to off then on again.

I got this from my pc but not too loud. It ended up switching between Tv speakers and ARC constantly.

So it's not working reliably yet. It drops out from time to time like a broken signal and isn't working with uncompressed PCM 5.1/7.1 for everyone. PCM could be an issues since a workaround for the lack of DTS support could be transcoding to uncompressed PCM. Hopefully they'll fix it and make the handshaking/signal consistent and reliable in a future firmware.

I'm hopeful for a timely fix since it sort of works inconsistently as it is now.
 
Last edited:
Just unpacked my 55CX, and holy mother of screens, that thing is huge (and I'm coming from an 43" already).

Now my question:
1) Shouldn't I be able to do 4k60 10bit RGB on HDMI 2.0 (2070 Super)?

I can only select 4k60 8bit RGB under "native" resolutions or only 4k120 (and 4k100) 10bit 4:2:0 under PC settings.

I actually can't even set 1440p60 10bit as the control panel won't allow me to..

Any suggestions?


2) what's actually the latest FW for european models? Kinda stuck on .10.20, shouldn't the .10.25 already be out in EU?
 
Just unpacked my 55CX, and holy mother of screens, that thing is huge (and I'm coming from an 43" already).

Now my question:
1) Shouldn't I be able to do 4k60 10bit RGB on HDMI 2.0 (2070 Super)?

I can only select 4k60 8bit RGB under "native" resolutions or only 4k120 (and 4k100) 10bit 4:2:0 under PC settings.

I actually can't even set 1440p60 10bit as the control panel won't allow me to..

Any suggestions?


2) what's actually the latest FW for european models? Kinda stuck on .10.20, shouldn't the .10.25 already be out in EU?
On hdmi 2.0 you only get 60Hz 8bit rgb. for 10bit you need to drop down in refresh, but you can do it if you want to watch video content for example.

10.25 and 10.26 have been out, but judging by previous replies in the thread possibly been recalled (likely since they don't work properly).
 
Regarding eArc and windows. I just read this reddit post and it is opposite of what I thought was correct for the last 10 years or so. Setting windows to stereo seems bad for movies.

EDIT: Point is that I guess I have no clue whether my eArc is working lol
 
I just received an OTA firmware update for 03.11.10 ... seems mine totally skipped 03.10.25 and 03.10.26. Anyone else?
 
On hdmi 2.0 you only get 60Hz 8bit rgb. for 10bit you need to drop down in refresh, but you can do it if you want to watch video content for example.

10.25 and 10.26 have been out, but judging by previous replies in the thread possibly been recalled (likely since they don't work properly).
Thanks, yeah I'm an idiot. Forgot I ran my previous monitor over DP..
 
Hmmm... the one i have is 03.11.26 actually.

Actually, I may have misspoken. I read steiNetti's post and assumed that the recent updates that everyone has been doing manually were 03.10.25 and 03.10.26 but I think that prior to this update, I was on 03.10.44 so it must have been 03.11.25 and 03.11.26 that I was thinking of. So according to Snyda, I'm now up to the latest one on LG's page, not the newest ones that were pulled, but still upgraded from the one I was on previously. Initially I thought that I had surpassed the ones that were pulled and there was already a newer version out, but it appears that isn't the case.
 
03.11.26 is gone from lg download page - says 03.11.10 now. strange. "serious" problem identified and "roll-back" performed?

That's what it says on the website? Serious problem/roll-back? Maybe it has something to do with the gsync stutters who knows. Guess they didn't 100% fix it with 11.26
 
Any way to enable G-Sync (or G-Sync compatible) with that display over HDMI 2.0 (2070S)?

Tried FreeSync Premium on and off on the TV and even though I can choose "G-Sync Compatible" (on either of the TV settings) in the control panel and activate Fullscreen and windowed G-Sync in the settings I can't use it in the Pendulum demo and I also get screen tearing at 60Hz..
 
has anyone messed around with custom resolutions? i wanted to try 3200x1800 but when I try to use CRU, I get a black screen and same with nvidia control panel

any help would be appreciated!
 
Back
Top