LG 48CX

So even if it had the additional bandwidth, the panel itself doesn’t support 12 bit?

It doesn’t look like Sony’s offerings go up to 120Hz at 4K so the LG does seems to be the best gaming display at the moment.
 
So even if it had the additional bandwidth, the panel itself doesn’t support 12 bit?

It doesn’t look like Sony’s offerings go up to 120Hz at 4K so the LG does seems to be the best gaming display at the moment.
This is correct. Sony makes some amazing TVs for movie watching, but for gaming, LG has it with GSYNC, 4k120, BFI 120, and a host of other features not found on competing sets.
 
So even if it had the additional bandwidth, the panel itself doesn’t support 12 bit?

It doesn’t look like Sony’s offerings go up to 120Hz at 4K so the LG does seems to be the best gaming display at the moment.
There are no consumer 12bit panels to my knowledge.
 
So they downgraded the freakin ports from last years model and it is not full 48Gbps HDMI 2.1. Fine for the bedroom but I am not buying another one for gaming. I’m sure it is killer for gaming but for $1,500 it should have the full spec. I really can’t believe they did that.

The new video cards can utilize the full 48Gbps, correct?
What's the point of adding the bandwidth necessary for 4k120 12-bit when you have a 10-bit panel? It's a complete non-issue; there's nothing lost whatsoever by using lower bandwidth ports. The only *potential* issue would have been if NVIDIA didn't expose 10-bit over HDMI, which they did, making this a complete non-issue.

The reason they did it was likely due to cost; higher yields on slower speed ports. Same reason the XBXS and PS5 are using 40GBPS ports.
 
What's the point of adding the bandwidth necessary for 4k120 12-bit when you have a 10-bit panel? It's a complete non-issue; there's nothing lost whatsoever by using lower bandwidth ports. The only *potential* issue would have been if NVIDIA didn't expose 10-bit over HDMI, which they did, making this a complete non-issue.

The reason they did it was likely due to cost; higher yields on slower speed ports. Same reason the XBXS and PS5 are using 40GBPS ports.
The only argument I've heard is that they think sending a 12 bit signal will allow the TV to do processing with more data and make the image look better. It's a pretty weak argument. The only time you use 4k@120hz is for games which wouldn't benefit from it. You have all processing turned off for games to minimize input lag, and even if you left the processing on games wouldn't look better. All the fancy processing is to compensate for low framerate, low resolution, and the limitations of cameras. And if you want to send 12 bit for those just set your refresh rate lower and you have plenty of bandwidth to do so.
 
The extra bandwidth would be pointless for picture quality, but 144Hz would be neat.
I feel that these oled's now are as near perfect as display tech has ever been, and if i upgrade in the future it will be for higher refreshrate.
 
The only argument I've heard is that they think sending a 12 bit signal will allow the TV to do processing with more data and make the image look better. It's a pretty weak argument. The only time you use 4k@120hz is for games which wouldn't benefit from it. You have all processing turned off for games to minimize input lag, and even if you left the processing on games wouldn't look better. All the fancy processing is to compensate for low framerate, low resolution, and the limitations of cameras. And if you want to send 12 bit for those just set your refresh rate lower and you have plenty of bandwidth to do so.
Yeah this argument has never made any sense and screams placebo. If the TV can display colors 1 and 2, and you send it 1.7, it gets rounded and displays 2, just like if you sent it 2 in the first place.
 
I jump to the last update of the tv and with my 3080 msi trio x i have some suttering in 4k 120, even if i am at 113 fps, so my question is simple what is the exact range of support of gsync in 4k 120 full rgb ? If i lock the fps at 100 in the nvidia control panel i don't have suttering ... Also is is possible that the sutter is due to the hdmi 2.1 cable ? Is there any good hdm 2.1 certification ?
I have this one :
https://www.amazon.fr/Belkin-Câble-...eywords=belkin+hdmi+2.1&qid=1602400404&sr=8-5
Does it have enough bandwitch ?
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
What does he mean it is downgraded HDMI 2.1? Are you telling me of the CX is worse than the C9?
I own both the CX 48" and the C9 65".

The difference in HDMI 2.1 bandwidth between the two results in exactly one difference: No 12-bit color on the CX series. The C9 is capable of 4K 120 Hz 4:4:4 12-bit HDR while CX is capable of 4K 120 Hz 4:4:4 10-bit HDR.

But both the CX and C9 series have a 10-bit panel. So they can't show 12-bit content in the first place. All the C9 can do with the extra bits is use it in internal processing which in theory could allow for better image quality. The practical difference however from what I have seen is exactly zero. I can't tell any image quality difference between the two.

It's also worth mentioning that a lot of other HDMI 2.1 devices are limited to 40 Gbps bandwidth. The new Denon AV receivers are like this and so is the Xbox Series X. PS5 probably too. A Denon representative mentioned in an interview is that this is to reduce costs. They could have went and developed a 48 Gbps controller but felt that spending was better served elsewhere because the practical difference is not worth it.

I repeat, don't watch QuantumTV. He is extremely biased and will do anything to show OLEDs in bad light to get views. He has been doing this bullshit for a long time now.
 
FYI - Ruipro is fixing their cables!

1602420770663.png
 
  • Like
Reactions: elvn
like this
LG decided to ship the 48” to Aus finally due to the high demand. First batch is only available through LG at a high premium (can’t blame them for taking advantage of the situation). So maybe in January I can join the ranks of the owners if the price normalises here.
 
In a game where you can maintain 120FPS BFI is fucking amazing and everyone should game with it on in that situation.
I think you need to maintain ~200-300fps for it to be worth turning off g/vysnc and not have BFI screw things up.
Half-Life 1 and CS 1.6 at 1080p never looked more beautiful thanks to BFI....... after sunset or in a basement

AFAIK the Series X and likely PS5 are also 40Gbps devices.
AFAIK the Series X and PS5 are also going to have HDMI 2.1 broken on arrival.


The new Denon AV receivers are like this and so is the Xbox Series X. PS5 probably too. A Denon representative mentioned in an interview is that this is to reduce costs. They could have went and developed a 48 Gbps controller but felt that spending was better served elsewhere because the practical difference is not worth it.
The Denon AV receivers do not have functional HDMI 2.1 yet.



Sorry to be such a debbie downer but I see a lot of people here conflating specs and theory with actual functionality.

And the real controversy here is if all these manufacturers were in cahoots about only doing 40gpbs on HDMI 2.1 why did we have to wait so long to be told by LG that we didn't have 48gpbs.
 
Last edited:
It's probably been mentioned here before, but I will ask anyway. Do we know of any good HDMI 2.1 cables currently on the market?
 
What's all this nonsense about HDMI 2.1 cables? More marketing BS to sell new cables. It's LITERALLY the same cable, 19 pins.

The rate is increasing only from 6Gbps to 12Gbps for each of the three existing data wire triplets (two data, one shield).

The new standard re-tasks a wire triplet that had been dedicated to clock signals, as a fourth 12Gbps data conduit. The clock signals will now piggyback as metadata. This is all done in a new arrangement is called a Fixed Rate Link (FRL) and is used for uncompressed data streams.

Literally any high quality HDMI cable of the past (and likely one you already have) that has decent shielding, would work fine with HDMI 2.1 speeds.
 
I think you need to maintain ~200-300fps for it to be worth turning off g/vysnc and not have BFI screw things up.
Half-Life 1 and CS 1.6 at 1080p never looked more beautiful thanks to BFI....... after sunset or in a basement
Why on earth would you need 200-300FPS on a 120Hz panel? I don't think you know what G-Sync actually does... What is BFI "screwing up" at 120fps...?
 
What's all this nonsense about HDMI 2.1 cables? More marketing BS to sell new cables. It's LITERALLY the same cable, 19 pins.

The rate is increasing only from 6Gbps to 12Gbps for each of the three existing data wire triplets (two data, one shield).

The new standard re-tasks a wire triplet that had been dedicated to clock signals, as a fourth 12Gbps data conduit. The clock signals will now piggyback as metadata. This is all done in a new arrangement is called a Fixed Rate Link (FRL) and is used for uncompressed data streams.

Literally any high quality HDMI cable of the past (and likely one you already have) that has decent shielding, would work fine with HDMI 2.1 speeds.
"Has decent shielding" is one of the key factors here. As is the quality of the wires - depending on the length.

I have several low cost junk HDMI cables here that won't even do 1080p60 at 5m+ and one cable that won't even do 4k60 at 2m..

But basically what you say: any decent quality cable should do, especially at 2m or below. If you need a new one anyways I wouldn't skimp on the extra 3$ and get one that claims to do 48gb/s and just return it if it turns out to be junk..
 
AFAIK the Series X and PS5 are also going to have HDMI 2.1 broken on arrival.

The Denon AV receivers do not have functional HDMI 2.1 yet.

What exactly do you mean by "broken" here? Surely the new Denon series will work just fine with HDMI 2.1? Or are you saying they need firmware updates to enable that functionality and that if you hook them up to a LG CX or C9 you won't get 4K 120 Hz etc?
 
LG decided to ship the 48” to Aus finally due to the high demand. First batch is only available through LG at a high premium (can’t blame them for taking advantage of the situation). So maybe in January I can join the ranks of the owners if the price normalises here.
I've been trying to buy one in Spain for nearly 3 months and nowhere has stock. I knew demand would be high but I never anticipated how low the supply would be
 
  • Like
Reactions: zehoo
like this
What exactly do you mean by "broken" here? Surely the new Denon series will work just fine with HDMI 2.1? Or are you saying they need firmware updates to enable that functionality and that if you hook them up to a LG CX or C9 you won't get 4K 120 Hz etc?
Just more 40Gbps fearmongering I think.
 
I think you need to maintain ~200-300fps for it to be worth turning off g/vysnc and not have BFI screw things up.
Half-Life 1 and CS 1.6 at 1080p never looked more beautiful thanks to BFI....... after sunset or in a basement


AFAIK the Series X and PS5 are also going to have HDMI 2.1 broken on arrival.



The Denon AV receivers do not have functional HDMI 2.1 yet.



Sorry to be such a debbie downer but I see a lot of people here conflating specs and theory with actual functionality.

And the real controversy here is if all these manufacturers were in cahoots about only doing 40gpbs on HDMI 2.1 why did we have to wait so long to be told by LG that we didn't have 48gpbs.

What.

https://www.whathifi.com/us/reviews/denon-avr-x2700h

That doesn't have functional HDMI 2.1?
 
What.

https://www.whathifi.com/us/reviews/denon-avr-x2700h

That doesn't have functional HDMI 2.1?
It doesn't even have functional VRR pass-through yet.

Most of the receivers will not have HDMI 2.1 officially enabled for months and it will only be on one of the inputs.

If you even know somebody that has a CX, denon 2020, and ampere card let me know.

If the panels don’t do 12bit anyway I fail to see why this was ever an issue.
Let's say you paid $50 for a gold plated HDMI monster cable even though you knew there wasn't a chance it would make any difference over the $5. Let's then say you found out that the $50 cable doesn't have gold plating and is actually an exact duplicate of the $5 cable. You are saying because there is not real world functional difference you wouldn't be upset in the slightest?

We still have a ways to go until we elucidate all of the consequences of not getting a 48gps C9 quality HDMI 2.1. Obviously in theory, even accounting for improved rounding errors from a 12bit signal, there should not be a real world difference on a 10-bit display. But I'm just saying that right now, our budget HDMI 2.1 ports are beginning to get that shit smell, and I hope I'm wrong and some updates magically fix everything.

By broken I mean I highly doubt the consoles will be fully HDMI 2.1 ready at launch. We shall see.
 
Last edited:
Serious question. Why would anyone try to passthru a VRR signal on a HDMI 2.1 receiver when eARC is a thing (and there aren't any HDMI 2.1 receivers that don't also support eARC)? Even if everything works perfectly you're adding latency to the signal you don't need to.
 
Serious question. Why would anyone try to passthru a VRR signal on a HDMI 2.1 receiver when eARC is a thing (and there aren't any HDMI 2.1 receivers that don't also support eARC)? Even if everything works perfectly you're adding latency to the signal you don't need to.
I hate to answer a question with a question but what's the point of getting a HDMI 2.1 receiver if you are just going to use eARC. We got into this because I said HDMI 2.1 isn't functional yet on Denon, I didn't say I recommend using it. To more directly answer your question there's issues with eARC on CX like DTS support etc.
 
I hate to answer a question with a question but what's the point of getting a HDMI 2.1 receiver if you are just going to use eARC. We got into this because I said HDMI 2.1 isn't functional yet on Denon, I didn't say I recommend using it. To more directly answer your question there's issues with eARC on CX like DTS support etc.

Because eARC and HDMI 2.1 were introduced the same year on most receivers...so if you get an eARC receiver you have an HDMI 2.1 receiver and vice versa. I highly doubt anyone that has one of the few eARC receivers that doesn't support HDMI 2.1 is going out and buying an HDMI 2.1 receiver, they shouldn't. And I thought the CX "DTS issue" purely related to the built in apps and not external sources, so not relevant to eARC vs. passthru.
 
Last edited:
Because eARC and HDMI 2.1 were introduced the same year on most receivers...so if you get an eARC receiver you have an HDMI 2.1 receiver and vice versa. I highly doubt anyone that has one of the few eARC receivers that doesn't support HDMI 2.1 is going out and buying an HDMI 2.1 receiver, they shouldn't. And I thought the CX "DTS issue" purely related to the built in apps and not external sources, so not relevant to eARC vs. passthru.
Everything you just said is wrong in the universe that I exist in (except for that people shouldn't go buy a 2.1 if they have eARC).
 
Last edited:
Because eARC and HDMI 2.1 were introduced the same year on most receivers...so if you get an eARC receiver you have an HDMI 2.1 receiver and vice versa. I highly doubt anyone that has one of the few eARC receivers that doesn't support HDMI 2.1 is going out and buying an HDMI 2.1 receiver, they shouldn't. And I thought the CX "DTS issue" purely related to the built in apps and not external sources, so not relevant to eARC vs. passthru.
The EDID on the CX exposes only Stereo and not 7.1. I'm not sure if modding the EDID will let it successfully passthrough 7.1 via eARC. It won't passthrough DTS via eARC even with an EDID mod.

A HDMI 2.1 AVR with VRR passthrough is the only option. If G-SYNC / FreeSync falls back to standard VRR through the AVR, we may be forced back to using extended desktop with a second HDMI cable for audio, in which case HDMI 2.0 is sufficient.
 
The EDID on the CX exposes only Stereo and not 7.1. I'm not sure if modding the EDID will let it successfully passthrough 7.1 via eARC. It won't passthrough DTS via eARC even with an EDID mod.

A HDMI 2.1 AVR with VRR passthrough is the only option. If G-SYNC / FreeSync falls back to standard VRR through the AVR, we may be forced back to using extended desktop with a second HDMI cable for audio, in which case HDMI 2.0 is sufficient.
Interesting, I haven't tested (no eARC receiver) but some people on AVS say otherwise. Are there any reasonably common DTS external sources that can't just decode to LPCM on their own?
 
Not sure why people want eARC so bad. It’s still crap lol. Still no proper bitstream for all audio formats. Fine for stereo or something but not if you want the best sound from games or blu-ray. I can still hear a difference on just Netflix let alone from my Plex server with full bit rate blu-ray and 4k rips.
 
Not sure why people want eARC so bad. It’s still crap lol. Still no proper bitstream for all audio formats. Fine for stereo or something but not if you want the best sound from games or blu-ray. I can still hear a difference on just Netflix let alone from my Plex server with full bit rate blu-ray and 4k rips.
Because I (and I assume others) want my audio receiver to stay the hell away from video, lol.
 
If the panels don’t do 12bit anyway I fail to see why this was ever an issue. 🤔
The *potential* issue was that older NVIDIA drivers only supported 8-bit and 12-bit over HDMI, so there was a possibility 40Gbps ports would have been limited to 4k120 8-bit output. A bunch of us raised a stink on the NVIDIA forums, and they eventually added 10-bit over HDMI making things a non-issue.
 
Ok, just wall mounted my new 48CX... This thing is perfect for seating distance of 4' (which is a bit shorter than the recommended distance of 4.8' by THX)

Happy to report that my unit passed all tests with flying colors: no dead pixels and no noticeable uniformity issue (banding, vignetting, blotches, etc.).
This is my 3rd OLED and is the cleanest. My 2017 55" had banding and my 2019 65" had some vignetting... nothing of the sort here... This is in line with the fact that larger OLED (65", 77") are more prone to uniformity issues than their smaller siblings.

Now, just need to find a 3080 to unlock the full potential of this beast.
 
The EDID on the CX exposes only Stereo and not 7.1. I'm not sure if modding the EDID will let it successfully passthrough 7.1 via eARC. It won't passthrough DTS via eARC even with an EDID mod.

A HDMI 2.1 AVR with VRR passthrough is the only option. If G-SYNC / FreeSync falls back to standard VRR through the AVR, we may be forced back to using extended desktop with a second HDMI cable for audio, in which case HDMI 2.0 is sufficient.
This is not correct. I have 5.1 Dolby Surround to my Denon right now through eArc.

EDIT: I made a few posts about this earlier in the thread with a picture here.
 
Last edited:
This is not correct. I have 5.1 Dolby Surround to my Denon right now through eArc.

EDIT: I made a few posts about this earlier in the thread with a picture here.

And you've only accomplished this while using Tidal and still have to reset it every time you boot? Also what specific receiver are you using?
 
Back
Top