LG 48CX

Have they actually said anything about supporting HDMI 2.1 in upcoming video cards?


No, but given the unprecedented exception for support in Gsync-compatible HDMI displays, it's highly-implied that Ampere will bring this update!

Much like Nvidia introduced their Chroma hack to get your 60 hz on Kepler recards around six months before Maxwell 2.0s release, they're setting the stage here with VRR support! There's never a guarantee on when you can expect an interface update, but you can take signs of interest from Nvidia as a primer for the real-deal.

AMD has completely lost interest in pushing interfacce technology for TVs, so you'll have to wait for Nvidia.
 
Last edited:
Considering these OLED TVs use 10bit panels and the precision of 10bit is already pretty darned high, I question the benefit of 12bit...
Most stills cameras capture 14 bits per channel, so there's likely some utility there.

One big reason I can think of is banding between very similar color values.
 
Well, finally got the 55" out of the box. Had a tinder date of all people help me :) She was really cool about it. Just extra hands to make sure you do not fuck up pulling things out.

I tried to take some pics with my Google Pixel 2 XL and they look fine but I'm gonna dig out a better camera, a Panasonic Lumux GX85 and try and get some really nice photos.

This thing is amazing. The last 2 sets I've had for PC gaming / desktop was the 49" KS8000 and the 55" NU8000. Both very nice sets.

The colors and contrast on this set is next level.

I also decided to build a new PC so haven't been able to game yet. Maybe tonight.

I hate to admit this after all the shit I've talked but thinking the 48" would be better.
 
This thing is amazing. The last 2 sets I've had for PC gaming / desktop was the 49" KS8000 and the 55" NU8000. Both very nice sets.

The colors and contrast on this set is next level.

Yes sir. You've finally seen the light and while you're correct in that the NU8000 is a great TV by LCD standards, it just can't compare in terms of IQ but I didn't want to shit on your toy. :)

I hate to admit this after all the shit I've talked but thinking the 48" would be better.

Well...you're within the return period so you could go that route, or buy the 48" when it comes out and replace a garbage LCD TV elsewhere in the house with the 55" OLED! I like that plan. I'd much rather watch movies from my couch than my desk and computer chair, so when the 48CX comes out it'll replace my 55B7 as my monitor and the B7 will replace an old 50" Philips that we have in the basement. It's darker down there with no windows behind the couch, so really a perfect environment for an OLED.
 
Most stills cameras capture 14 bits per channel, so there's likely some utility there.

One big reason I can think of is banding between very similar color values.
The benefit for higher bitdepths is different for the actual compression of the video data than for actual uncompressed video transmission.

Banding is a common artifact from not having a great enough bitrate, and increasing the bitdepth improves compression effeciency allowing you to use a lesser bitrate for greater quality - this is why anime fansubs have been using 10bit for over a decade now despite encoding from 8bit sources and why AV1's greater-than-8bit mode is actually a 16bit pipeline internally.

So I still question the benefit of inputting 12bit uncompressed HDMI signals to a 10bit panel.
 
The benefit for higher bitdepths is different for the actual compression of the video data than for actual uncompressed video transmission.

Banding is a common artifact from not having a great enough bitrate, and increasing the bitdepth improves compression effeciency allowing you to use a lesser bitrate for greater quality - this is why anime fansubs have been using 10bit for over a decade now despite encoding from 8bit sources and why AV1's greater-than-8bit mode is actually a 16bit pipeline internally.

So I still question the benefit of inputting 12bit uncompressed HDMI signals to a 10bit panel.
Isn't the extra 2 bit (12bit) used for HDR addressing?
 
Isn't the extra 2 bit (12bit) used for HDR addressing?

I cannot say how it works for the likes of Dolby Vision, but it's my impression that HGIG handles its equivalent of adaptive metadata on the GPU-side before even outputting a video signal, hence why games are perfectly fine outputting with plain old non-adaptive HDR10 - because the actual video signal being output is generated to fit your display's capabilities rather than feeding the display a single video signal with metadata and relying on the display to adjust the result accordingly to fit its capabilities.
 
Apparently the CX models aren't supporting FULL hdmi 2.1 bandwidth (see https://www.forbes.com/sites/johnar...d-tvs-dont-support-full-hdmi-21/#4cbbae466276). Is this something to be concerned about for any current or near future use-cases?

I mentioned this earlier in the thread in the quotes from AVS forum threads. It's important for the ability to send full uncompressed hdmi audio formats over the hdmi cable (for eArc back to a receiver/sound system) along with the full video formats (4k 120hz VRR, 4:4:4 chroma, 10 bit HDR) all on the same single cable..

  • Full 48Gbps support - as of April 2020 the "12 Gbps on 4 lanes" from the Max Fixed Rate Link EDID is missing (the C9 has it) - if we believe the EDID values, the 2020 models support only 40Gbps links, not full 48Gbps;


Have you tried any PCM 7.1, ATMOS or other uncompressed audio movies? What about uncompressed from an xbox or ps4?

Personally I don't care if the built in smart TV apps or connecting a usb HDD with a movie directly to the TV don't support uncompressed audio formats but I do care whether I can pass-though uncompressed HDMI audio from my pc through the tv and out of the e-ARC hdmi port to an eARC capable receiver correcly.

---------------------------------------------------------------------

https://www.avsforum.com/forum/40-o...s-general/3072900-lg-c9-earc-info-thread.html

However, early adopters of C9 have discovered the following issues with LGs implementation of the eARC feature;
  • The LG implementation ignores the media handles for PCM 5.1 and PCM 7.1 audio, which means it is not possible to pass uncompressed HD audio from devices like game titles on consoles like Xbox/PS4 that send HD audio uncompressed. There is no technical reason this shouldn't work (and does work on competitor televisions from Sony) this is just an omission on LGs part in supporting the formats. This issue was first reported in rtings.com review of LG C9.
  • Owners of 2017 Denon products have reported that their AVRs are not recognized by LG C9 as being eARC capable devices. It is reported that 2017 Denons also have this issue with other brand televisions so possibly this issue can only be fixed by Denon or that Denon and display makers will have to collaborate on a fix.
  • It has been confirmed that LG C9 operates properly with eARC delivery when HDMI CEC is turned off on the source (TV) and destination (AVR). This is accomplished by removing HDMI configuration for target AVR in the LG C9 Connections Manager (reset configuration) and disabling ARC and TV control in the Denon/Marantz unit.... then enabling ARC and eARC w/passthrough in the C9 HDMI audio settings. It is unknown if this is functional across all AVR brands but strongly indicates that LG has properly implemented the feature so that it can be turned on independent of use of HDMI control (HDMI CEC).

---------------------------------------------------------------------

https://www.avsforum.com/forum/40-o...020-lg-cx-gx-owners-thread-no-price-talk.html

Features that are delivered:


  • 5.1/7.1 LPCM passthrough via eARC - this works only via a CRU EDID hack - so not a user-friendly feature


Expected/Announced features to be delivered via firmware updates:

  • 5.1/7.1 LPCM passthrough via eARC - the CX HDMI EDID mentions only support for 2.0 LPCM - once CRU is used to add PCM 5.1 to the TV's EDID, you may be able to get surround sound (via eARC) - so it seems that both C9 and CX will get the firmware update in Q2-2020 - as previously announced
  • AMD FreeSync compatibility - will be available via a future firmware upgrade at the end of 2020;
  • HDCP 2.3 - the CX is still at HDCP 2.2 level - HDCP 2.3 will be available via a future firmware update, probably at the same time with the C9;
  • Dolby Vision glows at brightness 50 - LG is aware and is working for a fix;
  • DTS/DTS-HD audio is not passed via eARC - LG was notified about this issue, hopefully, they will provide a fix via a future firmware update;
  • Full 48Gbps support - as of April 2020 the "12 Gbps on 4 lanes" from the Max Fixed Rate Link EDID is missing (the C9 has it) - if we believe the EDID values, the 2020 models support only 40Gbps links, not full 48Gbps;


Features that will never be delivered via firmware updates:
  • WebOS apps support for lossless audio - there will be no WebOS app support for lossless HD audio soundtracks such as TrueHD, TrueHD+Atmos, DTS-HD HR, DTS-HD MA or DTS:X - this is the same situation as for the 2019 models; The Alpha9 SoC has only an ARC capable audio output, so it cannot send lossless audio back to the receiver/soundbar, even if they are eARC capable, firmware updates cannot modify hardware limitations;
  • DTS and DTS-HD support for USB and HDMI sources - the internal decoder is missing from the factory (as announced in the documentation) and probably it will never be added to the 2020 generation - what is puzzling is that DTS and DTS-HD is not even permitted to passthrough via ARC/eARC;

PCM 192 kHz is good... however your 1080p bluray isn't pushing the same bandwidth overall compared to 4k + uncompressed. It sounds like they need to fix the last 12Gbps lane's avaiability for a full 48Gbps in the EIDE which could make a difference in having enough bandwidth for 4k content and uncompressed sound being input to the TV (with the sound being passed through the eArc output unaltered if they get it working properly). The fact that you can use a CRU EIDE hack to get 7.1 PCM working fully shows that it should be possible by default with a later firmware update.

Hopefully they will also fix the DTS-HD pass-through in time but it sounds like they haven't for the C9 yet either.



HDMI VRR and Nvidia G-sync aren't the same as Freesync. I'm taking this to mean that LG displays work with "G-sync compatible" HDMI VRR but not with AMD's Freesync implementation just yet.

**NVIDIA G-SYNC compatible with RTX 20 and GTX 16 graphics card series.
***G-SYNC included only in ZX, GX, CX and BX models. FreeSync included in ZX model; software update required for GX, CX and BX models. FreeSync may not be available at the time of purchase of this product.
 
https://www.avsforum.com/forum/90-receivers-amps-processors/3061882-where-all-hdmi-2-1-avrs-15.html

Yesterday, 02:27 PM avernar

Quote:
Originally Posted by Billy Williams viewpost.gif

Just read an article on Forbes about LG not really supporting 2.1 this year. That's makes me even less likely to upgrade anything. I am looking forward to the new consoles but I think my current setup will be sufficient. Seems 2.1 has a ways to go before it's really standard.
EndQuote


HDMI 2.1 is a standard. The issue is that people incorrectly think it's an all or nothing standard. As dfa973 mentioned it's a grab bag of features and those features also may have options. Here's the speed options for the FRL (Fixed Rate Link) feature:

3 Gbps per lane on 3 lanes (9Gbps)
3 and 6 Gbps per lane on 3 lanes (9Gbps, 18Gbps)
3 and 6 Gbps per lane on 3 lanes (9Gbps, 18Gbps), 6 Gbps on 4 lanes (24Gbps)
3 and 6 Gbps per lane on 3 lanes (9Gbps, 18Gbps), 6 and 8 Gbps on 4 lanes (24Gbps, 32Gbps)
3 and 6 Gbps per lane on 3 lanes (9Gbps, 18Gbps), 6, 8 and 10 Gbps on 4 lanes (24Gbps, 32Gbps, 40Gbps)
3 and 6 Gbps per lane on 3 lanes (9Gbps, 18Gbps), 6, 8, 10 and 12 Gbps on 4 lanes ( (24Gbps, 32Gbps, 40Gbps, 48Gbps)

As you can see they range from 9Gbps to 48Gbps. All of them are standard.

That is why I keep repeating myself that "HDMI 2.1" by itself is meaningless. You need to see what features are supported and how they are supported.

-----------------------------------------

Today, 01:51 AM onesolo

2.1 is a failure from the start!! Standard too long to finalize from HDMI, to long to implement from OEMs, nothing is mandatory... HDMI org sucks!! The only thing for them is that there is no rivalry for them
 
The bandwidth limits on the new cx suck, but it's close enough. They mention wanting to reserve resources on the new cpu to handle AI... can someone confirm whether there is a physical limit to the datarate based on the hdmi 2.1 hardware installed, or merely a capacity to process?

I ask because my basic assumption is that people hooking this up to their pc in game mode are not going to be using the chips AI functions at that time (is this wrong? I wish someone would do a full god damn review of the new capabilities of the tv to let us all know). So if it's merely cpu/gpu hardware on the chip being conserved, then maybe full bandwidth can be enabled in certain modes where that excess processing is not taking place in the first place. If it is gimped hardware, then it is what it is and we just have to accept the limits.

No 12bit @120Hz without some compromise, although that is kind of a rarified feature set. Not going to stop me from getting the 48inch cx, 10bit 120Hz is good enough for me.
 
I ask because my basic assumption is that people hooking this up to their pc in game mode are not going to be using the chips AI functions at that time (is this wrong? I wish someone would do a full god damn review of the new capabilities of the tv to let us all know).
Hopefully this is the case; the processing that TVs do, aside from the presence of an actual tuner, is really what makes a display a TV.

"Monitor mode" should be simply "display whatever you're fed without molestation".
 
there is an AI tone mapping of HDR content feature that brings up details otherwise lost since OLED HDR can't match the absolute values based on 1000 nit. I'm not sure if it's as necessary for or even works with SDR content but from what I remember seeing in a hdrtest youtube review and a other review online it makes a big difference. Not sure if that works with game mode or if it adds input lag either though, sorry. Just thought I'd mention it.

I'm also concerned about uncompressed hdmi audio fitting in with 10 it 4:4 :4 120hz HDR as they take a lot of bandwidth.
 
There are other ways to get audio to a system than via HDMI. 2.5mm jacks, optical, XLR via DAC etc etc...
Why is the HDMI audio so important? Is it so you just run one cable? Sorry for the ignorance I've always used good old analogue audio to the amps via a long cable.
 
There are other ways to get audio to a system than via HDMI. 2.5mm jacks, optical, XLR via DAC etc etc...
Why is the HDMI audio so important? Is it so you just run one cable? Sorry for the ignorance I've always used good old analogue audio to the amps via a long cable.

Could be for people who want to hook up to a dolby atmos receiver maybe. Personally I've always used USB from my PC to my external Objective O2/ODAC then out to my headphones so I've never passed audio through HDMI.
 
There are other ways to get audio to a system than via HDMI. 2.5mm jacks, optical, XLR via DAC etc etc...
Why is the HDMI audio so important? Is it so you just run one cable? Sorry for the ignorance I've always used good old analogue audio to the amps via a long cable.
On the desktop, not a big deal; for home theater setups, more important when you're using a receiver and switching between inputs where necessary.

In a typical home theater topology, all audio / video streams go to the receiver via HDMI, then only one cable goes to the TV.
 
I'm also concerned about uncompressed hdmi audio fitting in with 10 it 4:4 :4 120hz HDR as they take a lot of bandwidth.
In a typical home theater topology, all audio / video streams go to the receiver via HDMI, then only one cable goes to the TV.

This is a complete non-issue bandwidth-wise. Uncompressed audio consumes an absolutely minuscule amount of bandwidth compared to video - go open up an LPCM WAV file in foobar2000 and note the bitrate that it displays in the status bar.

Even 6-channel LPCM at 32float 192KHz is only ~37Mbps, so that'd be equivalent to 12-channel LPCM at 24bit 192KHz or 24-channel LPCM at 24bit 96KHz - all with a bitrate that could fit over an internet connection (though not my 10Mbps connection) let alone an HDMI cable.

By comparison, the more typical 8-channel LPCM at 24bit 96KHz is only ~12Mbps (that's USB 1.1 levels of bandwidth), and 6-channel LPCM at 24bit 48KHz is merely ~4.6Mbps.
 
This is a complete non-issue bandwidth-wise. Uncompressed audio consumes an absolutely minuscule amount of bandwidth compared to video - go open up an LPCM WAV file in foobar2000 and note the bitrate that it displays in the status bar.

Even 6-channel LPCM at 32float 192KHz is only ~37Mbps, so that'd be equivalent to 12-channel LPCM at 24bit 192KHz or 24-channel LPCM at 24bit 96KHz - all with a bitrate that could fit over an internet connection (though not my 10Mbps connection) let alone an HDMI cable.

By comparison, the more typical 8-channel LPCM at 24bit 96KHz is only ~12Mbps (that's USB 1.1 levels of bandwidth), and 6-channel LPCM at 24bit 48KHz is merely ~4.6Mbps.

I was wondering if the last HDMI lane being left off was related to LG not supporting uncompressed 7.1 PCM as well as not supporting DTS-HD. At first it would seem like it's the EIDE and not the lack of bandwidth, since you can hack the EDIE with CRU.. but then I haven't heard of anyone pushing the limits yet since we have no hdmi 2.1 output sources yet.

  • Full 48Gbps support - as of April 2020 the "12 Gbps on 4 lanes" from the Max Fixed Rate Link EDID is missing (the C9 has it) - if we believe the EDID values, the 2020 models support only 40Gbps links, not full 48Gbps;

For those wondering about hdmi sound:

https://www.whathifi.com/us/advice/...dmi-which-is-the-best-audio-connection-to-use

Like coaxial, one of the issues with optical is that it doesn’t have enough bandwidth for the lossless audio formats such as Dolby TrueHD or DTS-HD Master Audio soundtracks found on most Blu-rays. An optical connection also can’t support more than two channels of uncompressed PCM audio

To be clear, I want LG to support what eARC is and hdmi 2.1 are supposed to be able to do : passing through the uncompressed audio formats via eARC --> OUT while still being able to do full 4k 120hz 4:4:4 VRR, HDR (10bit) + those audio formats on it's HDMI2.1<-- IN.

hdmiarcguide-5l.jpg
 
Last edited:
To be clear, I want LG to support what eARC is and hmdi 2.1 are supposed to be able to do : passing through the uncompressed audio formats via eARC --> OUT while still being able to do full 4k 120hz 4:4:4 VRR, HDR (10bit) + those audio formats on it's HDMI2.1<-- IN.
I'm assuming that the topology is something like: PC GPU HDMI outputs an audio / video stream to the TV on one TV HDMI input, then the TV passes HDMI sound to a receiver, regardless of format, without messing with it.
 
I was wondering if the last HDMI lane being left off was related to LG not supporting uncompressed 7.1 PCM as well as not supporting DTS-HD.

As a practical matter, the TV needs to be able to decode anything it can pass through. Otherwise you'd get no sound in some cases, which isn't acceptable in a consumer product. So besides the technical issues, there's also license fees to consider.
 
Yes for example 3080ti's HDMI 2.1 OUT ---> 4k 444 120hz VRR HDR + Uncompressed audio -----> LG CX OLED HDMI2.1-IN ...... LG CX eARC HDMI2.1 OUT-----> compatible eARC receiver or other eARC sound device (does not have to be hdmi 2.1 in order to have an eARC (IN) port on a modern receiver).


I wouldn't even be able to use eArc off of the LG CX until I got a receiver with eArc rather than the arc one I have apparently, but HDMI 2.1 eARc is backwards compatible with hdmi arc receivers.
If I wanted to upgrade to a receiver with 37Mbps audio via eARC - there is a Denon on sale on amazon for ~ $400 currently that can do 7.1 uncompressed formats over eARC for example:

3YTX8p7.png


I was hoping for this eARC capability eventually since LG is supposed to have hdmi 2.1 but if I had to I could continue using compressed audio over optical or arc (not eARC) for the life of the CX if necessary. I'd consider using my other tv's in my array for the audio pass through but they aren't hdmi 2.1 so are limited to arc instead of e-Arc. If you wanted to use hdcp video sources it could screw it up using a different monitor for audio even if I had a hdmi 2.1 tv alongside the LG CX though just so you know.

There are so many formats/standards and according to that AVS forum quote a few replies back there seems to be at least 6 HDMI 2.1 versions out there (one full 48Gbps and 5 watered down ones).
https://www.avsforum.com/forum/90-receivers-amps-processors/3061882-where-all-hdmi-2-1-avrs-15.html

Yesterday, 02:27 PM avernar

Quote:
Originally Posted by Billy Williams View attachment 243177

Just read an article on Forbes about LG not really supporting 2.1 this year. That's makes me even less likely to upgrade anything. I am looking forward to the new consoles but I think my current setup will be sufficient. Seems 2.1 has a ways to go before it's really standard.
EndQuote


HDMI 2.1 is a standard. The issue is that people incorrectly think it's an all or nothing standard. As dfa973 mentioned it's a grab bag of features and those features also may have options. Here's the speed options for the FRL (Fixed Rate Link) feature:

3 Gbps per lane on 3 lanes (9Gbps)
3 and 6 Gbps per lane on 3 lanes (9Gbps, 18Gbps)
3 and 6 Gbps per lane on 3 lanes (9Gbps, 18Gbps), 6 Gbps on 4 lanes (24Gbps)
3 and 6 Gbps per lane on 3 lanes (9Gbps, 18Gbps), 6 and 8 Gbps on 4 lanes (24Gbps, 32Gbps)
3 and 6 Gbps per lane on 3 lanes (9Gbps, 18Gbps), 6, 8 and 10 Gbps on 4 lanes (24Gbps, 32Gbps, 40Gbps)
3 and 6 Gbps per lane on 3 lanes (9Gbps, 18Gbps), 6, 8, 10 and 12 Gbps on 4 lanes ( (24Gbps, 32Gbps, 40Gbps, 48Gbps)

As you can see they range from 9Gbps to 48Gbps. All of them are standard.

That is why I keep repeating myself that "HDMI 2.1" by itself is meaningless. You need to see what features are supported and how they are supported.

-----------------------------------------

Today, 01:51 AM onesolo

2.1 is a failure from the start!! Standard too long to finalize from HDMI, to long to implement from OEMs, nothing is mandatory... HDMI org sucks!! The only thing for them is that there is no rivalry for them

According to RTINGS , the LG C9 does have eARC passthrough support working from PC. Perhaps there is still hope for full uncompressed pass-through for the LG CX (but the C9 has 48Gbps hdmi 2.1). I'm looking forward to RTings doing a review of the CX.
vz9rNOX.png

The RTINGS 5.1 surround audio passthrough section includes testing for eARC (and Dolby ATMOS DTS-HD eARC pass-through which I believe goes up to 7.2 - 9.2 uncompressed surround) on a long list of reviewed TVs now:
"Update 02/28/2019: Added tests for 'eARC Support', 'Dolby Atmos via TrueHD via eARC' and 'DTS:X via DTS-HD MA via eARC', as part of Test Bench 1.3. "
 
Last edited:
Yes for example 3080ti's HDMI 2.1 OUT ---> 4k 444 120hz VRR HDR + Uncompressed audio -----> LG CX OLED HDMI2.1-IN ...... LG CX eARC HDMI2.1 OUT-----> compatible eARC receiver or other eARC sound device (does not have to be hdmi 2.1 in order to have an eARC (IN) port on a modern receiver).
Please tell me more about 3080Ti.
I have somehow missed it's launch and features.
;)
 
I'm planning ahead for both GPU and eARC capable receiver to get the most out of the LG CX's capabilities - but yes I'm assuming a 3080ti will have hdmi 2.1 and that I will own one and be getting 120hz 4k 444 VRR so that is the goal scenario pipeline I replied with. :cool:

You could have assumed the RTINGS eARC test of the LG C9 OLED's pass-through capability I posted the image of and the link to was using a current GPU though. :rolleyes:


https://www.rtings.com/tv/tests/inputs/5-1-surround-audio-passthrough

Dolby Atmos via Dolby TrueHD passthrough via HDMI eARC

What it is: Whether the TV can receive and pass a Dolby Atmos signal to a receiver via HDMI eARC, when Dolby TrueHD is used as the carrier signal.
When it matters: Blu-rays and video games with Dolby Atmos audio.

Dolby Atmos via Dolby TrueHD passthrough via HDMI eARC capability means a TV can accept a Dolby Atmos signal from a source device, when Dolby TrueHD is used as the carrier signal, and then pass that along to the receiver over an HDMI eARC (Enhanced ARC) connection. This is only really an important test if you want to do this exact connection and transfer in your setup.

To test for Dolby Atmos via TrueHD passthrough via eARC, we connect a PC to the TV via HDMI and play a Dolby Atmos via TrueHD test file (mkv video) using the MPC-HC media player software, set to bitstream audio. We then output the TV’s audio to our 5.1.2 receiver via HDMI eARC and see whether the receiver receives a Dolby Atmos via TrueHD signal, or if it is downgraded to normal Dolby Digital.

DTS:X via DTS-HD MA passthrough via HDMI eARC

What it is: Whether the TV can receive and pass a DTS:X signal to a receiver via HDMI eARC, when DTS-HD MA is used as the carrier signal.
When it matters: Blu-rays and video games with DTS:X audio.

DTS:X via DTS-HD MA passthrough via HDMI eARC capability means a TV can accept a DTS:X signal from a source device, when DTS-HD MA is used as the carrier signal, and then pass that along to the receiver over an HDMI eARC (Enhanced ARC) connection. This is only really an important test if you want to do this exact connection and transfer in your setup.

To test for DTS:X via MA passthrough via eARC, we connect a PC to the TV via HDMI and play a DTS:X via MA test file (mkv video) using the MPC-HC media player software, set to bitstream audio. We then output the TV’s audio to our 5.1.2 receiver via HDMI eARC and see whether the receiver receives a DTS:X via DTS-HD MA signal, or if it is downgraded to normal DTS.
 
Hmm, this TV looks very interesting for gaming, but i'm not sure if I'm $1750 interested (48" price in Sweden).
Maybe next year.
 

Attachments

  • Capture.PNG
    Capture.PNG
    102.4 KB · Views: 0
As an Amazon Associate, HardForum may earn from qualifying purchases.
I'm not sure exactly how to put this into words but there's basically a paradigm shift that can, and I believe should take place if you move to such a large screen. For the most part, I've learned to only use somewhere around that 3840x1600 worth of pixels in the middle of the screen (and maybe more like 3400x1600). It's just something I started doing because it works best. The same type of thing had to occur when moving from a single monitor to multi-monitor, and then from multi-monitor to ultrawide.

Much of peoples' "It's way too big" belief is coming from thinking that they are going to format their windows on the giant screen the same way as they do on their small screen. It's understandable. It's just something you have to try out before you realize that it can work.

This is a really good point. For me, I think it comes down to software support... or lack thereof to allow the new paradigm to make sense.

I'd love OS and display support to enable the following scenario:
- a 36" diagonal 16:9 working space centered on a 48CX-like display for day-to-day where I want to sit close and not swivel my head constantly
- this working space is the default for 2D windowed GUIs, games etc., and mouse cursor movement is bounded to this "window"
- the periphery is low information density semi-static display space... think widgets (a la the old Vista sidebars) or other pinned content... along with the appropriate HMI to interact with that space... or a static background that blends in with the wall behind it... or to provide the illusory effectives enabled by display-in-display e.g. gentle rotation or panning of the central window in for example a simulator
- it's easy to switch from this "inset window" mode to a full-screen environment (and back) when for example watching a full screen video where I don't mind taking advantage of the full width, am more likely to sit back or watch at a distance, and don't want widgets cluttering the field of view

If I had this level of software support, I'd be more willing to incorporate a 48" or 55" computer monitor into my life.
 
Hmm, this TV looks very interesting for gaming, but i'm not sure if I'm $1750 interested (48" price in Sweden).
Maybe next year.
Historically they drop about 22% by black friday but it's still steep considering what the 55" and 65" LG C9 and CX's go for. If it's $1500 USD at release and I pay 8.7% tax here ~> $1631. 22% of 1500 ~> $1170 + 8.7% = $1272.

Waiting that long the 7nm nvidia gpus should be out and should have hdmi 2.1 (4k 444 120hz, VRR, 10bit HDR, HGIG HDR for consoles, eArc capable uncompressed audio). Potentially a $228 savings woud be appreciable if the 48 CX price drops similarly.

Overall I'm looking at
-LG CX ~ $1300+
-3080ti sc (hybird?) ~ $1200 - $1400 ??
-Denon 7.1 eArc capable receiver (optional) ~ $400 + tax

I'll see if I can sell some hardware but realistically it might be more like black friday for the LG 48 CX and new yr/tax return season for Nvidia 7nm Ti GPU.
..Will see how it goes and what the economic fallout of the pandemic looks like.. :sick::eek::unsure:

----------------------------------------------------------
This is a really good point. For me, I think it comes down to software support... or lack thereof to allow the new paradigm to make sense.

I'd love OS and display support to enable the following scenario:
- a 36" diagonal 16:9 working space centered on a 48CX-like display for day-to-day where I want to sit close and not swivel my head constantly
- this working space is the default for 2D windowed GUIs, games etc., and mouse cursor movement is bounded to this "window"
- the periphery is low information density semi-static display space... think widgets (a la the old Vista sidebars) or other pinned content... along with the appropriate HMI to interact with that space... or a static background that blends in with the wall behind it... or to provide the illusory effectives enabled by display-in-display e.g. gentle rotation or panning of the central window in for example a simulator
- it's easy to switch from this "inset window" mode to a full-screen environment (and back) when for example watching a full screen video where I don't mind taking advantage of the full width, am more likely to sit back or watch at a distance, and don't want widgets cluttering the field of view

If I had this level of software support, I'd be more willing to incorporate a 48" or 55" computer monitor into my life.

I've said before here in threads that I'd love to have more or less a wall of extreme resolution monitor where I could set my game screen 1:1 pixel whereever I wanted and whatever size I wanted on the screen, having other app windows at other parts of the "screen wall".

That said, having used virtual desktop windows on a VR headset - I can see that with an abundance of high ppi resolution that even narrow FoV glasses as AR/VR screen replacements (rather than large headsets with wide FoV for VR environments) will likely be a thing. This is most likely one of if not the last gaming monitor I'm going to buy going forward if things continue the way I'm hoping. I won't be spending $1500+ on a monitor AND $1000 - $1500 - $2000 on VR + AR Headsets and gear once VR and AR graduate to much higher fidelity + mixed reality functionality.

At the correct distance and used as a multimedia stage this LG is going to be great. At this point it's just a matter of when I'll break down and buy it which could be more toward xmas unfortunately if I'm going to be wise about it.
 
I'd love OS and display support to enable the following scenario:
Part of this is certainly OS, but it's also GPU driver, display, and application-dependent.

Just look at the mess that scaling, HDR, and VRR have been. Scaling is getting good, VRR has been mostly figured out, but HDR?

That's still a shitshow.
 
If I can find a way to completely hide the taskbar then I think I'll buy day one. TranslucentTB still leaves the sliver of color that Windows 10 uses to highlight the open app/window.
I'm confused - is there something problematic with just using Windows' built-in "auto-hide the taskbar" setting?
 
I'm confused - is there something problematic with just using Windows' built-in "auto-hide the taskbar" setting?

It leaves a tiny sliver of the accent color visible where the currently selected app resides. Completely stupid implementation. I guess you can set accent color to black but still...
 
If I can find a way to completely hide the taskbar then I think I'll buy day one. TranslucentTB still leaves the sliver of color that Windows 10 uses to highlight the open app/window.

What makes you think that would ever burn-in? The rtings test proved that semi-transparent and low brightness static elements would take >10K hours to cause any burn-in, if they ever do at all.
 
What makes you think that would ever burn-in? The rtings test proved that semi-transparent and low brightness static elements would take >10K hours to cause any burn-in, if they ever do at all.

It's just one or two pixel width at the very bottom so I wasn't thinking of burn-in at all, just that it makes no sense to have it visible at all.
 
It's just one or two pixel width at the very bottom so I wasn't thinking of burn-in at all, just that it makes no sense to have it visible at all.

Oh, well, whatever, I've always assumed it's there so if you want to use the mouseover stuff on the taskbar or right click on the focused window you don't have to unhide the taskbar and THEN search for right thing. They probably did some usage research on that.
 
Displayfusion Pro has full featured taskbar that includes all of the windows taskbar functions and more. I've been using displayfusion pro for years for all of it's multi-monitor functions. The displayfusion pro taskbar completely hides itself when set to auto hide.
https://www.displayfusion.com/Features/Taskbar/

Thread from 2020 with the displayfusion author referencing the 2nd thread as being a valid workaround:
https://www.displayfusion.com/Discu...f39s/?ID=46c7fa88-585f-463a-adec-1acaebb5a1b9

The original displayfusion forum thread with the author detailing how to hide the windows taskbar on the primary monitor completely in order to replace it with displayfusion's taskbar there:
https://www.displayfusion.com/Discu...kbar/?ID=3d0cad5b-2ab0-4b2d-acbc-cd2b45389c38


------------------------------------------------------

Taskbar Magic is referenced by most of the pages that mention completely hiding the default windows taskbar. You can then add the full featured and completely hide-able displayfusion taskbar in its place (or set the primary monitor's apps to show on the displayfusion taskbars on other monitors in an array) .

https://www.404techsupport.com/2015/09/16/hide-taskbar-start-button-startup/
taskbarmagic.png
From the Options window, you can manually choose to show or hide the Taskbar, move it to one of the four sides, and reserve a certain number of pixels from the monitor edge. You want to reserve 45 pixels if you are using RocketDock. You will also want to configure the Taskbar to auto-hide so that Windows does not needlessly reserve the full height of the Taskbar as well.

Taskbar Magic is quite old but still quite functional for this unusual need. The other options that I found had different default behaviors or quirks that didn’t really solve the problem. I was happy I found Taskbar Magic and so was the client.



https://superuser.com/questions/219605/how-to-completely-disable-the-windows-taskbar/641719

reddit r/windows/comments/atj5l2/is_there_a_way_to_hide_the_taskbar_completely/

-------------------------------------------------------

You can also use taskbar hider which sets the show/hide of the taskbar to a hotkey toggle, but I don't think this one gets rid of the sliver of taskbar showing by itself.

http://www.itsamples.com/taskbar-hider.html

------------------------------------------------------

Use any at your own risk but they seem to be working methods.

https://www.dropbox.com/s/480hjn1n0thpl3j/Taskbar-Magic.zip

https://www.displayfusion.com/
 
Last edited:
It leaves a tiny sliver of the accent color visible where the currently selected app resides. Completely stupid implementation. I guess you can set accent color to black but still...
Absolutely nutty idea - what happens if you make a custom resolution of something like 3840x2161?

I ask because feeding a 1920x1081 signal to my 1080p TV from AMD GPU whether with GPU scaling disabled or enabled (albiet set to the "don't up/downscale setting") actually results in cropping off a row of pixels, and while I've never tested the max I can go, I have done up to at least 1920x1084 (I forget if that resolution is cutting off 4 rows on a single side or if it's 2 rows on the top and bottom).
 
So kind of a problem for Nvidia PC gamers for the CX series which may make the C9 better. After reading the last 200 posts on the AVS forums:

We already know about the whole 48 to 40 gbps HDMI 2.1 downgrade that LG ninja’d and admitted within this week.This means 4K 120hz 4:4:4 10-bit Max instead of 4K 120hz 4:4:4 12-bit max. Since these panels are only naturally 10-bit anyway this isn’t the end of the world BUT:

Nvidia gaming cards CURRENTLY only allow you to chose 8 bit or 12 bit color when doing full 4:4:4 RGB over HDMI. Only their professional lines allows 10-bit. You can get 10-bit to show but only after some type of downgrade such as 4:2:2 or 4:2:0 or using YcBCr color instead of RGB. This can even be tested at lower resolutions such as 4K/30 to not saturate they current HDMI 2.0 spec.

8-bit HDR gaming WILL introduce banding and noise for overall lower PQ.

long story short, unless LG and/or Nvidia makes changes to their products:

C9 Series = 4K 120hz 4:4:4 RGB 12-bit Max
CX Series = 4K 120hz 4:4:4 RGB 8-bit Max

I think I may be sticking to my C9 for PC/Nvidia only gaming.

note: even though the panels are 10-bit anyway, some people claim there’s still some benefits of using 12-bit, maybe think of it like downsampling a higher resolution texture or something into a lower resolution panel.
 
Last edited:
  • Like
Reactions: elvn
like this
So kind of a problem for Nvidia PC gamers for the CX series which may make the C9 better. After reading the last 200 posts on the AVS forums:

We already know about the whole 48 to 40 gbps HDMI 2.1 downgrade that LG ninja’d and admitted within this week.This means 4K 120hz 4:4:4 10-bit Max instead of 4K 120hz 4:4:4 12-bit max. Since these panels are only naturally 10-bit anyway this isn’t the end of the world BUT:

Nvidia gaming cards CURRENTLY only allow you to chose 8 bit or 12 bit color when doing full 4:4:4 RGB over HDMI. Only their professional lines allows 10-bit. You can get 10-but to show but only after some type of downgrade such as 4:2:2 or 4:2:0 or using YcBCr color instead of RGB. This can even be tested at lower resolutions such as 4K/30 to not saturate they current HDMI 2.0 spec.

8-bit HDR gaming WILL introduce banding and noise for overall lower PQ.

long story short, unless LG and/or Nvidia makes changes to their products:

C9 Series = 4K 120hz 4:4:4 RGB 12-bit Max
CX Series = 4K 120hz 4:4:4 RGB 8-bit Max

I think I may be sticking to my C9 for PC/Nvidia only gaming.

This should be an easy fix since Displayport works just fine with a 10-bit option. I hope Nvidia does something about it sooner rather than later or the 12 bit option manages to correctly drop to 10 bit on these displays. As far as I know the professional drivers mainly allow you to use 10-bit color everywhere whereas the standard drivers just support 10-bit in games and possibly apps that use HDR. My understanding is that if you install the Studio drivers you can use non-pro GPUs at 10-bit in select apps like say Photoshop or Premiere Pro. It's a completely artificial and bullshit thing to do.
 
It seems rather pointless to worry about when we don't even have HDMI 2.1 GPUs yet. Personally I'm not buying anything for the desktop based around HDMI 2.1 until I see the GPU announcements.
 
Back
Top