LG 48CX

I got my club3d adapter.

Im able to do 120zh and RGB. If I mess something up on the nvidia, it will do a black screen and I just need to wait for it to revert to the previous setting. At word I need to unplug and re plug the hdmi...so I dont need to restart the pc at all. I thought first I thought it made the screen looked weird, but it actually reset the TV setting, so I had to re do it, then it looked fine.
 
Last edited:
Actually 8-bit dithered RGB at 120 Hz is still better than 10-bit RGB. You get less banding from a temporally dithered lower bit-depth signal, especially at high refresh rates. If the source were to render to a 12-bit+ surface and output a 10-bit dithered signal, that would be better than 8-bit dithered.

Dithering adds noise. Kind of like anti-aliasing in a way. While it might look better it is technically less accurate.




Partial transcript from the video with Doug from B&H photo video:
"You can actually visualize this if you look at a waveform monitor. As you add curves or adjust levels, you're stretching the data of the video out, revealing gaps in the image. Banding is what happens when these gaps become too prominent"
o2Z5EIx.jpg



"But shooting in 10-bit avoids this. Now all of that information from the sensor can be defined in 1024 steps, allowing for more drastic color grades, and smooth gradients in places like skies."

"Simulated example: Let's say that Rec. 709 has 'regular old blue' while P3 or Rec. 2020 have 'brilliant blue' (I just made those up). 8-bit color can still display 'brilliant blue', but it's going to have fewer shades between that and 'regular old blue'. The larger your gamut, the more challenging it is for an 8-bit mode to accurately represent it. This is why HDR requires a minimum of 10-bits. "

"But that still does not answer how nearly every form of video gets away with it."

"I know this is going to sound weird, but believe it or not,
for most types of video, 8-bit works just fine. Provided you have the bitrate, all it takes, is a little bit of noise. (Lower bitrate = higher chance of banding EVEN WITH HIGHER BIT DEPTHS)"

"Don't believe me? Watch what happens when I add just a small bit of noise to this image."


Noise example:
iYDrcxx.jpg


Original with banding(not at original resolution unfotunately but click it to see it at the max I captured):
QDuxYuO.jpg

After adding noise version (it's not at the full resolution unfortunately but you can click the image to see the max rez I captured it at):
LhN3Whe.jpg

"The banding is a lot less noticeable"



"And just like we could see the gaps in the waveform, you can see the noise filling-in those gaps."

JcWuwkD.jpg

"When professional content is finished in 10-bits or more, the version you end up seeing comes from that source. It's a lot like how a 4k image looks really sharp and clean when downscaled to 1080p. The process in this case is called 'dithering' and it blends colors together using noise. On top of this, most movies, even those shot digitally, have a form of sensor noise or film grain that hides the limits of 8-bit recording."

"Lastly, just because you shoot 10-bit, doesn't mean you need a 10-bit monitor to appreciate the gains. Yes, it's true -
the most accurate representation and benefits of 10-bit can only be seen on a 10bit display. However, shooting and especially grading in higher bit depths still gives you added flexibility that you see in the final image. Those cleaner gradients for example, are still visible in the 8bit monitor that you or anyone else has. In face, it's the monitor or graphics card you have that is dithering the image before you even see it.

"So if you aren't doing heavy grading, or better yet, if you're not using a log mode at all, shooting in 8-bits isn't all that bad. That said, if you shoot in log modes, and want to perform extensive, creative color grades, you will definitely want to shoot in 10-bit. And if you want to produce HDR contents, you're actually required to shoot in at least 10-bits!"

From what I can gather from what that video is teaching - 8bit does look "fine" or "good enough" with a sort of anti-aliasing effect or color noise...
... but he also went on to say that the 8bit is operating off of 10bit material and making noise out of those 10bit source colors, almost like how downsampling 4k to 1080p looks better than 1080p. That might give credence to the claims some people have that 12bit material downsampled to 10bit would look marginally better than 10bit to 10bit, a higher source material "color resolution" to "downsample" from (as Monstieur suggested).

That said at least if you are forced to use 8bit (dithered) you won't see aggressive banding since it will be masked by noise. There will be some small % of detail lost in a light noise just like adding a % noise filter in photoshop.
 
I suppose that's a little bit reassuring in the event nvidia actually decides not to support 10Bit Full RGB over HDMI with Ampere. I do expect them to come around but you never know.
 
I mean even if Club3D's own HDMI cable works with the CAC-1085, HDR still looks washed out and for me is the entire reason for buying the adapter (4K/120/HDR). It seems like right now it's only benefit is getting 120hz on the desktop and avoiding 4:2:0 text quality.
I was able to run the same. No washed out HDR enabled colors but didnt want to wait for club3D cables and returned the adapter. I will wait for a more vanilla PnP solution with Ampere, should prove useful to get decent frame rates at 4K HDR too in recent titles.

Just checking, what do you guys mean by "washed out colors" when HDR is enabled? Is this in Windows or in a game that uses HDR? I'm asking because Windows with HDR on will always look washed out compared to with HDR off. There's a slider in the HDR settings to make it look less washed out, but it's still not great. The Windows UI is built for 8-bit, and anything more is currently garbage. The only time HDR will look better is when using specific media players or games that explicitly take advantage of HDR.
 
I have an important request. Can someone PLEASE do a semi in-depth review of the Club3D adapter and Club3D HDMI cable?

There is literally no real "consistent" information out there. I am only hearing the bad, BUT, with a few people reporting on good results if the proper cable is used. I am interested in hearing from someone that is unbiased and honest and not from people that have unrealistic expectations and have a general attitude of "tech sourness" that will just trash this adapter.

What I would like to know is setup, daily usage, any type of software / hardware changes that have to constantly be made daily or hourly depending on use. An example of this would be gaming?

I am trying to decide if I should buy this.
I’ll let you know when my adapter gets here next week. Ordered a couple ”48 Gbps” hdmi cables to test alongside my current “18 Gbps” certified ones with this adapter. Curious to see what happens.
 
Just checking, what do you guys mean by "washed out colors" when HDR is enabled? Is this in Windows or in a game that uses HDR? I'm asking because Windows with HDR on will always look washed out compared to with HDR off. There's a slider in the HDR settings to make it look less washed out, but it's still not great. The Windows UI is built for 8-bit, and anything more is currently garbage. The only time HDR will look better is when using specific media players or games that explicitly take advantage of HDR.

I mean the entire screen turns gross with HDR enabled with the adapter like it has a grey overcast. HDR does not look washed out at all without the adapter and just appears like a more vibrant and bright SDR to me.
 
I mean the entire screen turns gross with HDR enabled with the adapter like it has a grey overcast. HDR does not look washed out at all without the adapter and just appears like a more vibrant and bright SDR to me.

Yea, without using the Adapter, HDR doesn't look washed out at all.
 
Yea, without using the Adapter, HDR doesn't look washed out at all.

Just tried out HDR, and yeah the desktop color is washed out, but media seem OK. Also, the adapter is pretty finicky.....sometimes I have to unplug/replug when switching to HDR, but sometimes it works fine.
 
Just tried out HDR, and yeah the desktop color is washed out, but media seem OK. Also, the adapter is pretty finicky.....sometimes I have to unplug/replug when switching to HDR, but sometimes it works fine.

Yea I think Club 3D just rushed out this product because they know if they didn't, they won't be able to sell them since the new 3000 series with HDMI 2.1 are coming out soon.
 
Yea I think Club 3D just rushed out this product because they know if they didn't, they won't be able to sell them since the new 3000 series with HDMI 2.1 are coming out soon.

Nah i just think people don't know what the hell they are doing, and what settings to use. It's specifically meant for a C9 and RTX/Navi cards. the nvidia drivers should be 450.22, and a proper cable helps.

Half the posts I've read are people trying to use 6 meter cables on a CX and wondering why it wasn't working lol.
 
Anyone losing signal after the TV goes to sleep? I'll leave the PC for 30 minutes and come back to turn the TV on and I'm greeted with a no signal screen until I restart PC.
 
Nah i just think people don't know what the hell they are doing, and what settings to use. It's specifically meant for a C9 and RTX/Navi cards. the nvidia drivers should be 450.22, and a proper cable helps.

Half the posts I've read are people trying to use 6 meter cables on a CX and wondering why it wasn't working lol.

I can remember when I ran 25' hdmi/dvi and displayport cables (and active usb) to my desk from a storage room through my basement ceiling. I also had a 50' one run to my living room tv.

I think this is the new ultra hdmi 2.1 cable:

HDMI2.1-adapter-jk-1.jpg
 
Dithering adds noise. Kind of like anti-aliasing in a way. There will be some small % of detail lost in a light noise just like adding a % noise filter in photoshop.
No it doesn't - this is temporal dithering in a display signal, not spatial dithering in a static image or video file. That video is ignorant misinformation - they are talking about the source content (which needs to be true 10-bit+), not the signal to the display. The frames in the source content are static images, where your photoshop analogy applies. Temporal dithering causes flickering (at low refresh rates), not banding.

Temporal dithering alternates colours of the same pixel in-place, using shades that average to the target colour, in every frame sent to the display, so the net wavelength from that single pixel reaching your eye is the desired colour. At 30 Hz it can cause visible flickering, but at 120 Hz the eye's persistence blends the wavelengths completely. It cannot be detected even with a spectrophotometer unless the meter has nanosecond-level temporal resolution. The colour reproduced is accurate - even more than a native 10-bit image which is potentially truncated.
 
Last edited:
Really the $55.99 is worth it if I can get the text to look a lot less blurry. I work numerous hours on the CX and the fuzzy text is starting to annoy me.

Turn on and off pixel shift. That fixes text blurriness at 4K 120 Hz 4:2:0. The adapter will get rid of fringing issues. The blurriness at 4K 120 Hz and pixel shift always turning back on after restart is a bug in LG firmware that they hopefully fix sooner rather than later.
 
No it doesn't - this is temporal dithering in a display signal, not spatial dithering in a static image or video file. That video is ignorant misinformation - they are talking about the source content (which needs to be true 10-bit+), not the signal to the display. The frames in the source content are static images, where your photoshop analogy applies. Temporal dithering causes flickering (at low refresh rates), not banding.

Temporal dithering alternates colours of the same pixel in-place, using shades that average to the target colour, in every frame sent to the display, so the net wavelength from that single pixel reaching your eye is the desired colour. At 30 Hz it can cause visible flickering, but at 120 Hz the eye's persistence blends the wavelengths completely. It cannot be detected even with a spectrophotometer unless the meter has nanosecond-level temporal resolution. The colour reproduced is accurate - even more than a native 10-bit image which is potentially truncated.

Are you saying that, for HDR games at least, all we need is 8bit RGB? So people have been complaining about nvidias lack of 10bit RGB over HDMI for nothing this entire time?
 
No it doesn't - this is temporal dithering in a display signal, not spatial dithering in a static image or video file. That video is ignorant misinformation - they are talking about the source content (which needs to be true 10-bit+), not the signal to the display. The frames in the source content are static images, where your photoshop analogy applies. Temporal dithering causes flickering (at low refresh rates), not banding.

Temporal dithering alternates colours of the same pixel in-place, using shades that average to the target colour, in every frame sent to the display, so the net wavelength from that single pixel reaching your eye is the desired colour. At 30 Hz it can cause visible flickering, but at 120 Hz the eye's persistence blends the wavelengths completely. It cannot be detected even with a spectrophotometer unless the meter has nanosecond-level temporal resolution. The colour reproduced is accurate - even more than a native 10-bit image which is potentially truncated.

I did read up about temporal dithering at some point but I still heard it referred to as "noise"... the noise is an anology for masking the banding. I've also heard it called noise when talking about the near black flashing fix LG did.

So is the temporal dithering being done on the gpu or the display in the case of the CX?

And following that logic, do you still think an 8bit dithered display from a 10bit signal would be better looking than a 10bit signal without dithering? In that case, would a 12bit signal "downsampled" to 10bit dithered look better than 12bit and 10bit without dithering?

The banding seems to get finer and finer at higher color bit resolutions ~ higher color bits, similar to increased ppi/rez... so when you say "Better" do you mean that the temporal dithering is blending out the more pixel accurate color banding?

Isn't averaging a color value sort of tangential? As you said, blending two values instead of showing 1 specific color? So muddying slightly? I mean, flash blending two lesser colors to blend an approximation of a higher color.. if it wasn't noisy or still slightly messy, sort of aliased.. why wouldn't it look just like the higher bit value version ( I. e. the much reduced/finer banding of a pure 10bit signal to 10 bit panel)?

....

Here are some more examples of the "noise" analogy in a type of temporal dithering. In this case he is trying to use an A and B image to flash between so that a screenshot would only get one image, as a method of preventing accurate screenshots. If i was running fast enough it would seem solid. To me a lot of the pictures look flashy, and on some I can even see the dark pixels scattered image flash. I realize this isn't the same way the very near colors are temporal dithered or as fast but I found it interesting.

http://persistent.info/web-experiments/temporal-dithering/

QrSoT5db6C790dfWWA0jX3FjdaRTu70AwbnfhhSvlTxw3uP4Xs.gif

If you go to that web page and select "File", you can upload your own pic and turn the setting to finer grain but this is the smallest it will let you go. The image in the upper right corner would be flashing super fast on his site but I can still see artifacts at whatever rate that is.

UjQhZHm.jpg

edit: Incidentally I did get the sliders on that site able to go down to 1px and 1 variance.
 
Last edited:
I think Monstieur is referring more to 8-bit full chroma vs 10/12-bit with chroma subsampling. In this case 8-bit could look the same when temporal dithering is applied despite having less colors it is able to represent. Chroma subsampling throws out color information to increase bandwidth because we are not good at seeing the difference in small color changes so even at 10-bit that could result in worse or same visual quality as 8-bit + FRC.

Most 8 vs 10+ bit discussions seem to be about color banding more than anything.
 
  • Like
Reactions: elvn
like this
I just received my CAC-1085 and did some testing with a 3 m HDMI 48 Gbps cable. I'll get the CAC-1372 cable tomorrow.
  1. The GPU sees it as a HDMI connection and not DisplayPort.
  2. The EDID of the CX is duplicated without any difference visible in CRU, including the FreeSync Range data block, but G-SYNC does not work.
  3. With the 3 m HDMI cable I get "No Signal" or corruption when changing display modes. I have to unplug the USB cable to fix it.
  4. I get washed out HDR at 4K (60 Hz / 120 Hz) (8-bit RGB / 10-bit RGB / 12-bit YCbCr422). Only 10-bit YCbCr422 does not have this issue. 1440p 120 Hz 10-bit RGB also works. Once in a while the other formats work correctly. This is not fixed by setting the Black Level to Low on the TV. Changing to Limited in NVCP results in a doubly washed out look. I think this is a cable issue - the adapter could be failing to negotiate RGB and falling back to YCbCr422 on the HDMI side, but the GPU thinks it's using RGB. If so, the GPU must be incorrectly using Gamma 2.2 for a YCbCr signal. This is plausible because the adapter is decoding DSC. The PG27UQ & X27 had this washed out look when using YCbCr gamma on the DisplayPort input.
  5. 60 Hz 8-bit RGB works. HDR is still subject to the washed out issue.
  6. 60 Hz 10-bit RGB results in 4:2:2 on the CX even in PC mode. I suspect the CX cannot render 4:4:4 at anything above 8-bit 60 Hz. It could also be an issue with the GPU / DSC / CAC-1085.
  7. 60 Hz 12-bit RGB results in 4:2:0 on the CX.
  8. 120 Hz 8-bit RGB results in 4:2:2 on the CX.
  9. 120 Hz 10-bit RGB results in 4:2:2 on the CX.
  10. 120 Hz 12-bit RGB results in 4:2:0 on the CX.
  11. CVT-RBv2 timings for 4K 120 Hz 8-bit RGB (to bypass DSC) does not work on the RTX 2080 Ti. Others have reported it does work on the 1080 Ti.
The only reliable mode is 4K 120 Hz 10-bit YCbCr422.
 
Last edited:
I just received my CAC-1085 and did some testing with a 3 m HDMI 48 Gbps cable. I'll get the CAC-1372 cable tomorrow.
  1. The GPU sees it as a HDMI connection and not DisplayPort.
  2. The EDID of the CX is duplicated without any difference visible in CRU, including the FreeSync Range data block, but G-SYNC does not work.
  3. With the 3 m HDMI cable I get "No Signal" or corruption when changing display modes. I have to unplug the USB cable to fix it.
  4. I get washed out HDR at 4K (60 Hz / 120 Hz) (8-bit RGB / 10-bit RGB / 12-bit YCbCr422). Only 10-bit YCbCr422 does not have this issue. 1440p 120 Hz 10-bit RGB also works. Once in a while the other formats work correctly. This is not fixed by setting the Black Level to Low on the TV. Changing to Limited in NVCP results in a doubly washed out look. I think this is a cable issue - the adapter could be failing to negotiate RGB and falling back to YCbCr422 on the HDMI side, but the GPU thinks it's using RGB. If so, the GPU must be incorrectly using Gamma 2.2 for a YCbCr signal. This is plausible because the adapter is decoding DSC. The PG27UQ & X27 had this washed out look when using YCbCr gamma on the DisplayPort input.
  5. 60 Hz 8-bit RGB works. HDR is still subject to the washed out issue.
  6. 60 Hz 10-bit RGB results in 4:2:2 on the CX even in PC mode. I suspect the CX cannot render 4:4:4 at anything above 8-bit 60 Hz. It could also be an issue with the GPU / DSC / CAC-1085.
  7. 60 Hz 12-bit RGB results in 4:2:0 on the CX.
  8. 120 Hz 8-bit RGB results in 4:2:2 on the CX.
  9. 120 Hz 10-bit RGB results in 4:2:2 on the CX.
  10. 120 Hz 12-bit RGB results in 4:2:0 on the CX.
  11. CVT-RBv2 timings for 4K 120 Hz 8-bit RGB (to bypass DSC) does not work on the RTX 2080 Ti. Others have reported it does work on the 1080 Ti.
The only reliable mode is 4K 120 Hz 10-bit YCbCr422.

I assume that HDMI 2.1 is more picky about cable length and quality than HDMI 2.0 considering how much higher bandwidth it requires. I hope the Club3D cable solves these issue.

How are you verifying the color spaces? Can you check this somewhere on the TV?
 
Nah i just think people don't know what the hell they are doing, and what settings to use. It's specifically meant for a C9 and RTX/Navi cards. the nvidia drivers should be 450.22, and a proper cable helps.Half the posts I've read are people trying to use 6 meter cables on a CX and wondering why it wasn't working lol.
Might be the case with plenty of noobs around here.... but no. The adapter just sucks. CX here with RTX, 45x.xx drivers, Win10 2004 build, 3 and 6ft 48Gbps 8/10K HDMI cables. Perfect 4K and 4K HDR @ 120Hz @10/12bpp Full RGB, but a meh experience with resolution switching across PC modes, apps, games, etc.

The Club3D cables both spec wise and cross section wise have nothing on the ones I am using, and yet they appear to 'fix' the issues.

if this thing cant readily accept a switch from 1080p @ 60Hz to 4K @ 120Hz and back to 1440p @ 100Hz, then its useless.
Deprecated anyway in 2-3 months time.
12-pin connector or not.
 
Perfect 4K and 4K HDR @ 120Hz @10/12bpp Full RGB
Does the CX downsample RGB to 4:2:2 above 8-bit 60 Hz? Is the chroma-444.png test image clear when viewed at 100% display scaling? You can press Ctrl + 0 twice in the Windows 10 Photos app for a pixel perfect image without reducing your display scaling.
 
3 and 6ft 48Gbps 8/10K HDMI cables.

Like i just said CX and 3 meter cable lol. - 2m is the maximum - 1m is preferred, Your transferring 48Gbit/s remember. that's 6000m/b per second. it's as fast/faster than a SSD. A dvd is 4700m/b - you are transferring more than a dvd per second.
 
Like i just said CX and 3 meter cable lol. - 2m is the maximum - 1m is preferred, Your transferring 48Gbit/s remember. that's 6000m/b per second. it's as fast/faster than a SSD. A dvd is 4700m/b - you are transferring more than a dvd per second.

3 feet is about 1m. 6ft is 2.
 
I just received my CAC-1085 and did some testing with a 3 m HDMI 48 Gbps cable. I'll get the CAC-1372 cable tomorrow.
  1. The GPU sees it as a HDMI connection and not DisplayPort.
  2. The EDID of the CX is duplicated without any difference visible in CRU, including the FreeSync Range data block, but G-SYNC does not work.
  3. With the 3 m HDMI cable I get "No Signal" or corruption when changing display modes. I have to unplug the USB cable to fix it.
  4. I get washed out HDR at 4K (60 Hz / 120 Hz) (8-bit RGB / 10-bit RGB / 12-bit YCbCr422). Only 10-bit YCbCr422 does not have this issue. 1440p 120 Hz 10-bit RGB also works. Once in a while the other formats work correctly. This is not fixed by setting the Black Level to Low on the TV. Changing to Limited in NVCP results in a doubly washed out look. I think this is a cable issue - the adapter could be failing to negotiate RGB and falling back to YCbCr422 on the HDMI side, but the GPU thinks it's using RGB. If so, the GPU must be incorrectly using Gamma 2.2 for a YCbCr signal. This is plausible because the adapter is decoding DSC. The PG27UQ & X27 had this washed out look when using YCbCr gamma on the DisplayPort input.
  5. 60 Hz 8-bit RGB works. HDR is still subject to the washed out issue.
  6. 60 Hz 10-bit RGB results in 4:2:2 on the CX even in PC mode. I suspect the CX cannot render 4:4:4 at anything above 8-bit 60 Hz. It could also be an issue with the GPU / DSC / CAC-1085.
  7. 60 Hz 12-bit RGB results in 4:2:0 on the CX.
  8. 120 Hz 8-bit RGB results in 4:2:2 on the CX.
  9. 120 Hz 10-bit RGB results in 4:2:2 on the CX.
  10. 120 Hz 12-bit RGB results in 4:2:0 on the CX.
  11. CVT-RBv2 timings for 4K 120 Hz 8-bit RGB (to bypass DSC) does not work on the RTX 2080 Ti. Others have reported it does work on the 1080 Ti.
The only reliable mode is 4K 120 Hz 10-bit YCbCr422.

I can confirm that 4k 120hz 10bit YcbCr422, no Washed Out colors.
Some one with a 1m cable try with other modes.
 
I think Monstieur is referring more to 8-bit full chroma vs 10/12-bit with chroma subsampling. In this case 8-bit could look the same when temporal dithering is applied despite having less colors it is able to represent. Chroma subsampling throws out color information to increase bandwidth because we are not good at seeing the difference in small color changes so even at 10-bit that could result in worse or same visual quality as 8-bit + FRC.

Most 8 vs 10+ bit discussions seem to be about color banding more than anything.

Makes sense from comparing lower chroma and the work-arounds people are forced to use until hdmi 2.1 gpus are out.

I found some other good info about FRC/temporal dithering overall so thought I'd post it.

==========================================
More dithering info from blurbusters.com
https://forums.blurbusters.com/viewtopic.php?t=6799
==========================================

Re: Why is FRC dithering so widely used?
by Chief Blur Buster » 08 May 2020, 15:06

Aldagar wrote:
07 May 2020, 13:10
And what advantages does FRC give to standard sRGB monitors? Maybe it improves accuracy by reducing quantization errors?
Banding is visible even at 10-bits on a 10,000nit HDR monitor.

FRC is still advantageous even at 8bits (to generate 10bits) and 10bits (to generate 12bits).

Banding artifacts in smoky haze, sky, or other gradients. Adding extra bits to that can eliminate that.


Aldagar wrote:
08 May 2020, 06:53
Very interesting. So, am I wrong to assume that FRC dithering is affected by resolution and refresh rate, specially considering some LCD technologies like IPS and VA have very slow pixel response times?
Human vision already has natural noise (but most brain filters that). The same problem affects camera sensors.

The important job is to
make sure FRC/temporal dithering is below the noise floor of human vision. Also at high resolution, FRC pixels are so fine that it's almost like invisible spatial dithering instead of temporal dithering. Also, FRC is dithering only between adjacent colors in the color gamut, not between widely-spaced colors (like some technologies such as DLP has to).

Aldagar wrote:
08 May 2020, 06:53
Also, I'm wondering if FRC looks worse on motion than on static images. I can see some kind of "noise" in darker shades on my 8 bit + FRC and 6 bit + FRC monitors, both IPS and 60Hz. And from what you have explained, even on a true 8 bit panel receiving an 8 bit source, there can still be dithering caused by software or GPU drivers.
Yes, motion amplifies temporal dithering visibility.

It's a bigger problem for more binary dithering (DLP/plasma) and much smaller problem for FRC dithering (LCD) because of the distance between dithered color pairs in the color gamut. DLP noisy darks, plasma christmas tree effect pixels in darks, are (at similar viewing distance for similar size / similar resolution display) usually more noticeable dither noise than LCD FRC on noisy dim shades

Mind you, turning off FRC (or defeating FRC via software-based black frame insertion) will mean two different near-shades will become looking identical.
Slight darn-near-invisible noise is preferable to banding/contouring/identical shades.

Also, the
noise in dim colors is not necessarily always FRC based on all panels. It's definitely almost always the case for 6bit+FRC though. There can be noise in the pixel drivers (poor electronics in the panel driver sometimes creates noise much akin to analog VGA noise, even for a digital input, but this is rare nowadays fortunately). Voltage jitter during voltage inversion algorithms or other mudane reasons for pixel noise to exist. Early color LCDs (25 years ago) often had a bit of analog-like pixel noise independently of FRC noise, but most of this is now far below the human noisefloor.
 
Last edited:
Anyone losing signal after the TV goes to sleep? I'll leave the PC for 30 minutes and come back to turn the TV on and I'm greeted with a no signal screen until I restart PC.
Yeah. I do a ctrl alt del and it wakes back up.
 
Well that's pretty lame. Never had this no signal issue with my C9.

Hopefully a firmware update fixes it because it's sporadic and doesn't occur every time.
 
Anyone losing signal after the TV goes to sleep? I'll leave the PC for 30 minutes and come back to turn the TV on and I'm greeted with a no signal screen until I restart PC.
Haven't seen it at all either on my Macbook Pro or my PC with 2080 Ti and Windows 10 2004.
 
Actually 8-bit dithered RGB at 120 Hz is still better than 10-bit RGB. You get less banding from a temporally dithered lower bit-depth signal, especially at high refresh rates. If the source were to render to a 12-bit+ surface and output a 10-bit dithered signal, that would be better than 8-bit dithered.

Say what? Sources?

Windows / NVIDIA drivers don't even dither by default. There is a supposed registry hack for dithering; not sure if it's a 100% solution. I'd still like to see reliable sources that say a lower bit dithered is better than a native higher color bit depth. Everything I've ever read said the opposite.
 
I'm not so sure about 8Bit RGB dithered being than 10Bit RGB but from what I've tested between 8Bit RGB and 10Bit 4:2:2 in HDR, I cannot tell any difference at all. So at the very least 8Bit RGB seems to be on par with 10Bit 4:2:2 making there zero reason to ever switch color depth from 8Bit RGB. I'd rather just leave it at that settings for both desktop and games.
 
I'm not so sure about 8Bit RGB dithered being than 10Bit RGB but from what I've tested between 8Bit RGB and 10Bit 4:2:2 in HDR, I cannot tell any difference at all. So at the very least 8Bit RGB seems to be on par with 10Bit 4:2:2 making there zero reason to ever switch color depth from 8Bit RGB. I'd rather just leave it at that settings for both desktop and games.
At 120 Hz, even 8-bit RGB is subsampled to 4:2:2. So clear text may be impossible at 120 Hz.

I'm currently running 120 Hz 10-bit YCbCr422 with the adapter because it's the only format that works 100% of the time without randomly getting washed out in HDR. Sometimes 120 Hz 8-bit / 10-bit RGB doesn't get washed out.
 
Last edited:
  • Like
Reactions: elvn
like this
With the adapter at least, the CX does not display 60 Hz 10-bit RGB without subsampling to 4:2:2 internally. At 120 Hz, even 8-bit RGB is subsampled to 4:2:2. So clear text may be impossible at 120 Hz.

I'm currently running 120 Hz 10-bit YCbCr422 with the adapter because it's the only format that works 100% of the time without randomly getting washed out in HDR. Sometimes 120 Hz 8-bit /10-bit RGB doesn't get washed out, but the CX subsamples it to 4:2:2 anyway.

I don't have the Club3D adapter. I'm just using a straight HDMI connection from my PC to my CX. I tested some HDR games with 8bit RGB and 10bit 4:2:2 and between the two modes, I saw no differences. Unless the CX is still subsampling it to 4:2:2??
 
Back
Top