LG 48CX

I swear if Ampere does not have HDMI 2.1, gonna sell this CX because I can't handle 60hz desktop much longer.

It's a real possibility like when the R9 Fury released with HDMI 1.4 ports and I had to use and active adapter.
It's 99% gonna have HDMI 2.1 (the remaining 1% is just confirmation directly from Nvidia on Sept 1).

"With GeForce RTX Ampere, NVIDIA introduces 2nd Generation Ray Tracing Cores and 3rd Generation Tensor Cores. The new cards will also support PCI Express 4.0. Additionally, Ampere supports HDMI 2.1 and DisplayPort 1.4a display connectors."

Have a read: https://videocardz.com/newz/nvidia-geforce-rtx-3090-and-geforce-rtx-3080-specifications-leaked
 
It's 99% gonna have HDMI 2.1 (the remaining 1% is just confirmation directly from Nvidia on Sept 1).

"With GeForce RTX Ampere, NVIDIA introduces 2nd Generation Ray Tracing Cores and 3rd Generation Tensor Cores. The new cards will also support PCI Express 4.0. Additionally, Ampere supports HDMI 2.1 and DisplayPort 1.4a display connectors."

Have a read: https://videocardz.com/newz/nvidia-geforce-rtx-3090-and-geforce-rtx-3080-specifications-leaked

My prayers have been answered. It's unfortunate no DP 2.0 though.
 
Lack of DP 2.0 is interesting because it means we will be using DP 1.4 with DSC for quite some time now. It also means we probably won't see much improvement in 4K+ res displays for a couple of years, e.g. ultrawide 5120x2160 at above 60 Hz. HDMI 2.1 might even become the display connector to use on desktop monitors at this rate.

It would be really nice if the new GPUs had two HDMI 2.1 and two DP + USB-C.

Also looking at those specs, I'm now pretty sure we will see a 3080 Ti/Super and 3070 Super sometime next year or perhaps to combat Big Navi.
 
Lack of DP 2.0 is interesting because it means we will be using DP 1.4 with DSC for quite some time now. It also means we probably won't see much improvement in 4K+ res displays for a couple of years, e.g. ultrawide 5120x2160 at above 60 Hz. HDMI 2.1 might even become the display connector to use on desktop monitors at this rate.

It would be really nice if the new GPUs had two HDMI 2.1 and two DP + USB-C.

Also looking at those specs, I'm now pretty sure we will see a 3080 Ti/Super and 3070 Super sometime next year or perhaps to combat Big Navi.

Yeah double HDMI would be nice, I'm sure some non-reference cards will have that. I think GPU USB-C (vr link or whatever they called it?) is a failed experiment at this point and going away though.
 
My prayers have been answered. It's unfortunate no DP 2.0 though.

DP 2.0 wouldn't benefit the CX though, and most likely it won't benefit any TV ever. Unless OLED monitors become a real thing and use DP 2.0 for higher refresh rates, I will never be going back to an LCD monitor for DP 2.0.

Lack of DP 2.0 is interesting because it means we will be using DP 1.4 with DSC for quite some time now. It also means we probably won't see much improvement in 4K+ res displays for a couple of years, e.g. ultrawide 5120x2160 at above 60 Hz. HDMI 2.1 might even become the display connector to use on desktop monitors at this rate.

It would be really nice if the new GPUs had two HDMI 2.1 and two DP + USB-C.

Also looking at those specs, I'm now pretty sure we will see a 3080 Ti/Super and 3070 Super sometime next year or perhaps to combat Big Navi.

Monitors have been evolving at a snail's pace anyways.
 
It boils down to a 3080 or 3090 for me. Depends on price and the performance gap between the two.

Don't really care about the launch 3080's being limited to 10GB since I'll have sold it and bought the latest and greatest before it even matters.
 
It boils down to a 3080 or 3090 for me. Depends on price and the performance gap between the two.

Don't really care about the launch 3080's being limited to 10GB since I'll have sold it and bought the latest and greatest before it even matters.

Based on what I've read so far, the rumor is $799 for the 3080 and $1399 for the 3090. The 3090 is going to have to have a massive performance boost over the 3080 in order to justify a $600 difference (if rumors are true). So, IDK. Strange that they didn't slot a 3080Ti in at a $1000-$1100 price point, but I suppose a 3080Ti or Super may come during the refresh cycle as kasakka alluded to above.

Really hoping the 3080 markedly outperforms a 2080Ti at the lower price point, as I really don't want to pay ~$1500 after tax for a GPU, but I guess we'll see how things shake out soon enough. The 3080 is going to be significantly faster than my 1080Ti at any rate, so I might be tempted to go with that now so I can get 4:4:4 @ 120Hz and then a Ti/Super later if they provide a large boost that's closer to how the 3090 performs at launch.
 
Based on what I've read so far, the rumor is $799 for the 3080 and $1399 for the 3090. The 3090 is going to have to have a massive performance boost over the 3080 in order to justify a $600 difference (if rumors are true). So, IDK. Strange that they didn't slot a 3080Ti in at a $1000-$1100 price point, but I suppose a 3080Ti or Super may come during the refresh cycle as kasakka alluded to above.

Really hoping the 3080 markedly outperforms a 2080Ti at the lower price point, as I really don't want to pay ~$1500 after tax for a GPU, but I guess we'll see how things shake out soon enough. The 3080 is going to be significantly faster than my 1080Ti at any rate, so I might be tempted to go with that now so I can get 4:4:4 @ 120Hz and then a Ti/Super later if they provide a large boost that's closer to how the 3090 performs at launch.

So you're going to buy a 3080 and then a 3080 Ti? Even if you sell the 3080 for a good amount($500?), you are now spending almost the same amount as just buying a 3090 right from the start.
 
My prayers have been answered. It's unfortunate no DP 2.0 though.

The lack of DP 2.0 is very odd as that means no uncompressed 4k at 10-bit 120hz on the DP connector whereas it will be possible on HDMI 2.1. Come to think of it, I've never seen a card where HDMI had more bandwidth than DP.
 
The lack of DP 2.0 is very odd as that means no uncompressed 4k at 10-bit 120hz on the DP connector whereas it will be possible on HDMI 2.1. Come to think of it, I've never seen a card where HDMI had more bandwidth than DP.

Do any DP 2.0 displays exist yet?

And I imagine if it becomes an actual problem active adapters for HDMI 2.1->DP 2.0 will come out to give 48Gbps bandwidth to a DP 2.0 display at least.
 
Do any DP 2.0 displays exist yet?

And I imagine if it becomes an actual problem active adapters for HDMI 2.1->DP 2.0 will come out to give 48Gbps bandwidth to a DP 2.0 display at least.

DP 2.0 is suppose to offer 80Gbps of bandwidth though so even with HDMI 2.1 adapters you are still far off from that.
 
DP 2.0 is suppose to offer 80Gbps of bandwidth though so even with HDMI 2.1 adapters you are still far off from that.

I know, but 48Gbps gets over the 4k120hz 10bit 4:4:4 hump that DP 1.4 can't at least. I haven't done the math but I think 48Gbps would get you to 4k 144hz 10bit 4:4:4 if/when there are displays that support that.

And it'll take more than a 3090 to push demanding games above 4k 144FPS....so by the time the lack of DP 2.0 bandwidth is a problem you'll need a new GPU anyway.
 
Last edited:
So you're going to buy a 3080 and then a 3080 Ti? Even if you sell the 3080 for a good amount($500?), you are now spending almost the same amount as just buying a 3090 right from the start.

I didn’t say that I was going to do anything. I was just spitballing... I’m waiting for confirmed price points and benchmarks/reviews. I see what you’re saying, though. But if it turns out that the 3090 is only 10-15% better than the 3080, that’s gonna be a hard nope. If that’s the case then I’ll get the 3080 and then re-evaluate when the next cards come out. If the 3090 destroys the 3080 then yeah, it would be beneficial to buy it day one assuming you’re willing to keep signaling to Nvidia that price is no object as they start the development of a $2K 4090. :meh:
 
I didn’t say that I was going to do anything. I was just spitballing... I’m waiting for confirmed price points and benchmarks/reviews. I see what you’re saying, though. But if it turns out that the 3090 is only 10-15% better than the 3080, that’s gonna be a hard nope. If that’s the case then I’ll get the 3080 and then re-evaluate when the next cards come out. If the 3090 destroys the 3080 then yeah, it would be beneficial to buy it day one assuming you’re willing to keep signaling to Nvidia that price is no object as they start the development of a $2K 4090. :meh:

I mean the prices are high sure but that's how it is with no serious competition. Intel was selling 8 core cpus for $1000 not too long ago. So I don't really blame nvidia for jacking up prices, any company would.
 
I didn’t say that I was going to do anything. I was just spitballing... I’m waiting for confirmed price points and benchmarks/reviews. I see what you’re saying, though. But if it turns out that the 3090 is only 10-15% better than the 3080, that’s gonna be a hard nope. If that’s the case then I’ll get the 3080 and then re-evaluate when the next cards come out. If the 3090 destroys the 3080 then yeah, it would be beneficial to buy it day one assuming you’re willing to keep signaling to Nvidia that price is no object as they start the development of a $2K 4090. :meh:

I mean, TITAN's have been around for how many generations now? Think of the 3090 as a Titan if it makes you feel better ¯\_(ツ)_/¯. It'll be cheap for a TITAN!

Even if they do release an actual Ampere TITAN card, the basically-confirmed-rumored 3090 specs are sooooooo close to a full GA102 chip an uncut TITAN would barely be any better.
 
I mean the prices are high sure but that's how it is with no serious competition. Intel was selling 8 core cpus for $1000 not too long ago. So I don't really blame nvidia for jacking up prices, any company would.

I know, man. I really wish that AMD could pull another Ryzen and be competitive at the top end. Otherwise this crap is going to continue until sales start to fall off at whatever price ceiling causes gamers in large enough numbers to say “Nah.”


I mean, TITAN's have been around for how many generations now? Think of the 3090 as a Titan if it makes you feel better ¯\_(ツ)_/¯. It'll be cheap for a TITAN!

Even if they do release an actual Ampere TITAN card, the basically-confirmed-rumored 3090 specs are sooooooo close to a full GA102 chip an uncut TITAN would barely be any better.

Ha, yeah I suppose that is one way of looking at it. :D
 
I swear if Ampere does not have HDMI 2.1, gonna sell this CX because I can't handle 60hz desktop much longer.

It's a real possibility like when the R9 Fury released with HDMI 1.4 ports and I had to use and active adapter.
If I'm not mistaken, it's already been confirmed that Hdmi 2.1 will be integrated with Ampere.
 
Based on the leaked specs the 30-35% performance gap between the 3080 and 3090 will probably hold true. Anything less makes no sense based on Nvidia's past releases (1080 to 1080 Ti, 2080 to 2080 Ti). Is that worth $600 when the 3080 is already rumored to be 30% faster than a 2080 Ti? Probably not but gotta wait and see.

The only difference this time around is that it might not sit at the top for 2 years like the 2080 Ti did. Given the huge TDP, a refresh is inevitable so those who sold 2080 Ti's recently for basically what they paid 2 years ago like me might not get the same opportunity.

2080 Ti was the most brain dead purchase I've ever made. I paid $1200ish in 2018 and basically used/abused it for 2 years with an out of pocket cost of $100 to "rent" after selling it.
 
I know, but 48Gbps gets over the 4k120hz 10bit 4:4:4 hump that DP 1.4 can't at least. I haven't done the math but I think 48Gbps would get you to 4k 144hz 10bit 4:4:4 if/when there are displays that support that.

And it'll take more than a 3090 to push demanding games above 4k 144FPS....so by the time the lack of DP 2.0 bandwidth is a problem you'll need a new GPU anyway.

And yet the games that will benefit most from 144+ FPS will likely run just fine on a 3090-ex. Fortnite, CSGO

So it still seems weird that the 30xx won't be able to capitalize on the esports market but maybe AMD also won't put DP 2.0 in their GPUs this year.
 
I know, but 48Gbps gets over the 4k120hz 10bit 4:4:4 hump that DP 1.4 can't at least. I haven't done the math but I think 48Gbps would get you to 4k 144hz 10bit 4:4:4 if/when there are displays that support that.

And it'll take more than a 3090 to push demanding games above 4k 144FPS....so by the time the lack of DP 2.0 bandwidth is a problem you'll need a new GPU anyway.

DP 1.4 + DSC is capable of outputting 4K 120 Hz 10-bit 4:4:4 with HDR, as evidenced by the Club3D adapter. So that is not necessarily an issue, but DP 2.0 would allow for above 4K high refresh rate displays, new ultrawide formats (e.g. 8K x 2160) and so on. It's going to be a chicken and egg situation where DP 2.0 displays won't come to market when there are no DP 2.0 sources available. This of course has zero bearing on TVs like the LG OLEDs which will keep using HDMI 2.1 but means less innovation and competition on the desktop display market unless they start adopting HDMI 2.1 as well. And before someone says "oh but I will only use large OLEDs so I don't care", for a lot of other people this is not going to be the case. More options is good.

The new GPUs look to only have 1x HDMI 2.1 port as is, maybe some vendors do models with two ports but those will most likely be more expensive models. Two ports would be convenient for quickly switching between VRR and BFI on the CX 48, each input using the lowest input lag Game mode. There is a very slight difference in input lag using other modes with Instant Game Response. The other option is to use ColorControl app to operate the LG TV menus to toggle BFI on, that works too.
 
Hi guys. I just got my 48CX in. I have to get a VESA adapter before I can test this on my freestanding Obutto monitor stand. But I need to ask about HDMI cables.

I want to achieve 4K @120hz. Plus hopefully HDR. I did purchase the CLUB 3d hdmi-display port adapter that I learned about in this thread. That gets here next week. But if this 48CX sings for gaming, I will end up buying two more for triple 48CX's for my race sim rig. Which is going to be very wide. My PC tower will be under the right screen. So I need to get a decent length cable for where the far left 48CX would end up (unfortunately the 48CX's HDMI outputs are at the far rear left of the unit).

Are there 12' or 15' cables that are good enough for 4k @120hz + HDR? I probably need a 12' minimum for the left screen (15' would do it for sure), and will be able to use a 9' for the center, and a 6' for the right.

I see a high speed 9.6' cable on Club 3d's site. But I am going to need longer than that for the left 48CX.

Thanks for any help!

EDIT: wondering if these are options for my longer left side TV run to the computer...
https://www.amazon.com/CABLEDECONN-Optical-Supports-HDCP2-2-Projectors/dp/B00XJESNQW/ref=sr_1_20?dchild=1&keywords=club+3d+hdmi+cable&qid=1598709931&sr=8-20&th=1

https://www.amazon.com/BIFALE-Suppo...20hz+hdmi+cable+15+feet&qid=1598713183&sr=8-3
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Hi guys. I just got my 48CX in. I have to get a VESA adapter before I can test this on my freestanding Obutto monitor stand. But I need to ask about HDMI cables.

I want to achieve 4K @120hz. Plus hopefully HDR. I did purchase the CLUB 3d hdmi-display port adapter that I learned about in this thread. That gets here next week. But if this 48CX sings for gaming, I will end up buying two more for triple 48CX's for my race sim rig. Which is going to be very wide. My PC tower will be under the right screen. So I need to get a decent length cable for where the far left 48CX would end up (unfortunately the 48CX's HDMI outputs are at the far rear left of the unit).

Are there 12' or 15' cables that are good enough for 4k @120hz + HDR? I probably need a 12' minimum for the left screen (15' would do it for sure), and will be able to use a 9' for the center, and a 6' for the right.

I see a high speed 9.6' cable on Club 3d's site. But I am going to need longer than that for the left 48CX.

Thanks for any help!

EDIT: wondering if these are options for my longer left side TV run to the computer...
https://www.amazon.com/CABLEDECONN-Optical-Supports-HDCP2-2-Projectors/dp/B00XJESNQW/ref=sr_1_20?dchild=1&keywords=club+3d+hdmi+cable&qid=1598709931&sr=8-20&th=1

https://www.amazon.com/BIFALE-Suppo...20hz+hdmi+cable+15+feet&qid=1598713183&sr=8-3

I'm in the same predicament. My PC is roughly 3m away from the left side of the CX. Based on Club3D's cable length recommendations of 1.8m for 4K/120hz, it's up in the air whether longer cables will work until we have hardware to test.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
If anyone's interested, I've created a new version of my ColorControl-app and it's available on GitHub now: https://github.com/Maassoft/ColorControl
This 1.1.0.0 version adds:
  • Automatically power on/off functionality for LG TV's
  • Extended NVIDIA controller context menu to allow setting color settings manually. Includes bit depth, format, dynamic range and color space. Note that not all combinations are allowed and will result in an error message. Some may also result in a purple screen (RGB format with YUV color space and vice versa)
  • Checkbox to fix blurry/rainbow fonts in Chrome. This is only necessary if you want to use ClearType in Windows, but grayscale anti-aliasing in Chrome. If you have ClearType disabled, just leave this checkbox unchecked.
  • Option to set a shortcut to the blank screen saver
 
I have a question.. can full screen dithering be enabled on the LG TV itself rather than the (nvidia) gpu? and if so, would that allow a dev like you to be able to active and de-activate it on the display itself? Is there also or otherwise any way your app might support 10bit dithering on the gpu too or instead like the app below?

Earlier in the thread I posted a link to an app that is supposed to do color profiles and enable dithering on nvidia gpus but the way it activates the dithering could be better. It uses a check box and you can't see the dithering test unless you disable and re-enable it while at the same time it says you might have to set the app to be enabled on windows start and reboot in order to see if it works.

I always got an error code enabling the dithering anyway but that might be because I'm still running SLI and on 1000 series until I get a 3090. I'll have to try it again with the more modern and singular gpu later this year (on a new OLED to boot). I am hopeful that nvidia someday will add a 10bit + dithering option to their driver controls on their gaming gpus as well but I won't hold my breath. (Someone else in this and the AW55 thread said they got the dithering to work with the app though, as well as people in other forum threads talking about the app).

Nvidia does have support for full dithering control by the user apparently, just not on their gaming cards in windows. So theoretically if they wanted to they could add both 10bit over hdmi and dithering controls at some point. (Maybe we'll get a 10bit internal Lut for windows instead of the current 8bit, unlike Linux's 11bit apparently, someday as well).

Here is a recent method below from 4months ago that has what seems like a good method, though I haven't tried it myself on any of my monitors. I don't represent that app at all so use at your own risk. It was made not just for dithering for for icc color profiles (color management in the may windows update has some bugs though apparently). However it does have dithering controls built in so should be able to use it just for that. The most recent readme update was from January 2020 and the current exe date lists as 2-18-2020.
https://bitbucket.org/CalibrationTools/calibration-tools/src/master/

https://bitbucket.org/CalibrationTools/calibration-tools/downloads/

View attachment 263775

Calibration Tools
Features

  • Create and edit monitor calibration files: adjust backlight, white balance, overall gamma, shadow detail gamma, 10-band gamma, s-curve, highlights.
  • Enforce separate day/night files with automatic dawn/dusk times and gradual transition during twilight.
  • Test for calibration hijacking in games, and solve with ReShade or borderless full screen windowed mode.
  • Enforce per-game calibration files.
  • Enable dithering on Nvidia GPUs for smoother gradients.
  • Portable, malware-free, ad-free, nag-free.
System Requirements

  • 64-bit Windows, Vista or later.
  • Logged in as a Windows administrator.
  • Video card supporting gamma tables for monitor calibration.
  • DirectX June 2010 update (www.microsoft.com/en-us/download/details.aspx?id=8109).
  • Note: Windows 10 version 1903 (May 2019) contains bugs in its colour management system which affect Calibration Tools. For possible solutions, see here.


---------------------------
https://www.nvidia.com/en-us/geforc...-to-port-dithering-from-nvidia-x-ser/2421010/
---------------------------
User - sonicblue83
You can try the app below which applies it through the Nvidia API instead of the reg key.

https://tiny.cc/CalibrationTools

https://i1.lensdump.com/i/j10v6m.png

If the Nvidia API status is success (code 0) click Check Gradient. If the gradient is unsmooth, try a hard reboot. If it's still unsmooth, see this post:
https://hub.displaycal.net/forums/topic/windows-10-1903-please-read/

The app also includes some automatic behaviour to try and prevent dithering from dropping out:
1. Disabling dithering at Windows shutdown
2. Refreshing dithering at display mode change
3. Refreshing dithering at display hardware change
4. Refreshing dithering at entering/exiting low power state
5. Refreshing dithering at game launch/exit

1 seems to be necessary to prevent it from dropping out after a reboot.
Only 1, 3 and 4 are enabled by default, as 2 and 5 have caveats.
You can still enable 2 and 5 through the Config.ini file, although it's not recommended. For details open the Readme file and see Config.ini settings - Dithering.
If for some reason the Enable Dithering checkbox is greyed out, try DitherForceSupport=1 in Config.ini.

Mozgus
Nice! This works quite well. Shame it cant apply it to individual monitors though. Have to unload this config when I want to use my 2nd monitor.

sonicblue83
Yes, unfortunately per-monitor calibration is not supported (see Readme file Q&A General section for explanation). However you can still control which monitors receive the calibration, if that is of use to you. For example to only apply calibration to monitor 1, set MultiMonitorDeviceContext=1 in Config.ini. To only apply to monitors 1 and 3, MultiMonitorDeviceContext=1,3 To apply to 1 and 3, but unload monitor 2, MultiMonitorDeviceContext=1,-2,3 (minus means unload). Then restart the app and use the Hijack Test tool to confirm the intended monitors are(n't) turning green. Dithering is always applied to all monitors and that behaviour cannot be changed. If you only want to use the app for dithering, and use another app for calibration, just don't select any calibration files to begin with. If you have already selected calibration files and want to stop applying them, right click tray icon -> Factory Reset.

--------------------

Looking forward to messing with your app, it looks good. Thanks.
 
Hi guys. I just got my 48CX in. I have to get a VESA adapter before I can test this on my freestanding Obutto monitor stand. But I need to ask about HDMI cables.

I want to achieve 4K @120hz. Plus hopefully HDR. I did purchase the CLUB 3d hdmi-display port adapter that I learned about in this thread. That gets here next week. But if this 48CX sings for gaming, I will end up buying two more for triple 48CX's for my race sim rig. Which is going to be very wide. My PC tower will be under the right screen. So I need to get a decent length cable for where the far left 48CX would end up (unfortunately the 48CX's HDMI outputs are at the far rear left of the unit).

Are there 12' or 15' cables that are good enough for 4k @120hz + HDR? I probably need a 12' minimum for the left screen (15' would do it for sure), and will be able to use a 9' for the center, and a 6' for the right.

I see a high speed 9.6' cable on Club 3d's site. But I am going to need longer than that for the left 48CX.

Thanks for any help!

You really won't know until you try a longer cable, but they do recommend max 2m cables. Doesn't mean that longer cables won't work but their quality will be more of a factor based on running 8-10m HDMI 2.0 cables to my TV. The HDMI 4 input poking out the back is probably your best bet for getting cables to reach. The port is the same as the rest, just different location.

Also when you get the adapter, contact Club3D asking for a newer firmware. It does a lot to make it work better but is still far from perfect.
 
If anyone's interested, I've created a new version of my ColorControl-app and it's available on GitHub now: https://github.com/Maassoft/ColorControl
This 1.1.0.0 version adds:
  • Automatically power on/off functionality for LG TV's
  • Extended NVIDIA controller context menu to allow setting color settings manually. Includes bit depth, format, dynamic range and color space. Note that not all combinations are allowed and will result in an error message. Some may also result in a purple screen (RGB format with YUV color space and vice versa)
  • Checkbox to fix blurry/rainbow fonts in Chrome. This is only necessary if you want to use ClearType in Windows, but grayscale anti-aliasing in Chrome. If you have ClearType disabled, just leave this checkbox unchecked.
  • Option to set a shortcut to the blank screen saver
I've been trying to use this. Shutting the TV off works, but it never wakes. Log would indicate it's sending magic packet fine.

Have WOL set up in HomeAssistant as well, and that does work.
 
I've been trying to use this. Shutting the TV off works, but it never wakes. Log would indicate it's sending magic packet fine.

Have WOL set up in HomeAssistant as well, and that does work.
Thanks for testing.
What LG model do you have? If the log is indicating it's sending the magic packet, there should also be a MAC-address in the log. Could you check if it's equivalent to the address that is listed under the connection properties on the TV?
By the way, maybe it's better to post issues with the app here: https://github.com/Maassoft/ColorControl/issues
I think most of the readers here will appreciate that :)
 
Thanks for testing.
What LG model do you have? If the log is indicating it's sending the magic packet, there should also be a MAC-address in the log. Could you check if it's equivalent to the address that is listed under the connection properties on the TV?
By the way, maybe it's better to post issues with the app here: https://github.com/Maassoft/ColorControl/issues
I think most of the readers here will appreciate that :)
48CX. Fired up Wireshark, despite the log, app never sends packet.

I am capturing packets from HomeAssistant, which works fine.

I do have quite a bit happening on this box network wise, is it possible to specify adapter binding somewhere?
 
Last edited:
48CX. Fired up Wireshark, despite the log, app never sends packet.

I am capturing packets from HomeAssistant, which works fine.

I do have quite a bit happening on this box network wise, is it possible to specify adapter binding somewhere?
Do you see a line in the log similar to this:

Code:
2020-08-30 21:08:23.9285|DEBUG|ColorControl.WOL|Opening network device: \Device\NPF_{B19EF94B-392C-4A85-8094-9F4A9EA82131}

If not, maybe I'm filtering out too many network interfaces. Currently I skip all adapters that don't have a gateway address specified. Maybe I should not do that, but I wanted to omit all "virtual" adapters that likely wouldn't be connected with the TV anyway.
I can disable this filter in the next release. I don't think sending the magic packet over all network interfaces is going to cause a problem.
 
You really won't know until you try a longer cable, but they do recommend max 2m cables. Doesn't mean that longer cables won't work but their quality will be more of a factor based on running 8-10m HDMI 2.0 cables to my TV. The HDMI 4 input poking out the back is probably your best bet for getting cables to reach. The port is the same as the rest, just different location.

Also when you get the adapter, contact Club3D asking for a newer firmware. It does a lot to make it work better but is still far from perfect.

Ok thanks so much! Thanks for the tip!
 
How can I get my CX to remember the last device it used. For some reason it randomly turns my PS4 on first when I POWER on my PC.
 
Can anyone speak to the severity of the near-black gamma shift issues w/ gsync enabled? I’m ok with minor issues here and there, but some comments I’ve read seem to indicate it’s pretty bad.

I’ll closing to pulling the trigger on a CX, with this being the only issue holding me back. Would be using a 48" exclusively for PC gaming with one of the upcoming Ampere cards.
 
Can anyone speak to the severity of the near-black gamma shift issues w/ gsync enabled? I’m ok with minor issues here and there, but some comments I’ve read seem to indicate it’s pretty bad.

I’ll closing to pulling the trigger on a CX, with this being the only issue holding me back. Would be using a 48" exclusively for PC gaming with one of the upcoming Ampere cards.

I have yet to encounter this near black issue. I've already done testing on it using in game gamma adjustment sliders and it shows no raised near blacks. However, now that I've been playing Guild Wars 2 quite a bit, what I HAVE been experiencing was some subtle flickering in this game in dark scenes and at low frame rates. Kinda sucks because I can't run the game at high fps all the time due to poor optimization so if this is a low frame rate thing and the display doesn't like being around the low end of the gsync window where it has to transition in and out of LFC then that's a real bummer. But as far as near blacks being raised and staying raised with VRR on....no I haven't encountered this.
 
  • Like
Reactions: es-wi
like this
I have yet to encounter this near black issue. I've already done testing on it using in game gamma adjustment sliders and it shows no raised near blacks. However, now that I've been playing Guild Wars 2 quite a bit, what I HAVE been experiencing was some subtle flickering in this game in dark scenes and at low frame rates. Kinda sucks because I can't run the game at high fps all the time due to poor optimization so if this is a low frame rate thing and the display doesn't like being around the low end of the gsync window where it has to transition in and out of LFC then that's a real bummer. But as far as near blacks being raised and staying raised with VRR on....no I haven't encountered this.

I've also never seen this. I haven't gone explicitly looking for it, but given how black real black is, raised black would stick out in contrast to the bezel and I think I'd notice it.
 
  • Like
Reactions: es-wi
like this
I'll let you know once I get my Ampere card since I can't do VRR on this TV with a 1080 ti. I'll do some extensive testing, in a dark room, SDR mode.

HDR is too much of a mess for me to deal with for now and I have no clue how to calibrate it.
 
  • Like
Reactions: es-wi
like this
Back
Top