LG 48CX

From TFTC's review of the Samsung G7:

1598907064300.png


I believe this is the same kind of flickering that I am experiencing in Guild Wars 2 on my CX, where running at frame rates near the bottom end of the VRR window and transitioning in and out of LFC is causing it. Not sure if LG can fix this with a firmware update if the issue is gsync itself.
 
Just going by the avsforum posts it sounded like it was a thing. Vincent at HDTVtest mentioned it in some of his LG OLED videos.
According to all of those sources:

-- There was a near black problem with flashing.
-- LG released a near black fix which applied dithering to the near black levels (of color/black) .. I think it only applied/forced dithering to those values as a sort of "noise" to mask it.
-- LG realized that detail was being lost in scenes due to the dithering, so switched to a black flattening method rather than a dithering method, in order to avoid the flickering without losing detail.
-- Reports from Vincent of HDTVTest and some people on AVSforum's LG OLED gaming thread are that the black flattening "fix" workaround is being bypassed with VRR active ever since VRR was added in (september?) 2019.
-- Some claimed this makes near black areas, for example a white character on a black background, have a lightened black area around the white objects in the scene. If this were true, it would be localized heightened blacks so adjusting your gamma would not be a perfect way to adjust for that.

Not having one yet I can't really weigh in one way or the other from personal experience. I do tend to trust vincent Teoh of HDTVTest though and he said the workaround was being bypassed with VRR active on the C9's and CX's.
If it's so minimal that you have to really look hard for it (outside of something like MisaSparkul's flashing issue), the heightened black "halo"/outline areas probably wouldn't bother too me much but hopefully LG will fix it in a future firmware.

I don't know if you are following along in that thread or if it's just coincidence that you brought it up now since they are talking about it currently again here:
https://www.avsforum.com/threads/2020-lg-cx–gx-dedicated-gaming-thread-consoles-and-pc.3138274/page-59

edit: yes I see your same nickname questioning on that avsforum thread
 
  • Like
Reactions: es-wi
like this
When this near black VRR issue was first brought up, people were making it seem like it's some super obvious and easy to spot issue that sticks out like a sore thumb and completely ruins the CX. If this was really the case then I'm pretty sure many of us would have noticed it already, this is [H] after all. The fact that it is hardly being brought up either means this issue is actually very subtle and the ones who were first complaining about it are way over exaggerating, or not everyone has this problem. I already did the silly little test with gamma sliders where I launched the game with Instant Game Response turned off, adjusted the gamma slider so that the reference icon was just 1 tick away from being visible, flipped Instant Game Response back on, and the icon still remained invisible in the end. So at this point I don't think I have the VRR problem, and probably so do many others. However, I just watched Umbrella Academy S2 on Netflix which is a Dolby Vision show, and I definitely have raised blacks with DV for sure.

 
Last edited:
My understanding was that everyone has it, but the gamma settings you use can effectively hide it.

But I don't know, there is also a lot of variance from panel to panel.
 
My understanding was that everyone has it, but the gamma settings you use can effectively hide it.

But I don't know, there is also a lot of variance from panel to panel.

I don't think so. AVSForum shows near black gamma being elevated when using 2.2 gamma and HDTV Test shows it being raised when using 2.4 gamma. Most people are going to be using 2.2 or 2.4 gamma on their CX's. I was using 2.4 but I can re-test it using 2.2. But regardless of whether everyone has it or not, it's pretty obvious that this issue is a lot more subtle and was initially being over exaggerated as how to bad it actually is. Like it's so obvious and bad that it makes games "look like shit" and "OLED's without true blacks" etc etc etc. Something that obvious would not go unnoticed around here.

1598909363589.png

1598909376337.png
 
2.4 is way too dark since most content (except a few movies here and there) is mastered for 2.2 so it would be hard to notice with that I think.
 
2.4 is way too dark since most content (except a few movies here and there) is mastered for 2.2 so it would be hard to notice with that I think.

I was using 2.4 since RTings says that's the best target for dark rooms, which is what I use my CX in. But yeah I can redo the test with 2.2 gamma instead.


1598909932385.png


"a gamma of 2.4 is also appropriate for very dark rooms"
 
Gamma is a headache. You'll find different conflicting recommendations everywhere. I just like 2.2 as a middle ground personally. Even in a dark room with 2.4 there is too much loss of detail for my liking. Except in some blu-rays and games that look washed out with 2.2, lol. But those are rare.
 
Looks like GAINWARD branded 3090s and 3080s were posted to their website too early, and subsequently removed. Screengrabs were captured however. Confirmed HDMI 2.1 on both. Interestingly enough though is that the listing for the 3080 states 2 8-pin connectors for the power requirement.

https://www.tweaktown.com/news/7485...rce-rtx-3090-3080-phoenix-official/index.html

I just can't wait for nvidia to finally add 10bit output over HDMI so everyone can finally shut up about it lol.
 
From TFTC's review of the Samsung G7:

View attachment 274998

I believe this is the same kind of flickering that I am experiencing in Guild Wars 2 on my CX, where running at frame rates near the bottom end of the VRR window and transitioning in and out of LFC is causing it. Not sure if LG can fix this with a firmware update if the issue is gsync itself.

On my C9 i once saw some flickering in the multiplayer menu of COD MW. I have never seen it while playing. People complaining about it while playing the new control DLC for example. This game has very dark scenes. And also there i see no flashing or flickering, even at different gamma levels.
 
Founder's editions are using the new 12pin. The leaked non-Founder's cards so far have been regular 2x8pin. I'm surprised they aren't doing 2x8 1x6pin at least for a 350w TDP card, if not 3x8pin. My 2080ti is 2x8 1x6, and there were 3x8 2080ti's.

I'd expect a lot of AIB partners to go 2x8pin since that's more "normal".

Honestly, I forget what my PSU even has plug wise; I'm not even sure I have 2 8-pin connectors. :/
 
Founder's editions are using the new 12pin. The leaked non-Founder's cards so far have been regular 2x8pin. I'm surprised they aren't doing 2x8 1x6pin at least for a 350w TDP card, if not 3x8pin. My 2080ti is 2x8 1x6, and there were 3x8 2080ti's.

I skipped the 2000 series entirely because to me it was incremental and so was mediocre upgrade wise. Compare that to a die shrink, hdmi 2.1 and other features in the 3000 series. I'll be switching from two 1080ti sc hybrids to a single 3090 neptune if all goes well. Even if the neptune took 400w it would still be 100w less than what my cards could potentially use now. I have 4 spare molex from those two cards so as long as I can get an adapter if necessary I should be good.

Btw, asus released some 3000 series specs with dual hdmi 2.1 outs
https://videocardz.com/press-release/asus-announces-geforce-rtx-30-series

e4bQn3V.png

Nvidia thing starts shortly but will prob be a lot of bravado and walking down memory lane at first:
https://www.nvidia.com/en-us/geforce/special-event/
 
Guys I can't decide on a 3080 or 3090 at launch. We know for a fact a 3080 Ti with 20GB will eventually come. I estimate that a 40% increase in FPS is required from my 2080 Ti to game carefree @ 4K/60+ in basically every game. I'm banking on the 3080 being 25% faster than a 2080 Ti so maybe a 3090 is necessary.

GPU launches are always so exciting.
 
I'm leaning towards 3090 but lots of unknowns still. The huge size and not much higher wattage combined with the fact they insisted on how quiet their cooler is (they even call it "SILENCER") definitely has my full attention. But it's 2x the price of the 3080, still need to know the performance delta to make the most reasonable decision.
 
Pretty sure they omitted it's performance because of how close it is to the 3080. Estimates of only 15-20% on top of the 3080 are probably true and would look horrible on a PR chart.
 
Pretty sure they omitted it's performance because of how close it is to the 3080. Estimates of only 15-20% on top of the 3080 are probably true and would look horrible on a PR chart.
They didn't omit its performance -- they omitted the whole product SKU entirely. As of right now, the 3080Ti doesn't exist.
 
Pretty sure they omitted it's performance because of how close it is to the 3080. Estimates of only 15-20% on top of the 3080 are probably true and would look horrible on a PR chart.

Doubt it. 20% more than a 3080 while charging double the price? More than double actually, 699 vs 1499 so I really doubt it's only going to be 20% more, but hey we won't have to wait long to find out.



Yeah based on this Digital Foundry video, I doubt that even more. 2080 Ti isn't that much faster than a 2080 and the jump to 3080 from 2080 is massive, which would be a significant improvement over a 2080 Ti.
1598987320550.png


1598986639836.png


1598987360221.png


A 2080 Ti is about what? 30% faster than a 2080? The 3080 seems to be around 70-80% faster than a 2080 so that would make it roughly 35-40% faster than a 2080 Ti, now add the additional performance gains of the 3090 and we are looking at a massive leap over the 2080 Ti here. As for 3090 over 3080, I kinda wanna say 25-30%. Of course we need a much much wider set of games to be tested by a lot more people to be sure, but this is a great start.

EDIT: Nvm just realized the 3090 and 3080 share the same GA-102 die with the same transistor count, unlike the 2080 Ti and 2080 which do not. Guess SoCali's original estimate of 15-20% might be right on the money then. I was under the impression that the 3090 uses a bigger die vs the 3080 like the 2080 Ti vs 2080.
 
Last edited:
Man, the 3080/3090 is going to make this thing shine. Glorious times ahead. I can't wait to unlock the rest of the CX's abilities...and it's nice to be able to put the HDMI 2.1 thing to bed.
 
So far the Asus cards are the ones I'm seeing with 2x HDMI connections (and triple 8pin power), so an Asus card will be what I buy (and then a waterblock for it).
 
Curious about what you guys want to use the dual HDMI for? Is it a VR rig + CX?

Yup, this. Also since I watercool my only performance constraint is power (assuming the silicon is stable) and in Turing cards, more power connectors = higher max TDP's which was the real performance ceiling, so triple vs. double 8pin may be meaningful.
 
Last edited:
I've decided I'm going to get the 3080 first, and if it isn't enough for the CX 55 4k120, I'll sell it and grab a 3090.

I'm with you. Doing a bit more reading I realized the 3090 and 3080 share the same GA-102 die, whereas the 2080 Ti and 2080 do not, one is TU-102 while the other is a smaller TU-104. This leads me to believe that the 3090 and 3080 will be a lot closer in performance as SoCali pointed out, perhaps only 15% difference between the two. I think I'll just go with the 3080 as well unless the 3090 ends up being more than 20% faster.
 
I think I will go with a FE for the first time since it now has 0 RPM and the rear does exhaust out the case.

3080 + PS5 is the route I'll go. Based on Eurogamers hands on it's roughly 33-48% faster than a 2080 Ti depending on game. Spending 2x the price for another 20-25% sounds nuts. My brain tells me 3090 but my wallet says 3080. If only there was something in between.
 
If anyone's interested, I've created a new version of my ColorControl-app and it's available on GitHub now: https://github.com/Maassoft/ColorControl
This 1.1.0.0 version adds:
  • Automatically power on/off functionality for LG TV's
  • Extended NVIDIA controller context menu to allow setting color settings manually. Includes bit depth, format, dynamic range and color space. Note that not all combinations are allowed and will result in an error message. Some may also result in a purple screen (RGB format with YUV color space and vice versa)
  • Checkbox to fix blurry/rainbow fonts in Chrome. This is only necessary if you want to use ClearType in Windows, but grayscale anti-aliasing in Chrome. If you have ClearType disabled, just leave this checkbox unchecked.
  • Option to set a shortcut to the blank screen saver

I just fired this up to try out and it isn't able to find my TV. I have quite a bit going on network-wise, but the TV and computer are on the same subnet and can talk (I can ping it and I could control calibration stuff on it over the network via CalMan). I have a feeling it's a network binding issue, is there some way to bind the app to a specific NIC?

EDIT: Cancel this, I figured it out based on the Github issues and how it finds the TV based on the DLNA device existing. I added it and it's detecting now!
 
Last edited:
My brain tells me 3090 but my wallet says 3080. If only there was something in between.

There likely will be if you’re willing to wait on the 3080Ti/Super...but yeah, I feel yah. The 3090 isn’t a good value, but it’s the one that will push the most frames at 4K which matters more for the CX. I’m still leaning toward the 3080 since even that will be a huge upgrade from my 1080Ti but we will see after the reviews come out.
 
The consensus seems that the 3080 is the direct replacement to the 2080Ti and the 3090 is akin to the Titan. A 3080Ti or a 3080 Super could happen (probably next year) but the 3090 will be the top card for $1500. So if you want the best the 3090 is it. They are also saying that the jump from the 2080 to the 3080 is the same as the jump from 980Ti to the 1080Ti. Its that big of a deal. I guess Nvidia wants to make sure they will have anything better than AMD's Big Navi performance wise.
 
I just fired this up to try out and it isn't able to find my TV. I have quite a bit going on network-wise, but the TV and computer are on the same subnet and can talk (I can ping it and I could control calibration stuff on it over the network via CalMan). I have a feeling it's a network binding issue, is there some way to bind the app to a specific NIC?

EDIT: Cancel this, I figured it out based on the Github issues and how it finds the TV based on the DLNA device existing. I added it and it's detecting now!

what did you do exactly to make it work?
 
This is a community service announcement for LG OLED C9 owners. Turn ... OFF ... automatic firmware updates NOW.

With 3xxx series launching within a few short weeks, I'm seeing some speculation on various forums that LG may nerf, disrupt, change, impliment "fixes" to these 2019 sets with their full 48gb HDMI 2.1 performance in some as of yet, unknown ways. It's absolutely better to be safe than sorry.

This logic of course stems from the fact that LG could have added in 4K@120hz 4.2.2 support along with Freesync support to the 2019 C9, etc but for "market" focused reasons, did not. This is of course, at the end of the day, an unfair game LG has chosen to play with it's 2019 sets / owners. Simply unacceptable.

It's being suggested that now that HDMI 2.1 is launching, with both nVidia and AMD, they may pull off a "firmware" update to make their newer 2020 sets have clear performance and feature advantages over the 2019 sets.

It's generally understood and accepted now within the community of large format display users that the C9 has some advantages over the CX series.

I not only turned off automatic firmware updates but I have also blocked my LG C9 at the router level. No data for you!
 
Back
Top