LG 48CX

mirkendargen confirmed on the previous page that the Zeskit 6’ from Amazon works...there are probably others, though. Wish we had a confirmed list of brands and lengths but my plan was to try the Zeskit if my current high speed cable doesn’t work. I think it’s a Mediabridge and it solved the sparkling issues that I was having at 4K/60Hz. Worth a shot I suppose.
Yup I've never had any screen blanking with the Zeskit. I had the white column artifacts and thought that was a cable problem, but it was a false alarm and caused by the HDMI port not being set to PC (forgot to set it again after changing GPUs).
 
mirkendargen confirmed on the previous page that the Zeskit 6’ from Amazon works...there are probably others, though. Wish we had a confirmed list of brands and lengths but my plan was to try the Zeskit if my current high speed cable doesn’t work. I think it’s a Mediabridge and it solved the sparkling issues that I was having at 4K/60Hz. Worth a shot I suppose.
I actually bought the 10FT one of this Zeskit when my 3080 first arrived over a month ago and it wouldn't work at 4K/120hz. I'm now using a random cable from a box full of cables but just today it's blanking out so not sure if it's the cable or drivers. Are you guys using the 6.5FT one?

My PS5 should be here Friday which ships with a supposedly legit HDMI 2.1 cable so I'll try that first.
 
Personally, I think the VRR raised blacks is a pretty minor issue. If you are near 120Hz the gamma shift will be extremely minimal. I've never personally noticed any problem across multiple games and wouldn't have been aware of it if I hadn't seen the reports. I haven't tried this yet, but if your game is running at 60 fps like Control, can't you just run your monitor at 60Hz so the gamma is calibrated correctly? Even with this issue, you won't find a monitor with better PQ right now.

Now the g-sync stuttering is a real problem that needs to get sorted. I tried the workaround suggested earlier (uncheck "enable for selected display"), but this actually seems to disable g-sync completely. It's smooth but I get tearing as a result.

I also find the rtings input delay measurements pretty weird. Why would VRR have higher input lag than non-VRR? They must have messed that up somehow. Anyways, I've found for competitive gaming that you can just run 1440p or 1080p non-scaled and the input delay is non-existent (but I can't really notice it at 4k either).
 
If you're looking for a cable, I can confirm the KabelDirekt UltraHD 3m cable works well.
 
Last edited:
Personally, I think the VRR raised blacks is a pretty minor issue. If you are near 120Hz the gamma shift will be extremely minimal. I've never personally noticed any problem across multiple games and wouldn't have been aware of it if I hadn't seen the reports. I haven't tried this yet, but if your game is running at 60 fps like Control, can't you just run your monitor at 60Hz so the gamma is calibrated correctly? Even with this issue, you won't find a monitor with better PQ right now.

Now the g-sync stuttering is a real problem that needs to get sorted. I tried the workaround suggested earlier (uncheck "enable for selected display"), but this actually seems to disable g-sync completely. It's smooth but I get tearing as a result.

I also find the rtings input delay measurements pretty weird. Why would VRR have higher input lag than non-VRR? They must have messed that up somehow. Anyways, I've found for competitive gaming that you can just run 1440p or 1080p non-scaled and the input delay is non-existent (but I can't really notice it at 4k either).
Running at 60hz does not fix it. The panel still runs 120hz internally as it does at any refresh rate (don't ask me how the 100hz mode works though - as that's a not a multiple of 120).

Anyway I agree it's pretty minor, most people will not notice it - but they might notice flashing in dark games with a framerate that fluctuates very wildly and drops low enough. Even with VRR I frequently cap my frames way below the refresh rate to get a more consistent experience anyway - but of course people are free to do it their own way. For example in the latest Assassin's Creed games or Control I cap my frames at 60 even though some scenes will give me 100+ frames - because the average uncapped framerate is around 60 (with some dips below that) so I prefer to make it stable - this way I get used to a certain feeling for that particular game and it's more immersive. Can also help with noise and heat in summer :)

The workaround from the German guy posted earlier works superbly for the games you know you'll be playing around 60fps (hello Control, hello AC). It doesn't work too bad at 100+fps either, since it makes near black tones darker rather than brighter, it never looks ugly (but yeah at 100-120fps you'll crush some dark details).
 
Last edited:
I'm pretty sure the gamma is correctly calibrated at 60Hz, isn't it? If I game at 60Hz gamma looks fine. The issue is running at a particular refresh rate with VRR enabled - the gamma won't calibrate correctly for the actual refresh rate as it drops - it is stuck at the max refresh. So I'd expect 60Hz VRR to work better than 120Hz VRR since it will use the 60Hz gamma curve. But you're saying with VRR enabled, it always uses the 120Hz gamma curve? That sounds like something that should be easy to fix though, but maybe it's just the way the VRR module works. If so that would be pretty annoying. I will have to test this out myself more to see.
 
LG explicity said gamma is calibrated at 120hz only. It was posted earlier in this thread.

edit: https://www.oled-a.org/lgersquos-48rdquo-oled-attracting-game-monitor-buyers_9620.html

And yeah I have tested 60hz VRR myself (just like the German guy did in his video) and the gamma issue is obvious if you know where to look.

If that's true it means any 60 fps content would have gamma shift issues, wouldn't it? Most consoles run at 60Hz Vsync for example. These have raised blacks as well?
 
If that's true it means any 60 fps content would have gamma shift issues, wouldn't it? Most consoles run at 60Hz Vsync for example. These have raised blacks as well?
I think only in combination with VRR though.

If not it would also mean it has to be broken on basically everything that runs on 60Hz, including Netflix, Prime and other apps that don't do a refreshrate switch..

Or it may only be a problem in gamemode.. then you could work around that by using another mode on non lag sensitive singleplayer games..
 
No it's only VRR. There is zero issue with fixed refresh rate mode (any hz & any fps). No matter the refresh rate you select, the TV runs internally at 120hz (this is common practice, it's not anything new). But in VRR it can't always run at 120hz, that's when the issue occurs.
 
If the panel is only refreshing at 60hz, and the gamma is fine, that means it's not just calibrated for 120Hz. It must be calibrated to run at these other refresh rates already, or we would see gamma shift problems at all refresh rates not 120Hz. Unless you're saying that it refreshes the pixels at 120Hz even if the input is 60Hz/24Hz, etc. but I don't think that's the case. That's why I was thinking it seems the problem is directly related to what refresh rate you run VRR at, so running VRR at 60Hz in theory should use the 60Hz gamma curve. Unless there is just something buggered w/ VRR completely so it always uses the 120Hz gamma curve regardless of what refresh rate you set the panel to.
 
Yeah without VRR the TV is always processing the image (refreshing the pixels as you said) at 120hz, no matter the refresh rate you select. So the optimized 120hz gamma curve fully applies.

But with VRR enabled the actual refresh rate will change and can drop low enough to cause problems. Edit: this guy explains how VRR works better. Since the issue has been shown on LCDs, it appears the only reason we are talking about it now is because OLED blacks (and by extension near-blacks*) are so low that for the first time the issue is becoming visible. (not sure though)

*for the record absolute blacks are not affected because the pixels are just turned off then. It's the near-black shades that are affected because they require a very subtle and very precise amount of voltage to look correct on OLED panels.
 
Last edited:
With HDMI 2.1, is it now preferable to set your HDMI input to PC mode vs game mode? I remember over the summer people were recommending Game Mode with HDMI 2.0, but I don't remember why...
 
Anyways, I've found for competitive gaming that you can just run 1440p or 1080p non-scaled and the input delay is non-existent (but I can't really notice it at 4k either).

If the input lag is 13.x ms or less on the LG CX, and you run frame rate averages like:

------------------------
100fpsHz average
-------------------------
(70) - 85fpsHz <<<<<<< 100 ave >>>>>>>> 115 (130) fpsHz

14.3ms - 11.8ms <<<<< 10ms ave >>>>>>>> 8.7ms (117fps cap = 8.5ms)

------------------------
75fpsHz average
------------------------
(40) - 60fpsHz <<<<<<<< 75 ave >>>>> 90 (105) fpsHz

25ms - 16.7ms <<<<<<<< 13.3ms >>>>> 11.1ms ( to 9.52)

--------------------------

At 100fpsHz average, 2/3 of your frame rate graph is being delivered nearly at your 13.x input lag in local single player games

At 75fpsHz average, that average is practically even with the input lag of the LG CX, so again 2/3 of the graph are delivered at or slower than the input lag in local single player games.

Delivered at or especially slower than the input lag means you aren't seeing any new game world ~ action state "slices" any sooner than your input lag or report rate, and that doesn't even take into account your reaction time which is always a non zero number no matter how good you think you are.

Since you said competitively I'm assuming you mean online gaming rather than local single player gameplay leaderboards. The whole chain of online latency/player pings and the server code interpolation means you would not see a new world action state "slice" for many frames later, dwarfing your local input periods.

Your interp value of 15.6ms on 128tick server (or much, much worse outside of the best 128 tick games) is alone larger than your input lag on a LG CX .. before adding your ping time. It's not cut and dry with latency compensation sort of rubberbanding things, and game physics like world of warcraft's spell casting being at a different lower tick rater than the rest of the game for example, but just from the raw numbers you can see that you would be way behind your input lag from any competitive edge perspective in online gaming, as well as considering everyone else's latency chains resolving (both teammates and enemies).. and the quality of the game's network code. If you were playing competitively in an arena or at a party all on the same LAN on a very high tick server that would be different, but the marketing of "360Hz" 4ms input lag monitors as if the online game matches are delivering and receiving at that rate is false advertising.

https://hardforum.com/threads/lg-48cx.1991077/post-1044799485
For example a good server game with 128 tick servers and using interp ratio 2 (to avoid huge 250 ms hits on missed packets) would have 15.6ms interpolation + 25ms to 40ms (your ping). So say 41ms to 56ms just for your own actions not counting lag compensation between other players. Lets say 56ms for now on the higher 128tick servers (though most games are much longer tick). 56ms is 6.6 frames of time on a 120hz monitor at 117fps solid (8.5ms per frame). So you aren't seeing new world updates for every 6 or 7 frames, maybe worse in relation to syncing with your next local (8.5ms) frame draw.

On a more traditional 64tick , 22tick, or 12 tick online game the numbers go up by a lot:

128tick at interp_2 = 15ms + (25 - 40ms) ping = 40ms to 55ms ~~~~~~~~~> 5 to 7 frames before new action/world state data is show (at 117fps solid)
64 tick at interp_2 = 31.2ms + (25 - 40ms) ping = 56ms to 71ms ~~~~~~~~> 7 to 8 frames
22 tick at interp_2 = 90ms + (20 - 40ms) ping = 110ms to 130ms ~~~~~~~~> 13 to 15 frames
12 tick at interp_2 = 166ms + (20 - 40ms)ping = 186ms to 206ms ~~~~~~~> 22 to 24 frames

If you set interp_1 then the tick interpolation time would be halved (minus 1 frame, 2 frames, 5 frames, 10 frames respectively) - but any lost packet at all would hit you with a 250ms delay /8.5ms per frame = 29 frames.

Command Execution Time = Current Server Time – (Packet Latency + Client View Interpolation)

For reference, tick rates of some common online games:

Valorant: 128tick
Specific paid matchmaking services like ESEA: 128tick
CSGO ("normal" servers): 64tick
Overwatch: 63
Fortnite: 60 (I think, used to be 20 or 30)
COD Modern Warfare mp lobbies: 22tick
COD Modern Warfare custom lobbies: 12tick


Some have guessed that League of Legends tick rate is around 30.
ESO PvE / PvP: ??
WoW PvE / PvP ?? .. World of warcraft processes spells at a lower 'tick rate' so is a bit more complicated, but overall the tick rate probably isn't that great.
https://us.forums.blizzard.com/en/wow/t/is-classic-getting-dedicated-physical-servers/167546/81




https://happygamer.com/modern-warfa...or-a-game-that-wants-to-be-competitive-50270/
 
Last edited:
I have a stupid question or comment. Through the years, I have always gamed at 1080 and 1440p without Gsync and freesync. Sometimes when playing on a 75hz refresh monitor (most professional monitors are 75hz), I would turn on Vsync only. On other gaming monitors (144-165), I simply leave Gsync and/or freesync off.
You guys state there is flickering with Gsync/Freesync? Why not just leave it off? Am I missing something?
Surely your graphics cards can push 120 fps without breaking a sweat? I do not understand the problem?

Another point. When you activate 1440p in windowed mode, what does the screen size become? For example on a 32" 1440p monitor when you activate a lower resolution (1080) the screen size goes from 32" to 24". ??
 
Last edited:
4k is super demanding so no.. on a lot of modern demanding games you are lucky to get 60 or 75fps average with all the bells and whistles turned on graphically and that is with a top end gpu. With your scenario, you would have to push 120fps SOLID .. that is, your minimum frame rate would have to be 120fps to avoid any frame rate fluctuation. Fluctuating frame rates with no compensating tech causes judder, and when going over the top Hz of the monitor in (actual ranging, not just average) frame rate it causes tearing. Usng V-sync adds a lot of input lag and if using double buffered v-sync can cut your frame rate in half when your frame rate can't match your monitor's max refresh rate.
 
Last edited:
With HDMI 2.1, is it now preferable to set your HDMI input to PC mode vs game mode? I remember over the summer people were recommending Game Mode with HDMI 2.0, but I don't remember why...
These are different things. You want the picture mode to be "Game" still (there is no "PC" here), and you want the label on the input in the "Home" menu to be "PC".
 
These are different things. You want the picture mode to be "Game" still (there is no "PC" here), and you want the label on the input in the "Home" menu to be "PC".

Yes, thank you. I meant PC vs Console in the HDMI input screen. Thanks!
 
@Whoever just deleted the gsync-off-no-tearing post: If you turn gsync off and vsync on you'll just have VSync at > 120Hz.
 
This monitor is pretty amazing. With G-sync off (to avoid the stuttering), I can barely notice the tearing. Is that because of the way OLED refreshes the screen? G-sync on and off is incredibly subtle (but I can notice it if I actively look for it). I'm guessing this is why people think the workaround of toggling "enable for selected display" actually works, because they don't even notice the tearing.

Compare this to LCDs - where I found the tearing to be SUPER noticeable even at high refresh rates of 165Hz, and games unplayable w/o g-sync. I'm not even sure I need g-sync on this screen.

Regarding input lags and online multiplayer - your latency etc. doesn't really matter when almost all games implement some form of rollback netcode. You are basically playing locally with rollback netcode, so all that really matters is your local input delay.

Rollback aka rubberbanding is attempting to keep you up to the tick rate to compensate for ping times it's not compensating to keep everyone else up to your local fps-Hz rate. At best you would theoretically get the tick rate of the server which with the interp2 on a 128tick server is 15.6ms but that is not how the formula works with your ping time contributing to it. More likely in the end you are getting serveral frames of your local game between each server state update. For example the game server could be sending you game action slices at a 55Hz (55ms) rate considering interpolation time + ping even if you are running 120fpsHz or 360fpsHz. On worse tick servers than 128 tick, which is most games, the interp time itself is much worse to start with before your ping time is factored in.

So you are running 15ms interp + your ping time on the best 128tick servers, with some averaging done by the client between 3 frames (current plus 7ms x 2 at interp2) that has been "buffered". By comparison, a LAN game typically has a latency of 3ms to 7ms.


If you’re now sitting there and asking yourself how only 128 or much worse 64 ticks/s produce a playable game that is not stuttering crazily when receiving only 128 updates a second, you deserve a cookie. The answer is interpolation. When your game client receives a package from the server, it doesn’t simply show you the updated game world right away. This would result in everyone breakdancing in 128 or 64 tick intervals across the map. Rather, it waits a set interpolation time called “lerp”, whose name probably originated by a network engineer stepping on a frog.

During this time, a set number of further packages arrived on the client’s side containing more updated ticks from the server. Through these ticks, the client is able to interpolate what has happened between these two points in time and display this assumption to the player (don’t get mad yet). Interpolation time is determined by the simple equation


cl_interp = cl_interp_ratio / cl_updaterate


So in our 128 tick server example from above, on otherwise default settings this would mean: You receive a new packet every 7.8 Milliseconds (cl_updaterate 128) but the server waits until you received a third packet (cl_interp_ratio 2) before displaying the information, making the interpolation time is 15.6 Milliseconds for this example. On the other hand, a client running cl_interp_ratio 1 is presented with a renewed state of the game every 7.8 Milliseconds – assuming all other hardware and software variable are optimal.


Of course, from everything we’ve learned in our long online gaming history we assume that a lower number in front of the ms sign is always preferable. But, you already guessed it, things aren’t so easy this time around as bad connections and lag compensation come into the picture.


Again, the people with unreliable connections are better off to accept higher interp times, as the game client requires a new package of information from the server precisely at the interpolation time to update your game. If the second package is lost, the client waits 250ms on another package before flashing that red warning message in the top right corner of the screen.


For someone who tends to experience any package loss pretty much ever, it is safer to set cl_interp_ratio to 2, especially since you regain the “lost” time in the lag compensation.

.... So you are loading multiple ticks sent (with your latency) which are on the best tick server a new packet + 7.8ms x 2 = 15ms tick frames, then figuring out what happened between them. This is like buffering a few frames and making a frame out of the difference between them.


Command Execution Time = Current Server Time – (Packet Latency + Client View Interpolation)

So you pull the trigger or peek around a corner = current server time (say 100min:00sec:00ms) minus packet latency (25ms to 40ms typical) + client view interp (15ms) =

128tick at interp_2 = 15ms + (25 - 40ms) ping = 40ms to 55ms ~~~~~~~~~> 5 to 7 frames before new action/world state data is show/new action is registered (at 117fps solid)
64 tick at interp_2 = 31.2ms + (25 - 40ms) ping = 56ms to 71ms ~~~~~~~~> 7 to 8 frames
22 tick at interp_2 = 90ms + (20 - 40ms) ping = 110ms to 130ms ~~~~~~~~> 13 to 15 frames
12 tick at interp_2 = 166ms + (20 - 40ms)ping = 186ms to 206ms ~~~~~~~> 22 to 24 frames


Lag Compensation


The inevitable conclusion from the preceding segment and also the fact that all players on the server have a ping is, that everything you see on your screen has happened on the server already a few Milliseconds in the past.


Let’s leave any philosophical and Einsteinian implications of this to the side for the moment to focus on how a playable game is produced from this situation in which you don’t have to pre-aim your crosshair in front of the enemy.


The process responsible for this is lag compensation in which the server accounts for both ping and interpolation timings through the formula:


Command Execution Time = Current Server Time – (Packet Latency + Client View Interpolation)
Put into English this means that once you pull the trigger and this information package gets sent to the server, it then goes back from the current server time (the time the pulling the trigger package was received) by your ping plus your interpolation time. Only then it is determined if the client hit the shot or not.

... So the rollback you are talking about
-starts from the current server time (the server time logged after your latency sent packet was received)
-minus
- your ping time (25ms - 40ms) plus your interpolation time (128tick = 15ms, 64 tick = 31ms, 22 tick = 90ms, 12 tick = 166ms)

There's no factoring in of 120fpsHz 8.3ms frames or 360fpsHz 's 2.7ms frames. You are bound by the tick rate. The best tick is closer to 60fpsHz so higher fpsHz isn't doing anything to better that. If anything your latency is the factor (at least until there are someday higher tick servers).

Imagine you want to reach the boxes on top mid on Mirage while an AWP is perched in window. You jump out of the cover of the wall, fly and land safe behind the boxes. In the moment you land, the AWP shot goes off and you somehow die and end up right on the edge of the boxes, a good half meter away from where you stood on your screen. In the German scene, you would have just been “interped”, even though “being lag compensated” might be the more accurate term (I acknowledge that is way more clunky and less easy to complain about).

As the peeking CT moves into the gap of the double doors, his lag compensated hitbox and model are still behind the door, giving the Terrorist no real chance to respond. However, it is imperative in this scenario for the peeking player to actually hit (and in most cases kill) his opponent in the time it takes the server to compute all executed commands and the appropriate lag compensation. Of course, the showcased example is taken with a ping of 150ms, which is unrealistically high for most people, artificially lengthening that time.


Should any of you reading this have the good fortune to play on LAN one day, you should keep in mind that peeker's advantage is solely dependent on lag compensation, a big part of which is made up by a players ping. With the typical LAN connection ping of 3-7ms, peeker's advantage is practically non-existent anymore. Together with many other factors, this is one of the reasons why CS:GO has to be played differently in certain aspects on LAN than on the internet.

--------------------------

https://www.reddit.com/r/Overwatch/comments/3u5kfg/everything_you_need_to_know_about_tick_rate/

Note: In an example where two players shoot eachother, and both shots are hits, the game may behave differently. In some games. e.g. CSGO, if the first shot arriving at the server kills the target, any subsequent shots by that player that arrive to the server later will be ignored. In this case, there cannot be any "mutual kills", where both players shoot within 1 tick and both die. In Overwatch, mutual kills are possible. There is a tradeoff here.

  • If you use the CSGO model, people with better latency have a significant advantage, and it may seem like "Oh I shot that guy before I died, but he didn't die!" in some cases. You may even hear your gun go "bang" before you die, and still not do any damage.
  • If you use the current Overwatch model, tiny differences in reaction time matter less. I.e. if the server tick rate is 64 for example, if Player A shoots 15ms faster than player B, but they both do so within the same 15.6ms tick, they will both die.
  • If lag compensation is overtuned, it will result in "I shot behind the target and still hit him"
  • If it is undertuned, it results in "I need to lead the target to hit them".

So you are never being fed new information or having your new actions register.. compensated across [[ two received 7.8ms tick action slices (15ms) + your local current frame state ]] on the client ... fast enough to make 13.x ms of input lag a competitive factor in online gaming imo even if you had a ping of zero but especially with online latency factored in.
 
Last edited:
4k is super demanding so no.. on a lot of modern demanding games you are lucky to get 60 or 75fps average with all the bells and whistles turned on graphically and that is with a top end gpu. With your scenario, you would have to push 120fps SOLID .. that is, your minimum frame rate would have to be 120fps to avoid any frame rate fluctuation. Fluctuating frame rates with no compensating tech causes judder, and when going over the top Hz of the monitor in (actual ranging, not just average) frame rate it causes tearing. Usng V-sync adds a lot of input lag and if using double buffered v-sync can cut your frame rate in half when your frame rate can't match your monitor's max refresh rate.
This is not true at all, v-sync will not any input lag if cap your framerate correctly. You can find multiple tests about that. Vsync only add input lag when its using without gsync.
https://hardforum.com/data/attachment-files/2019/01/298588_Q0SEx8F.png
 
This is not true at all, v-sync will not any input lag if cap your framerate correctly. You can find multiple tests about that. Vsync only add input lag when its using without gsync.
https://hardforum.com/data/attachment-files/2019/01/298588_Q0SEx8F.png

Correct.. I was replying to this
I have a stupid question or comment. Through the years, I have always gamed at 1080 and 1440p without Gsync and freesync. Sometimes when playing on a 75hz refresh monitor (most professional monitors are 75hz), I would turn on Vsync only. On other gaming monitors (144-165), I simply leave Gsync and/or freesync off.
You guys state there is flickering with Gsync/Freesync? Why not just leave it off? Am I missing something?
Surely your graphics cards can push 120 fps without breaking a sweat? I do not understand the problem?

Another point. When you activate 1440p in windowed mode, what does the screen size become? For example on a 32" 1440p monitor when you activate a lower resolution (1080) the screen size goes from 32" to 24". ??
 
Last edited:
No issue so far at 4K 120hz 4:4:4 using the following combination on my 3080 and 48CX setup
1) IBRA 2.1 HDMI 10ft Cable
2) Monoprice 48Gbps HDMI 3ft extention
 
Jost posted this on the AVS CX gaming thread but I’ve been following both of these threads since purchasing the CX.

I’ve had my CX for a week now and have done a lot off messing around with a 3070 and here is my experience.

I am using the zeskit 6ft. hdmi 2.1 cable and I do get random lost signal issues here and there. It is usually when I am changing Nvidia control panel settings. 4:4:4 to 4:2:0, 10bit to 8bit or gsync on or off. Sometimes it comes back after 15 to 20 seconds, sometimes I can power the CX off and on and it returns. Unplug and replugging the hdmi in always makes the signal come back. This doesn’t happen everything and it is fine more often than not. Doesn’t happen during normal steady use, shouldn’t be a problem long term especially after gsync stutter fix firmware is out and I don’t have a need to use 4:2:0 8bit for non stuttering gsync.

As for black levels and VRR, there isn’t only an issue with flickering... this is easily more noticeable when the frame rate dips for whatever reason. In control, pausing the game for instance, or the frames dipping in a dark area, gamma is shifting and it is annoying. BUT with VRR I can get a stable enough 60fps 4K/DLSS, Yes the game looks great, the picture is overall awesome but you are still dealing with a noticeable CONSTANT raise in gamma.

You can easily see the difference with the same scene by popping out and switching gsync off. Pop back into the game and only then did I notice it was back to perfect black. It not something you easily pickup on. This was in a pitch black room where I even was able to pick this out. Fired up Gears 5 and the same result. Playing during the day I don’t think I would notice and vrr smoothing things out and it’s a trade off I will make for now.

Testing what people have said about dropping brightness to compensate...
With the raised black in VRR I could see steps down in what should have been perfect black all the way to 47, going to 46 is when I could no longer see the true black area of the screen get any darker. Switching back and forth again VRR off vs on and the blacks are now right where they need to be. Of course at the expensive of some black crush. Nothing that’s a dealbreaker but still a shame to have to make compromises to get a accurate picture because features the TV is advertised having are not where they need to be.

I am going to play around and find something that works for me, all issues aside the CX is beautiful and it is a joy to play on. I think console users should not be worried. Most games are going to be at a more steady 30/60 and in most cases you can have VRR off, have a beautiful image and still enjoy all the benefits of an oled.

I know there are journalist covering this issue in some news stories but I wish more of the outlets all touting this as the best/perfect next gen gaming TV would update them and let people know of the issues. Even Rtings CX vs Vizio oled comparison video posted yesterday glosses over the issue completely. Don’t quote me but when they get to VRR on the CX they say it works perfectly on the CX without a hitch.

That all being said I still love it and look forward to the gsync firmware. The VRR / gamma issues are a bummer but I can do without it and still be extremely happy to own this TV.
 
Been testing my RTX 3090 FTW3 Ultra that I got yesterday and I've been really enjoying it. I think there might still be some firmware issues because I have been experiencing cut-outs when changing the color or G-Sync settings on the TV. Otherwise I haven't noticed the raised blacks on the VRR or any stuttering during my usual routine. If anything, everything looks quite amazing, and a great step up from my C6 OLED.
 
Those of you with HDMI 2.1, are you going with RGB 10 bit or 444? What are you setting your dynamic output in NVCP as well as black level for either?
 
Those of you with HDMI 2.1, are you going with RGB 10 bit or 444? What are you setting your dynamic output in NVCP as well as black level for either?
I'm at 10 bit 444 with HDR enabled. OLED light is set at 65.

I recently tried switching the Input (via the Home Panel) to PC, and it made my text blurry. Previously the text appeared crisp.

Not quite sure what the issue is. I have the Screen Shift off. I suppose I'll keep playing with the settings.
 
Some guy on reddit say we can have the new firmware thanks to the "engeneer mode" but i don't exactly what it mean ???
He also say the new firmware correct the issue of stutering with gsync.
So if anyone can explain to me how and it is safe ?
I have an extra code for watch dog legion (redemption on geforce experience >> must have a 3080 or 3090) if you want to trade message me.
 
I'm at 10 bit 444 with HDR enabled. OLED light is set at 65.

I recently tried switching the Input (via the Home Panel) to PC, and it made my text blurry. Previously the text appeared crisp.

Not quite sure what the issue is. I have the Screen Shift off. I suppose I'll keep playing with the settings.
If PC mode is off then you are not getting 4:4:4 output, the display will turn it into 4:2:2. If text becomes blurry make sure pixel shift is off. Make sure Quickstart+ is off too or pixel shift will turn itself back on. It's a bug LG hasn't bothered to fix.

Some guy on reddit say we can have the new firmware thanks to the "engeneer mode" but i don't exactly what it mean ???
He also say the new firmware correct the issue of stutering with gsync.
So if anyone can explain to me how and it is safe ?
I have an extra code for watch dog legion (redemption on geforce experience >> must have a 3080 or 3090) if you want to trade message me.
This can be done via the service menu, which requires the service remote or Android phone with IR blaster and LG service remote app. Poking in the service menu is not safe.
 
Screen Shift doesn't seem to zoom the image anymore even on 03.11.25 - it only pans the image.
Can confirm so at least they fixed that. I guess the display not retaining the screen shift off state when quickstart+ is enabled is still broken though?
 
I ended up turning Quickstart+ off and leaving it off. Yeah the TV boots noticeably faster with it enabled, but I can tolerate a few extra seconds per "session" for the other benefits it brings leaving it disabled. Fixing that "bug", if that's what it is, was really low on my priority list compared to the VRR/G-Sync stuff. I'm glad that LG remains committed to working hard on steady firmware updates. Someone said it earlier but the HDMI 2.1 implementation combined with all of the various gaming features was uncharted territory and I'm not surprised that everything wasn't working 100% perfectly out of the box with the 3xxx series cards. That being said, it was STILL a best-in-class display before, and they've worked to deliver the features as promised. Like a couple of people have said, anything else is a nitpick at this point. I bought the 48CX at launch and feel that it was money well spent (and due to GPU supply issues I'm still not able to take advantage of all of its features). It will be interesting to see how the AMD 6xxx series GPUs do with this display (and in general - drivers, etc.).
 
I ended up turning Quickstart+ off and leaving it off. Yeah the TV boots noticeably faster with it enabled, but I can tolerate a few extra seconds per "session" for the other benefits it brings leaving it disabled. Fixing that "bug", if that's what it is, was really low on my priority list compared to the VRR/G-Sync stuff. I'm glad that LG remains committed to working hard on steady firmware updates. Someone said it earlier but the HDMI 2.1 implementation combined with all of the various gaming features was uncharted territory and I'm not surprised that everything wasn't working 100% perfectly out of the box with the 3xxx series cards. That being said, it was STILL a best-in-class display before, and they've worked to deliver the features as promised. Like a couple of people have said, anything else is a nitpick at this point. I bought the 48CX at launch and feel that it was money well spent (and due to GPU supply issues I'm still not able to take advantage of all of its features). It will be interesting to see how the AMD 6xxx series GPUs do with this display (and in general - drivers, etc.).
Yeah it's the bottom of the list of my problems too. The only reason I care about it at all is with Quickstart+ enabled turning off the TV doesn't trigger an "unplug" event on the display and rearrange all your windows. Since I'm obviously turning the display off when I'm leaving the computer, it's an annoyance, albeit a minor one.
 
Back
Top