LG 48CX

Attachments

  • Udkli222p.PNG
    Udkli222p.PNG
    66.6 KB · Views: 0
Next question is, is there any decent adapter out there to convert Display Port to HDMI 2.1 and maintain 4k 120 and 10 bit color? My EVGA 3090 only has 1 HDMI 2.1, and I have both an LG CX 48, and a C9 65 in the same room.
Yes, the Club3D CAC-1085. I use this with a 2080 Ti and it works great now with the latest firmware on adapter and TV. VRR is the only feature that is missing, I don't know if they will ever make it work.

For USB-C you could look into the CableMatters "48 Gbps" USB-C to HDMI 2.1 adapter if you want things even easier. I think you can get it on Amazon in the US, they said March for EU. Again VRR is not possible with this.
 
Nope - displayport 1.4 is only capable of 32Gbps - you need 40Gbps to do 4k120 RGB 10-bit. That's why people like the AIB cards with 2 HDMI ports.

I believe the Club3D adapter supports DSC so that allows you to get 4k120 10bit RGB over DP 1.4 - HDMI 2.1
 
Game mode has lower input lag than any other mode. You can mostly match cinema home simply by copying the settings from that preset to game mode. I would recommend copying the ones from the expert modes though.

For HDR, just leave brightness on the TV at 100 and in games it's a case by case thing depending on how the game handles HDR configuration. HDTVTest recommends that instead of setting for the "where some logo is almost invisible" in games, you set for one step up from there where it is invisible so you are not limiting the brightness/black level ranges.

Put it in Game mode.

Just to clarify I'm not talking about Game Response, that's already enabled. I'm talking about the picture mode setting named game, other settings are cinema home, cinema, vivid, etc.
 
Just to clarify I'm not talking about Game Response, that's already enabled. I'm talking about the picture mode setting named game, other settings are cinema home, cinema, vivid, etc.
Putting the Picture Mode to the Game setting cleared up all my issues with text rendering on my Win 10 system.
 
Putting the Picture Mode to the Game setting cleared up all my issues with text rendering on my Win 10 system.

Interesting, I don't notice much of a difference. What I don't like about the game setting is it seems whites and light colors are really drab. I've tried playing around with it, but can't figure it out. The closest I've come is if I enable HGIG the desktop is drab, but then in games it's much more accurate. But in Cinema mode you can only do tone mapping, no HGIG, with tone mapping on the desktop looks much nicer. I guess I'll just use Cinema mode for the desktop and game for games.
 
I've found wired both works best. Also if you're doing something CPU heavy on the computer it can cause some stutters. It's also possible the HDD is your issue, or also if you're using the HDD for other things (like downloading another movie) at the same time you're watching it can cause issues.

Following the above recommendations I almost never get any stuttering and I'm using a pretty old PC and old HDDs. And almost everything I watch is the highest quality 4k HDR you can find. Even Gemini Man at 60 fps which I highly recommend watching.
I've played the 4K HDR movie using the USB HDD directly on PC fine with no stutters (via Plex). It only stutters with SmartShare. So it can't be the HDD. Also, I'm using 5900X + 3080, so my PC is plenty powerful. I'm still leaning toward a network lag causing the stutters. I'll find some free time to test SmartShare via ethernet on the TV (even if it's just 100Mbps) to see if it was the wifi interference causing the stutters.

For HDR, just leave brightness on the TV at 100 and in games it's a case by case thing depending on how the game handles HDR configuration. HDTVTest recommends that instead of setting for the "where some logo is almost invisible" in games, you set for one step up from there where it is invisible so you are not limiting the brightness/black level ranges.
Is that a typo and you mean leave OLED Light at 100? Brightness by default is 50 iirc.
 
Interesting, I don't notice much of a difference. What I don't like about the game setting is it seems whites and light colors are really drab. I've tried playing around with it, but can't figure it out. The closest I've come is if I enable HGIG the desktop is drab, but then in games it's much more accurate. But in Cinema mode you can only do tone mapping, no HGIG, with tone mapping on the desktop looks much nicer. I guess I'll just use Cinema mode for the desktop and game for games.
The desktop is the worst place to have tone mapping on. It will negate your brightness setting and make white blinding no matter what.
 
They do appear as display driver crash in the event viewer for me
We all have the exact same issue
View attachment 322844

View attachment 322845
As you can see, driver crash, then a restart 7 seconds later.
View attachment 322846

Additionnal note : Just like gaming and video playback, it never crashed when in a microsoft teams meeting.

yep indeed I'm also having driver crashes when I get the blackouts. Bad news, I tried the Belkin cable and within a few minutes I had 2 crashes seconds apart so I guess that is going back to Amazon. This seems more like an issue with Nvidia than with LG though.

crash1.JPG

crash2.JPG
 
The desktop is the worst place to have tone mapping on. It will negate your brightness setting and make white blinding no matter what.

Interesting, I feel like it looks better with tone mapping on. With tone mapping off anything white or lighter looks really drab, even with OLED to 100.
 
Yes, the Club3D CAC-1085. I use this with a 2080 Ti and it works great now with the latest firmware on adapter and TV. VRR is the only feature that is missing, I don't know if they will ever make it work.

For USB-C you could look into the CableMatters "48 Gbps" USB-C to HDMI 2.1 adapter if you want things even easier. I think you can get it on Amazon in the US, they said March for EU. Again VRR is not possible with this.
Unfortunately this EVGA 3090 just had 3 DP 1.4 and 1 HDMI 2.1 out. I wish it had a usb-c and another HDMI 2.1
 
Checking the google quickly, it seems this issue also happens to people using HDMI 2.1's full power (rtx 3000, 4K120hz12bit) on samsung TV's.
One of them said windows version 20H2 is the cause of this, he has rolled back to an older version and seemingly had no issue.
Nothing sure. Best bet is to report the bug to MS and / or Nvidia, and wait to further updates.

If this is a widespread problem with HMDI 2.1 & some RTX cards, the upcoming collection of HMDI 2.1 monitors coming out will probably make this issue more apparent.

This is the nvidia thread about this

I also had a similar issue with a Samsung Q80T on the HDMI 2.1 port. I didn't get to test it much because after a day the TV refused to run 4k 120hz and HDR at the same time, I exchanged it for the CX at that point. But now I wonder if the issue really was my 3070.
 
Interesting, I feel like it looks better with tone mapping on. With tone mapping off anything white or lighter looks really drab, even with OLED to 100.
This is the correct appearance. The brightness is controlled with the HDR / SDR brightness balance slider in Windows, not OLED Light which must always be 100.
 
I found a game to test my RTX 3080 at 4K120 10-bit RGB w/ GSYNC (capped framerate at 115 FPS).

Forza Horizon 4.

All settings maxed, MSAA 4x (to keep framerate higher than 8x), motion blur off.



All I have to say is SWEET JESUS!!! Rock solid 100-115 FPS, no stuttering, and not a frame tear in sight. Unbelievably colorful, smooth motion, incredibly responsive... I don't say this lightly, but the RTX 3080 + CX is gaming perfection on the Forza Horizon 4. Same experience on Doom Eternal as well.

Unbelievable. Just unbelievable. LG... you done good. Not much else I can ask for in a gaming display.
 
I use a RTSS global cap at 117 because it's dead on accurate and perfectly reliable unlike nvidia's which completelely fails for half the games I play (and I make custom profiles for games I want to play at 60 fps or whatever).

If I use an in game limiter I cap it at 115 because they often let the framerate flucluate around the target, so it's just extra safety - AND it also means that I don't need to turn off RTSS, which I leave in the background 24/7 (RTSS will not do ANYTHING below 117fps so there is no conflict and I can still get the benefit of the slightly lower lag from the ingame limiter).
 
I found a game to test my RTX 3080 at 4K120 10-bit RGB w/ GSYNC (capped framerate at 115 FPS).

Forza Horizon 4.

All settings maxed, MSAA 4x (to keep framerate higher than 8x), motion blur off.



All I have to say is SWEET JESUS!!! Rock solid 100-115 FPS, no stuttering, and not a frame tear in sight. Unbelievably colorful, smooth motion, incredibly responsive... I don't say this lightly, but the RTX 3080 + CX is gaming perfection on the Forza Horizon 4. Same experience on Doom Eternal as well.

Unbelievable. Just unbelievable. LG... you done good. Not much else I can ask for in a gaming display.

First game I played on the CX was Doom Eternal in HDR... it was glorious! I've been playing Control lately, with everything on maxed out settings, ray tracing at high, DLSS on. Even though it's around 60FPS, it feels smooth and is just amazing. The CX rocks!
 
Does the Game video mode do anything special for playing games? I rather prefer the Cinema Home mode but wasn't sure if I was sacrificing anything by not using the Game video mode. I do have Ultra HDMI and Game response enabled. Also where do you all set your in game HDR brightness to? I've seen some mention 5% window, but on Rtings they only have 2/10/25/50/100% windows.
Game mode has lower input lag than any other mode. You can mostly match cinema home simply by copying the settings from that preset to game mode. I would recommend copying the ones from the expert modes though.

For HDR, just leave brightness on the TV at 100 and in games it's a case by case thing depending on how the game handles HDR configuration. HDTVTest recommends that instead of setting for the "where some logo is almost invisible" in games, you set for one step up from there where it is invisible so you are not limiting the brightness/black level ranges.

Some additional input from the avs forum LG CX gaming thread here that helped to clear some things up for me:
https://www.avsforum.com/threads/2020-lg-cx–gx-dedicated-gaming-thread-consoles-and-pc.3138274/page-67

macmane said:

I thought whatever device thats hooked up to the TV has to support instant game response ie xbox one x. I know the instant game response banner pops up on x1x but not the ps4 since it doesn't support it.
Duc Vu said:

Yes. Basically instant game response is just a toggle; once it is enabled, the tv will go into a state where every picture mode has low input lag. If it is disabled, all picture modes will go back to having high input lag again except game mode.
Duc Vu said:

Yes the toggle will only work when the device supports auto low latency mode (ALLM). Even when you enable instant game response in the tv menu, if your device does not support ALLM like ps4 for example, the tv won't go into low input lag state.
One thing to note is only game mode has hgig under dynamic tone mapping option. So if you want to use hgig (which is still a vague standard that is not widely adopted yet), you still have to switch to game mode. Otherwise, use whatever other picture modes you want and enjoy that low input lag.
..

macmane

BraveHeart88 said:
So what you're saying is that when I connect my PS4 pro to my CX only the Game mode gives me low input lag, but if I switch to any other picture modes I won't have low input lag even though the Instant game response is enabled. But when I connect the X1X to the CX with Instant game response enabled, I can use any picture modes & still have low input lag that the regular game mode will give me because the X1X supports ALLM. Am I correct?
Yes
 
Don't know if it's related but...

I replaced my Sound Blaster Z card with the external Sound BlasterX G6. I removed the SB drivers with DDU and uninstalled the SB Control Center. Hooked up the G6 via USB and did not install the driver or software.

Not only did it fix my audio issues (My BeyerDyne DT 770 Pro 250 ohm headphones sound amazing) but I had no video driver failure. Maybe it's a fluke but so far so good.
 
I use a RTSS global cap at 117 because it's dead on accurate and perfectly reliable unlike nvidia's which completelely fails for half the games I play (and I make custom profiles for games I want to play at 60 fps or whatever).

If I use an in game limiter I cap it at 115 because they often let the framerate flucluate around the target, so it's just extra safety - AND it also means that I don't need to turn off RTSS, which I leave in the background 24/7 (RTSS will not do ANYTHING below 117fps so there is no conflict and I can still get the benefit of the slightly lower lag from the ingame limiter).

i haven’t had any issues using NVCP, at least that I’ve noticed. I cap it globally at 115 FPS.
 
I'm using RTSS for jedi fallen order at 57fps since my current gpu only does 60hz. When I get a 3090 I'll cap at 117 using RTSS. From what I read on blurbusters.com the lowest input lag methods were in-game caps or RTSS. The nvidia one introduced a little input lag. Maybe that's changed since but RTSS is just as easy to do (and has some other optional features besides) so I wouldn't bother switching to nvidia's method.
 
i haven’t had any issues using NVCP, at least that I’ve noticed
I read about RTSS vs NVCP vs in-game capping for the first time and the preferred order seems:
1) in-game for near-zero latency
2) RTSS for 0-1 fps latency addition
3) NVCP for 2-6 fps latency addition
 
Last edited:
I'm using RTSS for jedi fallen order at 57fps since my current gpu only does 60hz. When I get a 3090 I'll cap at 117 using RTSS. From what I read on blurbusters.com the lowest input lag methods were in-game caps or RTSS. The nvidia one introduced a little input lag. Maybe that's changed since but RTSS is just as easy to do (and has some other optional features besides) so I wouldn't bother switching to nvidia's method.

I read about RTSS vs NVCP vs in-game caping for the first time and the preferred order seems:
1) in-game for near-zero latency
2) RTSS for 0-1 fps latency addition
3) NVCP for 2-6 fps latency addition

Thanks guys, I would not have guessed a framerate cap would introduce input lag. I run RTSS anyway so I might as well set it there.
 
Dated article (2017) but unless it's changed for nvidia's limiter this was how it used to work:

https://blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/12//

RTSS is a CPU-level FPS limiter, and introduces up to 1 frame of delay, whereas Nvidia Inspector uses a driver-level FPS limiter, which introduces 2 or more frames of delay. See G-SYNC 101: In-game vs. External FPS Limiters for complete details, along with input latency tests comparing the two external solutions against an in-game limiter.


EDIT: it has indeed changed apparently (pretty recently starting in a driver released a year ago 1-6-2020).... so it's a matter of preference as long as you set it up properly. I'm used to using RTSS and it has some other neat features besides so I'll probably not bother switching to the nvidia method. An ingame limiter is usually the lowest input lag but that's not always made available in every game.

*As of Nvidia driver version 441.87, Nvidia has made an official framerate limiting method available in the NVCP; labeled “Max Frame Rate,” it is a CPU-level FPS limiter, and as such, is comparable to the RTSS framerate limiter in both frametime performance and added delay. The Nvidia framerate limiting solutions tested below are legacy, and their results do not apply to the “Max Frame Rate” limiter.

from the comments at the bottom of the page: https://blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/12/
It does not. The limiter tested on that page is legacy.

The new limiter has been tested by other sources such as Battle(non)sense, and has been found to be comparable to RTSS in both added latency and frametime performance.

I don’t know when you last checked this page, but I recently added a disclaimer to this effect:
https://blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/11/
when an in-game framerate limiter isn’t available, should I use RTSS or this new setting?
I included it as an “OR” option quite a while back on this page:
https://blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/14/

Seeing as it’s comparable to RTSS, it’s basically up to preference; try both.

NVIDIA Control Panel: <1 Frame Delay​

As of Nvidia driver version 441.87, Nvidia has made an official framerate limiting method available in the NVIDIA Control panel labeled “Max Frame Rate.”

To set a framerate limit, navigate to the “Manage 3D settings” section in the NVCP, locate the “Max Frame Rate,” entry, select “On,” set the desired limit, select “OK,” and finally select the “Apply” button after it appears in the lower right corner of the NVCP window.

nc-101-nvidia-control-panel-max-frame-rate-limiter.png
 
Last edited:
I'm pretty sure RTSS and nvidia frame rate limiter use the identical method. Both seem to work as reliably for me, and both fail in the same games for me. So I just use NVCP now.
 
I don’t understand why every one is playing 4k movies in windows. I always use the Plex app built into the TV. It does HDR and Dobly Vision. Granted I have a 42tb local server and 700tb in the cloud. I have no issues using the app local or streaming from the cloud. The new Plex server even does tone mapping for 4k hdr to 1080p sdr. It works well use it when streaming to older TVs.

Though I'd rather run everything via my pc the idea that in order to get full features I'd have to use a set-top or smart TV kioskware device is not new to me. When netflix first got 1080p it wouldn't do 1080p on pc intially, later it wouldn't do 4k while it would from consoles, smarttvs, set top boxes etc.

So I finally broke down and looked up all the apps on the LG ecosystem. The plex app seems to work great so far and it's interface is fast where the LG webos Emby app is very laggy and slow. that could just be because it's populating itself the first time though but I doubt it. I had been using plex and emby as dlna servers which the LG did pick up but the plackback seemed iffy that way with some titles unplayable and others stopping during playback.

I've added the LG OS apps for:
... my HDHomerun OTA antenna (for HD local news mostly and national event breaking reports, etc)
...weather nation (not sure if it's any good but I added it)

....Plex
...Emby (will see how that goes interface lag wise and if it works with HDR later)
...Netflix
...Prime Video
...Twitch
...Youtube

That's all I need more or less. I did notice that there is no Tidal app which I would use, and I do prefer to use a 3rd party app for twitch but I can just use the PC on the tv for that stuff. The important thing is I can get a fast non-laggy local library index and a few online streaming apps that can play 4k HDR titles with a proper curve and without hiccups.
 
Last edited:
EDIT: it has indeed changed apparently (pretty recently starting in a driver released a year ago 1-6-2020).... so it's a matter of preference as long as you set it up properly. I'm used to using RTSS and it has some other neat features besides so I'll probably not bother switching to the nvidia method. An ingame limiter is usually the lowest input lag but that's not always made available in every game.
Thanks for the update. I also had the older article. So use of NVCP is equivalent to RTSS frame rate limiting
 
Is capping below the max gsync rate recommended? Saw another post capping at 117?

Yes, you want a small buffer below the cap because once you exceed the range of gsync it defaults to vsync on or off (depending on your settings). So you either get tearing, stuttering or extra input lag as if you weren't using GSYNC. This is the case when using any gsync monitor
 
I'm pretty sure RTSS and nvidia frame rate limiter use the identical method. Both seem to work as reliably for me, and both fail in the same games for me. So I just use NVCP now.
RTSS has an "application detection level" setting. I put that on the most aggressive, high and then it works in 100% of games (I own hundreds) and apps (warning: it can also work for video players and browsers and such, so you might want to make exclusion profiles!). In theory it could trip off anti-cheat systems in online games but I've yet to experience that. I believe the application is popular enough to not have to worry about it. And of course you can always make exclusions :)

Nvidia's new limiter offers the same smoothness and performance, but for example does not work in Control (DX12) and many other titles that I've tried it with. I don't even think it's API related because it also fails in really random old content like UT 2004...
 
Last edited:
Some additional input from the avs forum LG CX gaming thread here that helped to clear some things up for me:
https://www.avsforum.com/threads/2020-lg-cx–gx-dedicated-gaming-thread-consoles-and-pc.3138274/page-67





..

Interesting. So if ALLM is supported then any video mode gives the low input lag (if game response is also enabled), but if ALLM is not supported, like the PS5, then only the game mode will give low input lag. I asked on there, but I'll ask here as well, does the 3070 for PC gaming support ALLM?
 
Interesting. So if ALLM is supported then any video mode gives the low input lag (if game response is also enabled), but if ALLM is not supported, like the PS5, then only the game mode will give low input lag. I asked on there, but I'll ask here as well, does the 3070 for PC gaming support ALLM?
I haven't tried it in several firmware versions, but this wasn't true with PC usage previously. You could enable instant game response and g-sync (ALLM as far as the TV is concerned) and every mode but game still had noticeable input lag on the mouse cursor.
 
  • Like
Reactions: SD777
like this
Yes, you want a small buffer below the cap because once you exceed the range of gsync it defaults to vsync on or off (depending on your settings). So you either get tearing, stuttering or extra input lag as if you weren't using GSYNC. This is the case when using any gsync monitor
If you turn V-Sync on with G-SYNC it's always on regardless of the framerate. The input lag just isn't as bad below the refresh rate cap because of how the back buffers are handled. G-SYNC still needs to wait for the scanout of the display to finish before pushing forward another image, or otherwise you would still get tearing and judder. V-Sync off means G-SYNC isn't waiting for the scanout anymore. It's why V-Sync on in the NVCP is recommended to used with G-SYNC.
 
I have a issue with my 55cx in the nvidia control panel the monitor is not refered as a gsync certified monitor... In the last nvidia firmware it was not saying that. Do you see that ?
 
I have a issue with my 55cx in the nvidia control panel the monitor is not refered as a gsync certified monitor... In the last nvidia firmware it was not saying that. Do you see that ?
There’s a post on Reddit right now discussing some issues with the latest nvidia driver, g-sync recognition being one of them. I honestly didn’t read the post very carefully because I’m not on the latest driver. Google it, I’m sure something will come up.
 
Interesting. So if ALLM is supported then any video mode gives the low input lag (if game response is also enabled), but if ALLM is not supported, like the PS5, then only the game mode will give low input lag. I asked on there, but I'll ask here as well, does the 3070 for PC gaming support ALLM?
I haven't tried it in several firmware versions, but this wasn't true with PC usage previously. You could enable instant game response and g-sync (ALLM as far as the TV is concerned) and every mode but game still had noticeable input lag on the mouse cursor.

From what I've read you have to turn on instant game response in the TV OSD in order for "G-sync" to show up in the nvidia drivers in windows. So I think LG instant game response = hdmi 2.1 ALLM + PC hdmi 2.1VRR / G-sync . So it should work with your 3070 with "g-sync" enabled.

From 4months ago when VRR rasied blacks was still an issue:
https://www.avsforum.com/threads/2020-lg-cx–gx-dedicated-gaming-thread-consoles-and-pc.3138274/post-60104161

Duc Vu


Corect me if I'm wrong but I think the near black gamma shift issue is there regardless of whether the game supports vrr or not. Once you enable instant game response, it will be there, because instant game response is simply allm and vrr bundled together. What this means is allm and vrr always go hand in hand on these tvs.

According to this below, besides that nvidia has both a low latency mode that can be set for everything in the drivers as well as better one called reflex which is only supported on certain games.

https://www.nvidia.com/en-us/geforce/guides/system-latency-optimization-guide/

nvidia-reflex-control-panel-low-latency-ultra-mode.png

NVIDIA Reflex is more effective at reducing latency and operates independently of NVIDIA Ultra low latency mode. If both NVIDIA Reflex and the Ultra Low Latency mode are enabled, NVIDIA Reflex will override Ultra Low Latency functionality.
ull-destiny-2-fortnite-valorant-latency-comparison.png
Turn on Exclusive Fullscreen - If possible, always be in Exclusive Fullscreen mode. This will bypass the windows compositor that adds latency.

In recent Windows updates, the latency of borderless windowed (windowed fullscreen) mode has slightly improved, but based on our tests we still recommend the fullscreen setting.
llscreen-vs-windowed-fullscreen-latency-comparison.png

https://www.pocket-lint.com/apps/news/nvidia/153634-what-is-nvidia-reflex-and-how-does-it-work
 
Last edited:
It's a discussion that might be considered a little off topic but the input lag of the LG CX as compared to some other traditional gaming monitors is often brought up as well as questions about the input lag in different modes and with different features enabled on the TV. While I'd like very low input lag overall, I think the effect of the tiniest of the differences might be greatly exaggerated - especially in relation to online games.

Command Execution Time = Current Server Time – (Packet Latency + Client View Interpolation)
Put into English this means that once you pull the trigger and this information package gets sent to the server, it then goes back from the current server time (the time the pulling the trigger package was received) by your ping plus your interpolation time. Only then is it determined if the client hit the shot or not.

Regarding input lag in relation to online games... The timing of the whole human + hardware + software chain would actually be even worse than my quote below. I was just listing some of the factor's amounts here and trying to plug some of them in:

you can't react to what you can't see yet:
If you were playing on a LAN against other players on the same LAN, on a server on the same LAN, you'd get latency of 3ms to 7ms from the server. Marketing for monitors acts like you are being served at 144fpsHz, 240fpsHz, or 360fpsHz of game world states in online game competitions. That's just false advertising... (and with lower frame rate ranges and thus longer ms per frame for most people than their peak Hz on most games besides - especially at 4k resolution).
...
Valorant: 128tick
Specific paid matchmaking services like ESEA: 128tick
CSGO ("normal" servers): 64tick
Overwatch: 63
Fortnite: 60 (I think, used to be 20 or 30)
COD Modern Warfare mp lobbies: 22tick
COD Modern Warfare custom lobbies: 12tick

..128tick at interp_2 = 15ms + (25 - 40ms) ping = 40ms to 55ms ~~~~~~~~~> 5 to 7 frames behind that have to be accounted for (at 117fps solid)

..64 tick at interp_2 = 31.2ms + (25 - 40ms) ping = 56ms to 71ms ~~~~~~~~> 7 to 8 frames

..22 tick at interp_2 = 90ms + (20 - 40ms) ping = 110ms to 130ms ~~~~~~~~> 13 to 15 frames

..12 tick at interp_2 = 166ms + (20 - 40ms)ping = 186ms to 206ms ~~~~~~~> 22 to 24 frames
...
At best if you ignore for a moment your added 25 to 40ms of ping time net code rollback rubberbanding guesswork (which you can't really ignore as a three to four 100fps 10ms frame's worth of smudge factor time) .. you are being served game states on the best tick servers closer to 55Hz tick wise which is over 15ms per frame, then your reaction time of 150ms at best if being extremely generous (to 250ms) + say 4ms to 13ms of monitor input lag depending on your screen.

So say 16ms tick + 180ms reaction + monitor input lag (4ms or 13ms) = 200ms (4ms monitor) vs 209ms (13ms monitor) with another 25-40ms of net code rollback guesswork muddying the result. So according to that, you are waiting for the next action state change frame + able to see that next frame and then react to it in 1/5th of a second vs 1/5th of a second + .009 second muddied/smeared by ping time rollbacks and proprietary net code decisions, limitations, and failings. That pretty much washes out such tiny differences especially with imperfect netcode and interpolating/rolling back multiple allies and enemy player's ping times.

--------------------------------------

That's not including some of the other factors in the system latency end of the chain...
idia-latency-optimization-guide-peripheral-latency.png

Some of the latency reducing methods like running fullscreen exclusive and using reflex can reduce some of the extra hardware latency that exists in the bar graph above some of which are above and beyond the factors I listed in the online gaming quote. Reductions could be something like minus 7ms from running fullscreen exclusive mode (guessing based on the chart) , and on reflex supported games minus 6 to 18ms depending on the game. I don't know if those reductions are cumulative or not.

I'm trying to get a clearer picture of how it all plays out. That's at least a grouping of a lot of the factors.
 
Last edited:
There’s a post on Reddit right now discussing some issues with the latest nvidia driver, g-sync recognition being one of them. I honestly didn’t read the post very carefully because I’m not on the latest driver. Google it, I’m sure something will come up.

Latest driver was buggy for me (g-sync not really working, 120hz not working), but swapping back and forth between scaling modes ended up fixing it anyway. Now it's all working perfectly like before.
 
Yeah for me Nvidia driver quality has jumped off the cliff starting from 457 version. I am not even willing to install the latest ones at this point and stick to an older one that works for me. On anything 457+ DSC just breaks completely with my Club3D adapter.
 
It's a discussion that might be considered a little off topic but the input lag of the LG CX as compared to some other traditional gaming monitors is often brought up as well as questions about the input lag in different modes and with different features enabled on the TV. While I'd like very low input lag overall, I think the effect of the tiniest of the differences might be greatly exaggerated - especially in relation to online games.

I don't think the argument holds with rollback netcode. With rollback, which favors the client, you are playing locally for all intents and purposes. The game engine renders a head, your ability to headshot it is directly based on your local input lag only - not your online ping - 99% of the time. The only effect tic rate and ping has is that "99%" figure will adjust due to rollback having to kick in. At some point of high enough tic rate and low enough ping, you hit diminish returns where 99.99% of the time your client action is always successful. So for comp gaming, once you have good enough ping (for most games, that is less than 100ms with good rollback implementations), local input delay is the thing to improve.

However local input delay has diminishing returns... I think I'm pretty sensitive to input delay, and I have 0 issues w/ the delay on the CX. But I don't play CSGO either :).
 
Last edited:
Back
Top