LG 48CX

Those 2 things will improve motion clarity sure, but a 240Hz refresh rate improves overall responsiveness from your mouse input vs 120Hz. Both are important factors in tracking targets so ideally a 240Hz OLED in a more suitable size like 24-27" would definitely be the ultimate competitive gaming monitor. Funny how panel makers rather push for 360Hz IPS panels instead...

It's a 4ms difference. I'd be curious to see a competitive player try both over, say, the course of a week.
 
Those 2 things will improve motion clarity sure, but a 240Hz refresh rate improves overall responsiveness from your mouse input vs 120Hz. Both are important factors in tracking targets so ideally a 240Hz OLED in a more suitable size like 24-27" would definitely be the ultimate competitive gaming monitor. Funny how panel makers rather push for 360Hz IPS panels instead...

Anyone testing would have to make sure that they are running 120fps (115 or 117fps capped) as a minimum , no averages, in order to get 8.3ms per frame at 120fps (or 8.6ms at 115fps) for the upper limit of what the monitor can do. However a more realistic test would be 90 to 100fps average where people are using higher graphics settings on demanding games, relying on VRR to ride a roller coaster of frame rates.

-------------------------------------------
If someone is doing graphics settings overboard or has a modest gpu and cranks up the graphics at 4k resolution on a game so that they are getting say 75fps average, they would then be frame durations something in the ranges of:

......25ms / 16.6ms <<< 13.3ms >>>> 11.1ms / 9.52ms

at...40fps / 60fps <<< 75fps >>> 90fps / 105fps

------------------------------------------
https://win.gg/news/4379/explaining-tick-rates-in-fps-games-difference-between-64-and-128-tick
  • CSGO official matchmaking: 64-tick
  • CSGO on FACEIT: 128-tick
  • CSGO on ESEA: 128-tick

Valorant tick rates:
  • Valorant official matchmaking: 128-tick

Call of Duty: Modern Warfare tick rates:
  • COD multiplayer lobbies: 22-tick
  • COD custom games: 12-tick
While that sounds fast, many CSGO players have monitors capable of running at 144Hz. In simple terms, the monitor can show a player 144 updates per second, but Valve's servers only give the computer 64 frames total in that time. This mismatch in the server's information getting to the computer and leaving the server can result in more than a few issues. These can include screen tearing, a feeling like the player is being shot when protected behind cover, and general lag effects.
---------------------------------------------------

You'd think that a tick of 128 would be 7.8ms and a tick of 64 would be 15.6ms , but it's not that simple... (see the quotes below)

----------------------------------------------------


http://team-dignitas.net/articles/b...-not-the-reason-why-you-just-missed-that-shot

interpolation. When your game client receives a package from the server, it doesn’t simply show you the updated game world right away. This would result in everyone breakdancing in 128 or 64 tick intervals across the map. Rather, it waits a set interpolation time called “lerp”, whose name probably originated by a network engineer stepping on a frog.

During this time, a set number of further packages arrived on the client’s side containing more updated ticks from the server. Through these ticks, the client is able to interpolate what has happened between these two points in time and display this assumption to the player (don’t get mad yet). Interpolation time is determined by the simple equation


cl_interp = cl_interp_ratio / cl_updaterate
So in our 128 tick server example from above, on otherwise default settings this would mean: You receive a new packet every 7.8 Milliseconds (cl_updaterate 128) but the server waits until you received a third packet (cl_interp_ratio 2) before displaying the information, making the interpolation time is 15.6 Milliseconds for this example. On the other hand, a client running cl_interp_ratio 1 is presented with a renewed state of the game every 7.8 Milliseconds – assuming all other hardware and software variable are optimal.


Of course, from everything we’ve learned in our long online gaming history we assume that a lower number in front of the ms sign is always preferable. But, you already guessed it, things aren’t so easy this time around as bad connections and lag compensation come into the picture.

Again, the people with unreliable connections are better off to accept higher interp times, as the game client requires a new package of information from the server precisely at the interpolation time to update your game. If the second package is lost, the client waits 250ms on another package before flashing that red warning message in the top right corner of the screen.


For someone who tends to experience any package loss pretty much ever, it is safer to set cl_interp_ratio to 2, especially since you regain the “lost” time in the lag compensation.

Lag Compensation


The inevitable conclusion from the preceding segment and also the fact that all players on the server have a ping is, that everything you see on your screen has happened on the server already a few Milliseconds in the past.


Let’s leave any philosophical and Einsteinian implications of this to the side for the moment to focus on how a playable game is produced from this situation in which you don’t have to pre-aim your crosshair in front of the enemy.


The process responsible for this is lag compensation in which the server accounts for both ping and interpolation timings through the formula:


Command Execution Time = Current Server Time – (Packet Latency + Client View Interpolation)


Put into English this means that once you pull the trigger and this information package gets sent to the server, it then goes back from the current server time (the time the pulling the trigger package was received) by your ping plus your interpolation time. Only then it is determined if the client hit the shot or not.


Combining all of these factors - these tiny ms differences on the LG are really moot and especially if arguing latency regarding online (rather than LAN) gameplay and without using solid frame rates that don't dip below the max Hz of the monitor.

Unless you are running extreme frame rate MINIMUMS rather than "decently high compared to 60fpsHZ" AVERAGES, I doubt you are going to get much improvement in real world scenarios - especially online gameplay for the above reasons. Most people use VRR as a way to squeeze some extra graphics eye candy in, bumping up their graphics settings and counting on g-sync to keep things smooth running even when running a frame rate graph with a poor low end framerate-HZ (and in some cases a pretty poor middle framerate-hz depending).

It would have to be pretty simple/easy to render game in order to get very high frame rates at 4k resolution.
At 100fps common low that would be 10ms per frame. If you got 115fps solid (not average) as a frame rate cap it would be 8.7ms per frame.
For 240Hz you'd prob need to be near 220fps minimum. Capping at 235fps to avoid overages, would be at best 4.25ms per frame.
For 360hz you'd need to be around 340fps minimum to really be staying in that Hz's zone. 355fpsHz solid would be 2.81ms per frame.

If you read through what I quoted about tick rates on servers you'd see that you'll still likely be getting 15.6 ms per tick on a 128tick server unless you had pristine ping and were willing to risk 250ms delays whenever your 2nd package is lost. For most game's 64 tick , 22, tick and 12 tick games your ms from the server would be much longer.

" Put into English this means that once you pull the trigger and this information package gets sent to the server, it then goes back from the current server time (the time the pulling the trigger package was received) by your ping plus your interpolation time. Only then it is determined if the client hit the shot or not. "

Then consider that everyone else in an online game is subject to the same lag compensation formulas.

I really think the response time conisdering extreme hz and mouse usage is overblown considering all of that, unless you are playing LAN games only (or single player local games) and at very high frame rates rather than using VRR to ride a roller coaster of frame rates that are moderate or low on the middle and low end of a frame rate graph. I'm guessing most people buying a HDR capable 4k OLED are buying it for some serious eye candy, not playing a very high frame rate competitive game at low settings or that is low graphics by design. What I do agree with is that at very high frame rates on very high frame rate monitors, the sample and hold blur would be reduced (without suffering the tradeoffs of BFI). That is not only advantageous for image clarity while moving for targeting purposes, but also aesthetically.
 
Unless you are running extreme frame rate MINIMUMS rather than "decently high compared to 60fpsHZ" AVERAGES, I doubt you are going to get much improvement in real world scenarios - especially online gameplay for the above reasons. Most people use VRR as a way to squeeze some extra graphics eye candy in, bumping up their graphics settings and counting on g-sync to keep things smooth running even when running a frame rate graph with a poor low end framerate-HZ (and in some cases a pretty poor middle framerate-hz depending).

It would have to be pretty simple/easy to render game in order to get very high frame rates at 4k resolution.
At 100fps common low that would be 10ms per frame. If you got 115fps solid (not average) as a frame rate cap it would be 8.7ms per frame.
For 240Hz you'd prob need to be near 220fps minimum. Capping at 235fps to avoid overages, would be at best 4.25ms per frame.
For 360hz you'd need to be around 340fps minimum to really be staying in that Hz's zone. 355fpsHz solid would be 2.81ms per frame.

If you read through what I quoted about tick rates on servers you'd see that you'll still likely be getting 15.6 ms per tick on a 128tick server unless you had pristine ping and were willing to risk 250ms delays whenever your 2nd package is lost. For most game's 64 tick , 22, tick and 12 tick games your ms from the server would be much longer.

" Put into English this means that once you pull the trigger and this information package gets sent to the server, it then goes back from the current server time (the time the pulling the trigger package was received) by your ping plus your interpolation time. Only then it is determined if the client hit the shot or not. "

Then consider that everyone else in an online game is subject to the same lag compensation formulas.

I really think the response time conisdering extreme hz and mouse usage is overblown considering all of that, unless you are playing LAN games only (or single player local games) and at very high frame rates rather than using VRR to ride a roller coaster of frame rates that are moderate or low on the middle and low end of a frame rate graph. I'm guessing most people buying a HDR capable 4k OLED are buying it for some serious eye candy, not playing a very high frame rate competitive game at low settings or that is low graphics by design. What I do agree with is that at very high frame rates on very high frame rate monitors, the sample and hold blur would be reduced (without suffering the tradeoffs of BFI). That is not only advantageous for image clarity while moving for targeting purposes, but also aesthetically.

Lol you are overthinking this dude. 240Hz may not necessarily give you any sort of "input lag" advantage over other players online, but your own mouse movement on your own screen on your own PC is going to feel smoother and more responsive when playing at 240fps/240Hz compared to 120fps/120Hz, which MIGHT help making tracking targets easier. That was literally all I was getting at. I never mentioned anything about having some sort of edge over other players due to less input lag.
 
  • Like
Reactions: isp
like this
You can't operate on what you can't see, so I doubt that aspect is true for online games considering the server update pipeline between all players and how long until when you get that new world state data.

What very high frame rates + very high hz can do is make the screen smear less (or not at all at extreme fps+hz someday) when you are doing 1st/3rd person FoV movement, moving the whole game world around in your viewport.... but again, only if you are filling those hz with very high frame rates into those Hz ranges.
 
Last edited:
You can't operate on what you can't see, so I doubt that aspect is true for online games considering the server update pipeline between all players and how long until when you get that new world state data.

What very high frame rates + very high hz can do is make the screen smear less (or not at all at extreme fps+hz someday) when you are doing 1st/3rd person FoV movement, moving the whole game world around in your viewport.... but again, only if you are filling those hz with very high frame rates into those Hz ranges.

What? Are you trying to say that playing games at 240fps on a 240Hz monitor, your overall mouse movement and responsesiveness won't feel any smoother vs 120fps on a 120Hz monitor just because you're playing online? Sure I guess I can't "see" that it's smoother but I can certainly feel it. Or are you trying to say that due to the nature of how online games work, tracking capability is limited by network and that FPS/Hz has absolutely nothing to do with how well you can track targets?
 
I'm saying what those articles said:

" " Put into English this means that once
you pull the trigger and this information package gets sent to the server
, (??? ms)
it then goes back from the current server time (the time the pulling the trigger package was received) by your ping (??? ms)
plus your interpolation time. (15.6ms on a 128 tick server, way more on most game's 64 tick , 22, tick and 12 tick servers)
Only then it is determined if the client hit the shot or not. " "

So you won't see the effective game world update state until your shot travels to the server, then goes back from the current server time (the time the trigger package was received), by your ping (???ms) plus your interpolation time (15.6ms itself, or much more on a lot of games). That goes for everything you do, and everything everyone else is doing.

I'm not saying higher fps+hz doesn't add motion articulation, better pathing that is way more cohesive in relation to "game server state" in single player games, and probably for LAN games. I'm saying it's doubtful that very high hz + fps and your mouse hz is giving you much gain if anything response wise for online games considering the server update pipeline between all players and how long until when you get that new world state data. I'm guessing therefore that scoring gains due to responsiveness are negligible if anything outside of LAN games and single player games.

However I also said that FoV movement in 1st/3rd person games smears the whole game world in your viewport at low fps-hz , so very high fps-hz reduces that and by a lot 240 or 360, 480fpshz to very little fuzziness. (1000fps x 1000hz, even with interpolation/duplication would be 1ms like a fw900 graphics professional crt for essentially "zero blur" without using BFI). That could be useful for seeing things that would otherwise be smeared out temporarily, but it is also a huge aesthetic gain since texture detail and depth via bump mapping, as well as in game-text are affected during FoV movement.
 
I'm saying what those articles said:

" " Put into English this means that once
you pull the trigger and this information package gets sent to the server
, (??? ms)
it then goes back from the current server time (the time the pulling the trigger package was received) by your ping (??? ms)
plus your interpolation time. (15.6ms on a 128 tick server, way more on most game's 64 tick , 22, tick and 12 tick servers)
Only then it is determined if the client hit the shot or not. " "

So you won't see the effective game world update state until your shot travels to the server, then goes back from the current server time (the time the trigger package was received), by your ping (???ms) plus your interpolation time (15.6ms itself, or much more on a lot of games). That goes for everything you do, and everything everyone else is doing.

I'm not saying higher fps+hz doesn't add motion articulation, better pathing that is way more cohesive in relation to "game server state" in single player games, and probably for LAN games. I'm saying it's doubtful that very high hz + fps and your mouse hz is giving you much gain if anything response wise for online games considering the server update pipeline between all players and how long until when you get that new world state data. I'm guessing therefore that scoring gains due to responsiveness are negligible if anything outside of LAN games and single player games.

However I also said that FoV movement in 1st/3rd person games smears the whole game world in your viewport at low fps-hz , so very high fps-hz reduces that and by a lot 240 or 360, 480fpshz to very little fuzziness. (1000fps x 1000hz, even with interpolation/duplication would be 1ms like a fw900 graphics professional crt for essentially "zero blur" without using BFI). That could be useful for seeing things that would otherwise be smeared out temporarily, but it is also a huge aesthetic gain since texture detail and depth via bump mapping, as well as in game-text are affected during FoV movement.

Mouse Hz at 240Hz is probably negligible true, but then I have to say that the increase in motion clarity from a very fast 240Hz TN to a 120Hz OLED is also negligible. Both still suffer from sample and hold motion blur regardless. Sure you could enable BFI on the OLED but then you can also enable backlight strobing on the TN as well and once again that makes any response time gains going from a fast TN to an OLED negligible once more. So to the original question of which is better for competitive online games, 120Hz OLED or 240Hz TN, I guess at the end of day, both options have their advantages over the other (240Hz TN vs 120Hz OLED) and like my original post stated, a 240Hz OLED would be ideal. But between a 120Hz OLED and 240Hz TN well you probably can't go wrong with either one for fast online games I'm sure they both work fine for 99% of people it just depends on whether you'd take the improvement in pixel response time or the improvement in mouse responsiveness but both are negligible anyways. I'll keep using my 240Hz because it's just in a more comfortable size for me.
 
Last edited:
The blur reduction, assuming you were getting ~ 240fps on a 240hz monitor or thereabouts if capping, while incremental should be appreciable compared to 120fpshz even though still not pristine clarity. 4ms persistence vs 120fpsHz ~8ms persistence that is. When using lower/variable frame rate graphs the difference would probably not be as obvious all of the time or in every game. With slower response time LCDs (especially VA's for example) the difference might not be as appriciable compared to 120FpsHz oled either. They are both in practice way better than 60fpshz's smearing though. They drop down to more of a "soften" blur or think of a lower vibration amount from a picture vibrating on a saw table or something. Sample and hold blur seems most prominent when mouse-looking, movement-keying, and gamepad panning at speed. At slower casual viewport speeds it's not as bad, or when flicking from point A to point B like a gnat. The same goes for text in a window you wave around on a 120hz desktop.

When combined with the smooth feeling and path articulation that the increased number of shown frames gives to the motion definition it is quite a difference overall (compared to 60fpshz), even if you are appreciating how it looks and feels aesthetically rather than attempting to get scoring gains/advantages. I don't have a 240fpsHz setup so I can't really comment on whether the motion definition aspect has diminishing returns past 120fps-hz or 144fps-hz personally.

This image from blurbusters.com gives an idea of the different amounts of screen blur at different fps+hz rates, but consider that it would be the entire game world blurring during FoV movment - objects/achitectures/geology, high detail textures, depth via bump mapping, in game text, etc. not just a simple cell shaded looking ufo bitmap:

KlIRG0B.png

As for advantages, you'd probably get more advantage in gameplay in single player games (or lan games) response wise considering the server state/game world update chain being slow by comparison. Advantage in single player isn't a bad thing to me. Some single player games are quite challenging and require fast accurate timing at higher difficulties. Overall I'm more interested in the effects on aesthetics personally.
 
I'll have to compare the CX running at 120Hz to my Omen X27 then I'll be able to see if a fast 240Hz TN can beat out a 120Hz OLED when it comes to sample and hold blur. I'm thinking it can but I could be wrong.
 
I'll have to compare the CX running at 120Hz to my Omen X27 then I'll be able to see if a fast 240Hz TN can beat out a 120Hz OLED when it comes to sample and hold blur. I'm thinking it can but I could be wrong.

And I think the key is to test with BFI enabled as well. BFI on a OLED vs a TN will look very different I imagine. Though I don't think people understand that it's pretty important for you frame rate to match your refresh rate to really capture the advantages of BFI. So you'd have to cap your frame rate at 120 on the OLED and rarely drop below it.
 
Higher FPS monitors will always be better than the 48CX as far as competitive FPS gaming goes. If you are the type that plays a broad variety of pc games and want a large screen then something like the 48CX will be good. However we really need to see how nVidia's new GPUs work with the 48CX before we can get final answers. Nvidia needs to give us full HDMI 2.1 support to make TVs like the CX48 really shine. Just my own opinion of course.

First off, nobody is going to buy the CX48 for competitive gaming. What is it exactly that you're saying? I think people will buy it just to enjoy it however they see fit.
 
First off, nobody is going to buy the CX48 for competitive gaming. What is it exactly that you're saying? I think people will buy it just to enjoy it however they see fit.

Someone earlier asked how the CX48 would be for competitive gaming like CSGO. I mean if you can get used to it's sheer size then I guess it could work. I personally believe a smaller 240Hz monitor would be far superior, if not for the increased mouse responsiveness, then at least for the increase in sample and hold motion clarity that 240Hz provides over 120Hz. But then some would argue that an OLED at 120Hz would have superior motion clarity than a TN at 240Hz due to OLED's true 0.1ms response time. I just have to wait until I get my CX in to see if that's really the case or not.
 
Y'all are micro-debating all of this minutia (and rightfully so; this is [H] after all) and here I am just waiting for the thing to be delivered because it's going to be one of the best gaming displays available. I don't game competitively but compared to my B7, which I've enjoyed immensely, it's going to be a massive upgrade.
 
Higher FPS monitors will always be better than the 48CX as far as competitive FPS gaming goes. If you are the type that plays a broad variety of pc games and want a large screen then something like the 48CX will be good. However we really need to see how nVidia's new GPUs work with the 48CX before we can get final answers. Nvidia needs to give us full HDMI 2.1 support to make TVs like the CX48 really shine. Just my own opinion of course.

Well, this is where it gets a bit tricky. The difference between something like 120/144 hz and 240 hz is marginal at best IRL. I own both and can spot the difference and while it does make things more smooth, I would lie if I said my stats where better on the 240 hz. Of course, I am far from a pro player myself but even they seem to agree. And to that we have to add the fact that no other panel type (that are in mass production at least) come close to the pixel response time of an OLED AFAIK.
 
Gotta wonder about that; one of the reasons to get higher-refresh LCD panels is for the motion resolution upgrade, but this is tied to the slow transition times of LCD technology. I expect the best LCDs to be better in terms of absolute motion handling and responsiveness to user input, but the compromises involved (TN panels, low contrast, washed out...) might not make for a better overall competitive solution.

Yes, this is where it gets interesting. I mostly play games like CS GO and although I am far from a pro player, I would like to think that I am not do bad either. I have compared my 55 GX to my 27" 1080@240hz TN monitor and honestly can't really feel that much of a difference, although I have only compared it with actual gameplay and from memory. For obvious reasons, its hard to play on more than one screen at the same time and its hard to rule out other factors like RNG, fatigue etc.
 
Someone earlier asked how the CX48 would be for competitive gaming like CSGO. I mean if you can get used to it's sheer size then I guess it could work. I personally believe a smaller 240Hz monitor would be far superior, if not for the increased mouse responsiveness, then at least for the increase in sample and hold motion clarity that 240Hz provides over 120Hz. But then some would argue that an OLED at 120Hz would have superior motion clarity than a TN at 240Hz due to OLED's true 0.1ms response time. I just have to wait until I get my CX in to see if that's really the case or not.

Was probably me then also as I have asked it before, but then few people actually had this or any other gen 10 OLEDs. The size difference is no problem really, as you can use a custom resolution and disable scaling (that's what I do for these kind of games on my 55" GX). Of course, not many put a 48" or even a 55" on their desk so comparisons are somewhat scarce.
 
If you read through what I quoted about tick rates on servers you'd see that you'll still likely be getting 15.6 ms per tick on a 128tick server unless you had pristine ping and were willing to risk 250ms delays whenever your 2nd package is lost. For most game's 64 tick , 22, tick and 12 tick games your ms from the server would be much longer.

" Put into English this means that once you pull the trigger and this information package gets sent to the server, it then goes back from the current server time (the time the pulling the trigger package was received) by your ping plus your interpolation time. Only then it is determined if the client hit the shot or not. "

Then consider that everyone else in an online game is subject to the same lag compensation formulas.

I really think the response time conisdering extreme hz and mouse usage is overblown considering all of that, unless you are playing LAN games only (or single player local games) and at very high frame rates rather than using VRR to ride a roller coaster of frame rates that are moderate or low on the middle and low end of a frame rate graph. I'm guessing most people buying a HDR capable 4k OLED are buying it for some serious eye candy, not playing a very high frame rate competitive game at low settings or that is low graphics by design. What I do agree with is that at very high frame rates on very high frame rate monitors, the sample and hold blur would be reduced (without suffering the tradeoffs of BFI). That is not only advantageous for image clarity while moving for targeting purposes, but also aesthetically.

This is very true and a fact that is often overlooked. And add to that netcode that is often less than ideal and RNG and I wonder if the difference is even measurable.
 
Is it possible to have BFI a long with the lowest possible input lag (which I assume is PC omde at 120 hz)?

Should be no problem. BFI is a separate setting you can use in any mode.

As for competitive gaming, I would not get the LG 48" for that. Not if your plan is to play fps games in fullscreen. It's just too big for that. I think it's great for immersion in a lot of games, but in a fast paced game where graphics mostly don't matter and you are more concerned with seeing and reacting to things, the sheer size of the display is going to be a factor. Now, if we ignore the requirements to run it at 120 Hz (HDMI 2.1 GPU etc), running with 1:1 scaling at 1080p and 1440p would probably shrink it to a more usable size for gaming.

It's just way more cost effective and straightforward to buy a 24" 1080p 240 Hz display for that purpose.
 
Improwise asked about competitive FPS gaming. The key word here is competitive. Not thinking he meant the average gamer. The 48CX should be fine for the average gamer but for the more serious FPS gaming higher FPS is best or at least according to the experts and professional gamers it is. I imagine there are some other variables in there but I am no expert on the subject.
 
Should be no problem. BFI is a separate setting you can use in any mode.

As for competitive gaming, I would not get the LG 48" for that. Not if your plan is to play fps games in fullscreen. It's just too big for that. I think it's great for immersion in a lot of games, but in a fast paced game where graphics mostly don't matter and you are more concerned with seeing and reacting to things, the sheer size of the display is going to be a factor. Now, if we ignore the requirements to run it at 120 Hz (HDMI 2.1 GPU etc), running with 1:1 scaling at 1080p and 1440p would probably shrink it to a more usable size for gaming.

It's just way more cost effective and straightforward to buy a 24" 1080p 240 Hz display for that purpose.

Not sure where people have gotten the idea that you need HDMI 2.1 to runat 120 hz, its simply not true. I run at 1080@120hz (or even lower resolution) with no scaling when running CS GO etc. Running 4K@120 hz on the other hand (at least with 4:4:4) is another story.

I actually did try to run fullscreen and it was better than expected, unfortunatley it also clearly reveled the micro stuttering in an old game like CS GO. But that is perhaps another discussion.
 
Last edited:
Improwise asked about competitive FPS gaming. The key word here is competitive. Not thinking he meant the average gamer. The 48CX should be fine for the average gamer but for the more serious FPS gaming higher FPS is best or at least according to the experts and professional gamers it is. I imagine there are some other variables in there but I am no expert on the subject.

Competitive is a tricky word, I guess we really should say "reaction based" or something like that. I just play for fun but when I do, I am naturally competitive as it would be kind of pointless if not for such a game. Its not like its a spellbinding story or breathtaking graphics :)
 
Posting this as a heads up in case anything like this happens to anyone:

https://www.avsforum.com/forum/40-o...aming-thread-consoles-pc-23.html#post59871620

I had one issue with eARC and Xbox One that I think I solved. Figured I'd post here if anyone had similar issues.

My setup is Xbox One X -> LG CX 65 -> LG SN11RG.

With Atmos enabled on the Xbox One X, games were randomly losing sound. Sometimes it was just the game, other times both the game and Xbox menu sounds. I thought it was an issue with the CX passing the Atmos until I turned off channel upmixing in the Dolby Access app on Xbox. Upmixing made non-atmos content sound weird anyways.

Hgig is supposed to prevent overlapping or duplication of settings for HDR toning at least but there can still be other facets of settings that shouldn't be enabled in the display and audio chains with so many different settings and features of devices and apps in the mix potentially.
 
Last edited:
Not sure where people have gotten the idea that you need HDMI 2.1 to runat 120 hz, its simply not true. I run at 1080@120hz (or even lower resolution) with no scaling when running CS GO etc. Running 4K@120 hz on the other hand (at least with 4:4:4) is another story.

I actually did try to run fullscreen and it was better than expected, unfortunatley it also clearly reveled the micro stuttering in an old game like CS GO. But that is perhaps another discussion.

Correct me if I am wrong but my understanding is that 1:1 scaling does not work without using GPU scaling which in turn limits you to 60 Hz without a HDMI 2.1 GPU. Otherwise the LG OLEDs will scale to fullscreen at 1080p @ 120 Hz or 1440p @ 120 Hz. The idea was that for competitive gaming you would want a smaller than fullscreen picture hence 1:1 scaling at a lower res.
 
Correct me if I am wrong but my understanding is that 1:1 scaling does not work without using GPU scaling which in turn limits you to 60 Hz without a HDMI 2.1 GPU. Otherwise the LG OLEDs will scale to fullscreen at 1080p @ 120 Hz or 1440p @ 120 Hz. The idea was that for competitive gaming you would want a smaller than fullscreen picture hence 1:1 scaling at a lower res.

Only certain resolutions will be scaled to fullscreen if scaling is performed by the display. AFAIK there is not problem running with 120 hz with GPU scaling. It is a bit of a minefield though :)

(I don't think there is a way to see on the TV itself what frequency it considers the input to be but the difference between 60 hz and 120hz is so noticeable that I think few would miss it anyway)
 
Last edited:
First off, nobody is going to buy the CX48 for competitive gaming. What is it exactly that you're saying? I think people will buy it just to enjoy it however they see fit.

Wait .... what? Who says? You says? LMAO.

First off all, some facts. people who compete competitively is less that a fraction of a fraction. It's nothing. I am assuming you're talking about professionally? I am trying to put your comment into a logical perspective that would make sense.

Home users on the other hand do play games competitively in terms of wanting wins, high scores, personal accolades and satisfaction, etc etc.

I had the CX for nearly 3 weeks before I returned it and logged MANY MANY MANY games of COD Warzone on that display. I now have the C9 55" and again, I'm advancing my wins. This display is incredible and it's very much exteremely usable for professionals let alone the average end home user like myself. Also, skill can negate and mitagate technical advantages, all day long. This includes latency and FPS, input lag, etc etc. There is a point at the very very tip top that this may not be true but we are talking at such a high level that it would include the best players in the world.

I have 66 or 67 wins in Battle Royal. I'm in the top 3% WORLD or in the 40,000K range out of 65+ million players .... let me go look.

Ok, 67 wins top 3%.

https://cod.tracker.gg/warzone/profile/battlenet/Meat#11825/overview

And I am 51 years old, 52 in Aug.

On a pie chart, strategic critical thought would be the largest contribution to my success. Meaning, I don't run around and button mash, arcade it out like most do The better players, I just out think and out execute them. To be fair, my hand and eye cordination is not the best so a player at my skill level ( critical thought and execution ) will mostly out play me if they are younger and more agile. The 55" 120 FOV def helps however along with the native 120hz. I have serious doubts that 240hz would help me or the lowest of the low latency, etc.

A lot of what I read here honestly just does not matter at the end of the day. You guys stressing over fractions of a second is a young mans game. I promise when you guys get to 40 - 50 years of age, your "Mature" brain will filter all of that non-sense out. Meaning, at a point, you will all realize that great hardware is good enough and chasing the best numbers doesn't mean SHIT.

This is a great display.

If you young guys don't have the money for these big boy toys, those 27" displays are still great. If I was 19 and had young eyes, I might be okay with rocking a 240hz 27" display.
 
Last edited:
if you got it from them for the higher price, just contact them and they'll refund the difference. just did it.
I didn't buy it yet. I was just pointing out that it seems like BB raised the price just to be able to state that it is now $100 cheaper :/
 
could not wait any longer, put the order in on best buy, pick up at store set for Jul 29, hopefully it is ready sooner.
 
The 55" 120 FOV def helps however along with the native 120hz. I have serious doubts that 240hz would help me or the lowest of the low latency, etc.

I've been increasing FOV in games since getting the 55CX. My online gaming scores have objectively improved. I don't think people should worry about screen size or input lag hurting their "competitive" experience with these displays. Just sit the right distance and you're good to go.
 
Last edited:
If you want "fastest" gaming experience, 120hz + BFI 120hz + NO GSYNC + NO FPS cap.
If you want "smoothest" gaming experience, 120hz + GSYNC + FPS cap @ 115 FPS.
 
  • Like
Reactions: N4CR
like this
If you want "fastest" gaming experience, 120hz + BFI 120hz + NO GSYNC + NO FPS cap.
If you want "smoothest" gaming experience, 120hz + GSYNC + FPS cap @ 115 FPS.

Disagree.

I think the smoothest way to go is 120hz + BFI + cap to 120fps. Where framerate=refresh rate AND you have BFI on.

The key is to tweak your game settings so you rarely drop below 120.

Alternatively, you could also run a lower refresh rate with BFI. One person on another forum told me this is possible. We need more people to confirm this though.

So you could run, say, 90hz or 100hz with BFI, if your CPU or GPU won't let you keep a locked 120fps. Then cap to whatever your refresh rate is.

And we have really good form of Vsync now in RTSS now called "scanline-sync". Where vsync is technically off, but your FPS is capped to precisely your refresh rate, and the "tearing line" is controlled and kept in one spot, even off-screen in the blanking interval if you want.
 
Scanline-sync is great but you do need a lot of performance headroom for it to work well. Easier said than done at 4k in many recent games, even with future hardware.
 
The best setting for this display if you have the horsepower will be: 4K - 120Hz - BFI "high" - FPS cap set to 119.993 via RTSS - VSync set to "On" in-game.

That way you don't get the V-Sync - ON input lag, and the FPS is so closely synced still for BFI that the occasional single micro-stutter is minutes apart and not noticeable, and you have no screen tearing. 4.16ms MPRT motion clarity.
 
  • Like
Reactions: N4CR
like this
The best setting for this display if you have the horsepower will be: 4K - 120Hz - BFI "high" - FPS cap set to 119.993 via RTSS - VSync set to "On" in-game.

That way you don't get the V-Sync - ON input lag, and the FPS is so closely synced still for BFI that the occasional single micro-stutter is minutes apart and not noticeable, and you have no screen tearing. 4.16ms MPRT motion clarity.

Why would you not get VSync input lag in this situation? Just enabling it should cause it.
 
Back
Top