LG 48CX

I've read some interesting info here:
https://www.reddit.com/r/Overwatch/comments/3u5kfg/everything_you_need_to_know_about_tick_rate/

Without lag compensation (or with poor lag compensation), you would have to lead your target in order to hit them, since your client computer is seeing a delayed version of the game world. Essentially what lag compensation is doing, is interpreting the actions it receives from the client, such as firing a shot, as if the action had occurred in the past.

The difference between the server game state and the client game state or "Client Delay" as we will call it can be summarized as: ClientDelay = (1/2*Latency)+InterpolationDelay
..
An example of lag compensation in action:

  • Player A sees player B approaching a corner.
  • Player A fires a shot, the client sends the action to the server.
  • Server receives the action Xms layer, where X is half of Player A's latency.
  • The server then looks into the past (into a memory buffer), of where player B was at the time player A took the shot. In a basic example, the server would go back (Xms+Player A's interpolation delay) to match what Player A was seeing at the time, but other values are possible depending on how the programmer wants the lag compensation to behave.
  • The server decides whether the shot was a hit. For a shot to be considered a hit, it must align with a hitbox on the player model. In this example, the server considers it a hit. Even though on Player B's screen, it might look like hes already behind the wall, but the time difference between what player B see's and the time at which the server considers the shot to have taken place is equal to: (1/2PlayerALatency + 1/2PlayerBLatency + TimeSinceLastTick)
  • In the next "Tick" the server updates both clients as to the outcome. Player A sees the hit indicator (X) on their crosshair, Player B sees their life decrease, or they die.
Note: In an example where two players shoot eachother, and both shots are hits, the game may behave differently. In some games. e.g. CSGO, if the first shot arriving at the server kills the target, any subsequent shots by that player that arrive to the server later will be ignored. In this case, there cannot be any "mutual kills", where both players shoot within 1 tick and both die. In Overwatch, mutual kills are possible. There is a tradeoff here.
...
So there is tuning going on...
  • If you use the CSGO model, people with better latency have a significant advantage, and it may seem like "Oh I shot that guy before I died, but he didn't die!" in some cases. You may even hear your gun go "bang" before you die, and still not do any damage.
  • If you use the current Overwatch model, tiny differences in reaction time matter less. I.e. if the server tick rate is 64 for example, if Player A shoots 15ms faster than player B, but they both do so within the same 15.6ms tick, they will both die.
  • If lag compensation is overtuned, it will result in "I shot behind the target and still hit him"
  • If it is undertuned, it results in "I need to lead the target to hit them".


So if I'm reading it right you'd have to see
....your frame you'd be reacting to delivered at your local frame rate first (though that's being altered/resolved throughout by the true server state modifying your local one based on the server latency compensation code tied to it's tick rate)
....then add your reaction time e.g. 150, more like 180ms - 250ms with some variance (this is a huge number frames-wise, your reaction spans ~ 23 frames at 117fps! ~ .180 seconds)
....then add your mouse, keyboard, and main pc case hardware component's input lag. According to the nvidia charts posted earlier system lag can be 33ms to 77ms of system lag across the three games it listed. That's another 4 to 8 frames later at 117fps (.033 to .077 seconds).
.. and then add your 4ms (a gaming monitor, .004 seconds) or 13ms (LG CX, .013 seconds) input lag on your display? This is a .009 second difference.

So is this quasi-accurate?
--------------------------------------
... 8ms local frame rate (if at 117 SOLID) but throughout "overwritten" by things resolving on the server after being compared to rolled back/compensated latencies(pings) and a 15ms tick rate interpolation and the return trip time.. that result showing you a registered server game state update and it hits your eyeballs so you decide to react to it...
... + 180ms human reaction time (if being generous, but varies throughout game) + 27 to 51ms system latency with nvidia reflex supported games + 4ms or 13ms from a gaming monitor or LG CX display.
...

I don't think the online game cares when you are seeing the result on your screen in resolving the actions as it relates specifically to screen input lag but in the overall flow - again you usually can't react well to what you haven't seen yet (e.g. is your crosshair visibly on the target yet to your eyes).... though some might compensate for even high input lag screens with best guess on the fly in their heads and get by.

It seems to me that your system latency (~ 30 - 77ms), the very large by comparison and variable reaction times(~180ms+/-), and 20 - 40 ms variable ping + tick rate(15ms, 33ms or worse) and the resulting compensation formula being shuffled into your frame state stream would dwarf the difference between 4ms (.004seconds) and 13ms (an additional .009 seconds by comparison) input lag screens.

Tick rate .015 on the best servers, most games are multiples worse (.031sec on 22 tick , .090sec on 64 tick)
Ping times .020 seconds to .040 seconds typically.
Latency as related to action delivery: After it's delivery time relative to tick it is 1/2 compensated for to resolve when it registered on the server (rewind time). The other 1/2 of the latency = return trip time (.010 sec to .020 sec at 20ms to 40ms ping). So everything on the server is really happening in the past.
Human reaction time .180 seconds +/-
4ms gaming monitor vs 13ms OLED = .009 seconds added difference






Generally, a higher tick-rate server will yield a smoother, more accurate interaction between players, but it is important to consider other factors here. If we compare a tick rate of 64 (CSGO matchmaking), with a tick rate of 20 (alleged tick rate of Overwatch Beta servers), the largest delay due to the difference in tick rate that you could possibly perceive is 35ms. The average would be 17.5ms. For most people this isn't perceivable, but experienced gamers who have played on servers of different tick rates, can usually tell the difference between a 10 or 20 tick server and a 64 tick one.

Keep in mind that a higher tickrate server will not change how lag compensation behaves, so you will still experience times where you ran around the corner and died. 64 Tick servers will not fix that.
...
 
Last edited:
I'm using RTSS for jedi fallen order at 57fps since my current gpu only does 60hz. When I get a 3090 I'll cap at 117 using RTSS. From what I read on blurbusters.com the lowest input lag methods were in-game caps or RTSS. The nvidia one introduced a little input lag. Maybe that's changed since but RTSS is just as easy to do (and has some other optional features besides) so I wouldn't bother switching to nvidia's method.
EDIT: it has indeed changed apparently (pretty recently starting in a driver released a year ago 1-6-2020).... so it's a matter of preference as long as you set it up properly. I'm used to using RTSS and it has some other neat features besides so I'll probably not bother switching to the nvidia method. An ingame limiter is usually the lowest input lag but that's not always made available in every game.
I wanted to ask you about this. I used RTSS a long time ago, but haven't had it installed in quite some time. A couple of days ago, I upgraded to the newest version of MSI Afterburner and ticked the box to install RTSS. You mention some other neat/optional features in there. Besides the frame limiter, is there anything else that I should be using/enabling/changing in RTSS to maximize optimal enjoyment of my GPU/panel?

Thanks in advance.
 
I wanted to ask you about this. I used RTSS a long time ago, but haven't had it installed in quite some time. A couple of days ago, I upgraded to the newest version of MSI Afterburner and ticked the box to install RTSS. You mention some other neat/optional features in there. Besides the frame limiter, is there anything else that I should be using/enabling/changing in RTSS to maximize optimal enjoyment of my GPU/panel?

Thanks in advance.

I mostly use it as a frame rate limiter but it has good text based readout overlays. On an OLED I'd normally only use them with a toggle hotkey to peek at them for a bit and toggle them back hidden for obvious reasons but with multiple monitors there are ways to place RTSS readouts and graphs on 2ndary screens yet showing data from the primary screen. These overlays could potentially trigger some online games anti cheat code though so I'd only use them for single player games unless I was sure it would work with or was otherwise whitelisted on a particular game's anti cheat .

https://forums.guru3d.com/threads/multi-monitor-support-for-osd.432963/

47065db04503a9341bdc22acb8b2d92fed6a324d.jpg





There are plenty of other programs (aida64, hwinfo64..) that can monitor hardware too, and rainmeter skins and things like that that plug into them.




It has some other capabilities but there are other apps and suites that duplicate most of them:
-overclock gpu
-set up custom fan speeds
-screenshots/capture game recordings
-hardware monitoring (already mentioned this, real-time but can bench/graph)
 
Last edited:
That overwatch post is over complicating things I think...

Rollback netcode was invented by John Carmack in QuakeWorld. If you remember the pushlatency command, this basically set your how far in the past your client was compared to the server state. E.g. /pushlatency -200 meant you were playing the game 200ms in the past.

Games like Overwatch hardcode this value, I think it's 100 for Overwatch.

That means every client is playing 100ms in the past with respect to the authoritative server game state. The server takes care of handling conflicts between clients, and sending the authoritative state to everyone. If there is an inconsistency, the client performs a rollback. The vast majority of the time there is never a rollback - human reaction time and physics engines pretty much dictate this (if you are moving in Overwatch, it takes more than 100ms to stop your momentum generally). So the client is basically always playing with 0 latency, as long as its physical latency is less than the "pushlatency" amount. I don't think I ever experienced a noticeable rollback in Overwatch, it's very rare.

Rollback netcode is one of the greatest things ever invented for gamers - all hail Carmack for blessing us mere mortals.

You are right that 8ms in the grand scheme of things is going to be very hard to notice. I am skeptical of arguments about human reaction time though. Because what people don't take into consideration is muscle memory. Our minds, once you practice enough, are very good at essentially hitting timing windows by doing things by rote. Think of a high level piano player, they aren't hitting notes with 200ms delay. They are starting their movements in anticipation of hitting a note ahead of when that note is supposed to come out. Same with competitive gamers, both CSGO and fighting games, you can basically hit frame perfect timings (16ms windows) with enough practice. The problem here with input delay on the screen is it can throw off your muscle memory substantially. If suddenly your screen adds 30ms of delay all your muscle memory is messed up. I think you can get used to it, but this is usually what people mean when they can "feel delay" on their screens. So in that respect, if you can simply minimize your input delay so it's not noticeable then you can use many different screens with similar results - not throwing away 5 years of CSGO experience because your input delay on the screen sucks.
 
I mostly use it as a frame rate limiter but it has good text based readout overlays. On an OLED I'd normally only use them with a toggle hotkey to peek at them for a bit and toggle them back hidden for obvious reasons but with multiple monitors there are ways to place RTSS readouts and graphs on 2ndary screens yet showing data from the primary screen. These overlays could potentially trigger some online games anti cheat code though so I'd only use them for single player games unless I was sure it would work with or was otherwise whitelisted on a particular game's anti cheat .

https://forums.guru3d.com/threads/multi-monitor-support-for-osd.432963/

View attachment 324200





There are plenty of other programs (aida64, hwinfo64..) that can monitor hardware too, and rainmeter skins and things like that that plug into them.




It has some other capabilities but there are other apps and suites that duplicate most of them:
-overclock gpu
-set up custom fan speeds
-screenshots/capture game recordings
-hardware monitoring (already mentioned this, real-time but can bench/graph)

Thanks! I just wanted to make sure I wasn't missing anything really noteworthy. I do normally use other apps for those things, but good to know regardless.
 
I'm not saying it's a bad system considering what we have to work with. I'm saying it's gapped and with some elasticisty and that by comparison to all of those numbers, +/- .009 seconds throughout all the solid rates and variables is not likely to count in your favor or against you.


"Yes, it's marketing. The mean for human reaction times is 250-270ms, the website you linked has a stats page where it shows their mean of all tests is 273ms. They also used to have a graph showing the distribution of results, it looked like this. You can see that 150ms is pretty much the top 1%. I don't think the "average gamers" reside in the top 1%. "


"The average reaction time for humans is 0.25 seconds to a visual stimulus, 0.17 for an audio stimulus, and 0.15 seconds for a touch stimulus.". So yes, the reaction time is indeed 150ms... for touch stimulus... "
--------------------------------------------------


https://humanbenchmark.com/tests/reactiontime

In addition to measuring your reaction time, this test is affected by the latency of your computer and monitor. Using a fast computer and low latency / high framerate monitor will improve your score.

Scores in this test are faster than the aim trainer test, because you can react instantly without moving the cursor.

This is discused in further detail on the the statistics page. While an average human reaction time may fall between 200-250ms, your computer could be adding 10-50ms on top. Some modern TVs add as much as 150ms!

If you want, you can keep track of your scores, and see your full history of reaction times.
Just perform at least 5 clicks and then save.

-------------------------------------------------


https://blurbusters.com/human-reflex-input-lag-and-the-limits-of-human-reaction-time/2/

To see a classic example of a simple visual reaction task, try your hand at the human benchmark reaction time test. One of the cool things about that site is that you can see some of the statistics. The all time average (median) is around 270 ms, but this includes input lag. In particular, it includes the delay between the moment the program instructs the display to change color (which is presumably when the timer starts), and the moment the pixels change color on your display. It also includes the delay between the moment you press the mouse button, and the moment the program receives the signal from your button press (which is when the timer stops).

If you look at the distribution, you can see that some people are performing at around 150 ms. These are probably younger folks who have excellent reflexes and are on good hardware (it’s also possible that some people are using clever methods to cheat), but 150 ms does seem to be in the ballpark of the limits of human reaction time to a visual stimulus (at least when it comes to pressing buttons with a finger), although there may be a few people who can push this limit lower.
We also respond faster to acoustic stimuli than visual stimuli. Here’s a great explanation from reddit. Basically, the idea is that converting photons to neural signals takes longer than it does to convert pressure waves to neural signals. Because of this, acoustic reaction times are around 30 ms faster.

In many of these experiments, the way the reaction is actually measured can have an impact on the final result. For example, in the humanbenchmark test, there is a slight delay between the moment your finger actually starts to move, and the moment the button actually “clicks”. And depending upon the USB polling rate of the mouse, there could be as much as a 8 ms delay between the moment the button clicks, and the moment the signal from the mouse is registered.

To get a more accurate picture of reaction times, many labs use specialized equipment to get a more precise idea of when the finger (or other body part) starts to actually move. For example, passive optical tracking systems (where retroreflective markers are placed on the target object, such as a finger) or tiny inertial sensors attached to the object are two ways to measure the position of an object across time. Some studies use surface electromyography (EMG) to measure the electrical activity in the muscles themselves.
Surface EMG is an accurate way to measure reflex, but it doesn’t take into account the time between the moment the muscles activate and the moment that force is actually produced across the joint in question (this time is called the electromechanical delay). So while measuring EMG reaction times is a great way to get rid of the “noise” involved in measuring things like button clicks, it doesn’t give us a true picture of how long it takes someone to produce a useful reaction—that is, a reaction that actually allows us to produce a physical response to the environment around us.

One of the posters on the Blur Busters forums (‘flood’, who is also responsible for designing this gem) has reported human benchmark scores of around 140 ms, and can regularly get scores below 160 ms. Here is a video of him sniping bots in CSGO.
 
Ho
Lee
Shit. Are you kidding me, LG? I have a lowly Nano81(I love it!) And I was looking in the settings and noticed next to Ethernet it said (100Mbps)
hm? Look in my router's settings, set the port to gig, LG still says 100mbit? Searching and come to find out that my $800 TV, purchased in the year two thousand and twenty, has a "fast ethernet" port on it. Are you fucking serious?
Search this thread and the $1,500, 48" LG CX48 also does not come with gig ethernet? WTF. Why even put the port on if you're that cheap. The first hit in duckduckgo was someone saying they can't actually stream lossless bluray via plex because it hits 125Mbps.
Just a baffling and appallingly cheap move by LG. But hey, we have baked in ads on our TV to "help make their products better" so I guess that makes up for it?

How much more expensive is a gig ethernet jack than a fast ethernet one? It wouldn't have surprised me to learn that 100mbit jacks are actually more expensive due them being out of production for the last ten years.

Also, if you're plugged in to ethernet and didn't disable the Wireless NIC in the TV it defaults to Wifi.(at least if your WAP supports 5g)
 
Slightly off-topic but since this appears to be the dedicated OLED thread:

Do we know if the 2019 C9 series are also getting all these recent major fixes? (VRR stutter bug, elevated black fix, etc.) Seems like all the news on fixes is just for the CX series. I haven't gotten my 3090 yet (freaking stock man...), and wondering if I'm going to be left out in the cold unless I get a CX or just wait for a C1. I would browse the AVS forums, but there's like 10,000 pages worth of replies....hard to find anything. My firmware version is 05.00.03 (North America - USA model). Sorry, LG doesn't seem to log a good firmware to firmware version history or anything. Thanks!
 
I wanted to ask you about this. I used RTSS a long time ago, but haven't had it installed in quite some time. A couple of days ago, I upgraded to the newest version of MSI Afterburner and ticked the box to install RTSS. You mention some other neat/optional features in there. Besides the frame limiter, is there anything else that I should be using/enabling/changing in RTSS to maximize optimal enjoyment of my GPU/panel?

Thanks in advance.

Scanline-sync is an option that is pretty crucial if you're trying to use BFI.
 
From what I've read you have to turn on instant game response in the TV OSD in order for "G-sync" to show up in the nvidia drivers in windows. So I think LG instant game response = hdmi 2.1 ALLM + PC hdmi 2.1VRR / G-sync . So it should work with your 3070 with "g-sync" enabled.
Thanks. So you are saying that with game response enabled, G-sync enabled then I should be able to use any of the video modes (for example Cinema) with the lowest input lag? Or would I still have to use the "game" video mode?

Also oddly enough in Nvidia control panel with their newest driver it's now saying my LG isn't validated as G-sync compatible, where on the previous driver I didn't have a message like that. It still allows you to check off to use the display as G-sync though, which I did check on.
 
Thanks. So you are saying that with game response enabled, G-sync enabled then I should be able to use any of the video modes (for example Cinema) with the lowest input lag? Or would I still have to use the "game" video mode?

Also oddly enough in Nvidia control panel with their newest driver it's now saying my LG isn't validated as G-sync compatible, where on the previous driver I didn't have a message like that. It still allows you to check off to use the display as G-sync though, which I did check on.
Just try it. It should be extremely apparent moving the mouse on the desktop if there's increased input latency or not.
 
Ho
Lee
Shit. Are you kidding me, LG? I have a lowly Nano81(I love it!) And I was looking in the settings and noticed next to Ethernet it said (100Mbps)
hm? Look in my router's settings, set the port to gig, LG still says 100mbit? Searching and come to find out that my $800 TV, purchased in the year two thousand and twenty, has a "fast ethernet" port on it. Are you fucking serious?
Search this thread and the $1,500, 48" LG CX48 also does not come with gig ethernet? WTF. Why even put the port on if you're that cheap. The first hit in duckduckgo was someone saying they can't actually stream lossless bluray via plex because it hits 125Mbps.
Just a baffling and appallingly cheap move by LG. But hey, we have baked in ads on our TV to "help make their products better" so I guess that makes up for it?

How much more expensive is a gig ethernet jack than a fast ethernet one? It wouldn't have surprised me to learn that 100mbit jacks are actually more expensive due them being out of production for the last ten years.

Also, if you're plugged in to ethernet and didn't disable the Wireless NIC in the TV it defaults to Wifi.(at least if your WAP supports 5g)

I keep my wifi turned off on the tv ever since I set it up. I just now swapped a usb3 to ethernet adapter I keep in my backpack to the usb3 port on the tv. Both the "Wired Connection (Ethernet)" and the "WiFi connection" in the TV OSD show as "not connected now". I just launched plex in WebOS and it loaded a video file so it seems to be working. I haven't done any bandwidth testing though.

Using this:
https://www.amazon.com/Plugable-Ethernet-Gigabit-10-100-1000-Compatible/dp/B00AQM8586

Some of the reviews say it reduces the speed on macs but overall the reviews from linux and PC are positive so I'll have to see how it goes on the TV.

On speedtest by okla I got
22 Mbps down, 240 Mps up on a single test.
That's using the webos web browser. The web page seemd a little clunky though and it's the first time I used the web browser on the tv.

On speakeasy using the same WebOS internet browser I got
36 Mbps down, 246 Mbps up.
4.5 MB/s down, 30.7 MB/s up.

edit:
I tested again on a different server and got
48.2 Mbps down ~> 6 MB/s


On my pc I get 10x faster or more down on fios gigabit, which is around equal to the higher upload speed results I posted above.
On steam and some other services I can usually get higher.

The point here is the same server test on the TV using my usb3.0 ethernet adapter is 10x slower down for some reason but the uploads seem fine.
I haven't run any LAN speed tests and I didn't test the built in ethernet port before I switched to compare. On paper if the built in ethernet port is 100mbit it would have a theoretical max of 12.5 MB/sec (maybe less real world).. ...

So these download speeds are very poor (6 MB/sec) on my usb3 adapter but the upload speeds are 31 MB/s compared to the theoretical max of 100mbit being 12.5mb/sec.


edit: on the TV's wifi on (5Ghz right next to the router) I got
47 Mbps down, 219 Mbps up which is about the same download speed as the best I got on the usb3.0 ethernet adapter so maybe there is some bottleneck with the TV

I might have to look into getting a 2019 shield with it's gigabit ethernet adapter eventually.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Thanks. So you are saying that with game response enabled, G-sync enabled then I should be able to use any of the video modes (for example Cinema) with the lowest input lag? Or would I still have to use the "game" video mode?

Also oddly enough in Nvidia control panel with their newest driver it's now saying my LG isn't validated as G-sync compatible, where on the previous driver I didn't have a message like that. It still allows you to check off to use the display as G-sync though, which I did check on.

Just try it. It should be extremely apparent moving the mouse on the desktop if there's increased input latency or not.

I'm still running 1080ti sc hybrids in sli until I get the 3090 I want so I'll leave that up to 3000 series owners on hdmi 2.1 to play with for now. :rolleyes:
 
Windows HDR is detected as BT.2020 fine for me.

Some people were claiming something about washed out colors unless BT.2020 was forced in the HDMI signaling override. Question is, what if those "washed out colors" was actually the accurate image while forcing BT.2020 was just giving major oversaturation that was more pleasing to the eye?
 
I've also noticed that text looks fairly crappy on my Cx as well, even after playing with clear type or trying to just turn it off. My issue is letters, especially lighter colored letters, have a lot of ghosting and halo around them.
Do we have any thread/info discussion settings for optimal text clarity?
I could live with one setting for my daily office use and one that I apply for the free/fun time
 
Last edited:
Some people were claiming something about washed out colors unless BT.2020 was forced in the HDMI signaling override. Question is, what if those "washed out colors" was actually the accurate image while forcing BT.2020 was just giving major oversaturation that was more pleasing to the eye?
No it was clearly handling the color space wrong and that caused washed out colors. It was fixed in a firmware update and now just works. If I remember correctly it did not even do this all the time, just occasionally.

At this point on the latest firmware you don't need to access that menu.
 
Do we have any thread/info discussion settings for optimal text clarity?
I could live with one setting for my daily office use and one that I apply for the free/fun time
Honestly at this point it gets hard for me to say what is optimal. On MacOS it's easy, just turn off font smoothing and everything looks great. On Windows it's trickier.

I have been running with the grayscale font smoothing setting for a good while but for testing I just turned RGB back on and adjusted the contrast to 1600 in Better Cleartype Tuner and might actually prefer how that looks. I don't know if LG has changed something in the processing on the TV because I before felt RGB was worse.

I suggest playing around with the Better Cleartype Tuner settings and settling for whatever looks good to you. Turning Cleartype off is not an option IMO as it looks very jagged.

The higher scaling level you use the better it will handle things. I run mine at 125% in WIndows, 120% on MacOS.

You need to have PC mode on (Home dashboard, set icon of input to PC, name can be anything) and make sure the display is running at 8 or 10 bit RGB. Anything else will run with chroma subsampling.
 
Last edited:
Honestly at this point it gets hard for me to say what is optimal. On MacOS it's easy, just turn off font smoothing and everything looks great. On Windows it's trickier.

I have been running with the grayscale font smoothing setting for a good while but for testing I just turned RGB back on and adjusted the contrast to 1600 in Better Cleartype Tuner and might actually prefer how that looks. I don't know if LG has changed something in the processing on the TV because I before felt RGB was worse.

I suggest playing around with the Better Cleartype Tuner settings and settling for whatever looks good to you. Turning Cleartype off is not an option IMO as it looks very jagged.

You need to have PC mode on (Home dashboard, set icon of input to PC, name can be anything) and make sure the display is running at 8 or 10 bit RGB. Anything else will run with chroma subsampling.

Wow I didn't even know of this app, with your settings text looks really nice. My only complaint is with the CX ( I have a feeling all OLED's) I still get quite a bit of ghosting.
 
Wow I didn't even know of this app, with your settings text looks really nice. My only complaint is with the CX ( I have a feeling all OLED's) I still get quite a bit of ghosting.
What do you mean by ghosting? What kind of settings are you running on the TV? Are you using DPI scaling in Windows? What apps are you using to compare text quality? Are you comparing with another monitor?

Chrome loves to ignore text rendering settings for example while Firefox doesn't, there's also a difference in more legacy GDI apps vs more modern stuff.
 
  • Like
Reactions: elvn
like this
Just a reminder since the text quality question often comes up. With a screen this size your text quality can be greatly impacted by your view distance.
That pic he took is from very close to the screen though too. How far away is he sitting? Sitting too close in relation to ppi = a lower PPD so will always look bad. (Using text sizes too small in relation to the ppi will also look bad).


...27" 3840x2160 at 2.0' away is 73.5 ppd (and 53 deg horizontal viewing angle)
...48" 3840x2160 at at 42.5" away is 73.5 ppd

...27" 3840x2160 is 57.9 ppd at 18" (1.5') away
...48" 3840x2160 is 57.9 ppd at 32" away


Same goes for sitting too near at a desk for 1440p on 31.5" 1440p diagonal screen like the VA I upgraded from, and it's why VR resolutions are still an issue currently (even though they use lenses to change the focal point).
 
What do you mean by ghosting? What kind of settings are you running on the TV? Are you using DPI scaling in Windows? What apps are you using to compare text quality? Are you comparing with another monitor?

Chrome loves to ignore text rendering settings for example while Firefox doesn't, there's also a difference in more legacy GDI apps vs more modern stuff.

White text seems to be slightly doubled and there is a very slight black halo around the text, I don't think I'm describing it well at all though, it's almost like the letters are out of focus, but they aren't and letters themselves are very sharp. This is in varying degrees depending on how light the text is, with white being the worst. I've played around with everything, OLED backlight, contrast, brightness, etc. I am using DPI scaling, currently at 200%, but I've played with different levels. Main apps I'm using are browsers, mainly MS Edge (chromium) but after your comment I did try in Firefox and didn't see any different. Nvidia settings 4k/120hz, 10bit RGB/full. HDR is on as I'm too lazy to turn it on/off every single time I play a game. Game response is on, as is HDMI ultra. I mainly use Vivid for desktop but if I switch to Game it still looks the same.

The text looks really good though, no jaggies, especially after putting it on 1600 RGB contrast. This is in comparison to my old LG LED, as well as the Q80T I had for a few days, that's why I was assuming it was something inherent to OLED. I do sit pretty close to it, but even if I am back a couple of feet I still see this. I double checked with a few friends/family to make sure it wasn't my eyes.
 
imo 38" to 41" minimum for this panel which is over 3' to 3.5', not 2'... or you are going to be making some compromises.
... THX sweet spot of 50deg wide viewing angle for movie viewing starts at 44.4" away for the 48 inch diagonal screen.

THX 's standard has optimal viewing angle for movies as 45 to 50 degree width. On a 48" 16:9, 50deg starts at ~ 44.4" screen to eyeballs. This results in ~ 76 PPD.

e.g. 54 degree viewing angle and the same ~71 to 72 PPD at 4k on all of them at those distances would be:

48" 16:9 display = 41" view distance ( ~ 3.5 ' )
42" 16:9 display = 35.9" view distance ( ~ 3' )
31.5" 16:9 display = 27" view distance ( ~ 2.4')
27" 16:9 display = 23" view distance ( ~ 2')
I'll also add.. don't sit too close to it. I had a 32" 1440p and the text and anything else would show edges if I tried to sit 1' to 1.5' away. When I sat at a more appropriate distance for it's ppi .. it's PPD and default font sizes looked great. Also in the same vein, don't try to make the fonts smaller than the ppi's limitations just because you are sitting too close as that will be worse too.

THX 's standard has optimal viewing angle for movies as 45 to 50 degree width. On a 48" 16:9, 50deg starts at ~ 44.4" screen to eyeballs. This results in ~ 76 PPD.

421005_w7zaY99.png

That's for movies though, to see the whole scene. It's a good baseline but I sit closer for some games at 38" to 41" away. I wouldn't bother sitting closer than that (and then complain about the ppi and font edges).

The PPD at that THX 50 degree is the same as what the ppi would look like at all of these monitor sizes at these distances below. For general usage and gaming I would sit closer (putting a little more of the scren's extents into the periphery) but this gives some idea of the distance ratio between all of these screen sizes PPD wise at ~ 76 PPD.

421006_CQMiyPX.png
421007_XdwQKAB.png


You can adjust that tool to compare nearer.
e.g. 54 degree viewing angle and the same ~71 to 72 PPD at 4k on all of them at those distances would be:

48" 16:9 display = 41" view distance ( ~ 3.5 ' )
42" 16:9 display = 35.9" view distance ( ~ 3' )
31.5" 16:9 display = 27" view distance ( ~ 2.4')
27" 16:9 display = 23" view distance ( ~ 2')
 
What do you mean by ghosting?
I'll give some examples using tools we all have access to: Nvidia control panel, task manager and and windows explorer:

* I am using 100% scaling at the moment

* Ghosting for my screen is like a horizontally shifted echo of the text, shifted to the left (in my case). It looks like the echos/halos you seen when they test IPS/VA panels with moving objects to measure pixel lag / refresh times

* On small text elements that leads to echoes overlapping with the next letters and that makes it hard to pick up visually

* Have a special look at the photo of the Nvidia control panel: The blue header called "Ultra-HD, HD, SD" has a strong echo, whereas the fonts for the resolution are crystal clear. So physically IT IS POSSIBLE to have perfect font sharpness, but the rendering in many places seems to generate fonts that are displayed with echoes.

I tried changing to no avail:

* 120HZ -> 60Hz
* Gsync on/off
* RGB/YCbCr444
* 10/8 bit

* Sharpness to "0" on TV and all other options tried with on/off setting
 

Attachments

  • 2021-01-31_19h15_06.png
    2021-01-31_19h15_06.png
    3.1 MB · Views: 0
  • 2021-01-31_19h18_35.png
    2021-01-31_19h18_35.png
    2.6 MB · Views: 0
  • 2021-01-31_19h16_54.png
    2021-01-31_19h16_54.png
    3.7 MB · Views: 0
imo 38" to 48" for this panel which is over 3' to 4'.
assuming it was something inherent to OLED.
OLED does have an extra clear subpixel that shows white through it... WRGB = WhiteRGB. This boosts effective brightness (to conserve OLED lifespan and vs burn-in) but sacrifices accuracy as you get father away from SDR range.

Personally I'd sit father away in the first place to a more comparable PPD to a traditional desktop monitor as I listed above. See how that looks to you. Other than that I would use dark themes wherever possible. Windows themes and browser themes, color changer addon for firefox and chrome (can change text and/or background color and intensity on the fly and it remembers it per site), turn off the lights for firefox and chrome (brightness slider per page), dark themes for all apps where possible. However in my opinion the best usage scenario is using a side monitor and keeping the OLED as a media stage for games and movies, videos, streams, etc.

247983.png

I set turn of the lights to "automatically dim the background on the following sites" so it affects all sites to my default dimming level that I set in the general settings. I then build a whitelist for sites like hardforum that are fine without it.

ion-turn-off-the-lights-browser-extension-1024x510.png

I also:
...Always show the dimness level bar when the lights go out (under Advanced -> dimness level bar). You can click to toggle the dimness bar on a page to "minimize" it to a square.
...set turn out the lights to "Enable clicking hyperlinks when the screen is darkened" in Advanced -> mouse settings. Then you can operate on the page while it is dimmed w/o breaking out of the dimness overlay.
 
Last edited:
I suggest playing around with the Better Cleartype Tuner settings and settling for whatever looks good to you. Turning Cleartype off is not an option IMO as it looks very jagged.

Wow, that was an awesome suggestion! Font quality has noticably improved. I chose Greyscale and the font "Georgia"
 
Wow, that was an awesome suggestion! Font quality has noticably improved. I chose Greyscale and the font "Georgia"
The font choice has no effect in anything but the preview though. It's best to look at real apps how they look. Some will work just toggling the Better Cleartype Tuner options, others need to be restarted for them to take effect. Some might require logging out and back in.
 
Best thing for fonts is to set scaling to 150% minimum. 100% just looks bad no matter what in my opinion. Cleartype really needs the extra pixels to work its magic.
 
Best thing for fonts is to set scaling to 150% minimum. 100% just looks bad no matter what in my opinion. Cleartype really needs the extra pixels to work its magic.
For 150% I would want quite a bit more viewing distance. I feel 120-125% is about ideal for 1m viewing distance. I feel that is enough to improve the font rendering too.
 
I'll give some examples using tools we all have access to: Nvidia control panel, task manager and and windows explorer:

* I am using 100% scaling at the moment

* Ghosting for my screen is like a horizontally shifted echo of the text, shifted to the left (in my case). It looks like the echos/halos you seen when they test IPS/VA panels with moving objects to measure pixel lag / refresh times

* On small text elements that leads to echoes overlapping with the next letters and that makes it hard to pick up visually

* Have a special look at the photo of the Nvidia control panel: The blue header called "Ultra-HD, HD, SD" has a strong echo, whereas the fonts for the resolution are crystal clear. So physically IT IS POSSIBLE to have perfect font sharpness, but the rendering in many places seems to generate fonts that are displayed with echoes.

I tried changing to no avail:

* 120HZ -> 60Hz
* Gsync on/off
* RGB/YCbCr444
* 10/8 bit

* Sharpness to "0" on TV and all other options tried with on/off setting
Try increasing viewing distance. I tried looking at the same apps and really had to stick my face in the screen to see a slight red tint on the text. For me it does not look anywhere near as bad so try resetting cleartype settings from the Windows built in Cleartype tuner.
 
I'll give some examples using tools we all have access to: Nvidia control panel, task manager and and windows explorer:

* I am using 100% scaling at the moment

* Ghosting for my screen is like a horizontally shifted echo of the text, shifted to the left (in my case). It looks like the echos/halos you seen when they test IPS/VA panels with moving objects to measure pixel lag / refresh times

* On small text elements that leads to echoes overlapping with the next letters and that makes it hard to pick up visually

* Have a special look at the photo of the Nvidia control panel: The blue header called "Ultra-HD, HD, SD" has a strong echo, whereas the fonts for the resolution are crystal clear. So physically IT IS POSSIBLE to have perfect font sharpness, but the rendering in many places seems to generate fonts that are displayed with echoes.

I tried changing to no avail:

* 120HZ -> 60Hz
* Gsync on/off
* RGB/YCbCr444
* 10/8 bit

* Sharpness to "0" on TV and all other options tried with on/off setting
Sharpness set to 0 on this TV is soft. 20 is the neutral setting, if I remember correctly. This is also showing the issue with the fourth white subpixel we have been talking about. You can address the text issue by using ClearType.
 
Sharpness set to 0 on this TV is soft. 20 is the neutral setting, if I remember correctly. This is also showing the issue with the fourth white subpixel we have been talking about. You can address the text issue by using ClearType.
Is there a source for this? I haven't seen it anywhere else.
 
I have seen Vincent Teoh and others saying 0 is neutral. To my eyes, in game mode + PC mode there is no difference between 0 and 10 anyway. It's like the sharpness setting is bypassed. Haven't tried much higher values, though.
 
I know it's that way on some VA tv's like my samung nu6900 which has neutral sharpness at 10 rather than zero.

I have seen Vincent Teoh and others saying 0 is neutral. To my eyes, in game mode + PC mode there is no difference between 0 and 10 anyway. It's like the sharpness setting is bypassed. Haven't tried much higher values, though.

One of the comments below says pretty much what you are saying and with some examples.


According to this page, the default sharpness on some LG OLEDs is set to 10, and filmmaker mode on the CX is 10 sharpness - but bypassing processing with no edge enhancement = zero. For video watching as a more proper TV.

it might come down to a matter of preference for some people though.

https://www.avsforum.com/threads/20...nd-user-settings-no-price-talk.3113174/page-3

FILMMAKERMODE should disable all processing which changes the picture, like edge enhancement.

No edge enhancement is sharpness 0 for LG TVs. 10 is the standard value for most modes but this is not the bypass setting without processing.

https://www.reddit.com/r/OLED/comments/hhnrga/so_is_proper_sharpness_10_or_0/
level 1
JDSP_

7 months ago

There isn't a diff between 0 and 10
My post on this the other day:
There is no diff between sharpening at 0 or 10, you can make some pixel perfect images and as long as you have a macro lens you can test it
There isn't a single diff in the pixel output from 0 to 14, at 15 a single pixel line on a black background the panel will start to turn the surrounding red and green subpixels on to increase the brightness of that pixel thus making more contrast and for it to appear sharper (however as it's a simple filter it just highlights exactly what it's doing)
(open images in a new tab and change between them so easily see the diff, or in this case no diff)



level 1
exodus_cl

7 months ago

You're the best judge for that, but if you want an opinion: Pc should be 0, everything else 10



EyeWasAbducted

7 months ago

0 for 4K content, 10 for HD. I leave mine at 0 for everything.
 
150% scaling is the only real solution for text sharpness. It's the best compromise.
I have seen Vincent Teoh and others saying 0 is neutral. To my eyes, in game mode + PC mode there is no difference between 0 and 10 anyway. It's like the sharpness setting is bypassed. Haven't tried much higher values, though.
At native resolution (3840x2160), Sharpness 10 seems to do nothing while higher values do sharpen the image. At 1080p, it pretty clear Sharpness 10 performs anti-aliasing and Sharpness 0 is neutral. This video demonstrates it conclusively.
 
Ah yes thanks for reminding of that - the sharpness setting at 10 does do something at 1080p specifically. Won't matter to most people here I imagine, but good to keep in mind nonetheless.
 
It always amazes me that we always see the flaws or should I say look for them rather than just enjoying this monitor for what it is, an absolute beast of a display. For anyone on the fence just buy it you will not be disappointed. If this monitor does not work for you or you are not completely blown away then maybe your negativety is something you should look at. Enjoy it for what it is, not for what it is not. You can always return it. 🙂
 
yeah that bottom reddit quote if you expanded it said as much too but that video verifies it.

EyeWasAbducted

7 months ago

0 for 4K content, 10 for HD. I leave mine at 0 for everything.

From TFTCentral review:

Within the chosen preset mode you're using for PC connection (e.g. if using the 'Expert (dark room)' mode as recommended in our calibration section below, change the sharpness control to 0 - you don't need the screen to artificially sharpen the image here as the PC provides the relevant resolution input at 3840 x 2160. If you want to sharpen movies etc you are probably better doing that with the software player on the PC than using the screen's built in sharpening system
 
Last edited:
It always amazes me that we always see the flaws or should I say look for them rather than just enjoying this monitor for what it is, an absolute beast of a display. For anyone on the fence just buy it you will not be disappointed. If this monitor does not work for you or you are not completely blown away then maybe your negativety is something you should look at. Enjoy it for what it is, not for what it is not. You can always return it. 🙂

Some convos were complaining, especially before the firmware fixes which fixed a number of big issues. However a lot of the conversations are about figuring out how to get the most out of what the display can do - via settings and configurations. Tuning it up and hearing other people's usage scenarios and layouts in a help group is part of what it's all about for me. Also learning more about how the technologies work, including their limitations in any given model or generation.

I'm extremely happy with this TV. It's my first large OLED. My others are a phone and a 8.5" tablet. It's also my first real HDR display and it's stunning in HDR combined with the "infinite" black depth per pixel.

That's even without a 3000 series gpu yet unfortunately but I don't regret my purchase even if I have to wait until the next gen of TVs is out to get a gpu. At this rate it wouldn't surprise me if it is just an ongoing thing and I am still without a gpu when they are feeding 3080Tis to scalpers.

I actually had a 3090 not just in my cart but fully processed order on newegg with an order number and a "congratulations" message informing me that there might be a 2 - 3 day delay due to covid issues etc. Then later that night I got a notification in my email that the order was canceled due to insufficient stock. So there is a serious issue or scam going on with ordering gpus I think with the amount of money that scalpers can get.
 
Last edited:
Back
Top