LG 48CX

So rtings retested the cx with a hdmi 2.1 signal recently.

The input lag for 4K VRR is 13.9ms?

I find that is a bit high for what I was expecting.
On an FALD, ~13 ms seems to be the typical minimum latency at 4K 120 Hz for HDR + VRR. It needs one frame to process the HDR luminance for the FALD with backlight response set to the fastest mode. That shouldn't be required for OLED, but there may be a hardware limitation in VRR mode.
 
If there isn't some additional variable involved you aren't aware of, then same TV, same cable type, different GPUs points to something wrong in the GPUs or NVidia drivers, not the TV. Both GPUs are doing HDMI VRR.
Are you sure? I thought only the 3000 series could do HDMI VRR since it requires HDMI 2.1

EDIT: HDMI VRR with HDMI 2.1 supports 4K@120 but HDMI 2.0 supports it a 60. Is that correct?
 
Last edited:
So rtings retested the cx with a hdmi 2.1 signal recently.

The input lag for 4K VRR is 13.9ms?

I find that is a bit high for what I was expecting.

It's REALLY not going to matter unless you are a top 100 level competitive player or something, in which case you will not even be using a CX as your gaming display to begin with but probably some 1080p 240Hz one instead. I can guarantee 99% of people on this forum won't benefit from the CX having 1ms of input lag vs the measured ~10-13ms from various sites. Sure, the lower number is better on paper and looks more appealing, but in the real world it makes a negligible difference. I can still frag out pretty hard in a fast paced shooter like COD on my CX even with it's "high" input lag.

1604905234768.png
 
Its strange but I tested few games, and there was shuttering using VRR, then I enabled vsync and shutering was gone in all games.
In apex legends with vsync I got capped at 115 fps max, no issues.
In Horizon Zero Dawn I got shuttering in benchmark even at 80fps, turned on vsync and it was gone.
 
It's REALLY not going to matter unless you are a top 100 level competitive player or something, in which case you will not even be using a CX as your gaming display to begin with but probably some 1080p 240Hz one instead. I can guarantee 99% of people on this forum won't benefit from the CX having 1ms of input lag vs the measured ~10-13ms from various sites. Sure, the lower number is better on paper and looks more appealing, but in the real world it makes a negligible difference. I can still frag out pretty hard in a fast paced shooter like COD on my CX even with it's "high" input lag.

I actually think I may that small % that can notice a difference of 7ms.
 
And VIncent measured 7ms at 4k120hz. But yeah RTings and TFTcentral measured around 13.

Make of that what you will but yes I play fast paced online shooters and have zero issues. I could measure it with my 240fps camera (which means +/-4ms) at some point but I really haven't felt compelled to since I'm performing at least as good if not better than on my gaming LCDs with 1-2ms input lag.
 
And VIncent measured 7ms at 4k120hz. But yeah RTings and TFTcentral measured around 13.

Make of that what you will but yes I play fast paced online shooters and have zero issues. I could measure it with my 240fps camera (which means +/-4ms) at some point but I really haven't felt compelled to since I'm performing at least as good if not better than on my gaming LCDs with 1-2ms input lag.

Rtings update puts 4K 120 Hz at about 7ms but at 13 when VRR is enabled, which is odd when both 1080p and 1440p have even lower input lag with VRR than without it. Might be a bug or something specific to how the display handles 4K.

I've played all through Doom Eternal (as an example of a game with the kind of action you would see in MP shooters too) and never had any issues as long as Game mode was used. Using any other mode with Instant game response enabled increased input lag a bit, just enough to feel a difference but even that would not have been a problem for having a good gaming experience. I should test it again with the latest updates.

To me the input lag on the LG OLEDs is low enough to be a complete non-issue for any gaming.
 
If you go back and look at my other replies you'll see the huge delays comparatively in world action states sent and received when playing online games. If you are within a frame locally and can't feel the input lag in single player games, you are not going to get any competitive edge with higher Hz from a world action state delivery point playing online games. It's literally many frames before you get the next update on the best tick servers and most are worse ticks. Maybe if you were playing all LAN players on the same LAN with a fast tick server but otherwise it's really moot competitive wise other than aesthetics. I supposed you could argue a little more blur reduction could help slightly at higher Hz as long as you are supplying enough fps to fill those Hz, which most people and games aren't.. but the marketing is not focusing that and is acting like you are going to be served online game frames/ticks at "360hz" .. hah. It's hard to blame people much for believing it when that's what they are being told in advertisements.
If you run 120fps solid (not average) you'd see new game world data on a local game every 8.3ms.
Or 8.54ms at 117ms capped.
However most people on a 4k screen will be using VRR with lower bottom end frame rate ranges to get better graphics settings, outside of a few very high frame rate and/or older games so they would be seeing 14ms (70fps) through 8.5ms (117fps) per frame ranges at 90 or 100fpsHz average, if they can even hit that average.

Online games aren't sending you new world/action states to react to for a much longer time - based on (15ms and usually much longer for most games) tick rates combined with the ping time of every player (usually 20ms - 40ms when possible). You don't want a long reaction time locally after the server state is sent back to you and you don't want to feel like your lip-sync is off so to speak, but considering the whole chain (time to send your action to server, server calculation, interpolation, time to send back) and the margins on both ends between multiple players I think there's a limit on the low end to how small in ms it really matters after a point - for online games in particular since you can't really react on what hasn't processed yet or what you haven't seen happen yet and neither can anyone else playing on the server.

So you can't see new world/action states (new unique pages in an animation flip book) until every 8.5ms at 117fps on a 120hz monitor locally.
Then in an online game the whole loop of your action, the server processing it, and sending that new server action state back to you (including everyone else's actions, ping+interpolation factored) is magnitudes longer.
For example a good server game with 128 tick servers and using interp ratio 2 (to avoid huge 250 ms hits on missed packets) would have 15.6ms interpolation + 25ms to 40ms (your ping). So say 41ms to 56ms just for your own actions not counting lag compensation between other players. Lets say 56ms for now on the higher 128tick servers (though most games are much longer tick). 56ms is 6.6 frames of time on a 120hz monitor at 117fps solid (8.5ms per frame). So you aren't seeing new world updates for every 6 or 7 frames, maybe worse in relation to syncing with your next local (8.5ms) frame draw.

On a more traditional 64tick , 22tick, or 12 tick online game the numbers go up by a lot:

128tick at interp_2 = 15ms + (25 - 40ms) ping = 40ms to 55ms ~~~~~~~~~> 5 to 7 frames before new action/world state data is show (at 117fps solid)
64 tick at interp_2 = 31.2ms + (25 - 40ms) ping = 56ms to 71ms ~~~~~~~~> 7 to 8 frames
22 tick at interp_2 = 90ms + (20 - 40ms) ping = 110ms to 130ms ~~~~~~~~> 13 to 15 frames
12 tick at interp_2 = 166ms + (20 - 40ms)ping = 186ms to 206ms ~~~~~~~> 22 to 24 frames

If you set interp_1 then the tick interpolation time would be halved (minus 1 frame, 2 frames, 5 frames, 10 frames respectively) - but any lost packet at all would hit you with a 250ms delay /8.5ms per frame = 29 frames.

Command Execution Time = Current Server Time – (Packet Latency + Client View Interpolation)

For reference, tick rates of some common online games:

Valorant: 128tick
Specific paid matchmaking services like ESEA: 128tick
CSGO ("normal" servers): 64tick
Overwatch: 63
Fortnite: 60 (I think, used to be 20 or 30)
COD Modern Warfare mp lobbies: 22tick
COD Modern Warfare custom lobbies: 12tick


Some have guessed that League of Legends tick rate is around 30.
ESO PvE / PvP: ??
WoW PvE / PvP ?? .. World of warcraft processes spells at a lower 'tick rate' so is a bit more complicated, but overall the tick rate probably isn't that great.
https://us.forums.blizzard.com/en/wow/t/is-classic-getting-dedicated-physical-servers/167546/81




https://happygamer.com/modern-warfa...or-a-game-that-wants-to-be-competitive-50270/


Quoting my post again about tick rates in games for more detailed info and referenced links:

If you read through what I quoted about tick rates on servers you'd see that you'll still likely be getting 15.6 ms per tick on a 128tick server unless you had pristine ping and were willing to risk 250ms delays whenever your 2nd package is lost. For most game's 64 tick , 22, tick and 12 tick games your ms from the server would be much longer.

" Put into English this means that once you pull the trigger and this information package gets sent to the server, it then goes back from the current server time (the time the pulling the trigger package was received) by your ping plus your interpolation time. Only then it is determined if the client hit the shot or not. "

Then consider that everyone else in an online game is subject to the same lag compensation formulas.

I really think the response time considering extreme hz and mouse usage is overblown considering all of that, unless you are playing LAN games only (or single player local games) and at very high frame rates rather than using VRR to ride a roller coaster of frame rates that are moderate or low on the middle and low end of a frame rate graph. I'm guessing most people buying a HDR capable 4k OLED are buying it for some serious eye candy, not playing a very high frame rate competitive game at low settings or that is low graphics by design. What I do agree with is that at very high frame rates on very high frame rate monitors, the sample and hold blur would be reduced (without suffering the tradeoffs of BFI). That is not only advantageous for image clarity while moving for targeting purposes, but also aesthetically.

Anyone testing would have to make sure that they are running 120fps (115 or 117fps capped) as a minimum , no averages, in order to get 8.3ms per frame at 120fps (or 8.6ms at 115fps) for the upper limit of what the monitor can do. However a more realistic test would be 90 to 100fps average where people are using higher graphics settings on demanding games, relying on VRR to ride a roller coaster of frame rates.

-------------------------------------------
If someone is doing graphics settings overboard or has a modest gpu and cranks up the graphics at 4k resolution on a game so that they are getting say 75fps average, they would then be frame durations something in the ranges of:

......25ms / 16.6ms <<< 13.3ms >>>> 11.1ms / 9.52ms

at...40fps / 60fps <<< 75fps >>> 90fps / 105fps

------------------------------------------
https://win.gg/news/4379/explaining-tick-rates-in-fps-games-difference-between-64-and-128-tick
  • CSGO official matchmaking: 64-tick
  • CSGO on FACEIT: 128-tick
  • CSGO on ESEA: 128-tick

Valorant tick rates:
  • Valorant official matchmaking: 128-tick

Call of Duty: Modern Warfare tick rates:
  • COD multiplayer lobbies: 22-tick
  • COD custom games: 12-tick
While that sounds fast, many CSGO players have monitors capable of running at 144Hz. In simple terms, the monitor can show a player 144 updates per second, but Valve's servers only give the computer 64 frames total in that time. This mismatch in the server's information getting to the computer and leaving the server can result in more than a few issues. These can include screen tearing, a feeling like the player is being shot when protected behind cover, and general lag effects.
---------------------------------------------------

You'd think that a tick of 128 would be 7.8ms and a tick of 64 would be 15.6ms , but it's not that simple... (see the quotes below)

----------------------------------------------------


http://team-dignitas.net/articles/b...-not-the-reason-why-you-just-missed-that-shot

interpolation. When your game client receives a package from the server, it doesn’t simply show you the updated game world right away. This would result in everyone breakdancing in 128 or 64 tick intervals across the map. Rather, it waits a set interpolation time called “lerp”, whose name probably originated by a network engineer stepping on a frog.

During this time, a set number of further packages arrived on the client’s side containing more updated ticks from the server. Through these ticks, the client is able to interpolate what has happened between these two points in time and display this assumption to the player (don’t get mad yet). Interpolation time is determined by the simple equation

cl_interp = cl_interp_ratio / cl_updaterate
So in our 128 tick server example from above, on otherwise default settings this would mean: You receive a new packet every 7.8 Milliseconds (cl_updaterate 128) but the server waits until you received a third packet (cl_interp_ratio 2) before displaying the information, making the interpolation time is 15.6 Milliseconds for this example. On the other hand, a client running cl_interp_ratio 1 is presented with a renewed state of the game every 7.8 Milliseconds – assuming all other hardware and software variable are optimal.



Of course, from everything we’ve learned in our long online gaming history we assume that a lower number in front of the ms sign is always preferable. But, you already guessed it, things aren’t so easy this time around as bad connections and lag compensation come into the picture.

Again, the people with unreliable connections are better off to accept higher interp times, as the game client requires a new package of information from the server precisely at the interpolation time to update your game. If the second package is lost, the client waits 250ms on another package before flashing that red warning message in the top right corner of the screen.


For someone who tends to experience any package loss pretty much ever, it is safer to set cl_interp_ratio to 2, especially since you regain the “lost” time in the lag compensation.

Lag Compensation


The inevitable conclusion from the preceding segment and also the fact that all players on the server have a ping is, that everything you see on your screen has happened on the server already a few Milliseconds in the past.


Let’s leave any philosophical and Einsteinian implications of this to the side for the moment to focus on how a playable game is produced from this situation in which you don’t have to pre-aim your crosshair in front of the enemy.


The process responsible for this is lag compensation in which the server accounts for both ping and interpolation timings through the formula:


Command Execution Time = Current Server Time – (Packet Latency + Client View Interpolation)


Put into English this means that once you pull the trigger and this information package gets sent to the server, it then goes back from the current server time (the time the pulling the trigger package was received) by your ping plus your interpolation time. Only then it is determined if the client hit the shot or not.


Combining all of these factors - these tiny ms differences on the LG are really moot and especially if arguing latency regarding online (rather than LAN) gameplay and without using solid frame rates that don't dip below the max Hz of the monitor.
 
Are you sure? I thought only the 3000 series could do HDMI VRR since it requires HDMI 2.1

EDIT: HDMI VRR with HDMI 2.1 supports 4K@120 but HDMI 2.0 supports it a 60. Is that correct?
I am sure. HDMI VRR is a feature that was finalized around the same time as HDMI 2.1, but nothing about it requires HDMI 2.1 bandwidth and a signal that fits in HDMI 2.0 bandwidth (like 4k 120hz 4:2:0) can have VRR fine on HDMI 2.0. How did you think Turing cards did VRR on this TV and labeled them G-Sync compatible? Lots of us ran that way for months.
 
And some interesting stuff regarding the gamma shift: https://www.reddit.com/r/OLED_Gaming/comments/jnivu5/seeing_gamma_shift_might_not_get_sorted_for/

Which confirms that it is in practice negligible at 100-120fps, so will be a near perfect display once the stutter is fixed.

Yes but those are frame rate averages, not solid frame rates. That means you can still drop down 30fps + a few potholes from your average point of 100 throughout the game.

For example this graph of far cry new dawn at 4k getting around 92fps average. Just adjust the graph upward in your head as many steps as you need for 100 or 120:farcry-3090-details.jpg

Or scale this one upward from 75.5 fps average on Metro: Exodus....
small_metro-rtx-3090.jpg

Both graphs are from https://hothardware.com/reviews/nvidia-geforce-rtx-3090-bfgpu-review?page=4


If you adjust the metro one's fps scale to show a ~100fps average on the same graph just as a simple hypothetical, it looks like this below. Of course the graph could be different when actually running that rate but for the sake of argument lets assume it was the same:

metro-rtx-3090_graph-edited-to-100fps-ave.jpg


As you can see you'd still be dropping into 80fps and with 60fps and less "fps potholes".

VRR will still blend these varying Hz smoothly but what will the raised blacks be like? Hopefully you are right and the raised blacks won't be as obvious (or as grey) at these kind of 100 - 110fps average rates where about 1/3 of the graph is below 100.

VRR is still very useful at 100fps average but a lot of people probably want to use vrr at more like 75fpsHz average. So if you are "forced" to run 100fpsHz average or better to avoid issues it would be pretty limiting for a lot of people, especially at 4k resolution. On the top end, I wouldn't turn VRR off unless it was on a game with insanely high fps (considering fps potholes), where I could keep the worst minimums at 117fpsHz.
 
Yeah, but still - what's the alternative? 120 zone FALD with generally worse blacks than even the "raised blacks" on the CX, halos, no VRR and/or no 120Hz support at all?

Right now it's imho either:
A) turn off VRR
B) run at 60Hz VRR
C) live with raised blacks
D) get a 4k€ Alienware OLED that doesn't even have HDMI 2.1 and worse peak brightness (and probably the same raised black issues..?)
E) wait for LG C11 to maybe fix it
F) wait for microLED (miniLED isn't going to cut it against OLED)

Imho it's extreme nitpicking in 2020 for that price point as there is no alternative.

Not saying it is irrelevant, it IS an issue. But it shouldn't be a dealbreaker (vs other current gen TVs/monitors) as ALL current alternatives are worse in many other ways, and especially including blacks.

If you want 4k120 VRR and perfect black in 2020 there currently is no way around LG.
 
Last edited:
yes I'll deal with/live with raised blacks as long as they fix the stuttering.


I'd just like to know what raised blacks looks like (how grey does it become) at 100fps or 110fps AVERAGE where the rate is still droping way beneath that in the ~100fpsHz graph. (Also curious how raised they are at 75fps average for example).

confirms that it is in practice negligible at 100-120fps, so will be a near perfect display once the stutter is fixed.

Saying it's not an issue AT 100 or 120fps (solid) wouldn't mean much since the frame rate variance is what VRR is for in the first place.
 
I am sure. HDMI VRR is a feature that was finalized around the same time as HDMI 2.1, but nothing about it requires HDMI 2.1 bandwidth and a signal that fits in HDMI 2.0 bandwidth (like 4k 120hz 4:2:0) can have VRR fine on HDMI 2.0. How did you think Turing cards did VRR on this TV and labeled them G-Sync compatible? Lots of us ran that way for months.
That is what I thought. I had this all figured out back when Nvidia added VRR ability to the 2000 series but now so many posts in this thread saying they are waiting for their 3000 series card to try this out have me confused.

So all the posts from people saying we need 3000 series cards are only talking about 4k@120 4:4:4?

I am still using a Titan Xp until stock from either Nvidia or AMD.
 
I think nVidia only introduced VRR/G-Sync compatible over HDMI on the 2xxx series.

I know I got an overpriced 2070S opposed to a 1080Ti because of some sort of G-Sync compatibility concerns.
 
That is what I thought. I had this all figured out back when Nvidia added VRR ability to the 2000 series but now so many posts in this thread saying they are waiting for their 3000 series card to try this out have me confused.

So all the posts from people saying we need 3000 series cards are only talking about 4k@120 4:4:4?

I am still using a Titan Xp until stock from either Nvidia or AMD.
Yeah the core thing was being able to do 4k 120hz 4:4:4 HDR. Some people got the Club3d DP1.4->HDMI2.1 adapter that let them run 4k 120hz 4:4:4 HDR...but didn't support VRR. They are probably the ones you saw saying "waiting for 3000 series for VRR!!!"

And yeah it was ONLY on Turing cards.
 
Yeah, but still - what's the alternative? 120 zone FALD with generally worse blacks than even the "raised blacks" on the CX, halos, no VRR and/or no 120Hz support at all?

Right now it's imho either:
A) turn off VRR
B) run at 60Hz VRR
C) live with raised blacks
D) get a 4k€ Alienware OLED that doesn't even have HDMI 2.1 and worse peak brightness (and probably the same raised black issues..?)
E) wait for LG C11 to maybe fix it
F) wait for microLED (miniLED isn't going to cut it against OLED)

Imho it's extreme nitpicking in 2020 for that price point as there is no alternative.

Not saying it is irrelevant, it IS an issue. But it shouldn't be a dealbreaker (vs other current gen TVs/monitors) as ALL current alternatives are worse in many other ways, and especially including blacks.

If you want 4k120 VRR and perfect black in 2020 there currently is no way around LG.

G) Turn brightness down 2 clicks
H) Calibrate the tv following the german guide
I) Keep frame rates high enough that it's barely noticeable
 
Yeah the core thing was being able to do 4k 120hz 4:4:4 HDR. Some people got the Club3d DP1.4->HDMI2.1 adapter that let them run 4k 120hz 4:4:4 HDR...but didn't support VRR. They are probably the ones you saw saying "waiting for 3000 series for VRR!!!"

And yeah it was ONLY on Turing cards.
OK, glad I had this correct in the first place. The "waiting for 3000 series for VRR!!!" posts had me confused.
 
G) Turn brightness down 2 clicks
H) Calibrate the tv following the german guide
I) Keep frame rates high enough that it's barely noticeable
What framerates have you found it become noticeable? The only time I've seen it happen is in the dead by daylight vid I postd earlier, and in the post-benchmark screen of shadow of the tomb raider if you mouse over the graphs. Mousing over those graphs drops the game to 8 fps and is a great way to exageratedly show the problem. But I'm curious to know the realworld, in-game use number of fps where we'll see the gamma shift more. Stay above 70? 90? 110?
 
What framerates have you found it become noticeable? The only time I've seen it happen is in the dead by daylight vid I postd earlier, and in the post-benchmark screen of shadow of the tomb raider if you mouse over the graphs. Mousing over those graphs drops the game to 8 fps and is a great way to exageratedly show the problem. But I'm curious to know the realworld, in-game use number of fps where we'll see the gamma shift more. Stay above 70? 90? 110?

It's heavily going to depend on the game. I would imagine Control's color palette makes the issue super easy to spot while other games will be more difficult to notice it. I've been playing a bunch of older/easier to run titles on my CX with a 2080 Ti so I've been averaging over 100fps in those games and that has been working well for me.
 
My frame rate is typically between 60 and 120 fps, and I've never really noticed the raised blacks. Maybe it's apparent at even lower frame rates, but at that point LFC should be multiplying the frame rate to above 60 fps anyway.
 
You realize this is the CX forum right? You get all our hopes up until you realize it has nothing to do w/ the tv that's the focus of this thread - lol!
C9 and CX are having some of the same issues. Firmwares are being currently updated
 
Having serious doubts on this display now.

Going to wait for firmwire updates now.

I want 4K 444 HDR + VRR
otherwise I will wait out another generation.

EDIT: Well I am an idiot, I just had to impulse buy it. Now it's on pre-order.

Hope I am not let down...

:)
 
Last edited:
Having serious doubts on this display now.

Going to wait for firmwire updates now.

I want 4K 444 HDR + VRR
otherwise I will wait out another generation.

EDIT: Well I am an idiot, I just had to impulse buy it. Now it's on pre-order.

Hope I am not let down...

:)
Yeah, I know that feeling. "I'm gonna wait, I'm gonna wait.." Next thing I remember was the checkout dialog 😂

I don't regret my descicion coming from a 43" 4k60 VA monitor. The LG CX is just phantastic and we're nitpicking at a very high level here.
 
I do not doubt there are issues, but I have not noticed any and really enjoy the display. I do however get an incompatible format message on the display occasionally and have to reboot it. Any thoughts on a fix for that?
 
C9 and CX are having some of the same issues. Firmwares are being currently updated
For what it's worth, thats the FW that supposedly fixes the Gsync issues on the C9. And over at AVS there's a few people claiming the beta firmware now out for the CX fixes the problems, with an expected release around the third/fourth week of November.

Fingers crossed.
 
For what it's worth, thats the FW that supposedly fixes the Gsync issues on the C9. And over at AVS there's a few people claiming the beta firmware now out for the CX fixes the problems, with an expected release around the third/fourth week of November.

Fingers crossed.
Is there any way to apply to become a part of the firmware beta?
 
Meanwhile in other news, I just mounted my LG CX 48 on this floor stand/trolley. I previously had the TV on a heavy duty monitor arm using a VESA 100x100 -> 300x200 adapter but that limited me to the depth of my desk. With the floor stand I can push the display back to about 100-120 cm which feels like a more optimal distance for this behemoth compared to my previous 85-90 cm. The stand seems good quality and was easy to assemble and mount the TV. The only problem is that when the TV is set at desk height, the pillar it is mounted on pokes up from behind the TV like a chimney. I might have to eventually saw it shorter to hide it.

I chose this stand because it can be put pretty close to a wall thanks to its base design and it had enough adjustment range to go low enough to sit level with my desk plus had tilt option to set it at the exact right angle. The included shelf was useful for hiding my USB switch and power strip as well as some other cables.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Those of you with 3000 series cards...was the only way to get 120 hz was to select the "custom" PC resolution and not under the Ultra HD listings? RBG and 10 bpc are the max via HDMI 2.1, correct? What are the ideal settings otherwise (both TV and nvidia control panel)?
 
I previously had the TV on a heavy duty monitor arm using a VESA 100x100 -> 300x200 adapter but that limited me to the depth of my desk.
kasakka - can you let me know what desk mount you used previously? I have a deep, custom-made butcher block desk and the TV on stand is just right for distance, but I'd like to get the TV off the desk (for potentially putting a sound bar beneath it at some point).
 
Meanwhile in other news, I just mounted my LG CX 48 on this floor stand/trolley. I previously had the TV on a heavy duty monitor arm using a VESA 100x100 -> 300x200 adapter but that limited me to the depth of my desk. With the floor stand I can push the display back to about 100-120 cm which feels like a more optimal distance for this behemoth compared to my previous 85-90 cm. The stand seems good quality and was easy to assemble and mount the TV. The only problem is that when the TV is set at desk height, the pillar it is mounted on pokes up from behind the TV like a chimney. I might have to eventually saw it shorter to hide it.

I chose this stand because it can be put pretty close to a wall thanks to its base design and it had enough adjustment range to go low enough to sit level with my desk plus had tilt option to set it at the exact right angle. The included shelf was useful for hiding my USB switch and power strip as well as some other cables.

The cheapest option is just put another table in behind your table and put the LG on that, you can use whatever piece of furniture with good height. If its needed just make bigger gap between these two tables. I am also now at about +-110cm which is quite okay
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
kasakka - can you let me know what desk mount you used previously? I have a deep, custom-made butcher block desk and the TV on stand is just right for distance, but I'd like to get the TV off the desk (for potentially putting a sound bar beneath it at some point).
I used a Multibrackets VESA HD Gas lift arm paired with a VESA 100x100 to 300x200 adapter off Amazon (sold under multiple brands). I don't recommend it, its tilt can't handle heavy/big displays and support from the company never bothered to answer my emails. The actual product is otherwise nice and easy to assemble, the tilt is just no good.

I would go wall mount if you can. With the position mine is in that was not an option so the floorstand was the next best thing.
 
I'm getting pretty frequent black screen blanking after updating to the latest Nvidia driver with a 3080 but it could be my cable. Any suggestions for a cable?
 
Hey all, I'm trying to play videos that are on my PC on my LG TV. I'm getting "no photo video files exsist in storage device"

What gives? The same videos show up fine on a USB stick when attached to the TV, so it's not that. The TV is hooked in through ethernet, though trying on WiFi didn't change anything. The PC is connected over WiFi.

The files themselves are on a separate drive, but it can see subfolders on that drive, it just doesn't see the .mkv or mpg files inside those folders.


Love the TV btw!
 
I'm getting pretty frequent black screen blanking after updating to the latest Nvidia driver with a 3080 but it could be my cable. Any suggestions for a cable?

mirkendargen confirmed on the previous page that the Zeskit 6’ from Amazon works...there are probably others, though. Wish we had a confirmed list of brands and lengths but my plan was to try the Zeskit if my current high speed cable doesn’t work. I think it’s a Mediabridge and it solved the sparkling issues that I was having at 4K/60Hz. Worth a shot I suppose.
 
I get flickering like crazy since Beyond Light Destiny 2 update. Bummer. I don't see frame dips happening. But it's flickering like there are a bunch of them.
 
Back
Top