LG 48CX

So I don't have any extreme viewing angles with them. I usually play games in windowed move on a single screen at a time and then have a youtube video playing as well, I then dual boot one of my desktops to do work with as well across all three displays. If I play fullscreen it is on only one monitor at a time.

The way I have them positioned is in a way that each one of them isn't blocking the other. It honestly works out really well. My multiple cameras I use for OBS and multiple angles on work calls if I need to switch to a different laptop / desktop.

I really have quite a lot going on with my setup. Which I will link a picture below.

View attachment 348866


Those riser desks are amazing.....at least for me anyway as I'm a 4lb Chiauhuahahaha and most furniture is made for bigger dogs!
1619662027554.png
 
Just saw this in the news. Finally a fix for this annoying CX "problem" when using it with a multi-monitor setup!

https://www.theverge.com/2021/4/28/...-apps-rearranging-sleep-resume-fix-directx-12

Hopefully this actually fixes that issue without causing other issues. It's annoying, but it's easy to deal with.
In the past I've experienced weird issues where I come back and a window doesn't show up on any monitor anymore which is actually a pain to deal with. Sometimes it can't be fixed by doing the windows key + arrow trick. Even worse I've had programs that somehow saved the starting window location so it does not on any screen, and they couldn't be fixed by restarting them or even reinstalling.... I had to go delete registry files or something.
 
Got my 48" CX today, and I'm very happy with the clarity and how much nicer games look as well. Coming from a 34" Ultrawide Alienware previously.
 

Attachments

  • 4-30.jpg
    4-30.jpg
    551.8 KB · Views: 0
I too will be coming from a 34" ultra wide when I get my cx48. What's it like to see the sky and ground again when gaming? God I miss that 16:9 view.
The Alienware was my first Ultrawide. I had a 32" MSI display previously and I was very happy with the Alienware for it's clarity, but I did miss the 16:9.

Welcome to the club! Such a beautiful screen!
Thanks! It is very beautiful, and after using it I don't see how I could ever go back to a traditional "monitor".
 
Just saw this in the news. Finally a fix for this annoying CX "problem" when using it with a multi-monitor setup!

https://www.theverge.com/2021/4/28/...-apps-rearranging-sleep-resume-fix-directx-12
Hopefully this actually fixes that issue without causing other issues. It's annoying, but it's easy to deal with.
In the past I've experienced weird issues where I come back and a window doesn't show up on any monitor anymore which is actually a pain to deal with. Sometimes it can't be fixed by doing the windows key + arrow trick. Even worse I've had programs that somehow saved the starting window location so it does not on any screen, and they couldn't be fixed by restarting them or even reinstalling.... I had to go delete registry files or something.

Sounds promising thanks for the news link. For window placement/management I use the displayfusion pro app in a few ways. The easy way is that after opening all of my most often used apps and placing them on my monitors, I then set displayfusion to save a window placement profile (that I name) using the right click menu. This can be hotkeyed so a quick hotkey and all of my windows get shuffled back to their tile positions like a dealer dealing them out. With my setup I went a step further and set that hotkey to a displayfusion button so I just have to hit one button and my windows all go back to their "prime" locations. That works even if only one is moved too.

I go deeper than that, setting different displayfusion functions to a bunch of apps separately assigned to their own button (and icon on the button) on my streamdeck. In this way I can press a button several times for one app. It checks to see if it's open. If not open it opens it and places it in what I chose as it's home location. If it's open and minimized, it restores it to that same location. If it's open and not minimized,it will minimize it. So I can quickly cycle through those until the window is open and where it should be or in order to minimize/restore it it.

I put a bunch of other misc apps on there in subsets of buttons/hotkeys for some commonly used windows functions/menus, music apps and controls, launch steam or close and re-open steam, toggle steam from big picture mode and back, and a bunch of other things like toggling between audio device outputs (headphones on a usb dac or the LG CX audio) which is very handy, mute/unmute mic, etc etc.

I have some displayfusion hotkeys linked to streamdeck buttons for generic window placement too. So whatever window is the active window will get moved to where that button/hotkey is assigned. I made some simple icons for the streamdeck buttons so it's obvious where the active window will be placed This comes in handy for popping a window between monitors and for setting windows to exact placements stacked on top of each other, or to place a window on top of an area already assigned for something else so it won't overlap other windows on the monitor, etc. For example I usually keep a 3 window high setup of equally sized windows on my left portrait mode monitor or I set it to one window 2/3 bottom and leave 1/3 on top for something else.

The default saved window position profile feature is easy and handy enough by itself though. I just like the extra control.

What I did notice, and what I suspect that microsoft update won't fix, is that when I turn my monitors back on the taskbar is always back on the LG CX as primary. I like to drag it to the right side monitor on top so that there is no taskbar on my primary (OLED) monitor at all (even though I use translucent taskbar and taskbarhider). Every time I turn all of my monitors back on when I get home or wake up I have to drag that bar over again. A minor inconvenience but still mildly annoying.
 
Last edited:
Hopefully this actually fixes that issue without causing other issues. It's annoying, but it's easy to deal with.

I do hope the new microsoft fix doesn't break anything else like you said, or force my windows contrary to what I do with them.

In the past I've experienced weird issues where I come back and a window doesn't show up on any monitor anymore which is actually a pain to deal with. Sometimes it can't be fixed by doing the windows key + arrow trick. Even worse I've had programs that somehow saved the starting window location so it does not on any screen, and they couldn't be fixed by restarting them or even reinstalling.... I had to go delete registry files or something.

The generic overall "Saved Window Position Profile" in displayfusion would probably bring that window back to where you saved it in the profile initially..

Having specific app locations set up as functions in displayfusion will pop them back to where they were set too.

Otherwise - if you can find the active taskbar icon for the app on the taskbar while the app itself is "off screen" somewhere, you can try hovering your mouse over the top of that icon until the running thumbnail preview of the app appears above that taskbar app. Then you carefully move your mouse onto the top of that thumbnail view and right click it so it will pop up a right click menu. Choose "maximize" and the window should fill one of your screens. From there you can drag it (away from being maximized) by the top bar of the app using your mouse and then resize it after. It's best to close it and reopen it at some point after that so it will forget the last position but at least you can use the app the rest of the session and save what you are doing, etc.

Displayfusion has function templates for setting the window position of a specific app so that is the best way to take control of window positions for me but the right click trick works. I don't have displayfusion on every pc I use (nor set up to specifically assign random misc app windows I don't normally use) so that can still come in handy even if it's a very rare occurrence.
 
Last edited:
I have a question about the gamma changing with hz on cx/c1.

When I play 60fps locked game on PC, is it better to run it in "120" hz and let freesync/gsync/lfc change hz to 60 or turn 60hz specifically ?
Or run 120hz, turn of freesync and enable half vsync?
And what about the input lag booster that says it doubles frames to 120hz ? would it be useful on console?

Same question for ps5 btw. Just without vrr since it does not support it... but I can enable the "prevent input delay: boost" option whatever it really does
 
Last edited:
That is correct. When games use virtual cameras (HOR+).. like most cgi authoring suite's virtual cameras do by default... a wider aspect ratio will always show more scene just like a real lens.

You can of course run an even larger physical size ultrawide rez (depending on the game or workarounds) with completely off emitter black bars on an oled TV too.

The 48 cx in uw is equivalent to around a 45.3 inch diagonal 21:10 uw screen.. when running 3840x1600 at 1:1 pixel edge to edge of the ~ 41.8 inch wide screen.
 
I might buy a shield just so I can get "Watch together" in Plex. It's so annoying the webos Plex doesn't have that feature. How is the shield experience?
 
The new Resident Evil demo 8 came out and seems a tiny bit washed out. I manage 70-120 fps 4k with HDR so probably vrr gamma is at fault.
I am still thinking what is the best way to play 60fps locked games on pc and ps5 with lg oled. Would it look better if I disabled VRR and left the screen at 120hz ? Or switch to 60hz without vrr? What bout input lag booster which maybe forces it to run at 120hz?
Sorry, it's a repeat from my previous post :p

As for game being a bit washed out maybe it's srgb vs rec709 ?
 
I might buy a shield just so I can get "Watch together" in Plex. It's so annoying the webos Plex doesn't have that feature. How is the shield experience?
I got a Shield for a better Plex experience but have been a bit underwhelmed. It's great in Plex...but Netflix has a TON of stutter and looks somehow lower res than the builtin app. I'm not sure it's worth it for only Plex.
 
  • Like
Reactions: hhkb
like this
I am still thinking what is the best way to play 60fps locked games on pc and ps5 with lg oled. Would it look better if I disabled VRR and left the screen at 120hz ? Or switch to 60hz without vrr? What bout input lag booster which maybe forces it to run at 120hz?

I would think that all you'd have to do is disable VRR while keeping 120Hz to avoid the gamma issues and call it a day, which is also probably the most convenient method since you can simply set a game profile in your GPU drivers to simply have VRR disabled for that specific game.

According to Rtings, the input lag for 60Hz with boost is pretty much exactly double that of a native 120Hz input, so there shouldn't be any benefit to changing to 60Hz unless you want to disable boost in order to use black frame insertion (which also wouldn't do much good at 120Hz when the game itself is running at 60fps - it'll result in a sort of double-image effect), not to mention using black frame insertion increased the input lag by around a frame or two on the CX (no idea about the C1).
 
I got a Shield for a better Plex experience but have been a bit underwhelmed. It's great in Plex...but Netflix has a TON of stutter and looks somehow lower res than the builtin app. I'm not sure it's worth it for only Plex.

Yeah I think I'll just use my browser for watch together, no big deal. Not worth 200 just for plex I think. Maybe if I can get a good deal on one. I have a PS5 too but its plex app is spectacularly bad.
 
This looks like its beside the kitchen. Which suggests this is the dining area. That's serious.

What's the material the desk top is made out of for the standing desk? I plan to get a standing desk for an LG OLED C1 TV. 48". So far I know I need something wide enough to hold a TV and a second monitor (at an angle). strong enough as well. A mounting arm is likely.
So that kitchen area is a secondary one in my basement. The primary one is upstairs.

the standing desk is from Vari and I got the black one. Not sure what the material is.
 
On the TV side without a shield - you can disable dts in the plex webOS app so it will play AAC sound on/from the tv but I found the TV's playback buggy on some titles where it would drop playback to black screen partway through or show a short time line as if the video was truncated, which for all practical purposes it functions as if it is in this scenario and just cuts off at that point.

Transcoding, which is what happens when you disable dts in the plex webOS player, is always some minor fidelity loss. It's not the same as remuxing a title beforehand 1:1. I'm also not certain that it's not transcoding the video by default when you disable DTS sound that way too.

The shield is good if you have a bunch of DTS sound titles and a receiver because you can port the shield to the sound system and back to the tv "the old fashioned way" instead of using the eARC out from the TV, at least when you are using the shield to watch a video. Besides seamless 1:1 (non-transcoded) DTS audio playback, the shield also has a full gigabit connection, full usb 3.0 ports, 5Ghz wifi, a very fast speed operation and a lot of other features like high quality AI upscaling, a more robust google play store full of apps compared to the app store on most TVs (and a bunch of other capabilities with gaming/emulation/streaming , android side load apps. etc. if you are into that) but yes you are paying a decent amount for all of those capabilities if you go that route ~ $200 USD.

From TechRadar:

Who’s it for?

Premium Cord-Cutters and Streamers: The new Nvidia Shield is a premium 4K HDR/Dolby Vision streaming player. It’s not the cheapest gateway into 4K streaming, but it is one of the most powerful. If you’re looking to dabble in high-end formats, this is the player to get.

Discerning Cinephiles: Because AI upscaling really focuses on the small details in a scene (the lettering on a bag of chips, the wrinkles in someone’s face, etc…) relatively observant cinephiles will get the maximum enjoyment from the upscaling features of the new Shield.



https://www.legitreviews.com/nvidia-shield-tv-2019-review-better-media-streaming_216195

NVIDIA SHIELD Key Specifications:

Processor: NVIDIA Tegra X1+ 64-bit Mobile Processor w/256-core NVIDIA Maxwell GPU
Memory: 2GB of RAM
Storage: 8GB internal (portion used for OS and system software) w/ microSD Expansion
Wireless: 802.11ac 2×2 MIMO 2.4GHz and 5GHz Wi-Fi / Bluetooth 5.0 + LE
Wired: 1x Gigabit Ethernet
OS: Android 9.0 powered by Android TV w/ Google Chromecast 4K built-in
Power: 40W power Adapter (5-10W typical power consumption)

NVIDIA SHIELD Video Support:

AI-enhanced upscaling for 720p/1080p to 4K up to 30 FPS
Up to 4K HDR playback at 60 FPS (H.265/HEVC)
Up to 4K playback at 60 FPS (VP8, VP9, H.264, MPEG1/2)
Up to 1080p playback at 60 FPS (H.263, MJPEG, MPEG4, WMV9/VC1)
Format/Container support: Xvid/ DivX/ASF/AVI/MKV/MOV/M2TS/MPEG-TS/MP4/WEB-M

NVIDIA SHIELD Audio Support:

Dolby Audio (Dolby Digital, Dolby Digital Plus, Dolby Atmos)
DTS-X surround sound (pass-through) over HDMI
High-resolution audio playback up to 24-bit/192 kHz over HDMI and USB
High-resolution audio up-sample to 24-bit/192 kHz over USB
Audio support: AAC, AAC+, eAAC+, MP3, WAVE, AMR, OGG Vorbis, FLAC, PCM, WMA, WMA-Pro, WMA-Lossless, Dolby Digital Plus, Dolby Atmos, Dolby TrueHD (pass-through), DTS-X (pass-through), and DTS-HD (pass-through)
 
Last edited:
I got a Shield for a better Plex experience but have been a bit underwhelmed. It's great in Plex...but Netflix has a TON of stutter and looks somehow lower res than the builtin app. I'm not sure it's worth it for only Plex.

Are you using it wired or wireless? Netflix buffering is what I would suspect at first but idk. I don't watch netflix on the shield that is on my LG CX but I've watched it plenty on my ~ 2016 4k vizio FALD VA (no HDR) in my living room on a gigabit lan connection with zero issues in the past for things like stranger things, umbrella academy, etc. and I do pay for the 4k netflix tier so it's full bandwidth.

I'm very particular about stuff like that and I would notice and find it extremely aggravating so it sounds like something particular to your setup.

----------------------------------------------------------------------------

Youtube HDR is not working on the shields though but that is a separate issue because they lack an AV1 chip. :/

https://www.flatpanelshd.com/news.php?subaction=showfull&id=1588740730

Netflix began streaming in AV1 in February 2020 and now YouTube has flipped the switch for compatible Android TV devices through the latest software update. YouTube has been experimented with AV1 in PC browsers for some time.

At this time, compatible Android TV devices include those the running on Broadcom's BCM72190/72180 and Realtek’s RTD1311/RTD1319 chipsets, according to XDA Developers who first spotted the change.

We were not able to pull down AV1 video streams on our 2019 Nvidia Shield, which is hardly surprising as it does not have hardware support for AV1. At this time you are unlikely to see any major changes.

However, in the long term AV1 support could bring 4K and HDR streaming to many more devices and streaming services following the slow roll-out of the royalty-based HEVC format as well as VP9, the two codecs currently being used for 4K HDR streaming. The royalty-free AV1 format is backed by Apple, Facebook, Netflix, LG, Microsoft, Netflix, Samsung and many others. LG and Samsung are also introducing the first TVs with AV1 support this year.
--------------------------------------------------

I can always just switch between things with the CX's remote holding the mic button:
..."youtube" opens the webOS youtube app directly for full youtube HDR. A problem with a lot of youtube HDR channels is they have a persistent static logo for advertising their channel/brand though.
..."switch to PC" (or "hdmi 3") to get back to PC
....and "shield" (which I named it's input) to get back to the shield.

I do have to swap hdmi sources using my receiver's remote when going to/from the shield though since I'm not porting the shield's audio through the TV in order to get the audio formats the shield supports (especially DTS formats). A minor inconvenience. In the old days I had to swap between sources manually all the time on the receiver (playstation, blu-ray player, set top box, pc) so only having to do it to/from the shield when I decide to use it isn't really a problem for me. I'd only be doing it when playing a whole video so it's not like I'd be swapping back and forth many times.
 
Last edited:
I have a question about the gamma changing with hz on cx/c1.

When I play 60fps locked game on PC, is it better to run it in "120" hz and let freesync/gsync/lfc change hz to 60 or turn 60hz specifically ?
Or run 120hz, turn of freesync and enable half vsync?
And what about the input lag booster that says it doubles frames to 120hz ? would it be useful on console?

Same question for ps5 btw. Just without vrr since it does not support it... but I can enable the "prevent input delay: boost" option whatever it really does
On PC you would set the refresh rate to 120 Hz and let VRR or G-SYNC do its thing. On PS5 you would set to 120 Hz and let the console do its thing.
The new Resident Evil demo 8 came out and seems a tiny bit washed out. I manage 70-120 fps 4k with HDR so probably vrr gamma is at fault.
I am still thinking what is the best way to play 60fps locked games on pc and ps5 with lg oled. Would it look better if I disabled VRR and left the screen at 120hz ? Or switch to 60hz without vrr? What bout input lag booster which maybe forces it to run at 120hz?
Sorry, it's a repeat from my previous post :p

As for game being a bit washed out maybe it's srgb vs rec709 ?
If RE8 or any game looks washed out then you need to change the HDR calibration in the game. For RE8 follow the instructions to make the inner box disappear and make the red and blue marks equal length. If you have HDR enabled then it shouldn't be possible to change the color space in the options for the game.

On PC make sure you enabled HDR in the Windows Display Settings and that you have color settings in the NVIDIA control panel set to Use default color settings.

On PS5 make sure to run the console's built in HDR calibration tool and follow the instructions. Except on the last step (3/3) you want to set it as dark as it will go no matter what is shown on screen.

On the TV make sure you have Deep Color enabled for all the HDMI inputs being used.
 
Who has seen the C1 rtings review?

Is it worth upgrading my CX to a C1, mainly because I use it for gaming and the C1 has 1ms less input delay then the CX. Aldo why does the CX have 13ms input delay @ 4K VRR but the C1 has only 5.8ms according to rtings?
 
Who has seen the C1 rtings review?

Is it worth upgrading my CX to a C1, mainly because I use it for gaming and the C1 has 1ms less input delay then the CX. Aldo why does the CX have 13ms input delay @ 4K VRR but the C1 has only 5.8ms according to rtings?
Since I have not seen any discounts on the CX you may as well get a C1 if it's available.

On input lag it is due to improved signal process and and the new "Prevent input delay" feature.
 
Are you using it wired or wireless? Netflix buffering is what I would suspect at first but idk. I don't watch netflix on the shield that is on my LG CX but I've watched it plenty on my ~ 2016 4k vizio FALD VA (no HDR) in my living room on a gigabit lan connection with zero issues in the past for things like stranger things, umbrella academy, etc. and I do pay for the 4k netflix tier so it's full bandwidth.

I'm very particular about stuff like that and I would notice and find it extremely aggravating so it sounds like something particular to your setup.

----------------------------------------------------------------------------

Youtube HDR is not working on the shields though but that is a separate issue because they lack an AV1 chip. :/

https://www.flatpanelshd.com/news.php?subaction=showfull&id=1588740730




--------------------------------------------------

I can always just switch between things with the CX's remote holding the mic button:
..."youtube" opens the webOS youtube app directly for full youtube HDR. A problem with a lot of youtube HDR channels is they have a persistent static logo for advertising their channel/brand though.
..."switch to PC" (or "hdmi 3") to get back to PC
....and "shield" (which I named it's input) to get back to the shield.

I do have to swap hdmi sources using my receiver's remote when going to/from the shield though since I'm not porting the shield's audio through the TV in order to get the audio formats the shield supports (especially DTS formats). A minor inconvenience. In the old days I had to swap between sources manually all the time on the receiver (playstation, blu-ray player, set top box, pc) so only having to do it to/from the shield when I decide to use it isn't really a problem for me. I'd only be doing it when playing a whole video so it's not like I'd be swapping back and forth many times.
Wired. It's not a bandwidth problem, there's no buffering. It's just a lot of judder/stutter/frame pacing problems. It's probably a combination of the Android TV Netflix and Android TV, nothing Shield-specific. If you Google you see tons of people reporting the same problem with no fixes.
 
To start with, unless you are running 120fps minimum for 8.3ms per frame that input lag is probably not going to be anything. 100fps solid ~> 10ms per frame too which is still close. Not many people are getting that and at solid rates in gorgeous high detail modern games at very high to ultra settings anyway, even with DLSS 2.0 capable games and gpus. That's why they are using VRR , for the fluctuations and ability to ride a roller coaster of frame rates in order to get better eye candy without judder because otherwise just capping the frame rate on the high end would be enough.


You can't generally operate on what you can't see from the what the game's frames haven't delivered ~ what action/world states haven't been delivered yet. Then take a dive looking at how tick rates and netcode work in online games and you are at best around a theoretical 15ms per frame of new world state data delivered at best in the chain but it's really way muddier than that cobbled between all players online, varying ping times, selective/arbitrary net code decisions on events by the devs, poor net code in many cases, most games having much worse tick rates to begin with, and in some games different (worse) separate tick rates for abilities/spells compared to the game's tick rate for "mechanical" actions.

So you are getting 15ms or so between frames on the highest tick server games.. like 65hz interpolated but due to netcode probably worse than that online. That's not a single player game locally, and that's not other game's tick rates which are way worse, and it's not taking your actual local frame rate deliveries over VRR frame rate roller coaster in account on most graphics showcase games, especially at 4k rez.

Then there is reaction time. Most people locally tested on gaming pcs get about 180 - 200ms reaction time (or worse). Some people can get close to the touch sensitive reaction time of 150ms at times, or under 160ms regularly when fed a visual reaction test. So .150 to .240 seconds reaction time on say 65states/second of online game delivered game states to be generous (worse in reality, much worse on poorer tick servers) ~ .015 second ticks theoretical optimum intepolated, and your screen lag ~ .017 seconds or less.

.180 seconds reaction time equates to 23 frames later locally at 117fps solid (8.5ms per frame) and ~ 18 local frames of reaction time at 150ms reaction time locally. So 18 - 23 frames of reaction time for the fastest people if compared to 117fps solid best case scenario. 100fps -> 18frames , 75fps -> 14 frames.

On the ~65hz(15ms) tick rate theoretical interpolated optimum on the best tick rate capable games currently (ignoring net code muddying) .180 seconds equals 12+ world state deliveries/frames worth of reaction time. .150 seconds reaction time ~ 10 interpolated ticks.

When you take all of these factors into account, (especially net code on top of tick rate systems), .001 seconds difference between the two tvs is practically nothing. Even a 4ms gaming panel .004seconds vs .017 seconds OLED panel when compared to .015 seconds theoretical best case online game server delivery chain (after it subtracts/compares your ping times and goes through net code muddying), where it is almost always more than that - and on higher tick servers a ton more than that - .. while you are getting say 75 - 117 fps locally (.013sec to ~ .009 sec) and a whopping .150second to .240second reaction time every time you are shown a series of new game world states.

Input lag 4ms gaming panel: ._____________________________________.004 second
Input lag 16-17ms tv:___________________________________________ .017 second
Highest tick server theoretical ordinary (interpolated) max: ~ 65Hz --> ____ .015 second per tick/delivered frame after netcode (higher in reality and magnitudes higher on other game servers)
Local frame rate using VRR example 75 - 117fps cap --> _____________.013 to .009 second roller coaster
Fast to fastest gamer reaction times------------------------->______________.180 to .150 seconds


128tick at interp_2 = .015 seconds
64 tick at interp_2 = .031 seconds
22 tick at interp_2 = .090 seconds
12 tick at interp_2 = .166 seconds

If you set interp_1 then the tick interpolation time would be halved (minus 1 frame, 2 frames, 5 frames, 10 frames respectively) - but any lost packet at all would hit you with a 250ms delay /8.5ms per frame = 29 frames.

Command Execution Time = Current Server Time – (Packet Latency + Client View Interpolation)


For reference, tick rates of some common online games:

Valorant: 128tick
Specific paid matchmaking services like ESEA: 128tick
CSGO ("normal" servers): 64tick
Overwatch: 63
Fortnite: 60 (I think, used to be 20 or 30)
COD Modern Warfare mp lobbies: 22tick
COD Modern Warfare custom lobbies: 12tick


Some have guessed that League of Legends tick rate is around 30.
ESO PvE / PvP: ??
WoW PvE / PvP ?? .. World of warcraft processes spells at a lower 'tick rate' so is a bit more complicated, but overall the tick rate probably isn't that great.
https://us.forums.blizzard.com/en/wow/t/is-classic-getting-dedicated-physical-servers/167546/81


https://happygamer.com/modern-warfa...or-a-game-that-wants-to-be-competitive-50270/


---------------------------------------------

Keep in mind that a higher tickrate server will not change how lag compensation behaves, so you will still experience times where you ran around the corner and died.
-----------------------
An example of lag compensation in action:
  • Player A sees player B approaching a corner.
  • Player A fires a shot, the client sends the action to the server.
  • Server receives the action Xms layer, where X is half of Player A's latency.
  • The server then looks into the past (into a memory buffer), of where player B was at the time player A took the shot. In a basic example, the server would go back (Xms+Player A's interpolation delay) to match what Player A was seeing at the time, but other values are possible depending on how the programmer wants the lag compensation to behave.
  • The server decides whether the shot was a hit. For a shot to be considered a hit, it must align with a hitbox on the player model. In this example, the server considers it a hit. Even though on Player B's screen, it might look like hes already behind the wall, but the time difference between what player B see's and the time at which the server considers the shot to have taken place is equal to: (1/2PlayerALatency + 1/2PlayerBLatency + TimeSinceLastTick)
  • In the next "Tick" the server updates both clients as to the outcome. Player A sees the hit indicator (X) on their crosshair, Player B sees their life decrease, or they die.
Note: In an example where two players shoot eachother, and both shots are hits, the game may behave differently. In some games. e.g. CSGO, if the first shot arriving at the server kills the target, any subsequent shots by that player that arrive to the server later will be ignored. In this case, there cannot be any "mutual kills", where both players shoot within 1 tick and both die. In Overwatch, mutual kills are possible. There is a tradeoff here.
----------------------------------------
  • If you use the CSGO model, people with better latency have a significant advantage, and it may seem like "Oh I shot that guy before I died, but he didn't die!" in some cases. You may even hear your gun go "bang" before you die, and still not do any damage.
  • If you use the current Overwatch model, tiny differences in reaction time matter less. I.e. if the server tick rate is 64 for example, if Player A shoots 15ms faster than player B, but they both do so within the same 15.6ms tick, they will both die.
  • If lag compensation is overtuned, it will result in "I shot behind the target and still hit him"
  • If it is undertuned, it results in "I need to lead the target to hit them".
 
Last edited:
Greentoe is probably selling 48CX's below that. I just got a 77CX for $2800 including tax/shipping for example.
 
On the TV side without a shield - you can disable dts in the plex webOS app so it will play AAC sound on/from the tv

...

Transcoding, which is what happens when you disable dts in the plex webOS player, is always some minor fidelity loss.

I've never used plex, but is there any reason multi-channel PCM can't be used instead of AAC so that it's actually lossless? Or something like FLAC?

At least, with a USB drive plugged directly into an E9 OLED, the built-in WebOS player is able to decode multi-channel PCM and multi-channel FLAC without issue.


I found the TV's playback buggy on some titles where it would drop playback to black screen partway through or show a short time line as if the video was truncated, which for all practical purposes it functions as if it is in this scenario and just cuts off at that point.

What if you use one of those USB ethernet adapters that are known to work with the TV to get a 300mbps wired connection?
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
I've never used plex, but is there any reason multi-channel PCM can't be used instead of AAC so that it's actually lossless? Or something like FLAC?

At least, with a USB drive plugged directly into an E9 OLED, the built-in WebOS player is able to decode multi-channel PCM and multi-channel FLAC without issue.




What if you use one of those USB ethernet adapters that are known to work with the TV to get a 300mbps wired connection?
The built in pictures and video app refused to decode DTS for me to anything (it refused to do Atmos also strangely). Plex on Shield does decode DTS-MA into multichannel PCM over eARC though. I connect my Shield this way because my receiver doesn't support Dolby Vision.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
The built in pictures and video app refused to decode DTS for me to anything (it refused to do Atmos also strangely). Plex on Shield does decode DTS-MA into multichannel PCM over eARC though. I connect my Shield this way because my receiver doesn't support Dolby Vision.
Maybe it wasn't entirely clear, but yeah I know the CX can't decode DTS unlike the E9 - that's why I'm asking about transcoding to a supported lossless format like PCM or FLAC because, contrary to what elvn says, transcoding does not inherently mean a loss of quality, only if you transcode to lossy formats (Opus, AAC, etc) rather than lossless formats (PCM, FLAC, etc).
 
Last edited:
Maybe it wasn't entirely clear, but yeah I know the CX can't decode DTS unlike the E9 - that's why I'm asking about transcoding to a supported lossless format like PCM or FLAC because, contrary to what elvn says, transcoding does not inherently mean a loss of quality, only if you transcode to lossy formats (Opus, AAC, etc) rather than lossless formats (PCM, FLAC, etc).
To my knowledge, no. It seems to only be something the client can do (decode to 7.1PCM rather than bitstream the encoded format). It would definitely be useful to be able to do it on the server side.
 
Do you rekon next year they may bring 144hz to there OLED range?
It's my impression that BFI at 120Hz on the CX results in a given pixel being turned off for the same amount of time that it's turned on, meaning that the panel is already natively 240Hz.

So assuming that this is accurate, then going straight to 240Hz input seems like the more likely scenario.
 
Transcoding will always have some minor loss. It is not the same as remuxing 1:1 beforehand. Whether you can notice personally, whether it is noticeable in testing, or whether that "not the original 1:1" fact bothers you or not are other questions. Plenty of people watch streaming services which are much more watered down (compressed/bit rate) with no issues for example. I think when plex transcodes the audio it might also be doing the video by default too. Besides transcoding, the plex WebOS app was crashing or seeing a file as truncated partway through several videos I played among other things. I already have a shield on my living room tv so I sprang for another on the CX so I didn't have to worry about anything other than swapping my receiver input when I switch to the shield.

-------------------

Regarding BFI , I'm not sure what method LG uses but some BFI implementations do a rolling scan independent of the refresh rate.

https://forums.blurbusters.com/viewtopic.php?t=6310#p47203
 
Transcoding will always have some minor loss. It is not the same as remuxing 1:1 beforehand. Whether you can notice personally, whether it is noticeable in testing, or whether that "not the original 1:1" fact bothers you or not are other questions. Plenty of people watch streaming services which are much more watered down (compressed/bit rate) with no issues for example. I think when plex transcodes the audio it might also be doing the video by default too. Besides transcoding, the plex WebOS app was crashing or seeing a file as truncated partway through several videos I played among other things. I already have a shield on my living room tv so I sprang for another on the CX so I didn't have to worry about anything other than swapping my receiver input when I switch to the shield.

-------------------

Regarding BFI , I'm not sure what method LG uses but some BFI implementations do a rolling scan independent of the refresh rate.

https://forums.blurbusters.com/viewtopic.php?t=6310#p47203
This is not true. If you take a zipped file, unzip it, then rar it, was there "loss"? No, they are lossless forms of encoding or they literally wouldn't work. PCM is the same as the unzipped file, FLAC is the same as the rar.
 
It's possible to transcode to a higher codec but I don't think thats happening generally on the fly from something like plex, as opposed to remuxing something beforehand.

Lossless remuxing vs typical transcoding on the fly are different.

https://video.stackexchange.com/questions/21618/does-transcoding-always-mean-quality-loss

Generally, H.264 and H.265 (as well as others like VP9) are lossy codecs, at least in their default settings with most encoders. This means that whenever you re-encode from one to another (or even in the same codec), you throw away information.


Whether this information loss is visible or not depends on your source material and the chosen settings, of course.


To combat the loss of information, you often need to choose a higher bitrate than what your input is, or a higher constant-quality setting than the default of the respective encoders. There is no "general principle" here – it really depends on what the default bitrates or quality targets are for the encoders you're using.
 
It's possible to transcode to a higher codec but I don't think thats happening generally on the fly from something like plex, as opposed to remuxing something beforehand.

Lossless remuxing vs typical transcoding on the fly are different.

https://video.stackexchange.com/questions/21618/does-transcoding-always-mean-quality-loss
We're talking audio, and transcoding not remuxing. Lossless audio codecs have been a solved problem for a decade or two, and even completely uncompressed audio is a manageable amount of bandwidth/storage. Video is very different, and bit-perfect lossless video will probably never be a thing due to the storage/bandwidth needed. For video perceived quality will always be better by compressing a higher resolution source with a lossy algo than having a lossless lower resolution source, so that's the strategy that will always be used up to the maximum "acceptable" level of storage/bandwidth. A 24FPS 12bit color 720p video would be ~400Mbps for example. Who would want to watch that over a 4k HEVC video with a ~20Mbps bitrate?
 
Last edited:
There is pass-through, decoding, transcoding, remuxing.

I'm not saying a FLAC container passed through with plex isn't lossless or something else like a receiver decoding passed through data.

https://www.reddit.com/r/PleX/comments/4nzdff/can_plex_stream_flac_losslessly/
Plex Media Server supports FLAC just fine, and clients like Plex Media Player and Open Plex Home Theatre do as well. Even the iOS client does, I believe.
Whether or not you'll be able to stream lossless FLAC files without transcoding depends entirely on what hardware you're using, and which release of the Plex client.


I'm saying plex transcoding (not just streaming) on the fly changing to a different codec. And I'm not sure when you do that with audio if plex doesn't automatically trancode the video too.


https://www.winxdvd.com/video-converter/plex-transcoding.htm
Plex supports pretty much media formats[1]. However, not all content can be Direct Played on your client device because your device always has special requirements for media encoding format, container format, and resolution. Some content needs Direct Stream or Transcode. Direct Stream is enabled when Plex client can support the video and audio codec formats but can't support the container format. Transcoding is used when Plex client can't read either the video or audio format or can't support the resolution, bitrate, etc.[2]

Edit: according to some replies on some other forums I looked up, Plex can still stream the video mp4/mkv even if it transcodes and converts the DTS audio seperately. So that concern might be out at least though I haven't found that confirmed elsewhere yet. The general idea in all of the plex conversations is avoiding transcoding wherever possible.

Either way the built in webOS plex player was dropping playback partway through to a black screen on several different titles more than once, acting like the file had been truncated there so I'd rather just use a shield along with it passing dts tracks to my recevier normally without transcoding even if it cost me some money.
 
Last edited:
This is not true. If you take a zipped file, unzip it, then rar it, was there "loss"? No, they are lossless forms of encoding or they literally wouldn't work. PCM is the same as the unzipped file, FLAC is the same as the rar.

Yes there is no loss if you take it out of or change the container without manufacturing whole new streams from the original source in it (or if you take a long time manufacturing a new file rather than doing it on the fly). I think there might just be some confusion on terms we are using between us.

support.plex.tv

Remuxing, in our context, refers to the process of changing the “container” format used for a given file. For example from MP4 to MKV or from AVI to MKV. It also allows adding or removing of content streams as needed. Remuxing differs from Transcoding in that remuxing a file simply repackages the existing streams while transcoding actually creates new ones from a source.

Transcoding speed/quality
Your Plex Media Server’s default value is Very fast. Slower values can give you better video quality and somewhat smaller file sized, but will generally take significantly longer to complete the processing.

Most users will not want to change this, but those who have a particularly powerful server or who don’t mind much longer processing times might choose a higher quality (slower) value.

There is also hardware/gpu enabled transcoding which is faster but might be a little less precise (e.g. a little more artifacting occasionally in dark scenes with a lot of motion according to some reports). Plex's hardware transcoding was in some cases worse than software and they had hardware HDR transcoding, HDR tonemapping to SDR , etc. issues at first but they have been updating it.

I never said transcoding is not usable and probably not noticeable to the less discening eye but it's not 1:1 direct play, especially if you are using plex's default transcode speed.

"Direct Play" (pass-through) > direct stream (change/break the container type and pass the readable video/audio stream types already inside) > transcode (create new streams on the fly from the unplayable streams)
 
Last edited:
Back
Top