LG 48CX

Yes there is no loss if you take it out of or change the container without manufacturing whole new streams from the original source in it (or if you take a long time manufacturing a new file rather than doing it on the fly). I think there might just be some confusion on terms we are using between us.

support.plex.tv



Transcoding speed/quality


There is also hardware/gpu enabled transcoding which is faster but might be a little less precise (e.g. a little more artifacting occasionally in dark scenes with a lot of motion according to some reports). Plex's hardware transcoding was in some cases worse than software and they had hardware HDR transcoding, HDR tonemapping to SDR , etc. issues at first but they have been updating it.

I never said transcoding is not usable and probably not noticeable to the less discening eye but it's not 1:1 direct play, especially if you are using plex's default transcode speed.

"Direct Play" (pass-through) > direct stream (change/break the container type and pass the readable video/audio stream types already inside) > transcode (create new streams on the fly from the unplayable streams)
No I think there's confusion about lossless codecs. TrueHD/DTS-HD MA/FLAC are all lossless codecs. You can transcode between them infinite numbers of times, and the PCM they all decode to will be bit-identical. Just like zipping/raring a file. It's absolutely still transcoding not remuxing, the stream is decoded from one format and encoded into another. But there is no loss whatsoever. The wish is that Plex could transcode to FLAC (Dolby TrueHD/DTS-HD MA encoders aren't freely available) on the fly, or decode to 7.1 PCM on the server side to send to the client and be more universally compatible when things like CX TV's without DTS decode capability are in the mix.
 
I get what you are saying now, thanks. I'm assuming from what you said that lossless audio, being smaller and maybe by nature of the codec and decode, can be transcoded fast enough (remaining lossless) on the fly "do it live" by plex - as opposed to full quality video which gets some loss by default when transcoded on the fly by Plex.

Plex server can be forced to transcode to AAC if you disable dts in the plex client so I had been trying that at first. What I said about transcoding vs quality as opposed to what happens remuxing the container type (like changing from a rar full of a few files to a zip full of those files, at least container wise) with video holds true on the video end (rather than the audio end *when it is lossless audio codec*) with plex transcoding though which is what stuck in my head and is what I posted quotes and examples of - so my concern is/was that I heard somewhere that when you force dts transcoding kicking plex into trancoding mode, that it might also transcode the video stream automatically. If that were the case, the video quality could be degraded slightly when the forced dts transcoding "trick" was utilized to get some audio off of a dts title through the LG CX's webOs Plex client. Now that I've looked it up more, it might not be transcoding video when you do that but I'm not positive.

Where I posted, (*in regard to steams that aren't lossless audio formats)
Remuxing, in our context, refers to the process of changing the “container” format used for a given file. For example from MP4 to MKV or from AVI to MKV. It also allows adding or removing of content streams as needed. Remuxing differs from Transcoding in that remuxing a file simply repackages the existing streams while transcoding actually creates new ones from a source.

Transcoding speed/quality

Your Plex Media Server’s default value is Very fast. Slower values can give you better video quality and somewhat smaller file sized, but will generally take significantly longer to complete the processing.

Most users will not want to change this, but those who have a particularly powerful server or who don’t mind much longer processing times might choose a higher quality (slower) value.

In any event, when I did the forced dts to AAC transcode several titltes were acting as if they were truncated in the PlexWebOS partway through the title. To the PlexWebOS client, the video file ended there so even after the videos abruptly ended to a black screen, I couldn't even resume them. Compared to the shield, the indexing was a little slower too (not unusable though). All things considered, coming from having nvidia shields that pretty much pass through anything (and have gigabit connections and usb3 ports, etc) - I decided to go with another shield on my LG CX and just pass everything (including dts) to my receiver unchanged so I don't have to worry about all of that stuff. The only thing the shield lacks is Youtube HDR but I can tell the LG remote to bounce back to webOS youtube anytime. Most of the youtube HDR content has bad static logos promoting the channel/brand though anyway so I don't like leaving those on a loop.
 
Last edited:
Plex can transcode video/audio independently. It also does on the fly remux if it can. I think usually when people use the term transcode, they tend to mean lossy encoding. Plex always says "transcode" in the server dashboard even if it is doing a lossless remux, which is sort of annoying. I think tautalli might show it properly though but I haven't tested.

For anyone w/ the nvidia shield, how's the AI upscaling w/ plex? Does anyone know how it is implemented? Is it like DLSS? In mpv you can use similar upscaling shaders, which work really well for certain types of content, but I think the nvidia thing is different. I wonder if that feature would make it worthwhile for 1080p content.
 
That would explain how it would be hard to determine from just looking at the dashboard. Thanks for the clarifications overall hhkb
and mirkendargen. Unfortunately even if forcing plex to transcode the dts to aac kept full fidelity sound and left the mp4/video untouched~>direct play or (or at least direct stream/remux), I was still getting bad performance issues on occasion. The worst of those being the truncated file issue dropping out of playback with no ability to resume since the LG webOS plex client thinks the file ends there after it happens. It still might be a valid workaround for some people but the performance results in my case were not solid enough for my liking. I also just love the overall capabilities of the feature rich nvidia shields that just work and pass through just about everything (I also find the lack of gigabit on the LG annoying) - so even if paying a decent amount for the shield it wasn't that big a leap to pull the trigger on one for me.


=================================================

Shield AI upscaling --
Is it like DLSS? kind of. It is Nvidia's machine learning AI upscaling but DLSS is used in a few different ways from what I've read. By default on quality settings it sounds pretty similar if you consider DLSS reducing the resolution being a similar starting point to having 1080p to start with in a video on a 4k screen. If you prioritize performance in DLSS 2.0 it will look worse though.

(from a random Geforce forums post:)
Dlss on quality setting runs on essentially half the resolution and then upscales it. Quality algorithm is quite good and with dlss 2.0 you would be hard to find a difference for the performance boost you will receive. Performance on the other hand takes an image thats 1/4 resolution and upscales it. You will notice it but in cases where the performance is terrible it may be the only way the game is playable.

https://blogs.nvidia.com/blog/2020/02/03/what-is-ai-upscaling/
Traditional upscaling starts with a low-resolution image and tries to improve its visual quality at higher resolutions. AI upscaling takes a different approach: Given a low-resolution image, a deep learning model predicts a high-resolution image that would downscale to look like the original, low-resolution image.

To predict the upscaled images with high accuracy, a neural network model must be trained on countless images. The deployed AI model can then take low-resolution video and produce incredible sharpness and enhanced details no traditional scaler can recreate. Edges look sharper, hair looks scruffier and landscapes pop with striking clarity.


However the shield's AI upscaling lacks the hardware that a full nvidia GPU has for DLSS:
https://www.pcgamer.com/if-the-nint...upport-i-want-it-in-my-new-nvidia-shield-too/

The Shield TV Pro already supports AI upscaling for video, although it lacks the hardware to handle DLSS locally itself, nor can it match the fidelity that DLSS is able to achieve. Right now all the super sampling is handled on the server side for GeForce Now game streaming. A newer chip could handle a form of DLSS upscaling within the Shield itself, meaning that you wouldn't need such a phat internet pipe to play games at 4K.

If the Tensor Cores in a new Ampere-based Shield could be used for a content-agnostic DLSS-analogue that worked on simple video streams, rather than needing to be added on a per-game basis, then you would only need a low-res stream from the source. The updated Shield could then do all the super-smart super sampling on the client end.

Though what that might do for latency we're not entirely sure.

Nvidia has released numerous iterations of its offerings SoC offerings since the Tegra X1 was first introduced, including the Pascal-based Tegra X2, the Volta-based Xavier, and most recently Orin.

Orin would appear to have the digital makeup needed to deliver on the promise set out by the Bloomberg rumour. Orin was first announced at the GPU Technology Conference 2018, where Nvidia boasted it had 17 billion transistors and 12 ARM Hercules cores.

Orin is Ampere-based and therefore has access to the Tensor cores necessary to weave the DLSS magic. Not only that, but while the Tegra X1 has 256 CUDA Cores, Orin has 2048 CUDA Cores.

The assumption was that Orin was destined for the vehicle market, but if these rumours for the Nintendo Switch Pro are true, then it looks like a lot of the hard work would be done for a new Shield as well. A Shield capable of streaming using GeForce Now and upscaling to 4K at high refresh rates at the same time.

https://www.androidpolice.com/2020/...rce-now-and-gamestream-graphics-apk-download/

say you're taking advantage of cloud-streamed games and you're really looking for that extra punch of detail? Perhaps it's best to own last year's Nvidia Shield TV Pro — the complementary Nvidia Games app has been updated to enable AI upscaling on the company's GeForce Now and GameStream platforms.

Once the update's installed, AI upscaling can be toggled in a new, dedicated settings menu on the 2019 Pro. We got word from Nvidia a few weeks back that this very update would enable GeForce Now games to scale up to 4K at 60fps.

Nvidia spokesperson Jordan Dodge noted in a tweet that the company has tested and found upscale-generated lag to max out at around 1 or 2 frames.


AI upscaling on the shield - This guy seemed wowed by it. heh.

https://www.reddit.com/r/ShieldAndroidTV/comments/dqa16g/ai_upscaling_wow/
I just bought the new NVIDIA Shield TV (cigar tube). My first Shield. I've used or owned an Amazon Fire TV Stick 4K, Roku Streaming Stick + and Apple TV on my Vizio M558-G1 4K set.
Like many, I was skeptical about the new AI Upscaling feature that uses AI to make 1080 content look closer to 4K. I just figured it was NVIDIA's way to try to snow consumers to try the new Shield instead of Apple TV 4K.
Wrong. This AI Upscaling is legit. Very legit.
I just turned on a documentary in 1080 on Netflix with AI Upscaler on. Damned if it didn't look like 4K. Far more crisp and vivid than the upscaler in any other streaming stick or box I've used, including Apple 4K TV.
If you watch a lot of 1080 stuff on Netflix or Amazon Prime Video, this feature is a game-changer. I'm not a fanboy of any platform or a Reddit plant for NVIDIA. I'm just a regular consumer who is blown away by this feature.
The AI Upscaling only handles 30 fps content, so everything I stream on YouTube TV unfortunately doesn't get this treatment. But it still upscales very nicely with the new Shield, just as well as Apple TV 4K to my eyes.

----------------------

This link has some side by side/split comparison images taken (rough images taken by camera) of a nvidia shield TV's demo mode

https://www.talkandroid.com/guides/android-tv-guides/nvidia-shield-tv-ai-upscaling/

If you’re watching standard 1080p content with a 1920 x 1080 resolution on a 4K television with 3840 x 2160 resolution, that means you’re putting about 2 million pixels onto a display that can show about 8 million pixels. Now, with zero upscaling you’d just see a small box of your TV show that only takes up 25% of the center of your screen, surrounded by gigantic black bars on all sides. But obviously that would be a pretty terrible experience, so instead of showing a 1:1 image of a 1080p file, your TV/streaming box tries to “fill in” those remaining 6 million pixels so you get a full-screen image.

That content can be filled in by just stretching out the pixels to fill the screen, which looks terrible, or by upscaling it. Upscaling takes that content and tries to “guess” what the nearby pixels should look like to give you a crisper, clearer image. For the most part now even cheaper TVs and boxes do this reasonably well, but NVIDIA’s solution uses AI and machine learning to take significantly more educated guesses about those surrounding pixels

It’s also important to keep in mind that this only works on 1080p and 720p content at 30FPS, so you won’t be able to use it on a lot of YouTube videos or things that are already in 4K. But if you do watch a lot of standard 1080p content, then this excellent upscaling might just be enough to tip you over into buying an NVIDIA Shield TV over whatever else you had on your list.

---------------------------------------
Some basic info here with an simulation of the ai upscaling on a gecko photo:
https://blogs.nvidia.com/blog/2020/02/03/what-is-ai-upscaling/

========================

https://hackaday.com/2021/04/05/ai-upscaling-and-the-future-of-content-delivery/

While these may be early days, it seems pretty clear that machine learning systems like Deep Learning Super Sampling hold a lot of promise for gaming. But the idea isn’t limited to just video games. There’s also a big push towards using similar algorithms to enhance older films and television shows for which no higher resolution version exists. Both proprietary and open software is now available that leverages the computational power of modern GPUs to upscale still images as well as video.
The software AI upscaling they mention is horribly slow on the fly (1 to 2 fps) but still an interesting read.
 
Last edited:
It can't be exactly DLSS, because DLSS requires the rendering engine to present motion vectors so that it can estimate movement. That obviously wouldn't be available from a dumb video stream. Some elements could be shared though.
 
The pcgamer article I linked said DLSS (quality setting) is better (especially for upscaling games which I think it is talking about there) - but they both use AI upscaling/deep learning

https://www.pcgamer.com/if-the-nint...upport-i-want-it-in-my-new-nvidia-shield-too/
The Shield TV Pro already supports AI upscaling for video, although it lacks the hardware to handle DLSS locally itself, nor can it match the fidelity that DLSS is able to achieve.

Regarding vector based .. DLSS in that case sounds like how time warp in VR sort of projects the next frame from the vector/head movement. Though DLSS is only supported on certain games that have been "digested" so that makes it seem less on the fly and more pre-learned for much of it in regard to games. So even if it's smart enough to project the next frame from a vector, if it hasn't learned the game enough beforehand it doesn't sound like it can ai enhance it on the fly in some random game . I'll have to try to find out if DLSS support for a game is dependent on running the game through deep learning alone or if the game has to specifically code to send vector data/handles to DLSS or something. The final result is probably much better fidelity with DLSS capable content than shield AI upscaling though all factors considered.

It makes me wonder if you could theoretically run a particular video through deep learning a bunch of times beforehand like a long render time as a option to improve it even more specifically for that title. Though it could be that the AI upscaling available to the consumer can't learn anymore (can't improve on how much it improves) specifically to that video/title unless nvidia directly supported the title feeding it to deep learning ahead of time themself in order to publish support for that movie. i.e. a directly supported AI upscaling movie list like DolbyVision supported movie library (pre-baked mastered tone mapping), DLSS supported games. I think their goal is to keep the photo and video upscaling more generic in general usage though some articles mention the potential for applying it to old movies and resolutions beforehand for new release.

That software AI upscaling mentioned in that hackaday article makes me wonder if you could "render" AI upscaling of a movie title similarly yourself over a long time and bake it in somehow.
 
Last edited:
That obviously wouldn't be available from a dumb video stream

You could theoretically take a "dumb" movie title where DLSS would be blind to any vectors and map virtual cameras to it to duplicate the movement of the actual camera used filming scene by scene.

That could either be done painstakingly on a scene by scene basis by hand or maybe using AI to do a rough guess pass of where the virtual camera is and where it moves in each scene that you (mastering professionals) would later tweak scene by scene. That's done at least in some cgi scenes in order to make a composite where the cgi elements match even when the real camera is moving in a scene. The virtual world/CGI layer's virtual camera tracks exactly the same as the real world one at that point. They have motion data captured from digital cameras now as they film in the first place, kind of like they mo-cap actors, at least in cgi scenes but that could be done with the cameras throughout if they wanted to. Deep learning/AI upscaling might be figuring out a rough guestimate of that movement already in some fashion and working from it though.

Whether you mapped virtual cameras to the existing camera in each scene's movement and zoom in a pre-existing title or you captured that during filming, you could end up with the vector data if needed by DLSS. In the case of a movie rather than a game they'd always be the same vectors once learned. However even a live video feed could potentially have a camera with motion capture/state data sent live to DLSS/AI upscaling systems if it was needed for some reason (transmitting a lower resolution for example).

I'd be curious to see a side by side comparison of a recorded video of a very graphically detailed game's play being AI upscaled on the fly at a base rez .. vs... that same game played in "real-time" using DLSS 2.0 from the same base rez (e.g. 1080p basis to 4k in both cases). That and then theoretically adding a third version with the virtual camera in the game's vectors applied to DLSS on the recorded video version. Also the same thing with a live video feed of a real life event vs the recording of the same video, then the same recording with the camera motion capture/vector data.

A little off topic, just intriguing to me.
 
Last edited:
DLSS 1.0 is based on a dedicated training model for each game that supported it (and was ok but not great). DLSS 2.0+ doesn't need training for each game, it just has a universal model done by Nvidia and needs motion vectors provided by the engine (and looks amazing).

Yes theoretically you could run 1 frame behind in a video to achieve the same thing (this is exactly how motion smoothing (frame interpolation) done by the TVs themselves works).
 
  • Like
Reactions: elvn
like this
Anyone know the power usage for the LGCX 48? The UPS that mine is hooked up to, gave up the ghost and it was an older 900w unit and I don't think I need to go that big since this is the only thing plugged in now.
 
Anyone know the power usage for the LGCX 48? The UPS that mine is hooked up to, gave up the ghost and it was an older 900w unit and I don't think I need to go that big since this is the only thing plugged in now.
Dude. It’s like 50watts maybe. Therefore you have plenty of capacity for a three screen set-up. Get on it!
 
My bad, it’s 50w with black screen. About 90w average. You have the capacity for only 8 more screens. 7 to be safe. ;)
Hehe, it was an old recycled UPS I was using off the test bench. I would move those off critical testing stuff every two or three years and put those on something else that could use it. We get weird power and lighting all the time out here. Lighting ate a big UPS and a 1200W Seasonic PSU last month.
 
Hehe, it was an old recycled UPS I was using off the test bench. I would move those off critical testing stuff every two or three years and put those on something else that could use it. We get weird power and lighting all the time out here. Lighting ate a big UPS and a 1200W Seasonic PSU last month.
Oof! That stinks. We used to live near a grain elevator. I would lose a nic if I didn’t unplug my cable during thunderstorms.
 
Hehe, it was an old recycled UPS I was using off the test bench. I would move those off critical testing stuff every two or three years and put those on something else that could use it. We get weird power and lighting all the time out here. Lighting ate a big UPS and a 1200W Seasonic PSU last month.

You might want to consider looking into springing for a line conditioner at least at your pc at some point. I know some ups have line conditioning but a separate unit seems more robust (and they supposedly warranty your gear up to ~ $25k but I've never had to use that). I use one ahead of a ups on my pc and one on my home theater for the more fragile electronics and things I run off generator power in an outage. Plus a few other things just plugged into the line conditioners and not the uninterruptable power supplies. That doesn't help access wise if the fios line goes down though of course.


https://www.tripplite.com/2400w-120...tion-avr-ac-surge-protection-6-outlets~LC2400

https://www.amazon.com/Tripp-Lite-LC2400-Conditioner-Outlet/dp/B0000514OG

https://www.newegg.com/apc-h15blk-power-conditioner-surge-protector/p/N82E16882303008

There are also these:
https://musiccritic.com/equipment/studio/power-conditioner/

I don't use these but they exist:

7 of the Best Ethernet Surge Protectors Reviewed for 2021
https://gagthesurge.com/ethernet-surge-protectors/

----------------

There are whole house line conditioners too but I've never gone that far. That would help protect boards in appliances along with everything else. My parents lost a fridge once due to a storm's outtages while using their generator. The fridge is on a mini conditioner of some kind now behind the fridge.

Even if your electronics don't outright die, browns and surges can decrease the lifespan of things. Running a generator is worse - fluctuating briefly any time the load changes considerably unless you have a line conditioner built into your house or at least at your devices.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Ordered a shield off of amazon warehouse, and just got it today. Apart from a busted ethernet cable making me think it was broken, so far so good. The AI upscaling is really great actually. It is very similar to https://github.com/haasn/FSRCNN-TensorFlow, which I was using with mpv before I moved everything to Plex. Instead of the slightly blurred upscale in the default TV plex app, things look nice and sharp similar to if it was native 1080p. For all I know they are using the same underlying algorithm. I haven't had luck getting my DV files to play yet though.
 
  • Like
Reactions: elvn
like this
Glad you like it. I just used the upscaling on a 1080p fantasy movie that isn't available in 4k yet if ever and it looked great.

I'm using dolby vision on the shield after the shield update prompted me to turn it on in the shield settings. I did have to turn that Dolby vision slider off and back on once to get it to stick though. The dolby vision caption will pop up in the upper right corner after that. You can also hit the three dots "..." on the LG remote and move the cursor to the lower right of the menu overlay under TV functions where it says "information" and the dolby vision logo will be there with the other info if it's working.

I'm using Emby currently but plex updated to work with dolby vision in mkv containers some time ago so it should work. Maybe you just have to turn it off and on again in the shield settings. When I first started using HDR mode all of the time on the LG in windows 10 I had to turn it off an on a few times but that hasn't been the case in a long time now.
 
On PC you would set the refresh rate to 120 Hz and let VRR or G-SYNC do its thing. On PS5 you would set to 120 Hz and let the console do its thing.

If RE8 or any game looks washed out then you need to change the HDR calibration in the game. For RE8 follow the instructions to make the inner box disappear and make the red and blue marks equal length. If you have HDR enabled then it shouldn't be possible to change the color space in the options for the game.

On PC make sure you enabled HDR in the Windows Display Settings and that you have color settings in the NVIDIA control panel set to Use default color settings.

On PS5 make sure to run the console's built in HDR calibration tool and follow the instructions. Except on the last step (3/3) you want to set it as dark as it will go no matter what is shown on screen.

On the TV make sure you have Deep Color enabled for all the HDMI inputs being used.
Thanks. So I can do full rgb 120hz on ps5 for games running 60fps? Use input lag boos then or now?
So on C1/C1 the gamma is only correct for 120hz (let's say without vrr so fixed 120hz). If I set pc res or ps5 to 4k60hz, it will have wrong gamme or does the TV have few internal voltage modes for fixed refresh rates? I understand vrr starts from 120hz down but maybe fixed modes are all correct?
And the way input lag booster 120hz "cheat" works with all of this is double confusing :p
Maybe on ps5, if most games are 60hz, it is good to use some frame insertion?
 
Anyone know the power usage for the LGCX 48? The UPS that mine is hooked up to, gave up the ghost and it was an older 900w unit and I don't think I need to go that big since this is the only thing plugged in now.
Is it just batteries, they are usually easy and cheaper to replace.
 
Thanks. So I can do full rgb 120hz on ps5 for games running 60fps? Use input lag boos then or now?
So on C1/C1 the gamma is only correct for 120hz (let's say without vrr so fixed 120hz). If I set pc res or ps5 to 4k60hz, it will have wrong gamme or does the TV have few internal voltage modes for fixed refresh rates? I understand vrr starts from 120hz down but maybe fixed modes are all correct?
And the way input lag booster 120hz "cheat" works with all of this is double confusing :p
Maybe on ps5, if most games are 60hz, it is good to use some frame insertion?
Just stop overthinking it. I run my CX at 4K 120 Hz pretty much at all times on PC. PS5 will change to 4K 60 Hz or 4K 120 Hz depending if the game supports 120 Hz.

BFI is only viable to use for SDR games as it reduces HDR highlights way too much.
 
To save me from digging through 180+ pages how has the CX been working after all this time for PC display purposes and gaming with Gync? I know there used to be a bunch of quirks when these first came out. Has anyone experienced burn in problems yet with heavy browser usage? Currently I have the Sammy curved 43" 4K for my game rig (the one that was touted quite a bit here on the forums several years ago) and I was using a LG 32" 4K monitor for my daily computer. I then got the Asus XG43VQ to replace the LG but it seems it was too close or my eyes just have too much of an issue with the 1200p res and text.

I currently have the Asus sitting in front of my Sammy to get a feel for a SUW display in this location and to have it further away from me to see if it helps with the eyestrain which has improved quite a bit but isn't perfect yet. If my Sammy wasn't still on the desk the Asus would be another foot away and is currently 3'. I will say I do like the feel for the SUW and can probably easily go for an even wider display but at a higher resolution.

I'm looking at the CX 48 as a possibility if others using it for a pc monitor confirm if the font is crisp or not. My Sammy has meh font and I couldn't see looking at the same quality of font on a daily basis if I switch to a single display for usage with my daily.
I'm leaning more toward with the G9 certified refurbished with a 2 year general warranty or the AOC AGON AG493UCX that is new and has a 4 year pixel/panel warranty. If I went with the CX it's new and has a 4 year burn in warranty but is $400 more then the other two which are $1,000ea.

Excuse the mess a bomb went off with trying to test the Asus on the other section of desk and running usb cables to a switch for one K/M for 2 computers without making it a permanent solution yet.
 

Attachments

  • asus.jpg
    asus.jpg
    324.8 KB · Views: 0
To save me from digging through 180+ pages... [snip]
I have the 48" CX with a factory clocked RTX 3070 (soon to be 3080). In short, the display is fantastic: text is crisp, HDR gaming is the real deal. It was easy to set up and has been solid ever since.

I have no burn-in issues and I don't expect any. Hide your desktop and choose dark colors for your toolbar. Set a screen saver (just in case) but practice turning off your unit when you leave for periods over 15 minutes.

It really is the best monitor I have ever used (over the last 25 years).
 
I have the 48" CX with a factory clocked RTX 3070 (soon to be 3080). In short, the display is fantastic: text is crisp, HDR gaming is the real deal. It was easy to set up and has been solid ever since.

I have no burn-in issues and I don't expect any. Hide your desktop and choose dark colors for your toolbar. Set a screen saver (just in case) but practice turning off your unit when you leave for periods over 15 minutes.

It really is the best monitor I have ever used (over the last 25 years).
I normally have browser windows open especially with trying to cop both of those cards for each system (1080FTW on daily and 2080TI on game rig currently) but I do have the display set to turn off in windows after 5 minutes. Is there a need to actually power off the display during the day if it's set to power off signal in windows? I also have a dark theme so the top bars of all windows and the taskbar are black/gray.
 
To save me from digging through 180+ pages how has the CX been working after all this time for PC display purposes and gaming with Gync? I know there used to be a bunch of quirks when these first came out. Has anyone experienced burn in problems yet with heavy browser usage? Currently I have the Sammy curved 43" 4K for my game rig (the one that was touted quite a bit here on the forums several years ago) and I was using a LG 32" 4K monitor for my daily computer. I then got the Asus XG43VQ to replace the LG but it seems it was too close or my eyes just have too much of an issue with the 1200p res and text.

I currently have the Asus sitting in front of my Sammy to get a feel for a SUW display in this location and to have it further away from me to see if it helps with the eyestrain which has improved quite a bit but isn't perfect yet. If my Sammy wasn't still on the desk the Asus would be another foot away and is currently 3'. I will say I do like the feel for the SUW and can probably easily go for an even wider display but at a higher resolution.

I'm looking at the CX 48 as a possibility if others using it for a pc monitor confirm if the font is crisp or not. My Sammy has meh font and I couldn't see looking at the same quality of font on a daily basis if I switch to a single display for usage with my daily.
I'm leaning more toward with the G9 certified refurbished with a 2 year general warranty or the AOC AGON AG493UCX that is new and has a 4 year pixel/panel warranty. If I went with the CX it's new and has a 4 year burn in warranty but is $400 more then the other two which are $1,000ea.

Excuse the mess a bomb went off with trying to test the Asus on the other section of desk and running usb cables to a switch for one K/M for 2 computers without making it a permanent solution yet.

The topic of text crispness is so subjective that it's kinda hard to give an answer. Best I can do is give you some points of comparison. The text on the CX is obviously worst than my Acer X27 4k IPS panel, but it is actually better than an old 32" 1440p VA panel I had a few years ago despite the PPI of the two displays being nearly identical (LG CX and old VA panel that is). I would say as long as you have PC mode on the TV enabled and Cleartype tuned to your taste in Windows then the text is useable even for long work sessions. It's not the greatest, but it is also not the worst.
 
I have the 48" CX with a factory clocked RTX 3070 (soon to be 3080). In short, the display is fantastic: text is crisp, HDR gaming is the real deal. It was easy to set up and has been solid ever since.

I have no burn-in issues and I don't expect any. Hide your desktop and choose dark colors for your toolbar. Set a screen saver (just in case) but practice turning off your unit when you leave for periods over 15 minutes.

It really is the best monitor I have ever used (over the last 25 years).

Ditto on all this. Particularly the parts about HDR and this being the best display I’ve ever used. It’s not perfect but it’s damn good at so many things.
 
To save me from digging through 180+ pages

Here I dug up a few replies of mine that give my take on a few things. I waited to buy until most of the issues (all the major ones really) were fixed with firmware patches. This is by far the best multimedia and gaming oriented screen I've owned I'm extremely happy with it.

There are a lot of repeat Q & A's in this thread as more people interested in buying are making inquiries. Which is fine but if you do a search on this thread for things like text, viewing distance, screensaver, burn in , burn-in etc you'd find most of the answers you are looking for.and people's opinions in the surrounding replies.

I wouldn't trust operating system screen savers.

From about one page back in this thread::

screen savers / avoiding static content where possible

https://hardforum.com/threads/lg-48cx.1991077/post-1044971422
...Taskbar hider set to a hotkey to lock the taskbar away (show/hide toggle). Translucent taskbar app to make it and it's edge see-through.

...Dark themes in windows and browsers/browser add-ons ("color changer", "turn off the lights") to make the backgrounds dark on a per site basis..
...True black (empty of art) desktop wallpaper.

...Low (relatively low to SDR levels) HDR brightness on desktop using the HDR color control menu slider.
...Keep all the burn-in reduction tech on (like Derangel said), including ABL, ASBL, logo dimming, etc.

...Activate the remote's voice functions so that you can hold down the mic button and tell it to "turn off the screen" (which leaves everything running and just turns off the emitters after a 5 second countdown). I do that any time I walk away from my pc (afk) even if just for a minute because I can get sidetracked and/or forget that I left a game running in a static/paused state on the tv.

...Don't rely on pc/android etc system screensavers because systems/apps/video device itself can crash/freeze rarely. Crashed app notifaction windows can take top layer above a screen saver. Stuck on bios screen (or even just the log on screen) on a spontaneous reboot can happen too. You might want to change your logon screen too for this reason. A lot of 3rd party apps let you customize the logon screen.

...I think the best thing you can do besides that (other than buying at bestbuy and adding the bestbuy warranty that reportedly covers burn-in for ~ $66/yr over 5 yrs) is to use a 2nd monitor for static desktop/apps and keep the OLED usage scenario as a gaming/video/multimedia "stage".

Screen dimming will turn on by default if you leave a static screen on for too long though and it's pretty dim so that would will definitely help already aside from doing all of the above. That doesn't happen the same when people turn off dimming when trying to use the OLED as a static app/desktop monitor. If you keep a set of settings just for desktop that is well below the ABL level then you will avoid ABL but you won't avoid ASBL unless you go into the service menu and disable it.

---------------------------------------------------------------

I'm pretty sure this info from Rtings.com 's C9 OLED review would still work:
The C9 has a new Peak Brightness setting, which adjusts how the ABL performs. Setting this to 'Off' results in most scenes being displayed at around 303 cd/m², unless the entire screen is bright, in which case the luminosity drops to around 139 cd/m². Increasing this setting to 'Low', 'Med', or 'High' increases the peak brightness of small highlights. If ABL bothers you, setting the contrast to '80' and setting Peak Brightness to 'Off' essentially disables ABL, but the peak brightness is quite a bit lower (246-258 cd/m² in all scenes).

Otherwise just keeping HDR on all of the time on the desktop and turning the HDR Color slider down low enough should do similar and wouldn't affect the color brightness scaling for HDR content/metadata - so that is what most people do I think. I keep the HDR slider down to where SDR images still pop, so more like a SDR image level of color brightness/nits but some people set this brightness slider very low because they use a lot of static text based stuff on their screen since it is usually their only screen in that case.
-----------
-----------
View distance vs. text quality and AA

https://hardforum.com/threads/lg-48cx.1991077/post-1044972495
20/20 vision threshold is 60 PPD which starts at (meaning no closer than)
33.5" viewing distance and 64 degree viewing angle for a 48" 16:9 4k screen (and starts at ~1.5' on a 27" 4k)

Sitting any closer will be much poorer text and aliasing. You can try to compensate with aggressive AA and try to tweak subpixel sampling on text but it's still not optimal.

While
33.5" - 60 PPD - 64deg is the nearest you can sit while still within the 20/20 vision threshold, personally I think what's best for this screen is:

38" -- 41" - 44.4" - 48" view distance
66.6 - 72 -- 76 - -- 81.5 PPD
58 - - 54 -- 50 --- 47 degree horizontal viewing angle

--------

WOLED contributes to text issues like others have said but if you are sitting closer than the 20/20 vision threshold your text/subsampling and graphics aliasing are going to be bad regardless to start with - requiring more aggressive attempts at cleartype work arounds and more aggressive AA in games just attempting to get back to what you'd be seeing at 33.5" to 48" away .
 
Last edited:
  • Like
Reactions: scgt1
like this
New desk really helps keep that 48" manageable! And CoinopsX on the ATgames Legends ultimate is amazing with nearly 2000 arcade titles....its like childhood in a cabinet! And GoldenTee fore doesn't help my at home work productivity either
1620690094988.png

1620690659584.png
 
Last edited:
Just stop overthinking it. I run my CX at 4K 120 Hz pretty much at all times on PC. PS5 will change to 4K 60 Hz or 4K 120 Hz depending if the game supports 120 Hz.

BFI is only viable to use for SDR games as it reduces HDR highlights way too much.
That's good advice and I know it.
If ps5 sets itself to 4k60 (for 60fps game), then maybe if I enable lag booster, screen will run 60fps with 120hz but good gamma ;P
 
Anyone else having gamma issues running Metro: Exodus EE on the CX? I have the in-game gamma slider at 0, and still the only way I can avoid grey banding/ come close to inky blacks is by setting the TV's brightness to 40. No issues with other dark games using the recommended setting of 50.
 
Anyone else having gamma issues running Metro: Exodus EE on the CX? I have the in-game gamma slider at 0, and still the only way I can avoid grey banding/ come close to inky blacks is by setting the TV's brightness to 40. No issues with other dark games using the recommended setting of 50.
Try spamming the green button and see what it says for the picture mode etc. Or check the HDMI diagnostics screen (111111 on remote with Connections -> Programmes and tuning highlighted).

It might go to a wrong black level and sometimes in HDR I've had the display output at the wrong color space. Usually this happens just on the desktop rather than in games though.
 
Anyone else having gamma issues running Metro: Exodus EE on the CX? I have the in-game gamma slider at 0, and still the only way I can avoid grey banding/ come close to inky blacks is by setting the TV's brightness to 40. No issues with other dark games using the recommended setting of 50.
Are you running the TV in PC mode with black level at full range? Does the color space in the NVIDIA control panel show RGB or YCbCr444 when HDR is enabled? Is the dynamic range set to full?
 
TIL the Live+ feature on the LG CX is a glorified screen logger. It captures images from your TV and sends them to remote servers to be matched for ad tracking purposes. Can't believe such a feature exists, wtf. Anyways, make sure you turn it off.
 
Also I spent some more time with the nvidia shield pro's AI upscaling, and it isn't as good as I thought it was initially. It oversharpens things way too much. Even on the Low setting. I ended up disabling it in favor of just using "enhanced" upscaling. The mpv ML based upscalers are much better. I'm surprised nvidia doesn't adopt those, since they are simply shaders.
 
Also I spent some more time with the nvidia shield pro's AI upscaling, and it isn't as good as I thought it was initially. It oversharpens things way too much. Even on the Low setting. I ended up disabling it in favor of just using "enhanced" upscaling. The mpv ML based upscalers are much better. I'm surprised nvidia doesn't adopt those, since they are simply shaders.

I like it so far but I only use it for 1080p content so I'm not using it that often. Compared to the original 1080p it's way better. Some people prefer a more softened look though. You might be able to make a named set of settings on the tv itself that are less sharp for when you are using ai upscaling. I've never tried doing that myself, at least not yet.
 
Back
Top