LG 48CX

I'd wait until next year, when HDMI 2.1 graphics cards become easily accessible on the market and LG presents the new CXI series with updates made to the current 48"s. Who knows maybe they will announce even smaller OLED size. Unlikely of course, but who knows for sure.
I thought this was [H]. Supposed to say buy the CX now and next year buy the CXI.
 
Hey thanks!

So to be clear - if I have GSYNC on but the game goes about 120 FPS I could see some tearing (odd that I have not). I was under the mistaken impression that GSYNC just handled this and there would be no tearing regardless of framerate.
If you cap your FPS below 120, so you are always in the GSYNC range, why do you have to force VSYNC on in the control panel?
It used to, but people complained about the automatic frametime compensation and scanout sync, so NVIDIA separated it from G-SYNC. V-Sync forced on reenables both of these features. Without V-Sync you can get judder. The FPS cap is to prevent the monitor from hitting max refresh rate and hitting V-Sync input lag.
If you cap the framerate without forcing v-sync you can get some rare tearing because the v-sync option adds a frametime compensation mechanism to g-sync. That being said, it may or may not be noticeable so does not matter too much. But enabling v-sync doesn't hurt at all, so may as well just do it.

Check out https://blurbusters.com/ if you want more in depth explanations.
V-Sync also enables scanout sync because a monitor can only output images so fast.
 
If 2021 models are a decent improvement, I have no problem selling my CX for $800-900 and upgrading. My conspiracy theory is that the 40Gbps ports on the CX were intentional to provide a path for improvement in 2021 models since it appears they will be another rehash but maybe more software features + full fat HDMI 2.1 ports.

New RTX series is like 2 months away and once I get that my 4K/120hz there's very little incentive to upgrade unless a 43" arrives or they manage to finally hit 1000nits.

I can't remember the last time I've been content with a display until getting this but let's see how I feel 6 months from now.
 
If 2021 models are a decent improvement, I have no problem selling my CX for $800-900 and upgrading. My conspiracy theory is that the 40Gbps ports on the CX were intentional to provide a path for improvement in 2021 models since it appears they will be another rehash but maybe more software features + full fat HDMI 2.1 ports.

New RTX series is like 2 months away and once I get that my 4K/120hz there's very little incentive to upgrade unless a 43" arrives or they manage to finally hit 1000nits.

I can't remember the last time I've been content with a display until getting this but let's see how I feel 6 months from now.
Honestly, the only thing I could ask for on these displays is a DisplayPort. Beyond that, I'm super content with my CX 55.
 
If 2021 models are a decent improvement, I have no problem selling my CX for $800-900 and upgrading. My conspiracy theory is that the 40Gbps ports on the CX were intentional to provide a path for improvement in 2021 models since it appears they will be another rehash but maybe more software features + full fat HDMI 2.1 ports.

New RTX series is like 2 months away and once I get that my 4K/120hz there's very little incentive to upgrade unless a 43" arrives or they manage to finally hit 1000nits.

I can't remember the last time I've been content with a display until getting this but let's see how I feel 6 months from now.

48Gbps wouldn't matter unless the new 2021 panels were capable of 12 bit color anyway.
 
If they were to make an 8k 77", they COULD release a ~38" 4k by cutting them. No idea if/when that is on the radar though.

Who gives a shit? While it would be great to have a 38" form factor, enjoy the 48CX for now, its seriously amazing!

The 48CX has increased not only my length but girth as well, the wife is very satisfied!
 
Last edited:
So depsite i1 Pro saying all three screens are at 6500k, there is clearly a visible difference between the three. Now, the question is if the OLED is miscalibrated by the i1 Pro? I'd say likely no because I had the very same issue back in the day trying to get CCFL and LED monitors to match and GB-r LED and W-LED ones to match as well. For some reason, different backlight types end up producing different 'looking' whites though after calorimeter corrections images do look very close, despite this issue. Thoughts? The side screens are very close/identical since both are WLED sRGB screens.
Looking at the picture only the display on the left looks like correct whites to me while the right one is reddish, probably due to viewing angles. The OLED does look very blue. Have you tried what the i1 Pro thinks it is seeing if you try to calibrate it just visually? You could also try different placement of the calibrator and see if that has an effect.

For me just running on a single display in Game mode with Warm 2, contrast 90, OLED light 30-40 looks visually alright to me, grayscale does not look tinted to my eyes but I will see if I can improve the situation with a Spyder 5 Pro I have from work.
That is definitely blue. You don't need the other two monitors to notice. You probably need to correct for WOLED since the colorimeter is calibrated for standard RGB/BGR pixels. The extra white subpixel in the LG is probably throwing it off. Just for shiggles try turning the blue component of the white balance on the TV down by 10-20.
It seems like all our CX's are definitely too blue, at least that's what I got in Game Mode picture. I also had to dial down the blue to -11 and red up +3 just to get my white point balanced out to 6500k.

FWIW, when I had my C7 professionally calibrated a few years back, it was calibrated on the ‘Warm 2’ setting. Coincidence? Maybe not.

But yeah, your picture is on the cool side.
 
Any improvements to the CX is going to very marginal gains at best for the next few years, probably small brightness gains to HDR year over year, better image processing via better processors, more resistance to burn-in, etc etc. I don't think there will be any HUGE leaps over a CX for a while. Just buy one now and enjoy the hell of it. I personally can only see myself upgrading to an 8k version(Because TV's will skip 5k/6k) of this way later down the road once GPU power has progressed enough to push 8k at 120Hz.
 
So my experience with Autocal so far...

1. Spyder5 sucks at darks, it starts to struggle and have inconsistent results below 25% gray, and it gets real bad below 10% gray. It also doesn't have an OLED meter mode. I have a friend of a friend with an i1Display Pro I'm going to try and borrow.
2. Calman is crashing during color calibration after awhile with a .NET error. I haven't put a ton of time into figuring this out yet in case just using the i1D fixes it.
3. The way it interfaces with the TV is tool, you can even reset the settings on a picture mode adjust OLED Light from the computer while measuring luminosity so you don't have to deal with the remote menus.
4. You need to have an input going to the TV even when the AutoCal is going, even though that input isn't displayed. This is actually handy because it lets you ensure VRR is engaged when you're doing the calibration because it seems VRR/non-VRR picture settings are separate.
5. When a calibration is saved, Gamma and White Balance->Color temperature become grayed out. Your calibration supersedes them.
 
I decided to just keep my 55" C9. One, it only cost me $999 + tax. 2, I can sell it for that amount very easily, 3, @ 55" It will be an easier sell to a invidual / Family rather than the smaller 48", 4, I have the full 48gbs ... right? Not sure that even matters, 5, I don't have to pump another several hundred dollars into the purchase.

We should see our new 3080 ti's within the next 90 days
 
60 hours in and not a single regret about buying this at msrp (I hardly buy anything at msrp!!)
 

Attachments

  • 5651F8A4-D755-48EE-81A4-34F223FB7E88.jpeg
    5651F8A4-D755-48EE-81A4-34F223FB7E88.jpeg
    324.3 KB · Views: 0
I decided to just keep my 55" C9. One, it only cost me $999 + tax. 2, I can sell it for that amount very easily, 3, @ 55" It will be an easier sell to a invidual / Family rather than the smaller 48", 4, I have the full 48gbs ... right? Not sure that even matters, 5, I don't have to pump another several hundred dollars into the purchase.

We should see our new 3080 ti's within the next 90 days

I'm looking for a "3090" Ti with more vram if the rumor ends up true that they are making one, and if they make a hybrid one even better so that might take awhile longer and as has been said, to be in stock and not re-sold at even higher middle-man gouging prices.

55" inch is do-able at 48" to 55" min to my eyeballs. I have not yet ruled it out for my setup. As long as you aren't crazy about BFI and you don't mind sitting back even farther, it doesn't sound like you'd be missing anything with a C9. I'm going to be interested to see what's up with the vizio 55" OLEDs when they are available too though.

I'll be happy to have a good oled in november and hopefully be using that oled + a top tier 3000 series (assuming they have hdmi 2.1) by the end of 2020 if possible.

Incidentally I don't really re-sell stuff much unless it's someone I know. I don't have a P.O. box or UPS box currently. Call me paranoid but I don't want people getting my address. In the past I was willing to do local police stations that are set up as online auction transfer sites in order to sell stuff locally but considering the various issues currently I wouldn't even bother doing that. So most things become hand-me downs to other rigs or other areas of the house, get given to my nephews or go into storage if they don't die before then.

I often hear people saying they are going to get 90% of what they bought their products for when selling used goods. They post that they got a lot of money for selling them. I don't know if I believe that in every single case. A 55" C9 was $1200 last november. It went back up to $1500 but now the CX's are out and eventually the $1300 (retail, not on sale) 55" vizios. I mean if you can get a grand for it used (and a used OLED at that) more power to you I guess. Personally as a metric for used goods I base prices working from 60% of current retail price at the time of sale, not of what the product cost the person to buy when they bought it. That includes my stuff if I was selling anything. I mean, anyone could ask more but that would be my line. Paying much more than that I'd rather just buy new. Less risk and unused. Less than that I'd usually rather just keep the item. But that's just me.
 
Last edited:
I just fired up Mass Effect Andromeda. The graphics on this game are insane. Just freakin wow! 4k with this type of contrast blows any other display away. Period.
 
I am really torn on this TV as a monitor vs some other current and future monitors. I have a Radeon VII and a Threadripper 1920x CPU. I've been tinkering for fun with an LG UN7300 43 inch TV as a monitor and 4k is cool, running 3840x1620 ultrawide is cool sometimes as well. Seeing some reviews of Samsung Odyssey G7 and G9 though have me considering that route too. For my personal TV in the house we have a 65 inch LG C8 so I'm well aware of the benefits of OLED. Just having trouble pulling the trigger on anything right now.
 
60 hours in and not a single regret about buying this at msrp (I hardly buy anything at msrp!!)
Is that a Pearlsmith stand? If so, I have the same one. Very nice looking and drops the CX right on my desk. I removed the rubber "legs" to drop it even lower.
 
I gave in and grabbed a 48" CX to replace my 55" C9.

Early impressions:
1. Wow, this thing is SMALL! It just absolutely makes more sense as a monitor, though. I fired up a couple FPS games with HUD components in the corners and I can tell that I'm able to track things better.
2. 4k/120 @ 4:2:0 with text isn't as bad as what I've seen at some friends' houses with cheap monitors but it's obvious that it's not what you want when reading text for more than a few minutes.
3. If playing a game, I don't think I could pick out 4:4:4 vs 4:2:0 unless the 2 screens were side by side...and even then, I'm not sure.
4. BFI is amazing...just what I remember from my old Acer & BenQ 3d monitors with ULMB.

5. After a brief delay in my typing because I decided to turn on my receiver connected via eARC....when I did that, my PC rebooted. lol (So I guess I have some issues to address with the CX)
5.5 No quick fix. I've definitely got some settings configured at the moment that are incompatible. If I power off my receiver, I'm ok.
 
Ok so people have been telling me to play Death Stranding in HDR. In order to do so, windows HDR needs to be toggled on so I did that and....WTF my eyes instantly got fried to a crisp by the ridiculous brightness. Is Windows HDR still a POS to this day? The game looks great but that desktop though...


EDIT: As pointed out by many, windows now has a brightness slider for SDR. Mine was defaulted to max and combined with OLED light 100 well that's why my eyes burned out. Adjusting the brightness slider fixed it up. Cut me some slack here it's been a long time since I activated windows HDR.
 
Last edited:
Ok so people have been telling me to play Death Stranding in HDR. In order to do so, windows HDR needs to be toggled on so I did that and....WTF my eyes instantly got fried to a crisp by the ridiculous brightness. Is Windows HDR still a POS to this day? The game looks great but that desktop though...

I had the same experience the first time I turned it on, and is why I wonder if the people that have OLED Light high enough to trigger the ABL during desktop use wear sunglasses at their computer or something...
 
Well I decided to go back to SDR for Death Stranding. Don't get me wrong HDR was good, the extended range from the darkest darks to the brightest brights really adds extra pop to the game, but it came at too high of a cost, which was going back to 60Hz. I was getting 90-100fps at 4k with DLSS so I just wasn't willing to sacrifice that for HDR.
 
Well I decided to go back to SDR for Death Stranding. Don't get me wrong HDR was good, the extended range from the darkest darks to the brightest brights really adds extra pop to the game, but it came at too high of a cost, which was going back to 60Hz. I was getting 90-100fps at 4k with DLSS so I just wasn't willing to sacrifice that for HDR.

But why you need 90-100fps in THIS game :D
 
I bought my C9 for $1150 in October last year and sold it a few months later for $1000. It was just too damn big.

Don't get me wrong this 48" is still too damn big but much more manageable.

I couldn't resist and ended up buying Death Stranding. For me IMO the game benefits from HDR much more than it does from an extra 20-30FPS + sticking to 60hz means no need for DLSS.
 
I bought my C9 for $1150 in October last year and sold it a few months later for $1000. It was just too damn big.

Don't get me wrong this 48" is still too damn big but much more manageable.

I couldn't resist and ended up buying Death Stranding. For me IMO the game benefits from HDR much more than it does from an extra 20-30FPS + sticking to 60hz means no need for DLSS.

But DLSS quality looks better than native + TAA. Clearly less jaggies and shimmering. Resolution looks exactly the same to me.
 
But why you need 90-100fps in THIS game :D

Panning the camera around at 60Hz is an absolute eyesore. At 100fps I can least pan the camera around without everything turning into a complete blob. Contrary to popular belief, high fps doesn't just benefit twitch shooters. In fact I might just use DLSS performance if it means a perfect 120fps lock for BFI. Again I love the HDR on it, but I also love having perfect CRT motion during camera movement too, I just value one over the other a little bit more.
 
Last edited:
It just occurred to me that OLED pixel response + brightness + 120hz BFI would be the greatest 3dVision display of all time... I hunted down my emitter and glasses and figured out the current way to install 3dVision on newer Nvidia drivers and got a game to launch but had sync/ghosting issues... I'll fiddle with it a bit more tomorrow and see if it's something that can be fixed or not.
 
So depsite i1 Pro saying all three screens are at 6500k, there is clearly a visible difference between the three. Now, the question is if the OLED is miscalibrated by the i1 Pro? I'd say likely no because I had the very same issue back in the day trying to get CCFL and LED monitors to match and GB-r LED and W-LED ones to match as well. For some reason, different backlight types end up producing different 'looking' whites though after calorimeter corrections images do look very close, despite this issue. Thoughts? The side screens are very close/identical since both are WLED sRGB screens.
The i1 Pro needs a color profile to be able to calibrate an LG woled - as of last year none shipped with an appropriate profile, I believe. Only Calman provides a profile for LG woleds that would work correctly.
 
The i1 Pro needs a color profile to be able to calibrate an LG woled - as of last year none shipped with an appropriate profile, I believe. Only Calman provides a profile for LG woleds that would work correctly.
I've used Lightspace/Colourspace to calibrate my 65" C9. More expensive than calman but the license is gets updates for the life of the software and can be used with all brands. Can also upload a 3d lut profile.
 
Ok so people have been telling me to play Death Stranding in HDR. In order to do so, windows HDR needs to be toggled on so I did that and....WTF my eyes instantly got fried to a crisp by the ridiculous brightness. Is Windows HDR still a POS to this day? The game looks great but that desktop though...

Windows desktop HDR sounds horrible... What the hell? It should be in SDR range (which would show up as monochrome in a color temperature map) unless you had a HDR image as your wallpaper or something ... at least it should if desktop HDR was done right by MS instead of blasting everything bright. They are way behind in desktop HDR it sounds.
-How does this look when you just have a black wallpaper and no taskbar?
-Do things like STEAM and game launchers look blastingly bright?
- Do game menus, etc look super bright? I'd think the game switching from desktop to directx wouldn't but have to ask. Also, does it work in windowed mode + fullscreen properly like g-sync does?

It just occurred to me that OLED pixel response + brightness + 120hz BFI would be the greatest 3dVision display of all time... I hunted down my emitter and glasses and figured out the current way to install 3dVision on newer Nvidia drivers and got a game to launch but had sync/ghosting issues... I'll fiddle with it a bit more tomorrow and see if it's something that can be fixed or not.

The ultimate 3d vision now is a VR headset really, it's just that most don't have a high enough resolution to avoid screen door being more obvious in long shot scenes and in virtual distances past around 20' or so in games. When you use a VR headset it displays a different image per eye so the 3d effect is pretty much holographic. The pimax "8k" headsets are one of the only 4k per eye headests available so far. They are a small company with trouble fulfilling orders, some build quality issues, support issues.. growing pains perhaps idk. Plus the full kit with the headset and controllers is quite expensive. Eventually other headset mfgs should start using much higher resolution headsets too, pimax is just attempting to capitalize on being the first I think. Headset designs should get a little less bulky in the next gen and in the long run a lot less bulky with VR+AR/mixed reality goggles and glasses. These techs require interpolation types like and frame doubling (cutting anything under 90fps to 45fps then doubling it), moving tricks where they guess where the FoV is moving to and slide the whole game world slightly like a giant texture/projection sort of, etc. Oculus as well as amd and nvidia are all working on AI upscaling, like nvidia's DLSS so that should help a lot.

BFI is incompaitble with HDR. In the long run HDR will be ubiquitous and supported fully. Eventually it will become standard to have HDR support on games and in apps. Youtube uploads and streaming services will have HDR on their content (there is some available already), everyone's cameras will have HDR for photos and videos (some already do optionally). Even digital art will have HDR. HDR will trump 4k/8k resolution (and BFI) imo, especially as display technology (higher color volumes and better emitter or backlighting tech) and HDR support gets better and it becomes standard content wise. Think of how people looked at old TN screens that were washed out, or some less vibrantly colored VA screens. SDR displays are going to have up to 400nit of color brightness rows in their crayon box while current displays have 800nit (OLED) or even up to 1500nit (FALD LCD) of color volume, two to three times+ the number of rows of brighter colored crayons so are able to show more realistic bright highlights and light sources and importantly - details throughout areas of colors instead of clipping or rolling down to a lower ceiling. To be fair, FALD LCDs don't really hit those highest peak specs #'s in contrasted scenes because they have to offset the areas by making "dim halos" vs "glow halos" due to inadequate # of zones compared to the pixels, but you get the point. In the future we'll have HDR 4000 and up to HDR 10,000 displays. Hopefully we'll also be on the road toward 1000fps (with some kinds of interpolation and AI upscaling) x 1000Hz for "zero" blur too.
 
Last edited:
Windows desktop HDR sounds horrible... What the hell? It should be in SDR range (which would show up as monochrome in a color temperature map) unless you had a HDR image as your wallpaper or something ... at least it should if desktop HDR was done right by MS instead of blasting everything bright. They are way behind in desktop HDR it sounds.
-How does this look when you just have a black wallpaper and no taskbar?
-Do things like STEAM and game launchers look blastingly bright?
- Do game menus, etc look super bright? I'd think the game switching from desktop to directx wouldn't but have to ask. Also, does it work in windowed mode + fullscreen properly like g-sync does?



The ultimate 3d vision now is a VR headset really, it's just that most don't have a high enough resolution to avoid screen door being more obvious in long shot scenes and in virtual distances past around 20' or so in games. When you use a VR headset it displays a different image per eye so the 3d effect is pretty much holographic. The pimax "8k" headsets are one of the only 4k per eye headests available so far. They are a small company with trouble fulfilling orders, some build quality issues, support issues.. growing pains perhaps idk. Plus the full kit with the headset and controllers is quite expensive. Eventually other headset mfgs should start using much higher resolution headsets too, pimax is just attempting to capitalize on being the first I think. Headset designs should get a little less bulky in the next gen and in the long run a lot less bulky with VR+AR/mixed reality goggles and glasses. These techs require interpolation types like and frame doubling (cutting anything under 90fps to 45fps then doubling it), moving tricks where they guess where the FoV is moving to and slide the whole game world slightly like a giant texture/projection sort of, etc. Oculus as well as amd and nvidia are all working on AI upscaling, like nvidia's DLSS so that should help a lot.

BFI is incompaitble with HDR. In the long run HDR will be ubiquitous and supported fully. Eventually it will become standard to have HDR support on games and in apps. Youtube uploads and streaming services will have HDR on their content (there is some available already), everyone's cameras will have HDR for photos and videos (some already do optionally). Even digital art will have HDR. HDR will trump 4k/8k resolution (and BFI) imo, especially as display technology (higher color volumes and better emitter or backlighting tech) and HDR support gets better and it becomes standard content wise. Think of how people looked at old TN screens that were washed out, or some less vibrantly colored VA screens. SDR displays are going to have up to 400nit of color brightness rows in their crayon box while current displays have 800nit (OLED) or even up to 1500nit (FALD LCD) of color volume, two to three times+ the number of rows of brighter colored crayons so are able to show more realistic bright highlights and light sources and importantly - details throughout areas of colors instead of clipping or rolling down to a lower ceiling. To be fair, FALD LCDs don't really hit those highest peak specs #'s in contrasted scenes because they have to offset the areas by making "dim halos" vs "glow halos" due to inadequate # of zones compared to the pixels, but you get the point. In the future we'll have HDR 4000 and up to HDR 10,000 displays. Hopefully we'll also be on the road toward 1000fps (with some kinds of interpolation and AI upscaling) x 1000Hz for "zero" blur too.
Windows HDR has worked fine and as expected on the desktop for me since 1903. It has to be an issue with the connected display.
 
There's nothing "incompatible" about HDR and BFI, I just tried it. BFI reduces overall brightness, sure, but different displays already have different overall brightness. A CX OLED in HDR with BFI on high would basically be HDR400, which is normally not great when talking about cheap LCD monitors...except it would have infinite contrast and perfect motion.
 
Last edited:
Windows HDR support seems to be different depending on the display you use.

It would seem like there would be a straight-forward way to map from SDR to HDR, but the Windows implementation leaves something to be desired.
 
Back
Top