LG 48CX

Joined
Jul 13, 2012
Messages
15
After reading through the discussion regarding WOLED sub-pixel arrangement and ClearType, I felt the need to dig out my ancient HardForum account just to share the following ClearType tweaking utility that IMO beats the pants off of Windows' built-in ClearType tuner since this even includes keep-it-simple-stupid toggles for full-pixel greyscale font anti-aliasing, RGB ClearType, BGR ClearType, and disabling ClearType and font anti-aliasing altogether:

Info: https://github.com/bp2008/BetterClearTypeTuner
Download: https://github.com/bp2008/BetterClearTypeTuner/releases

Better ClearType Tuner.png



One of the 48" LGs is going in my arcade cabinet. Been looking for a suitable monitor replacement for a LONG time. It'll fit in there mounted vertically well, and I can run the games at 2160x1620 (4:3 aspect ratio).
I actually run a vertically-stacked multi-monitor setup of an LCD monitor mounted on top of a CRT monitor (I have a good amount of vertical space but minimal horizontal space) and, as someone that isn't all that picky about resolution, I've thought about doing the very same thing of taking the 48" OLED and rotating it 90 degrees to replace said multi-monitor setup.

And it was this thought of rotating an LG OLED 90 degrees that made me look into an easier way to disable ClearType and/or use full-pixel greyscale font anti-aliasing because AFAICT there's no way that ClearType with its default settings is going to look any good on a 90-degree rotated LG OLED.
 
Last edited:

kasakka

[H]ard|Gawd
Joined
Aug 25, 2008
Messages
1,444
I got my 65 C9 around the same time from BB and I use it as my living room set. We're selling the house by end of year and moving so I don't want to wall mount it yet. My ONLY complaint with this TV, and it's a small one, is how it sits so low on the stand. I understand why it was built this way, but I had originally planned on getting a soundbar but being as the TV is so low I'm thinking it would block the bottom of the screen. So now my mind has shifted to spending a shit ton more on an actual legitimate Atmos surround set up. Which my wife hates the idea of from a cost and loudness perspective.
The stand in general is a bit shit. The one on my Samsung was far easier to install and it takes less space in the back and sits higher. Samsung One Connect box is also way better for managing inputs.
 

N4CR

Supreme [H]ardness
Joined
Oct 17, 2011
Messages
4,464
I use M-Audio powered monitors for my PC speakers - what do you recommend for a 2.1 setup that actually sounds really good? Anything with ARC input?
M-Audio or similar are perfectly fine. Just get a sub that fits the room or DIY one.
Other suggestion for optical from TV might not be a bad idea either.
 
Joined
Mar 23, 2013
Messages
941
So when people say 120hz BFI, do they mean the panel runs at 120hz to give you BFI on 60fps games, or can the panel actually go up to 240hz to insert frames in 120fps signals?
 

Keller1

Weaksauce
Joined
Dec 10, 2013
Messages
65
With the news of 120hz BFI, these are just no-contest, really. If the NVidia Gsync thing last year didn't tip us off, Especially seeing how LG had the 48'' hooked up to a literal gaming rig at CES, it's pretty telling that they're absolutely aware of who's gonna be getting these.
So when people say 120hz BFI, do they mean the panel runs at 120hz to give you BFI on 60fps games, or can the panel actually go up to 240hz to insert frames in 120fps signals?
The latter.

It does a rolling scan at 120hz, with half of the rolling scan being a black frame.
 

AngryLobster

Clueless
Joined
Aug 24, 2016
Messages
191
99% chance Gsync and BFI will not function simultaneously so it's completely worthless for me.

Also the brightness penalty was pretty obvious in CES videos.
 

Sancus

Gawd
Joined
Jun 1, 2013
Messages
996
99% chance Gsync and BFI will not function simultaneously so it's completely worthless for me.
I dunno I think this varies based on what you're playing. For example if I'm playing an FPS with no HDR support(OW, f ex), I might turn BFI on and play at 1080p integer upscaled to 4K if I need to peg 120fps. That should be a better experience than anything short of a CRT. And some FPSes(CSGO probably?) can easily peg 120fps even at 4K. Hell, maybe you can peg OW at 120fps with a 2080TI or upcoming 3080TI -- it's certainly not difficult to drive.

Of course, if you're playing graphically intensive, new AAA games, you aren't easily pegging 120fps at 4K, but in that case you may care about HDR, and it probably won't be an FPS, so who cares about BFI then. I mean, sure, BFI is a nice to have for something like Cyberpunk 2077, but given a choice between BFI and HDR I'll take HDR 100% of the time and twice on Sundays. And it's unlikely BFI will be compatible with HDR for a very long time if ever.
 

MrDeas

n00b
Joined
May 11, 2012
Messages
20
That looks absolutely sick. I just hope the 6ms input latency is legit or better, as it needs to be functional with mouse and keyboard for it to be a PC monitor replacement.

gsync, image quality, and the fast pixel response is a dream. I don’t think gsync and bfi work together in any monitor today, besides it would still dominate everything else.
 

gamerk2

[H]ard|Gawd
Joined
Jul 9, 2012
Messages
1,700
Out of curiosity, what's the point of BFI with Gsync (or any other form of VRR)?. After all, isn't the point of VRR to get around fixed refresh rates?
 

AngryLobster

Clueless
Joined
Aug 24, 2016
Messages
191
I dunno I think this varies based on what you're playing. For example if I'm playing an FPS with no HDR support(OW, f ex), I might turn BFI on and play at 1080p integer upscaled to 4K if I need to peg 120fps. That should be a better experience than anything short of a CRT. And some FPSes(CSGO probably?) can easily peg 120fps even at 4K. Hell, maybe you can peg OW at 120fps with a 2080TI or upcoming 3080TI -- it's certainly not difficult to drive.

Of course, if you're playing graphically intensive, new AAA games, you aren't easily pegging 120fps at 4K, but in that case you may care about HDR, and it probably won't be an FPS, so who cares about BFI then. I mean, sure, BFI is a nice to have for something like Cyberpunk 2077, but given a choice between BFI and HDR I'll take HDR 100% of the time and twice on Sundays. And it's unlikely BFI will be compatible with HDR for a very long time if ever.
Yeah I would never in a million years consider using my C9 at anything but native 4K. I bought it for the image quality and all non native resolutions on a display that size looks absolutely atrocious.

To add to that, I bought it to drool at eye candy and not play stuff like overwatch/league where the TV's strengths are basically gone to waste.

I know a guy who bought a X35, turns off the FALD and only plays league with a 2080 Ti. What a waste.
 
Joined
May 30, 2016
Messages
43
Out of curiosity, what's the point of BFI with Gsync (or any other form of VRR)?. After all, isn't the point of VRR to get around fixed refresh rates?
To reduce sample-and-hold motion blur, which has nothing to do with variable vs fixed refresh rate. Read more on blurbusters.com if you are interested.
 

kasakka

[H]ard|Gawd
Joined
Aug 25, 2008
Messages
1,444

Some more questions answered but mostly regarding the movie/TV stuff. But a more important one is that the CX 48" will be just as good as the larger models, so it is also expected to have the exact same feature set which is really awesome considering manufacturers have often just really pared down the feature set as the size goes down.
 

GameLifter

Limp Gawd
Joined
Sep 4, 2014
Messages
362
I'm going to replace my C6 with a CX later this year. The new features it has are exciting. Not sure if I want to stay at 55" or go 65". One thing I'll miss about the C6 is stereoscopic 3-D but I don't use it too much. When I do it's awesome though.
 

N4CR

Supreme [H]ardness
Joined
Oct 17, 2011
Messages
4,464
Don't see how BFI can't work with HDR. Rise time on LEDs is in nanosecond range.
Is it persistence of image that is the issue e.g. can't hit required nits? Still better than SDR it so...
 

SH1

[H]ard|Gawd
Joined
Jun 30, 2006
Messages
1,410
I'm going to replace my C6 with a CX later this year. The new features it has are exciting. Not sure if I want to stay at 55" or go 65". One thing I'll miss about the C6 is stereoscopic 3-D but I don't use it too much. When I do it's awesome though.
65" -- I don't know anyone who regrets getting a larger size TV. If you're sitting farther way even that can seem too small.

(And maybe keep the C6 in another other room for stereoscopic?)
 

Armenius

Fully [H]
Joined
Jan 28, 2014
Messages
21,309
Anyone know the total height of the 48CX unit?
Praying it will fit inside a 2x4 kallax unit...
From the 48" diagonal we know the screen is going to be about 41.84" x 23.5". Who knows what it will be with the frame, but even though the frame is small it is still going to be a tight fit. I can find out how much it adds to my C8 at home later.
 

GameLifter

Limp Gawd
Joined
Sep 4, 2014
Messages
362
65" -- I don't know anyone who regrets getting a larger size TV. If you're sitting farther way even that can seem too small.

(And maybe keep the C6 in another other room for stereoscopic?)
Yeah, right now I'm heavily leaning towards the 65" since it would make the upgrade even more worthwhile. I've considered moving my C6 into my room after getting the CX but then I would need to move my Samsung plasma out of there and put it somewhere or sell it. I've also considered selling the C6 and getting a plasma that does stereoscopic 3D but those don't seem too common and since they are older, getting one could be risky. One thing for sure is that I'm going to go on a stereoscopic 3D gaming spree before I get the CX to enjoy it for a little while longer.
 

mikeo

Gawd
Joined
May 17, 2006
Messages
622
Anyone else thinking about the 8k nanocell ones coming out over the OLEDs? I won't be buying any of them until a video card supports hdmi 2.1 though.
 

Vega

Supreme [H]ardness
Joined
Oct 12, 2004
Messages
6,369
Nanocell? Oh good lord had to look it up. More marketing BS for regular old crappy LCD.

I think I'm going to buy some cheap Chinese LCD's and re-sell them as "Magnetic Ion Pulse TVs" and see how many people buy this "new" TV tech. I'll make a fortune!
 

imsirovic5

Weaksauce
Joined
Jun 21, 2011
Messages
78
Anyone else thinking about the 8k nanocell ones coming out over the OLEDs? I won't be buying any of them until a video card supports hdmi 2.1 though.
Given my preference to run games at native resolution, I suspect it will take 5 to 10 years for me to consider an 8K display. But who knows what will happen in that timeframe in VR space given potential achievments in foveated rendering coupled with other advances. Monitors themselves may become obsolete in a decade.
 

mikeo

Gawd
Joined
May 17, 2006
Messages
622
I expect it would have been pretty expensive so might as well put that money towards a HDMI 2.1 GPU.
Yeah, probably still a lot cheaper than buying the alienware OLED over a LG though and potentially still the only option for 4k120 over hdmi.
 

Sancus

Gawd
Joined
Jun 1, 2013
Messages
996
Weird that they never released this one:

https://www.anandtech.com/show/1453...s-rtd2173-displayport-14-to-hdmi-21-converter

Anyone using a 9 series OLED as a monitor would buy one as it would finally provide 4k120.
They demonstrated a prototype chip, not even an actual product, for mfgs to even start prototyping actual adapters the chip would need to be finished and commercially available. So there may still be products coming eventually, but yeah, it was never going to be released very quickly.
 
Joined
Jul 13, 2012
Messages
15
Yeah I would never in a million years consider using my C9 at anything but native 4K. I bought it for the image quality and all non native resolutions on a display that size looks absolutely atrocious.
Quick question - are you referring to the blur introduced by upscaling, or are you referring to just raw pixel size?

I ask because 1080p with integer scaling would eliminate the majority of the blur from upscaling (or, if using 4:4:4, all of the blur from upscaling), but would give you pretty low PPI.

(you'd possibly still get a teeny bit of blur if using 4:2:0 chroma subsampling because, despite 4:2:0 at 2160p having a 1080p chroma resolution, the actual chroma pixels are not mapped directly on the same grid as the luma pixels are but rather tend to be offset by half of a luma pixel)


It's also worth mentioning though that, if you disable GPU scaling and let the TV do the upscaling, then you're going to get considerably better results because the upscaling on mid to high-end TVs absolutely thrashes the upscaling built into monitors as well as GPUs (except for those couple rare monitors that automatically do integer scaling when possible).

Of course, if you want to do integer scaling with the likes of 1080p and 720p on these OLED TVs, then the use of GPU scaling is required, but integer scaling obviously wouldn't work with the likes of 1440p on a 2160p 4k display (though it would work on a 4320p 8k display)
 
Last edited:

MistaSparkul

[H]ard|Gawd
Joined
Jul 5, 2012
Messages
1,076
Quick question - are you referring to the blur introduced by upscaling, or are you referring to just raw pixel size?

I ask because 1080p with integer scaling would eliminate the majority of the blur from upscaling (or, if using 4:4:4, all of the blur from upscaling), but would give you pretty low PPI.

(you'd possibly still get a teeny bit of blur if using 4:2:0 chroma subsampling because, despite 4:2:0 at 2160p having a 1080p chroma resolution, the actual chroma pixels are not mapped directly on the same grid as the luma pixels are but rather tend to be offset by half of a luma pixel)


It's also worth mentioning though that, if you disable GPU scaling and let the TV do the upscaling, then you're going to get considerably better results because the upscaling on mid to high-end TVs absolutely thrashes the upscaling built into monitors as well as GPUs (except for those couple rare monitors that automatically do integer scaling when possible).

Of course, if you want to do integer scaling with the likes of 1080p and 720p on these OLED TVs, then the use of GPU scaling is required, but integer scaling obviously wouldn't work with the likes of 1440p on a 2160p 4k display (though it would work on a 4320p 8k display)
I have tried all forms of scaling ranging from integer scaling to in game resolution scaling to built in TV scaling and no option looked good enough on a 55" display. Heck even on my Acer X27 it doesn't look good enough because I am now sitting much closer to my display and can notice the reduction in clarity at that closer distance. LCDs simply don't do non native resolutions well unlike CRT which could display a wide range of resolutions and still look great. Perhaps if you sat far away enough then yeah you could resort to upscaling instead.
 

tybert7

2[H]4U
Joined
Aug 23, 2007
Messages
2,655
I'm going to replace my C6 with a CX later this year. The new features it has are exciting. Not sure if I want to stay at 55" or go 65". One thing I'll miss about the C6 is stereoscopic 3-D but I don't use it too much. When I do it's awesome though.
Couldn't this just be implemented via a graphics card update in game mode?
 
Joined
Jul 13, 2012
Messages
15
I have tried all forms of scaling ranging from integer scaling to in game resolution scaling to built in TV scaling and no option looked good enough on a 55" display. Heck even on my Acer X27 it doesn't look good enough because I am now sitting much closer to my display and can notice the reduction in clarity at that closer distance
So at least for you, the issue is just low PPI rather than any sort of blur introduced by actual upscaling.


LCDs simply don't do non native resolutions well unlike CRT which could display a wide range of resolutions and still look great. Perhaps if you sat far away enough then yeah you could resort to upscaling instead.
I think you may be mixing up and/or consolidating two different aspects of CRTs.


CRTs being multisync means that they have no single native resolution, so both 1280x960 and wonky resolutions like 1224x918 will display with near identical clarify (near identical because your dot pitch and pixel clarity does get a bit fuzzier as you increase your resolution on a CRT monitor).

But additionally, the phosphor and electron beam nature of a CRT means that you not only get scanlines between each horizontal line of pixels but end up with pixels that aren't perfectly square and are somewhat soft, kind of like pseudo anti-aliasing. This combination of scanlines and pseudo anti-aliasing means that lower resolutions like 800x600 and 640x480 look much less pixelated than they do on an LCD or OLED. And at larger screen sizes (like the legendary FW900 24" CRT) these benefits of a CRT can apply to higher resolutions as well like 1920x1200 and such.


So while integer scaling can eliminate the lack of multisync on an LCD or OLED, it can not eliminate the lack of scanlines and phosphor "smoothness", and to me it sounds like it's the latter that is the issue you are describing.
 
Last edited:

GameLifter

Limp Gawd
Joined
Sep 4, 2014
Messages
362
Couldn't this just be implemented via a graphics card update in game mode?
From what I know it can't. The C6 and other passive 3D screens need an extra layer on the screen for the 3D to work and probably extra processing. When the 2017 OLEDs came out they removed the passive 3D layer so the screen could get brighter for HDR or at least this was the reasoning they gave. Now that HDR is the new big thing, newer TVs don't have stereoscopic 3D anymore which is understandable but when done right 3D can be pretty awesome.
 

setz3r

n00b
Joined
Aug 10, 2016
Messages
17
I'm thinking the only issue I would be concerned about with this TV for desktop monitor use (besides the obvious burn in issue) is the ABL feature. They gotta make sure it can be fully disabled. in past versions of LG OLED's, it always existed to some degree ? I have an older 65" C7 which I sometimes use for computer desktop use and gaming, and it seemed I could never fully get rid of the ABL. As nice as it is for probably reducing my power bill and certainly minimizing burn in risk, it needs to be able to be fully shut off. I hope LG are aware of this.
 

Sancus

Gawd
Joined
Jun 1, 2013
Messages
996
I'm thinking the only issue I would be concerned about with this TV for desktop monitor use (besides the obvious burn in issue) is the ABL feature. They gotta make sure it can be fully disabled. in past versions of LG OLED's, it always existed to some degree ? I have an older 65" C7 which I sometimes use for computer desktop use and gaming, and it seemed I could never fully get rid of the ABL. As nice as it is for probably reducing my power bill and certainly minimizing burn in risk, it needs to be able to be fully shut off. I hope LG are aware of this.
You understand that's impossible, right? The whole TV would probably catch on fire if you turned ABL off and then sent it a signal that it should display a full white screen at 800 nits. Assuming you didn't immediately blow the power supply, which you would.

C9 ABL is @ 151 nits full screen white, 317 for 50%, the CX might be improved a little bit but it will probably have the same limitations. It's unrealistic to expect otherwise, and it doesn't prevent desktop use at all. Use dark modes, they look better anyways.
 
Top