LG 48CX

99% chance Gsync and BFI will not function simultaneously so it's completely worthless for me.

I dunno I think this varies based on what you're playing. For example if I'm playing an FPS with no HDR support(OW, f ex), I might turn BFI on and play at 1080p integer upscaled to 4K if I need to peg 120fps. That should be a better experience than anything short of a CRT. And some FPSes(CSGO probably?) can easily peg 120fps even at 4K. Hell, maybe you can peg OW at 120fps with a 2080TI or upcoming 3080TI -- it's certainly not difficult to drive.

Of course, if you're playing graphically intensive, new AAA games, you aren't easily pegging 120fps at 4K, but in that case you may care about HDR, and it probably won't be an FPS, so who cares about BFI then. I mean, sure, BFI is a nice to have for something like Cyberpunk 2077, but given a choice between BFI and HDR I'll take HDR 100% of the time and twice on Sundays. And it's unlikely BFI will be compatible with HDR for a very long time if ever.
 
That looks absolutely sick. I just hope the 6ms input latency is legit or better, as it needs to be functional with mouse and keyboard for it to be a PC monitor replacement.

gsync, image quality, and the fast pixel response is a dream. I don’t think gsync and bfi work together in any monitor today, besides it would still dominate everything else.
 
Out of curiosity, what's the point of BFI with Gsync (or any other form of VRR)?. After all, isn't the point of VRR to get around fixed refresh rates?
 
Out of curiosity, what's the point of BFI with Gsync (or any other form of VRR)?. After all, isn't the point of VRR to get around fixed refresh rates?
To reduce sample-and-hold motion blur, which has nothing to do with variable vs fixed refresh rate. Read more on blurbusters.com if you are interested.
 


Some more questions answered but mostly regarding the movie/TV stuff. But a more important one is that the CX 48" will be just as good as the larger models, so it is also expected to have the exact same feature set which is really awesome considering manufacturers have often just really pared down the feature set as the size goes down.
 
I'm going to replace my C6 with a CX later this year. The new features it has are exciting. Not sure if I want to stay at 55" or go 65". One thing I'll miss about the C6 is stereoscopic 3-D but I don't use it too much. When I do it's awesome though.
 
Don't see how BFI can't work with HDR. Rise time on LEDs is in nanosecond range.
Is it persistence of image that is the issue e.g. can't hit required nits? Still better than SDR it so...
 
I'm going to replace my C6 with a CX later this year. The new features it has are exciting. Not sure if I want to stay at 55" or go 65". One thing I'll miss about the C6 is stereoscopic 3-D but I don't use it too much. When I do it's awesome though.

65" -- I don't know anyone who regrets getting a larger size TV. If you're sitting farther way even that can seem too small.

(And maybe keep the C6 in another other room for stereoscopic?)
 
Anyone know the total height of the 48CX unit?
Praying it will fit inside a 2x4 kallax unit...
From the 48" diagonal we know the screen is going to be about 41.84" x 23.5". Who knows what it will be with the frame, but even though the frame is small it is still going to be a tight fit. I can find out how much it adds to my C8 at home later.
 
65" -- I don't know anyone who regrets getting a larger size TV. If you're sitting farther way even that can seem too small.

(And maybe keep the C6 in another other room for stereoscopic?)

Yeah, right now I'm heavily leaning towards the 65" since it would make the upgrade even more worthwhile. I've considered moving my C6 into my room after getting the CX but then I would need to move my Samsung plasma out of there and put it somewhere or sell it. I've also considered selling the C6 and getting a plasma that does stereoscopic 3D but those don't seem too common and since they are older, getting one could be risky. One thing for sure is that I'm going to go on a stereoscopic 3D gaming spree before I get the CX to enjoy it for a little while longer.
 
Anyone else thinking about the 8k nanocell ones coming out over the OLEDs? I won't be buying any of them until a video card supports hdmi 2.1 though.
 
Nanocell? Oh good lord had to look it up. More marketing BS for regular old crappy LCD.

I think I'm going to buy some cheap Chinese LCD's and re-sell them as "Magnetic Ion Pulse TVs" and see how many people buy this "new" TV tech. I'll make a fortune!
 
Anyone else thinking about the 8k nanocell ones coming out over the OLEDs? I won't be buying any of them until a video card supports hdmi 2.1 though.

Given my preference to run games at native resolution, I suspect it will take 5 to 10 years for me to consider an 8K display. But who knows what will happen in that timeframe in VR space given potential achievments in foveated rendering coupled with other advances. Monitors themselves may become obsolete in a decade.
 
I expect it would have been pretty expensive so might as well put that money towards a HDMI 2.1 GPU.

Yeah, probably still a lot cheaper than buying the alienware OLED over a LG though and potentially still the only option for 4k120 over hdmi.
 
Weird that they never released this one:

https://www.anandtech.com/show/1453...s-rtd2173-displayport-14-to-hdmi-21-converter

Anyone using a 9 series OLED as a monitor would buy one as it would finally provide 4k120.

They demonstrated a prototype chip, not even an actual product, for mfgs to even start prototyping actual adapters the chip would need to be finished and commercially available. So there may still be products coming eventually, but yeah, it was never going to be released very quickly.
 
Yeah I would never in a million years consider using my C9 at anything but native 4K. I bought it for the image quality and all non native resolutions on a display that size looks absolutely atrocious.

Quick question - are you referring to the blur introduced by upscaling, or are you referring to just raw pixel size?

I ask because 1080p with integer scaling would eliminate the majority of the blur from upscaling (or, if using 4:4:4, all of the blur from upscaling), but would give you pretty low PPI.

(you'd possibly still get a teeny bit of blur if using 4:2:0 chroma subsampling because, despite 4:2:0 at 2160p having a 1080p chroma resolution, the actual chroma pixels are not mapped directly on the same grid as the luma pixels are but rather tend to be offset by half of a luma pixel)


It's also worth mentioning though that, if you disable GPU scaling and let the TV do the upscaling, then you're going to get considerably better results because the upscaling on mid to high-end TVs absolutely thrashes the upscaling built into monitors as well as GPUs (except for those couple rare monitors that automatically do integer scaling when possible).

Of course, if you want to do integer scaling with the likes of 1080p and 720p on these OLED TVs, then the use of GPU scaling is required, but integer scaling obviously wouldn't work with the likes of 1440p on a 2160p 4k display (though it would work on a 4320p 8k display)
 
Last edited:
Quick question - are you referring to the blur introduced by upscaling, or are you referring to just raw pixel size?

I ask because 1080p with integer scaling would eliminate the majority of the blur from upscaling (or, if using 4:4:4, all of the blur from upscaling), but would give you pretty low PPI.

(you'd possibly still get a teeny bit of blur if using 4:2:0 chroma subsampling because, despite 4:2:0 at 2160p having a 1080p chroma resolution, the actual chroma pixels are not mapped directly on the same grid as the luma pixels are but rather tend to be offset by half of a luma pixel)


It's also worth mentioning though that, if you disable GPU scaling and let the TV do the upscaling, then you're going to get considerably better results because the upscaling on mid to high-end TVs absolutely thrashes the upscaling built into monitors as well as GPUs (except for those couple rare monitors that automatically do integer scaling when possible).

Of course, if you want to do integer scaling with the likes of 1080p and 720p on these OLED TVs, then the use of GPU scaling is required, but integer scaling obviously wouldn't work with the likes of 1440p on a 2160p 4k display (though it would work on a 4320p 8k display)

I have tried all forms of scaling ranging from integer scaling to in game resolution scaling to built in TV scaling and no option looked good enough on a 55" display. Heck even on my Acer X27 it doesn't look good enough because I am now sitting much closer to my display and can notice the reduction in clarity at that closer distance. LCDs simply don't do non native resolutions well unlike CRT which could display a wide range of resolutions and still look great. Perhaps if you sat far away enough then yeah you could resort to upscaling instead.
 
I'm going to replace my C6 with a CX later this year. The new features it has are exciting. Not sure if I want to stay at 55" or go 65". One thing I'll miss about the C6 is stereoscopic 3-D but I don't use it too much. When I do it's awesome though.

Couldn't this just be implemented via a graphics card update in game mode?
 
I have tried all forms of scaling ranging from integer scaling to in game resolution scaling to built in TV scaling and no option looked good enough on a 55" display. Heck even on my Acer X27 it doesn't look good enough because I am now sitting much closer to my display and can notice the reduction in clarity at that closer distance

So at least for you, the issue is just low PPI rather than any sort of blur introduced by actual upscaling.


LCDs simply don't do non native resolutions well unlike CRT which could display a wide range of resolutions and still look great. Perhaps if you sat far away enough then yeah you could resort to upscaling instead.

I think you may be mixing up and/or consolidating two different aspects of CRTs.


CRTs being multisync means that they have no single native resolution, so both 1280x960 and wonky resolutions like 1224x918 will display with near identical clarify (near identical because your dot pitch and pixel clarity does get a bit fuzzier as you increase your resolution on a CRT monitor).

But additionally, the phosphor and electron beam nature of a CRT means that you not only get scanlines between each horizontal line of pixels but end up with pixels that aren't perfectly square and are somewhat soft, kind of like pseudo anti-aliasing. This combination of scanlines and pseudo anti-aliasing means that lower resolutions like 800x600 and 640x480 look much less pixelated than they do on an LCD or OLED. And at larger screen sizes (like the legendary FW900 24" CRT) these benefits of a CRT can apply to higher resolutions as well like 1920x1200 and such.


So while integer scaling can eliminate the lack of multisync on an LCD or OLED, it can not eliminate the lack of scanlines and phosphor "smoothness", and to me it sounds like it's the latter that is the issue you are describing.
 
Last edited:
Couldn't this just be implemented via a graphics card update in game mode?

From what I know it can't. The C6 and other passive 3D screens need an extra layer on the screen for the 3D to work and probably extra processing. When the 2017 OLEDs came out they removed the passive 3D layer so the screen could get brighter for HDR or at least this was the reasoning they gave. Now that HDR is the new big thing, newer TVs don't have stereoscopic 3D anymore which is understandable but when done right 3D can be pretty awesome.
 
I'm thinking the only issue I would be concerned about with this TV for desktop monitor use (besides the obvious burn in issue) is the ABL feature. They gotta make sure it can be fully disabled. in past versions of LG OLED's, it always existed to some degree ? I have an older 65" C7 which I sometimes use for computer desktop use and gaming, and it seemed I could never fully get rid of the ABL. As nice as it is for probably reducing my power bill and certainly minimizing burn in risk, it needs to be able to be fully shut off. I hope LG are aware of this.
 
I'm thinking the only issue I would be concerned about with this TV for desktop monitor use (besides the obvious burn in issue) is the ABL feature. They gotta make sure it can be fully disabled. in past versions of LG OLED's, it always existed to some degree ? I have an older 65" C7 which I sometimes use for computer desktop use and gaming, and it seemed I could never fully get rid of the ABL. As nice as it is for probably reducing my power bill and certainly minimizing burn in risk, it needs to be able to be fully shut off. I hope LG are aware of this.

You understand that's impossible, right? The whole TV would probably catch on fire if you turned ABL off and then sent it a signal that it should display a full white screen at 800 nits. Assuming you didn't immediately blow the power supply, which you would.

C9 ABL is @ 151 nits full screen white, 317 for 50%, the CX might be improved a little bit but it will probably have the same limitations. It's unrealistic to expect otherwise, and it doesn't prevent desktop use at all. Use dark modes, they look better anyways.
 
You understand that's impossible, right? The whole TV would probably catch on fire if you turned ABL off and then sent it a signal that it should display a full white screen at 800 nits. Assuming you didn't immediately blow the power supply, which you would.

C9 ABL is @ 151 nits full screen white, 317 for 50%, the CX might be improved a little bit but it will probably have the same limitations. It's unrealistic to expect otherwise, and it doesn't prevent desktop use at all. Use dark modes, they look better anyways.

Ok fair enough. Admittedly I haven't spend too much time with this yet. So you set it dark enough and then the ABL just won't even get triggered or activated at all ? I am not all hung up on peak brightness at all either.
 
Ok fair enough. Admittedly I haven't spend too much time with this yet. So you set it dark enough and then the ABL just won't even get triggered or activated at all ? I am not all hung up on peak brightness at all either.

Yeah, I mean, it's possible they will offer some configuration of ABL for SDR modes. If you calibrate the display at 100 or 120 nits(which are the usual standards for darkened room viewing) ABL should never trigger. But I also will say that the way ABL was applied on older OLEDs was faster and more disruptive/obvious than newer ones, which tend to just slowly dim brightness over a long period of time.

I'm probably wrong to say that it will never be an issue, in HDR modes it probably will show up sometimes in some games and stuff. On the desktop it shouldn't be a problem unless you set brightness high and use a lot of full screen white backgrounds.
 
newer TVs don't have stereoscopic 3D anymore which is understandable but when done right 3D can be pretty awesome.

It was never really 'done right'; either you had active shutter glasses that required twice the framerate on the TV -- at least twice, because many including myself got headaches with the tech as it was -- or you use passive glasses that cut light and resolution in half.
 
  • Like
Reactions: mikeo
like this
It was never really 'done right'; either you had active shutter glasses that required twice the framerate on the TV -- at least twice, because many including myself got headaches with the tech as it was -- or you use passive glasses that cut light and resolution in half.

I've tried 3D on both active and passive displays and both have their pros and cons. With active 3D the resolution isn't cut in half but the glasses required batteries, were expensive, and I also got headaches from them. Passive 3D doesn't give me headaches and the glasses don't need batteries and are cheap but like you said, resolution and brightness take a hit. When I mentioned 3D being "done right" I was referring to it being implemented well in a movie or game but I should have clarified before.

As for it being implemented well on the TV itself, the OLED C6 passive 3D isn't perfect but I still find it to be great. The deep blacks of OLED seem to make the 3D effect look deeper due to a ridiculously high contrast. The 4K resolution can also help mitigate the resolution being cut in half with passive 3D. One example is playing Gears of War 3 on Xbox One X. Since it runs at 4K on the One X the split in resolution is mitigated enough to where the picture in 3D still looks crisp. It's too bad we didn't get much 4K 3D content, at least that I know of.
 
Apologies for deleting and re-posting, but I made key edits and I really wanted to make sure nobody was replying to the previous version of my post.


80% resolution scaling on a 27 inch 4k display isn't low ppi.

From the context, it was my understanding that the "it" in the phrase "on my Acer X27 it doesn't look good enough" was referring to 1080p integer scaling rather than native 4k:
I have tried all forms of scaling ranging from integer scaling to in game resolution scaling to built in TV scaling and no option looked good enough on a 55" display. Heck even on my Acer X27 it doesn't look good enough because I am now sitting much closer to my display and can notice the reduction in clarity at that closer distance


It's not like 1080p on a 27" monitor was never particularly great PPI-wise anyway (I mean, that's why the majority of enthusiasts preferred having at least 1440p on 27" monitors).
 
Apologies for deleting and re-posting, but I made key edits and I really wanted to make sure nobody was replying to the previous version of my post.




From the context, it was my understanding that the "it" in the phrase "on my Acer X27 it doesn't look good enough" was referring to 1080p integer scaling rather than native 4k:



It's not like 1080p on a 27" monitor was never particularly great PPI-wise anyway (I mean, that's why the majority of enthusiasts preferred having at least 1440p on 27" monitors).

Nah what I meant by "it" was all forms of upscaling, not just integer scaling specifically. The degradation in texture quality is obvious once you step down from non native 4k whether you use the in game engine upscaler, TV upscaler, or even DLSS. And yeah integer scaling from 4k would just mean playing at 1080p and having a very nasty pixelated image overall. Like I said though if you were to use upscaling on your TV and then sit far away to the point where you would be hard pressed to notice it then I guess it would work out.
 
Does anyone know whether it's safe to mount OLED screens in portrait/vertical mode? I'd assume that it should work fine, but I can't seem to find a lot of information about it.
 
Nah what I meant by "it" was all forms of upscaling, not just integer scaling specifically. The degradation in texture quality is obvious once you step down from non native 4k whether you use the in game engine upscaler, TV upscaler, or even DLSS. And yeah integer scaling from 4k would just mean playing at 1080p and having a very nasty pixelated image overall. Like I said though if you were to use upscaling on your TV and then sit far away to the point where you would be hard pressed to notice it then I guess it would work out.

My experience is that most resolutions between 1440p and 4K when combined with image sharpening have good results. I have used RDR2 mostly as my test game for this as it has easy resolution scale options and high amount small details. At 80% 4K scale + image sharpening you would be hard pressed to tell the difference in anything but framerates when actually playing the game, which means you will be in movement most of the time and will not be looking at fine details with a magnifying glass but you will appreciate the increase in framerate. Until we have far more powerful GPUs I think these are very acceptable compromises.

1080p with integer scaling is not an option for me, it just loses too much detail overall and makes modern games look somewhat last gen to me.
 
Back
Top