LG 48CX

Ultimately becoming more and more upset over this purchase.

1) OLED motion is inferior to plasma and CRT without BFI. BFI only functions well enough to give you a taste of what could be on CX
2) raised dolby vision blacks
3) nonfunctioning HDR because HDMI 2.1 not working
4) can't purchase receiver yet because HDMI 2.1 not working
5) true blacks on OLED is only a thing if you like them crushed
6) banding ruining many shows and movies because HDMI 2.1 not working and no ampere cards available (halo bands not vertical bands)
7) input lag while very good in some settings can skyrocket (4k resolutions) and no idea if this will be fixed by hdmi 2.1 because HDMI 2.1 not working

At least the sound coming of the TV is reasonable and my panel uniformity is reasonable. At this point I feel dumb for not just buying a used OLED as the CX is still not fully functional with no light at the end of the tunnel.

Highly recommend waiting if you are thinking about buying.

To me none of these are big enough issues to make a big deal about. Let's break them down.

1) OLED has only sample and hold blur really so for this type of display it does a great job. I don't feel motion is an issue on it at all. I gave my Panasonic plasma to my parents and sure it does look great but it's only 1080p and does not do HDR.
2) May get fixed eventually.
3) HDMI 2.1 does work and HDR does work, with some chroma subsampling issues at the time of writing. Again not an issue for games or movies when movies are generally 10-bit 4:2:0 and you won't notice 4:2:2 in games.
4) Same as above. As long as a receiver has HDMI 2.1 ports you should be good.
5) Don't know about this one. Looks fine to me.
6) Again haven't seen this be an issue. I would expect banding issues to be more prevalent on streaming services due to the crappy bitrates of some of them.
7) I can't tell any difference in input lag when running in game mode between 1080p, 1440p or 4K. It's just not worth worrying about and certainly does not "skyrocket". If you can feel a difference then you have keener senses than I do.

I sometimes feel like some of you can't be satisfied with anything except a non-existing ideal display. Everything has its compromises and you can freely pick which ones are ok to you.
 
Anyone got 3840x1600 resolution working with 120Hz? If so what settings are you using? I want to run it for some specific games but I just get no signal when clicking Test.
View attachment 283377

For me these exact settings work perfectly fine for adding 3840x1600 @ 120 Hz using the Club3D adapter. I would wait for firmware or driver fixes. Setting the desktop to that resolution tends to drop it out of HDR though but you can get 3840x1600 and HDR working together in games just fine.
 
In a dark room, I can see even 1,2,3. Remember that they are going to be far harder to see than on a LCD though, because of the infinite contrast.


Thanks for the tip about blocking ambient light from the brighter squares. I'll try that in the evening.

I don't quite get this point about true blacks making it more difficult than LCD for the darkest squares though.

Is this because human eyes have lower sensitivity at low light values (ie. close shades of gray are easier to distinguish than close shades of black)?
 
For me these exact settings work perfectly fine for adding 3840x1600 @ 120 Hz using the Club3D adapter. I would wait for firmware or driver fixes. Setting the desktop to that resolution tends to drop it out of HDR though but you can get 3840x1600 and HDR working together in games just fine.

Does the adapter support gsync? If not, do you know if there are plans to?
 
Does the adapter support gsync? If not, do you know if there are plans to?

It does not but Club3D posted a "G-Sync works!" pic on Twitter then a bit later said "no wait, our mistake, not working". So I guess they are actually working on it and have gotten to the point that they can enable it but it does not actually work. It might needs the LG CX firmware update first. G-Sync is just not an option with the adapter at the moment and the reported EDID says VRR range is zero when it is 40-120 Hz over HDMI 2.0.

So fingers crossed they can get it working and issue a firmware update.
 
For me these exact settings work perfectly fine for adding 3840x1600 @ 120 Hz using the Club3D adapter. I would wait for firmware or driver fixes. Setting the desktop to that resolution tends to drop it out of HDR though but you can get 3840x1600 and HDR working together in games just fine.
[/QUOT
Then it should work with HDMI 2.1 also? I'm running 8bit(chroma subsampling issue with 10bit) and without HDR.
 
Then it should work with HDMI 2.1 also? I'm running 8bit(chroma subsampling issue with 10bit) and without HDR.

Yes it should, no idea why you get a blank screen. Also chroma subsampling issue exists at 8-bit too. Basically any 100-120 Hz resolution except 1440p has the problem. 8 vs 10 bit does not matter.
 
Anyone have recommendations for HDR PC games that look incredible on the CX? My 48" arrives Wednesday, and I’m really looking for something with that “wow” factor. Will be running @ 4K with an RTX 3090 that I just installed this weekend.

Also, is there a difference between selecting 4:4:4 or RGB when gaming?
 
Anyone have recommendations for HDR PC games that look incredible on the CX? My 48" arrives Wednesday, and I’m really looking for something with that “wow” factor. Will be running @ 4K with an RTX 3090 that I just installed this weekend.

Also, is there a difference between selecting 4:4:4 or RGB when gaming?

I love how Doom Eternal looks in HDR.
 
Yes it should, no idea why you get a blank screen. Also chroma subsampling issue exists at 8-bit too. Basically any 100-120 Hz resolution except 1440p has the problem. 8 vs 10 bit does not matter.
Guess I'll wait for the new firmware and new Nvidia drivers then. When running with 8bit instead of 10 text does look better for me in Windows right now.
 
Thanks for the tip about blocking ambient light from the brighter squares. I'll try that in the evening.

I don't quite get this point about true blacks making it more difficult than LCD for the darkest squares though.

Is this because human eyes have lower sensitivity at low light values (ie. close shades of gray are easier to distinguish than close shades of black)?

Yeah because OLED blacks are so deep (goes down to absolute black). The human eye is not so good at distinguishing very dark details if you have lights blasting in your eyes at the same time. With LCD it's not just blacks that are raised but also the (very) near black to some extent.
 
I was exaggerating a bit out of frustration in my last post. But the bottom line is we have been waiting some time for fixes and compatible tech..... And those that have come have revealed more and more problems

Maybe a firmware since fixed the 4k input lag since I last tried gaming on this.... but considering you can't even use VRR right now and test it.... Well.... more waiting and seeing.

Being disappointed with OLED motion is partly on me for not doing my due diligence first, but I thought BFI would be more functional than it is. I really notice the sample and hold artifacting in most content... And really anyone can go observe it with the UFO test. Turn on BFI and you'll see close to what CRT and plasma deliver. BFI both darkens and adds input lag (not TOO terribly admittedly) and you cant even use it in many gaming settings.

Yes there is never a perfect display (except for the FW900 if you disregard it's weight). However at least with tech like plasma etc you knew what you were buying and could use it for the appropriate need. With the CX series there's been both false promises (48gpbs) and greatly delayed promises.

Until HDMI 2.1 is hooked up there's a lot of sacrificing one feature to gain another (banding in 4k 120hz), and considering the issues going on with ampere right now..... I'm not about to fork over cash for a video card that's also having issues.... Just to maybe get a few things working on 2.1.

Is this a solid display with likely little risk of burn in with great input lag when it isn't bugging out? Yes. Were features like BFI and 48gps way and basically falsely overhyped? Yes


Might as well add to the gripe list while I'm at it
8) cant permanently turn off pixel shift
9) latest firmware update causes TV to keep popping up a menu asking me if my htpc is an apple device......
 
Last edited:
Anyone got 3840x1600 resolution working with 120Hz? If so what settings are you using? I want to run it for some specific games but I just get no signal when clicking Test.
View attachment 283377

This is the exact same thing I am fighting. Not interested in running the display at 3840x2160 until PS5 comes. I want 3840x1600@120hz full time during PC gaming. I ordered a Club3D adapter to test with my 2080 Ti and 3080 but I am sure things will change (hopefully for the better) when the CX firmware fixes start hitting!
 
The lack of gsync support & the instant gaming response sucks, but all you can do is be patient.
It is nice to know that LG & Nvidia are aware of the issue and working on it.

The improved image clarity with the RTX 3080 @ 4k120 RGB 10bit has been real nice and works
a million times better than the club 3d adapter did with my 2080ti.

Most of the negative comments about the 48CX from unsatisfied owners seems to be coming from the
Ultra Wide crowd who are just weird people to begin with anyway....I mean really, no big surprise there honestly....with their freakish hammerheads and such!

1601253751075.png
 
Yes there is never a perfect display (except for the FW900 if you disregard it's weight).

And it's very small screen which is a huge issue. The is no way I'd go back to such a tiny screen. Even a 27" LCD and the comparable ultrawides only have a 13" tall screen... the fw900 22.5" viewable and is only around 12" tall. Triniton CRTs also have aligment/damper lines in them. The FW900 also has convergence issues, focus issues - having to crack open the massive high energy capacitor and electronics laden chasis to tweak internally to keep tight as the settings "loosen" . Lack of per pixel cripsness as they age ("fuzzy/blob edges" like playing consoles on an old CRT TV)... They age into blooming and/or fading over time as they age and there's nothing you can do about it (without replacing parts or the whole CRT) until they eventually die. Also power suck on startup and a blast like a EMP, they are very sensitive to electric noise in the lines, sensitive to magnetic interference (like speakers) and have other issues like radiation, eye fatigue, and no HDR and HDR's much extended color volume pallete which will be more important as time goes on. Also potential signal support considerations from gpus. They also have a fairly long warm up period before their full saturation, contrast, and sharpness are their best again after they've been off or in standby.

So I wouldn't call them perfect by a long shot. What they do have is 1px of motion blur for essentially "zero" blur during high speed FoV movement / controller panning / camera movement but that is a lot of tradeoffs.

At 120fps on 120hz LCD/OLED you get 50% blur reduction down to a "fuzzy" vibration type blur during FoV movement now instead of the smearing blur you get with 60hz and low frame rate gameplay scenarios. That up to 8ms/8px of "fuzzy blur inside of the lines" is most evident when moving the FoV at high speed so how much clarity is lost is in relation to how fast you are moving your FoV around at any given time. It's also depending on staying at 120fps (solid). At different frame rate spans of frame rate averages, you aren't going to be staying at the 50% blur reduction (vs 60hz baseline) level or motion clarity.
 
And it's very small screen which is a huge issue. The is no way I'd go back to such a tiny screen. Even a 27" LCD and the comparable ultrawides only have a 13" tall screen... the fw900 22.5" viewable and is only around 12" tall. Triniton CRTs also have aligment/damper lines in them. The FW900 also has convergence issues, focus issues - having to crack open the massive high energy capacitor and electronics laden chasis to tweak internally to keep tight as the settings "loosen" . Lack of per pixel cripsness as they age ("fuzzy/blob edges" like playing consoles on an old CRT TV)... They age into blooming and/or fading over time as they age and there's nothing you can do about it (without replacing parts or the whole CRT) until they eventually die. Also power suck on startup and a blast like a EMP, they are very sensitive to electric noise in the lines, sensitive to magnetic interference (like speakers) and have other issues like radiation, eye fatigue, and no HDR and HDR's much extended color volume pallete which will be more important as time goes on. Also potential signal support considerations from gpus. They also have a fairly long warm up period before their full saturation, contrast, and sharpness are their best again after they've been off or in standby.

So I wouldn't call them perfect by a long shot. What they do have is 1px of motion blur for essentially "zero" blur during high speed FoV movement / controller panning / camera movement but that is a lot of tradeoffs.

At 120fps on 120hz LCD/OLED you get 50% blur reduction down to a "fuzzy" vibration type blur during FoV movement now instead of the smearing blur you get with 60hz and low frame rate gameplay scenarios. That up to 8ms/8px of "fuzzy blur inside of the lines" is most evident when moving the FoV at high speed so how much clarity is lost is in relation to how fast you are moving your FoV around at any given time. It's also depending on staying at 120fps (solid). At different frame rate spans of frame rate averages, you aren't going to be staying at the 50% blur reduction (vs 60hz baseline) level or motion clarity.

You're talking worn FW900s. CRTs when younger or less used just basically work. At least for casual usage. Back in the day, there didn't use to be a forum like this filled with endless frustrations I don't think. Sometimes someone would get a lemon, but generally computer displays just kept getting better and better similar to other advances in computing. And then suddenly, it became more about weight and thinness and raw PQ and motion quality took a back seat. Years later and for enthusiast usage it still sucks and so awfully so that that happened. It is kind of fun that the gaming press has apparently rediscovered CRT more recently, but it's kind of too late now.

That said, OLED is awesome. Hopefully they get the incompatibilities sorted. Is the BFI really not usable? I don't mind it being darker. However, it not being available in some games modes or adding input lag is a thing?
 
You're talking worn FW900s. CRTs when younger or less used just basically work. At least for casual usage. Back in the day, there didn't use to be a forum like this filled with endless frustrations I don't think. Sometimes someone would get a lemon, but generally computer displays just kept getting better and better similar to other advances in computing. And then suddenly, it became more about weight and thinness and raw PQ and motion quality took a back seat. Years later and for enthusiast usage it still sucks and so awfully so that that happened. It is kind of fun that the gaming press has apparently rediscovered CRT more recently, but it's kind of too late now.

That said, OLED is awesome. Hopefully they get the incompatibilities sorted. Is the BFI really not usable? I don't mind it being darker. However, it not being available in some games modes or adding input lag is a thing?

CRTs have plenty of issues of their own but they are with focus, DA converters of GPUs, image geometry etc. Personally I'd take an LCD over them any time. I have a couple of low res reference grade CRTs I use for retro gaming that are great for that purpose.

I don't agree that BFI causes any issue with gaming. As long as it is set to Auto or Medium it works well to me and is not too dim. To me its main drawbacks are that it is not quick to activate and it does not work together with VRR. I would also not use it with HDR because it dims highlights a lot whereas in SDR it is not noticeable.

I have tried playing Doom Eternal with and without BFI and notice no input lag difference. To me the biggest difference you will see in input lag on the CX series is if you use anything other than Game mode. Even with Instant Game Response enabled I can tell a slight difference between Game and for example Expert modes. It does not mean the other modes are unusable, just that there is an input lag you can perceive.
 
If LG was advertising their 2020 OLED X series as a G-Sync compatible display, they should have added a DisplayPort as well, most likely if they did that this HDMI 2.1 RGB/G-Sync fiasco wouldn't have happened, I believe a DP port is more PC friendly than a HDMI port. I really don't understand why TV manufacturers don't even add a DP port on their TV's if they want to cater the PC community.
 
If LG was advertising their 2020 OLED X series as a G-Sync compatible display, they should have added a DisplayPort as well, most likely if they did that this HDMI 2.1 RGB/G-Sync fiasco wouldn't have happened, I believe a DP port is more PC friendly than a HDMI port. I really don't understand why TV manufacturers don't even add a DP port on their TV's if they want to cater the PC community.

It's an extra development and hardware cost and DP 1.4 does not have anywhere near as much bandwidth as HDMI 2.1. The issues are all just problems with the firmware rather than anything with the port itself. HDMI 2.1 is still brand new in actual devices so there are bound to be some compatibility issues initially. That's the price of early adoption like LG has done on their TVs.
 
If LG was advertising their 2020 OLED X series as a G-Sync compatible display, they should have added a DisplayPort as well, most likely if they did that this HDMI 2.1 RGB/G-Sync fiasco wouldn't have happened, I believe a DP port is more PC friendly than a HDMI port. I really don't understand why TV manufacturers don't even add a DP port on their TV's if they want to cater the PC community.

I agree with you and let's standardize on one blasted interface and be done with it like we did on PC devices. Everything is USB for the most part.

Don't get me started on aspect ratios.. it's going to get even worse now with 16x9 vs ultrawide...
 
If LG was advertising their 2020 OLED X series as a G-Sync compatible display, they should have added a DisplayPort as well, most likely if they did that this HDMI 2.1 RGB/G-Sync fiasco wouldn't have happened, I believe a DP port is more PC friendly than a HDMI port. I really don't understand why TV manufacturers don't even add a DP port on their TV's if they want to cater the PC community.

By using HDMI 2.1, they can use the same port to support up to 8k120 (with DSC) across their product lines. Also remember that devices that are going to connect to the TV are largely HDMI anyways, and fewer HDMI ports can cause some connectivity options (I need 5; I have to physically move a HDMI cable currently). Given that, it doesn't make sense to add DP 1.4 ports.
 
Still enjoying my 48cx , I would not go back to a CRT or anything LCD ever again. Was helping my wife with a quest, well doing it for her on her computer. Has a 4k 7000 something Samsung 40 inch that I hate. I can never let her use my computer as I’ll have to buy another 48cx lol. But the motion and everything being washed out , the difference was night and day. 60hz response with the mouse is night and day. I run my CX @[email protected] all the time. Using a 5700x on water. The 4.2.0 doesn’t bother me much, I can see it in World of Warcraft in red text. Whenever I am get my hands on a new card it will be a non issue. I have VRR ON , on my 5700x . I haven’t noticed any issues with it. Seams like I’m the only person in here that’s using an AMD card with this TV. How is the Club 3D adaptor now ? I would really like to get 4.4.4@4k120hz.. looking like I’ll be waiting a few months and maybe into next year before I can get a proper card with a water block. If they can get VRR working with it that would be great.
 
Still enjoying my 48cx , I would not go back to a CRT or anything LCD ever again. Was helping my wife with a quest, well doing it for her on her computer. Has a 4k 7000 something Samsung 40 inch that I hate. I can never let her use my computer as I’ll have to buy another 48cx lol. But the motion and everything being washed out , the difference was night and day. 60hz response with the mouse is night and day. I run my CX @[email protected] all the time. Using a 5700x on water. The 4.2.0 doesn’t bother me much, I can see it in World of Warcraft in red text. Whenever I am get my hands on a new card it will be a non issue. I have VRR ON , on my 5700x . I haven’t noticed any issues with it. Seams like I’m the only person in here that’s using an AMD card with this TV. How is the Club 3D adaptor now ? I would really like to get 4.4.4@4k120hz.. looking like I’ll be waiting a few months and maybe into next year before I can get a proper card with a water block. If they can get VRR working with it that would be great.

That's how I feel as well. I'm not going to tell other people what they should use, but I know what works for me. Small red text is currently the only thing that suffers in my use case, and I rarely encounter that. And I could get around it by changing my display settings but I don't, because it's such a minor issue. I can wait for the fix.

Going back to LCD is simply not an option for me. I've had ultrawide monitors and don't like them...at least not nearly as much as a large 4K. The game changed when I got my first Samsung 40" 4K. For me, there's no going back after experiencing that level of immersion. And there's no going back to LCD after experiencing OLED contrast/blacks/motion. And now that we finally have 4K @ 120Hz OLED...next level. There are just some teething issues to work through. If that means waiting for the CXI then so be it, but I'm hoping that our existing TVs will get the updates they need to fully work with these new HDMI 2.1 GPUs. Like I said earlier, the hardware is capable and all of the existing issues should be solvable through software/firmware updates. And, we know that LG is actively working on it.

I'm not part of some cult, I just know what looks best to me. I've used a bunch of different displays over the years and this feels like end-game stuff, at least until Micro LED is a thing (and I'm sure it won't be as perfect as everyone thinks, at least at first). If someone tries the CX and is frustrated by it not being perfect for everything, then that's fine. I've been doing large 4K, and OLED, for years now so it's not like I'm still in the initial lovestruck phase...except I am. It's like being in a relationship where you still get butterflies years after being with that person. Every time I fire a game up, I am flabbergasted by the image quality and motion handling. If others don't see that, or if it's not enough to overcome the other minor quibbles then who am I to tell them that they shouldn't go back to a small 27" LCD or an ultrawide? Not all of us have the same preferences in cars, women, clothing, or food...why should it be any different for tech? I am just doing a disservice to myself if I let anyone else's criticisms take away from MY enjoyment of what is the most stunning display I've had the pleasure of using.
 
The future of displayport (displayport 2) is USB-C. Neither cards nor devices support this yet, but it exceeds HDMI 2.1 specs. Strangely, I feel like this is the best bet for TV inclusion in the future (2022+) given USB 4 and DP are converging on USB-C, but I do not expect it to ever supplant HDMI 2.x on TVs. I think if 8K TVs gain in popularity, if anything we will have another round of HDMI updates in 2-3 years.
 
The future of displayport (displayport 2) is USB-C. Neither cards nor devices support this yet, but it exceeds HDMI 2.1 specs. Strangely, I feel like this is the best bet for TV inclusion in the future (2022+) given USB 4 and DP are converging on USB-C, but I do not expect it to ever supplant HDMI 2.x on TVs. I think if 8K TVs gain in popularity, if anything we will have another round of HDMI updates in 2-3 years.

Given HDMI 2.1 already supports 8k120, I doubt we'll see another spec update for a while. And TVs aren't doing away with HDMI, given that's what all the devices that people connect to them are using. DP was and always will be a PC specific adaptor.
 
Still enjoying my 48cx , I would not go back to a CRT or anything LCD ever again. Was helping my wife with a quest, well doing it for her on her computer. Has a 4k 7000 something Samsung 40 inch that I hate. I can never let her use my computer as I’ll have to buy another 48cx lol. But the motion and everything being washed out , the difference was night and day. 60hz response with the mouse is night and day. I run my CX @[email protected] all the time. Using a 5700x on water. The 4.2.0 doesn’t bother me much, I can see it in World of Warcraft in red text. Whenever I am get my hands on a new card it will be a non issue. I have VRR ON , on my 5700x . I haven’t noticed any issues with it. Seams like I’m the only person in here that’s using an AMD card with this TV. How is the Club 3D adaptor now ? I would really like to get 4.4.4@4k120hz.. looking like I’ll be waiting a few months and maybe into next year before I can get a proper card with a water block. If they can get VRR working with it that would be great.

Club3D adapter now works almost well. Still occasional issues but mine has been nice for now. I hope they can get VRR working. FYI it's not worth buying if you don't have a GPU that supports Display Stream Compression. Don't know if 5700 does.
 
TL;DR - I have problems with X1X VRR using an HDMI Matrix Switcher on a CX.

Hey all, so I have a 55" CX in the same room with a 65" Vizio and my gaming devices (PC, XBOX, etc...), I wanted to be able to choose which TV gets the display output (as my 55" setup/area is single player only) so I picked up an HDMI Matrix switcher (this one - https://www.easycoolav.com/products/hdmi-matrix-4x2-matrix-4-in-2-out-18gbps-mx42hs). All my testing went well, I'm seeing HDR, I'm seeing 4K/60, etc... until I hit the X1X and ALLM and VRR. With the matrix set to copy HDMI OUT 1 EDID data (which is the LG), ALLM and VRR options (understandably) became selectable. In the XBOX Guide, everything is fine, but load up Fortnite and "Instant Game Response" pops up on the TV, and then the TV starts to periodically blank every few seconds.

In my experience screen blanking like this usually indicates something with the cable, like it's underspec or weak/damaged in someway.

However, if I have the overlay showing, it stops blanking. If I turn off VRR in the X1X settings (but leave ALLM on) it stops blanking. If I connect the X1X directly to the TV with either of the 2 cables (both hdmi 2.1 cables), it works fine.

So... what do you think it is about the Matrix switcher that could be causing the problem? It *should* be copying the EDID verbatim, it claims to support a full 18gbps and it says HDMI 2.0 (but it must be 2.0a given it supports 4K/60Hz 4:4:4 + HDR/DV) and I can't find anything online that clearly states any HDMI bandwidth requirements to support VRR. Maybe it's stripping some data that's supposed to get back to the X1X?

Thoughts?
 
Anyone have recommendations for HDR PC games that look incredible on the CX? My 48" arrives Wednesday, and I’m really looking for something with that “wow” factor. Will be running @ 4K with an RTX 3090 that I just installed this weekend.

Also, is there a difference between selecting 4:4:4 or RGB when gaming?

CoD:MW is special according to this thread.
 
  • Like
Reactions: es-wi
like this
Club3D adapter now works almost well. Still occasional issues but mine has been nice for now. I hope they can get VRR working. FYI it's not worth buying if you don't have a GPU that supports Display Stream Compression. Don't know if 5700 does.

You got a link to the new firmware for it?
That AVS link from two months ago is expired :-(
 
Last edited:
Tales from Triple CX48's...

I just got my three CX'S to work in SLI surround at 7680x1440 120hz HDR, on my Titan Pascal XP's (one HDMI, 2 DPs).

Loaded up Metro 2033 which I have not played before. Finally worked across three monitors with SLI Surround.

And the left screen lags a good 100ms behind the other two which are fine.

They also somehow ran at 60hz, despite 120 on the desktop.

Truly disappointing. Trying to get DOOM going in surround with no luck so far.

Ugh. More to come.
 
The lack of gsync support & the instant gaming response sucks, but all you can do is be patient.
It is nice to know that LG & Nvidia are aware of the issue and working on it.

The improved image clarity with the RTX 3080 @ 4k120 RGB 10bit has been real nice and works
a million times better than the club 3d adapter did with my 2080ti.

Most of the negative comments about the 48CX from unsatisfied owners seems to be coming from the
Ultra Wide crowd who are just weird people to begin with anyway....I mean really, no big surprise there honestly....with their freakish hammerheads and such!

View attachment 283504


I'll never understand the ultra wide stans. They get so excited by a wide display, seeming not to care that it's a display with the vertical space cut off giving less view.

reminds me of an old Steven Seagal skit from mad tv

 
TL;DR - I have problems with X1X VRR using an HDMI Matrix Switcher on a CX.

Hey all, so I have a 55" CX in the same room with a 65" Vizio and my gaming devices (PC, XBOX, etc...), I wanted to be able to choose which TV gets the display output (as my 55" setup/area is single player only) so I picked up an HDMI Matrix switcher (this one - https://www.easycoolav.com/products/hdmi-matrix-4x2-matrix-4-in-2-out-18gbps-mx42hs). All my testing went well, I'm seeing HDR, I'm seeing 4K/60, etc... until I hit the X1X and ALLM and VRR. With the matrix set to copy HDMI OUT 1 EDID data (which is the LG), ALLM and VRR options (understandably) became selectable. In the XBOX Guide, everything is fine, but load up Fortnite and "Instant Game Response" pops up on the TV, and then the TV starts to periodically blank every few seconds.

In my experience screen blanking like this usually indicates something with the cable, like it's underspec or weak/damaged in someway.

However, if I have the overlay showing, it stops blanking. If I turn off VRR in the X1X settings (but leave ALLM on) it stops blanking. If I connect the X1X directly to the TV with either of the 2 cables (both hdmi 2.1 cables), it works fine.

So... what do you think it is about the Matrix switcher that could be causing the problem? It *should* be copying the EDID verbatim, it claims to support a full 18gbps and it says HDMI 2.0 (but it must be 2.0a given it supports 4K/60Hz 4:4:4 + HDR/DV) and I can't find anything online that clearly states any HDMI bandwidth requirements to support VRR. Maybe it's stripping some data that's supposed to get back to the X1X?

Thoughts?

HDMI has hdcp copy protection which other display cables and connections don't. I'm not educated about how it works exactly - but personally I think this anti piracy tech can cause a lot of handshaking problems. If it detects what it thinks is a bad scenario it can even blank another hdmi monitor's signal entirely. I'm supposing that is to prevent hardware that duplicates hdmi protected content. I'm thinking if it's confused or somehow gets an inadequate result at first it can get into a stutter mode. That is all just what I am trying to guess as to why it happens, however I can say certainly that is is what happens after the issue occurs:
After a TV is turned on or awakened and exhibits the blinking (or no signal) issue, If you unplug the physical cable from a blinking monitor and plug it back in to re-intialize the hdmi handshaking it usually clears it up. Since nothing has changed and it works 100% after forcing the handshaking/initialization to start over, it leads me to believe it's a hdmi handshaking signal issue. That is from running multiple hdmi displays off of a gpu so I can't say whether your splitter is failing to clone and mimic the displays well enough to fool the hdcp or if any data is missing in the transfer.

Lately I've been leaving my hdmi screens on a blank screen saver and my gaming monitor on a dark screen saver, then turning the hdmi TVs off using the remote. If I'm going to be away a long time I turn the displayport gaming monitor off manually (which has no remote so a little less convenient). When I come back, I turn the screens back on and give them a moment before I wake up the pc from screensaver mode. Once I get an OLED, all three of my screens will be TVs so Ill be able to turn them on and off via two remotes since my other two screens are both the exact same model and remote. I'm probably going to get a two hdmi output 3090 model so that I only have to rely on a single dp adapter rather than two as well.

Anyone have recommendations for HDR PC games that look incredible on the CX? My 48" arrives Wednesday, and I’m really looking for something with that “wow” factor. Will be running @ 4K with an RTX 3090 that I just installed this weekend.

Also, is there a difference between selecting 4:4:4 or RGB when gaming?

The potential benchmark for DLSS 2.0 at 4k with a very high graphic fidelity resembling 5k or 8k downsampled and at much higher frame rates (+RTX optionally but with somewhat lower frame rates) should be Cyberpunk 2077. CD Projekt Red had said in a twitter post that they planned on HDR support for the title too but idk if that feature is being promoted actively or has been implemented (yet?).

Formerly set for April 16, 2020, then September 17, we now know that the Cyberpunk 2077 release date will fall on November 19...

edit... more than month away damn..
 
Last edited:
I'll never understand the ultra wide stans. They get so excited by a wide display, seeming not to care that it's a display with the vertical space cut off giving less view.

reminds me of an old Steven Seagal skit from mad tv



Most games use HOR+ which is based on the vertical. This is because games, like most cgi creation suites, use virtual cameras that emulate a real lens. With a real lens, any wider aspect ratio is going to show more of the scene, not less. The main place ultrwawides cut off real-estate is on desktop/apps, not the vast majority of games with their virtual cameras. However, you could get a much larger ultrawide running an ultrawide rez on an OLED TV. This would be especially good since the black letterboxing bars on oleds should be emitters that are turned off, so ultra black. If it still bothered you enough, you could paint your wall black behind the TV or hang a black tapestry (as long as you shorted it enough and had enough space vs power plugs and such in regard to fire risk of course). This could be useful for sitting closer for immersion in racing or flying games for example.

HOR-plus_scenes-compared_1-sm.jpg
eyefinity_config-aspects-visualized_sm.jpg
 
As to FoV: the most important (for me) actually is vertical FOV. That being said: I'm talking triple screens.

Ultrawide vs 3x40" triples in Simracing was night amd day before VR came along. The UW monitors had the horizontal FOV, but even at 60cm it was like looking out of a tank.

Right now, for casual gaming (Open world titles, sports titles, console faming, AAA,..) I think 16:9 40"+ is the best compromise. I've been using a 43" Momentum for 2 years now and I couldn't ever go back to something smaller. The pnly reason why I'm reluctant to buy a 48CX for 1200EUR is thinking that it may be too smal if I'd ever want to move it to the bedroom or sth like that..
 
I got a call back from LG today and they asked me for my serial number. She said their system still does not show a new firmware update even though I pointed her to the LG press release saying an update was now availiable for the LG C9 addressing the RTX 3xxx series video cards and LG's 2019 gsync feature. Super frustrating. I just installed the new nVidia drivers, that turns on gsync by default if your TV supports this feature and upon reboot, massive black bar right down the middle of the screen. Luckily I was able to reboot and disable gsync.
 
I got a call back from LG today and they asked me for my serial number. She said their system still does not show a new firmware update even though I pointed her to the LG press release saying an update was now availiable for the LG C9 addressing the RTX 3xxx series video cards and LG's 2019 gsync feature. Super frustrating. I just installed the new nVidia drivers, that turns on gsync by default if your TV supports this feature and upon reboot, massive black bar right down the middle of the screen. Luckily I was able to reboot and disable gsync.
I too get this same issue right now as well.

Huh just got a firmware update for my tv. Going to version 3.10.41. Not sure if this is the G-Sync fix that was supposed to come out?
 
  • Like
Reactions: elvn
like this
Racing games ? sure. Flying games, absolutely not. You want as much estate as you can get.

Don't confuse how a lens and a virtual lens works to show a scene - with actual vertical resolution/pixels. HOR+ games show more game world (in wider resolutions) as the images I linked in my previous post plainly (flying plane-ly?) show.

In HOR+ games, which are the vast majority of games, and which use virtual lenses (like the virtual cameras in CGI creation suites that made the games typically do) -- the vertical scene height is the exact same in every aspect ratio at the same viewing distance. The wider aspect ratios equate to "wider virtual lenses" and therefore will always show more game world on the sides at the same virtual camera distance in any HOR+game, which again is the vast majority of games. Those pictures show it very clearly.


As to FoV: the most important (for me) actually is vertical FOV. That being said: I'm talking triple screens.

Ultrawide vs 3x40" triples in Simracing was night amd day before VR came along. The UW monitors had the horizontal FOV, but even at 60cm it was like looking out of a tank.

Right now, for casual gaming (Open world titles, sports titles, console faming, AAA,..) I think 16:9 40"+ is the best compromise. I've been using a 43" Momentum for 2 years now and I couldn't ever go back to something smaller. The pnly reason why I'm reluctant to buy a 48CX for 1200EUR is thinking that it may be too smal if I'd ever want to move it to the bedroom or sth like that..

I agree that your "windshield" is a tiny belt on ~13" tall screens, which is why I mentioned playing in uw resolution on a big OLED TV instead as an option, at least for some games... for those of us who don't want low frame rates on a high Hz capable monitor when using triple monitors or who don't want to see bezels (or both). In fact, you'd get higher frame rates in uw rez on a 4k TV incidentally since it's obviously a little less than 4k rez.

The difference there that you are talking about is the physical height of the screens, not how much more vertical game world is shown - since in HOR+ games that never changes at the same camera view distance.
 
Last edited:
I too get this same issue right now as well.

Huh just got a firmware update for my tv. Going to version 3.10.41. Not sure if this is the G-Sync fix that was supposed to come out?

3.10.41 is old at this point. Mine is at 3.10.44 but that also was ages ago. I'm guessing they will have the new firmware out sometime in October.
 
Back
Top