LG 48CX

1080 Ti is a good card no doubt. I've held onto mine for my 2nd PC this entire time, but I was not happy with the performance I was getting at 4k on my Acer X27. 2080 Ti was around ~35% average performance uplift over a 1080 Ti in 4k. That meant the difference between getting 50-60 fps and getting 70-80 and actually making some use of >60Hz on my monitor.
 
yes I stuck with 1440p until I considered 4k at over 60hz more meaningful and achievable (and even capable display hardware wise) at native resolution.

This will be a big milestone for me for that and for OLED and for that matter some HDR.

------------

I usually shoot for 100fps average on a 120hz+ display for something like a:

(70)/85fps <<< 100fps >>> 115 (130) fps range, or better.

I'll have to see what I can get with a 3090 on some games. I could always tweak settings if necessary to dial in, especially the obvious of turning down/off raytracing. DLSS on supported titles could also be huge.

-----------

I'm still holding out to see november LG OLED deals and what Vizio's OLED offering is like but I'm keeping my eye out for TV stands ahead of time. According to reviews this one is sturdy. Between the wheels and the 15 degree tilt up/down it looks like it would offer a lot of adjustment even if not an articulating arm outward. I'd have it stuck behind a bench like desk anyway and so would move my peripheral island desk on wheels forward or back if I needed to for a racing game or something, rather than moving the OLED forward out of line with the other monitor(s) in the array alongside it.

https://www.amazon.com/1homefurnit-Mobile-Display-Trolley-Locking/dp/B074TB5MG5/

(I'd omit installing the tray)
71LvWSx-f-L._AC_SL1500_.jpg
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Well, my CX came today and holy cow!! Did not disappoint. Gaming is fantastic and Display Port carrying the sound via Display Port to HDMI into receiver works like a champ. So glad I did it. This thread was very helpful in making my decision. Thank you. Now to pore over the 100 pages to find the settings people seem to like best.
 
I've got an Alienware/Dell gaming laptop, and probably should determine if this can run the LG 48CX monitor properly (4k, 120hz, mid-range graphics quality settings) before I get too excited. I'd be playing Cyberpunk 2077 on it, and probably some Elite Dangerous and Subnautica.

Alienware m15
Intel(R) Core(TM) i7-8750H CPU @ 2.20GHz
32768MB RAM
NVIDIA GeForce RTX 2070
24310 MB
 
Ok so we finally have some comparative RTX 3090 performance metrics.

https://images.hothardware.com/cont...ontent/small_geforce-rtx-3090-performance.jpg

It's roughly 50% faster than a RTX Titan which in turn is about 3-5% faster than a 2080 Ti so the ballpark estimates of a 3090 being only 15-20% faster than a 3080 seem accurate.

Looks like the 3090 will be right in the 50-55% range faster than a 2080 Ti. Now the question is how will it scale with a moderate core/memory OC and whose cooler can handle 400w.
 
Ok so we finally have some comparative RTX 3090 performance metrics.

https://images.hothardware.com/cont...ontent/small_geforce-rtx-3090-performance.jpg

It's roughly 50% faster than a RTX Titan which in turn is about 3-5% faster than a 2080 Ti so the ballpark estimates of a 3090 being only 15-20% faster than a 3080 seem accurate.

Looks like the 3090 will be right in the 50-55% range faster than a 2080 Ti. Now the question is how will it scale with a moderate core/memory OC and whose cooler can handle 400w.

Those slides could be fake. But it is definitely what I am actually expecting though. 2080 Ti to 3080 is simply not big enough of a performance gain for me at ~40% max. 3090 it is.
 
Ok so we finally have some comparative RTX 3090 performance metrics.

https://images.hothardware.com/cont...ontent/small_geforce-rtx-3090-performance.jpg

It's roughly 50% faster than a RTX Titan which in turn is about 3-5% faster than a 2080 Ti so the ballpark estimates of a 3090 being only 15-20% faster than a 3080 seem accurate.

Looks like the 3090 will be right in the 50-55% range faster than a 2080 Ti. Now the question is how will it scale with a moderate core/memory OC and whose cooler can handle 400w.

It'll really depend what the OC (probably dictated by power unless Samsung's fab process is shit, and solder/liquid metal/pencils can likely overcome power limits) can get up to. I run my 2080ti at 2.1ghz solid no problem. It's on water, never gets above 40c, never throttles, etc. Hopefully 3090's can at least do similar, or that performance delta will get cut down some. If they can do 2.3Ghz+ or something, it will be truly awesome.
 
Those slides could be fake. But it is definitely what I am actually expecting though. 2080 Ti to 3080 is simply not big enough of a performance gain for me at ~40% max. 3090 it is.

Its directly from Nvidia briefing today

https://hothardware.com/reviews/nvidia-geforce-rtx-30-series-ampere-details

3080 FE cooler is looking pretty unimpressive to me although it is only 2 slots. 2 dBa quieter and 3C cooler but it is dealing with 80w more heat so I guess that's quite an achievement.

EDIT: There are 3090 cooler slides as well where it looks like 28 dBa @ 75C. Also given 150w per 8pin and a addition 75w from the PEG slot, the FE will be pretty handicapped compared to all the 3 8pin equipped partner cards. I'm starting to reconsider getting an FE now.
 
Last edited:
Does anyone have any thoughts on 3080's 10gb memory becoming eventually handicap @4k resolution? Sure today its plenty, but how about a year from now? Just trying to justify getting 3090 - but (to me) its almost silly to pay over 2x cost for potential 15% to 20% in extra performance. This is such a tough decision to make coming from 1080 Ti.......
 
Personally I don't really think it's a big deal since the majority of games even at 4K sit in the 6GB range. I've always believed that by the time you are handicapped by memory capacity the cards already old/slow.
 
Does anyone have any thoughts on 3080's 10gb memory becoming eventually handicap @4k resolution? Sure today its plenty, but how about a year from now? Just trying to justify getting 3090 - but (to me) its almost silly to pay over 2x cost for potential 15% to 20% in extra performance. This is such a tough decision to make coming from 1080 Ti.......

Just get the 3090. It will be the top dog GPU for at least 2 years and you don't have to worry about nvidia hosing you with either

A. 20gb 3080
B. 3080 Ti
C. Both
 
Last edited:
Does anyone have any thoughts on 3080's 10gb memory becoming eventually handicap @4k resolution? Sure today its plenty, but how about a year from now? Just trying to justify getting 3090 - but (to me) its almost silly to pay over 2x cost for potential 15% to 20% in extra performance. This is such a tough decision to make coming from 1080 Ti.......

If a game like RDR2 requires running 8K resolutions to fill the memory of the 2080 Ti's 11 GB then 10 GB for 4K will be perfectly fine. Nvidia has said on Reddit that they are looking at cost vs what developers are going to use and landed on the current amounts based on that. At the same time there are rumors of a 16 GB 3070 Ti and 20 GB 3080 so who knows.

IMO 10 Gb is plenty. Just because there are games that extensively cache stuff in VRAM does not mean they actually need that much. They just make better use of the available resources.

To me the 3080 is the card to buy. It's in that zone where it is faster than the already blazing fast 2080 Ti while being a lot cheaper. With the 3090 offering a modest 20-30% performance boost it does not seem worth the extra cost to me. This is of course if you consider practical, cost effective choices rather than just buying the 3090 because you can afford it and you want the best.

I would not buy at launch though. There is bound to be a Cyberpunk 2077 bundle in November and by that time also benchmarks, AMD's cards and who knows maybe even some in-between models from Nvidia if AMD's turn out to be very good.
 
If the 3080 can run most AAA games at 100 fps average (4K max or near max settings), then I consider it a good buy to go with this TV. Otherwise, might have to pony up for the 3090.

....or wait 9 months-1year for the 3080 Ti/Super. Can't wait for benchmarks.
 
IMO the 3080 will run most AAA games in the 60-80FPS range when maxed out at 4K (no DLSS). The 3090 will add 10FPS to this.

Of course the smart thing to do is optimize settings that do absolutely nothing for IQ but tank performance.

I'm basing this off the fact that my 2080 Ti was borderline 60FPS in many games and regularly dropped to the 50's and this was with mixed settings.
 
IMO the 3080 will run most AAA games in the 60-80FPS range when maxed out at 4K (no DLSS). The 3090 will add 10FPS to this.

Of course the smart thing to do is optimize settings that do absolutely nothing for IQ but tank performance.

I'm basing this off the fact that my 2080 Ti was borderline 60FPS in many games and regularly dropped to the 50's and this was with mixed settings.

This sounds about right to me too as a 2080 Ti owner. There will be varying cases, e.g. corridor shooters like Doom Eternal and multiplayer shooters will easily run at something like 100-120 fps or more. But for the "visuals first" AAA single player games, expect 60-80 and expect to still turn down some settings.
 
I think he read the dimensions wrong. They're triple slot at most, and that isn't 5" "wide" in terms of mobo real estate. I think they're 5" "tall" in the way I would think of things, but that might be listed as width in physical dimensions.

What I regard as width is how thick the card is. The source I read said 5" wide, and I was like --- WTF. So they used the wrong term, and I didn't figure that out. They certainly did mean to say height, which is a massive relief. Still very curious if the 3090 will fit for SLI in a large tower. Many of my racing sims take advantage of SLI.
 
Last edited:
1080ti SC hybrids were probably my last sli build. Before that I Diy modded AiOs with nzxt brackets onto a few 780 ti SC. While some games still support sli and others via hacks/workarounds.. with the power of these new gpus along with the promise of DLSS frame rates while giving results similar to supersampling - I decided to just pay the "Titan Early Tax" and get a 3090 hybrid/neptune.. rather than wait for a somewhat lower priced Ti to double down on with Sli like I had my last two gpu upgrades.

That and the increased cost of this gen, plus the fact that SLI isn't being utilized in VR (even though it could have had a perfect scenario of one gpu per eye). Personally I'd get a 4k per eye vr kit rather than run triple gaming monitors now if I were spending that kind of money on immersion, or wait for the next gen of VR where those kinds of resolutions should become more standard.

I hear ya. I am not into VR at all. My controls are so complex for sim racing, I always want to be able to see my rig and not be physically blind. I just dropped over 3k for the best F1 wheel on the market, which has an LCD screen in it. So triple monitors are where it's at for me. SLI is still a thing for most of my sims.

I just ordered two more of these bad boys. This first CX 48 has just blown me away. Time to get crazy!! The new 3090 can't come out soon enough...
 
My prediction of the 3090 is it will sell out fast and the price gouging will push it to $2,000 pretty quickly when you can find it. I hope I'm wrong but this is my worry.
 
My prediction of the 3090 is it will sell out fast and the price gouging will push it to $2,000 pretty quickly when you can find it. I hope I'm wrong but this is my worry.

Well...back when I bought my Pascal Titans in 2016, they kept selling out on the Nvidia site. But they never increased their prices.

Let's hope it stays the same way this time!
 
Anyone have some high quality, high speed HDMI 2.1 cable recommendations? I need two 6 footers and a 12 foot for my triple CX 48" setup. I am too busy these days for all the hard core research. I will Paypal $50 to the successful recommendation that gets me to 4k 120 with HDR. Which will only be possible when I bag a 3090 in a few weeks...
 
Anyone have some high quality, high speed HDMI 2.1 cable recommendations? I need two 6 footers and a 12 foot for my triple CX 48" setup. I am too busy these days for all the hard core research. I will Paypal $50 to the successful recommendation that gets me to 4k 120 with HDR. Which will only be possible when I bag a 3090 in a few weeks...
I will apply - https://www.bhphotovideo.com/c/product/966501-REG/audioquest_hdmifor02_2_0m_forest_hdmi_black.html
I am currently using one that is 5 years old and it is as good as an hdmi cable can get.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Anyone have some high quality, high speed HDMI 2.1 cable recommendations? I need two 6 footers and a 12 foot for my triple CX 48" setup. I am too busy these days for all the hard core research. I will Paypal $50 to the successful recommendation that gets me to 4k 120 with HDR. Which will only be possible when I bag a 3090 in a few weeks...

I bought a cheap 10' Bifale 48 Gbps uncertified cable from Amazon. It works perfectly with my CX and the CAC-1085 at 4K 120 Hz 10-bit RGB. No mid-signal dropouts apart from the defective CAC-1085 itself.
https://www.amazon.com/gp/product/B07XJMDBS9/

I also have the 2 m Club3D cable and it's no different. There is nothing better until certified cables are released.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
We still don't know whether the CX can do 4K 120 Hz at 4:4:4. As of now it downsamples to 4:2:2 internally. Only 1440p 120 Hz renders at 4:4:4, not even 1080p. It's either a hardware limitation or a firmware issue. I've gone back to 60 Hz because text looks bad at 120 Hz 4:2:2.

The C9 can do both 1080p and 1440p 120 Hz at 4:4:4. No one has reported back on 4K 120 Hz with the CAC-1085 adapter on the C9.
 
may be too early to buy hdmi 2.1 cables, most if not all for now are not yet certified right?

its just the waiting game.
 
We still don't know whether the CX can do 4K 120 Hz at 4:4:4. As of now it downsamples to 4:2:2 internally. Only 1440p 120 Hz renders at 4:4:4, not even 1080p. It's either a hardware limitation or a firmware issue. I've gone back to 60 Hz because text looks bad at 120 Hz 4:2:2.

The C9 can do both 1080p and 1440p 120 Hz at 4:4:4. No one has reported back on 4K 120 Hz with the CAC-1085 adapter on the C9.

With the CAC-1085 adapter it behaves a bit weird on CX. If I have HDR enabled it outputs 4K 120 Hz 10-bit 4:4:4 but if HDR is disabled it outputs 4:2:2. The problem does not seem to occur at 60 Hz so it's hard to say if this is a bug on the TV or the adapter. It's something I have reported to Club3D. Also CX needs to have pixel shift turned off at 120 Hz or it will be blurry.
 
We still don't know whether the CX can do 4K 120 Hz at 4:4:4. As of now it downsamples to 4:2:2 internally. Only 1440p 120 Hz renders at 4:4:4, not even 1080p. It's either a hardware limitation or a firmware issue. I've gone back to 60 Hz because text looks bad at 120 Hz 4:2:2.

The C9 can do both 1080p and 1440p 120 Hz at 4:4:4. No one has reported back on 4K 120 Hz with the CAC-1085 adapter on the C9.

I'm going to be pissed if this internal down-sampling still happens connected to a 3090. I may have to sell my 48CX and get a PG32UQX.
 
Yes, it would be disappointing if the BX/CX series only did 4:2:0 with a HDMI 2.1 connection, because I too am planning to get a 3090 myself. Need full RGB and anything less than that looks horrible.
 
I'd rather play games with 4:2:0 (or 4:2:2 we'll at the very least get that) 120hz on OLED than go back to LCD. I can't see any difference outside of rare instances of text over solid colours (which is how it's supposed to be). I just switch to 60hz on the desktop when I need to.

But it would make no sense for this TV to not support it when it can do it fine at 1440p. I mean, it even upscales the 1440p signal to 4k fullscreen internally without losing it.

I'm more concerned about the VRR issues, LG made it sound like those things happen on LCDs but are just less noticeable (which may be at least partially true, my previous VA panel with a g-sync module did have flickering when displaying dark scenes with a low framerate for example). I will definitely not use VRR until they fix the raised blacks (or a workaround is found). If that means no VRR until a new better display comes out well, so be it.
 
Last edited:
I'm going to be pissed if this internal down-sampling still happens connected to a 3090. I may have to sell my 48CX and get a PG32UQX.

I'd do the same if PG32UQX wasn't 1.5 years away. Asus announces stuff 2 years in advance.
 
I would never, ever go back to an LCD. Even if the CX has firmware problems with 4k 444 I still don't give a shit. I'll just keep doing what I'm doing now which is using a side monitor for desktop stuff.
 
I would never, ever go back to an LCD. Even if the CX has firmware problems with 4k 444 I still don't give a shit. I'll just keep doing what I'm doing now which is using a side monitor for desktop stuff.

"The GeForce RTX 30 series will support the following over HDMI: RGB/YUV444: 8, 10, 12bpc YUV422/420: 8, 10, 12bpc Same formats are supported on HDMI 2.1 Fixed Rate Link."

https://www.nvidia.com/en-us/geforc...color-on-30-series-cards-via-hdmi-21/2816420/
 
"The GeForce RTX 30 series will support the following over HDMI: RGB/YUV444: 8, 10, 12bpc YUV422/420: 8, 10, 12bpc Same formats are supported on HDMI 2.1 Fixed Rate Link."

https://www.nvidia.com/en-us/geforc...color-on-30-series-cards-via-hdmi-21/2816420/

Read Monstieur's post. It doesn't matter what the GPU's support if the TV ends up internally downsampling from 444 to 422 like they do now when paired to the Club3D adapter.

We still don't know whether the CX can do 4K 120 Hz at 4:4:4. As of now it downsamples to 4:2:2 internally. Only 1440p 120 Hz renders at 4:4:4, not even 1080p. It's either a hardware limitation or a firmware issue. I've gone back to 60 Hz because text looks bad at 120 Hz 4:2:2.

The C9 can do both 1080p and 1440p 120 Hz at 4:4:4. No one has reported back on 4K 120 Hz with the CAC-1085 adapter on the C9.
 
Read Monstieur's post. It doesn't matter what the GPU's support if the TV ends up internally downsampling from 444 to 422 like they do now when paired to the Club3D adapter.

I understand the point now after re-reading above but why would we think / assume it would downsample to 4:2:2 internally with an HDMI 2.1 source feeding it? Seems like it wasn't an accident that the CX has evolved into the premier gaming display to be paired with the RTX 30 series based on it being the only OLED getting GSYNC support from Nvidia. LG was also highlighted in the "8k" demo portion of the launch event last week so it looks like a partnership. I can't imagine LG would place a seemingly artificial downsampling limitation that would blow all of this up.
 
I hear ya. I am not into VR at all. My controls are so complex for sim racing, I always want to be able to see my rig and not be physically blind. I just dropped over 3k for the best F1 wheel on the market, which has an LCD screen in it. So triple monitors are where it's at for me. SLI is still a thing for most of my sims.

I just ordered two more of these bad boys. This first CX 48 has just blown me away. Time to get crazy!! The new 3090 can't come out soon enough...
Well once we get into the next gen of headsets they should have much higher resolutions per eye(and use AI upscaling ~"supersampling" end results). From what I've learned they should start using better external cameras also so that you can do quality Mixed Reality for things like virtual screens. In that way you could almost do a reverse green-screen like effect with a overlay behind your actual racing control rig. You could make a circle of screen space around you like a skin of virtual walls. That is really more about what they have been working on for the next generation of VR though and not so much what is available now.

There is also a push for rebuilding/mapping controllers into VR as well, just so you are aware, but I don't know how far along that is getting. Where you would sort of scan/capture or "paint" the controller into the VR home scene, like compositing video. At least then you would always know where your controls are exactly even if it wouldn't show readouts on things like flight controls. The issue now is some games show a flight stick but in the VR world it's not where your actual flight stick is in in the physical world. I think the "spray painting" (compositing) parts of reality into the scene, or vice-versa
.. painting the screen around your desktop/controls in a mixed reality setup is probably the way it will go eventually though.

Dirt rally one and two and other racing games are a trip in VR though already if you didn't have the eyes on controls requirement.


--------------------------------------

I'm going to be pissed if this internal down-sampling still happens connected to a 3090. I may have to sell my 48CX and get a PG32UQX.

I'd check out the 55" Vizios to see what they do with their own electronics and gaming chip and also see how well the C9 and E9 55" LG's work on the nvidia 3000 series. I'd rather sit back a little farther on a 55" OLED than use a LCD - especially an IPS for media and gaming, personally.
 
I understand the point now after re-reading above but why would we think / assume it would downsample to 4:2:2 internally with an HDMI 2.1 source feeding it? Seems like it wasn't an accident that the CX has evolved into the premier gaming display to be paired with the RTX 30 series based on it being the only OLED getting GSYNC support from Nvidia. LG was also highlighted in the "8k" demo portion of the launch event last week so it looks like a partnership. I can't imagine LG would place a seemingly artificial downsampling limitation that would blow all of this up.

Because it already does it at 1080p and currently at 4k when paired with the DP 1.4 adapter. It kight still do it when connectee straight to HDMI 2.1 most likely it is a firmware bug.
 
Honestly I had 0 issues with my X27 in terms of PQ so if they release a bigger one with 3x the zone count I'm all for it.

In fact in some areas the X27 is just better to me like HDR gaming and PPI.
 
Honestly I had 0 issues with my X27 in terms of PQ so if they release a bigger one with 3x the zone count I'm all for it.

In fact in some areas the X27 is just better to me like HDR gaming and PPI.

HDR is 50/50 on the X27. Sure bright games are going to look amazing and have more punch than the CX. But have you ever played RE2 remake or RE7 in HDR on it? Nothing but a blooming fest.
 
Because it already does it at 1080p and currently at 4k when paired with the DP 1.4 adapter. It kight still do it when connectee straight to HDMI 2.1 most likely it is a firmware bug.
It might be an issue with the adapter doing the conversion from DP signaling to HDMI 2.1 signaling more than anything. We will know for sure when 3000 series cards get in the hands of LG CX owners.
 
Back
Top