LG 48CX

You are a rockstar kasakka! I really appreciate your prompt help! After updated my Club3D adapter to the latest, I am running 3840x1600@100hz as a test and it's working great on the 3080-- No GSYNC of course but buttery smooth compared to 60hz.

I haven't been able to test the 2080 Ti yet but I will. So odd that HDMI 2.1 to HDMI 2.1 won't work for custom resolutions over 60hz but DP 1.4 to HDMI 2.1 does work... but GSYNC doesn't? So odd.

EDIT:

Noticed that 3840x1600@120hz with Club3D adapter cuts off the edges of the screen:
View attachment 284405

Also gamma/brightness seems to be lower and mode is forced to RGB in NVCP...

I just tested and for me 3840x1600 120 Hz will drop out of HDR on the desktop but works fine without cuttng any edges or anything. In games I have been able to get it working with HDR too.
 
  • Like
Reactions: elvn
like this
Unfortunately most of the souls games are 60fps but at least I can lock 60fps minimum with everything cranked. I'd like to play Nioh2 and I know that is still unfortunately 60fps cap in the sequel.

I think remnant from the ashes is 120hz though it's not fully souls-like, also star wars jedi fallen order which while isn't hard core souls-like is very souls-ish gameplay.
The upcoming assassin's creed valhalla is supposed to put a much more challenging stamina meter souls-like gameplay style into the game which some people are already whining about lol.

I think mortal shell is 120hz capable which is very souls-like, though it could use more work on making richer environments, and like you said - Sekiro can be unlocked which is great. There is also a very promising looking game in development now which is based on an avatar of the monkey king. I've always been a fan of the monkey king mythology and movies. I don't know what the frame rate cap will be or if it can be unlocked since it is still in development but hopefully the new consoles with hdmi 2.1 will give some weight to higher frame rates for it and for ports and multi platform development in general.

 
I just tested and for me 3840x1600 120 Hz will drop out of HDR on the desktop but works fine without cuttng any edges or anything. In games I have been able to get it working with HDR too.

I really hope it works fully on hdmi 2.1 to hdmi 2.1 off of a 3000 series gpu eventually.
I understand that people find the club3d adapter info valuable for previous generations of gpus, especially since the 3000 series stock is so bad currently. It can be confusing having to keep filtering out or separating the club3d info from short replies, having to dig back through the quoted messages.

From what you guys have been saying it seems like custom ultrawide resolutions 1:1 (letterboxed with ultra black OLED bars) at "4k-ish" (3840 x 1600) 10bit 444 120hz VRR is not possible off of a 3000 series gpu currently so you have to resort to using the club3d adapter and it's limitations, the worst of which is no VRR/g-sync.
 
I really hope it works fully on hdmi 2.1 to hdmi 2.1 off of a 3000 series gpu eventually.
I understand that people find the club3d adapter info valuable for previous generations of gpus, especially since the 3000 series stock is so bad currently. It can be confusing having to keep filtering out or separating the club3d info from short replies, having to dig back through the quoted messages.

From what you guys have been saying it seems like custom ultrawide resolutions 1:1 (letterboxed with ultra black OLED bars) at "4k-ish" (3840 x 1600) 10bit 444 120hz VRR is not possible off of a 3000 series gpu currently so you have to resort to using the club3d adapter and it's limitations, the worst of which is no VRR/g-sync.

That's correct and it's not even custom resolutions that don't work straight from HDMI 2.1 on the 3xxx series cards, it's the custom refresh rates that seem to have the issue.

And you are correct, even though there is a lot to love about the CX 48, I am only interested in running at 3840x1600 on PC. There are too many remaining caveats for me at the moment even with the adapter where I got 3840x1600 @ 100hz working.
 
This is silly. So 120HZ ultrawide works on the adapter but not actual HDMI 2.1? Jeez. They need to update the adapter to do gsync then maybe I'll pick it up.
 
That's correct and it's not even custom resolutions that don't work straight from HDMI 2.1 on the 3xxx series cards, it's the custom refresh rates that seem to have the issue.

And you are correct, even though there is a lot to love about the CX 48, I am only interested in running at 3840x1600 on PC. There are too many remaining caveats for me at the moment even with the adapter where I got 3840x1600 @ 100hz working.

I'm not only interested in that but it would be a nice feature so I hope they implement that in firmware fixes in the long run. Not a total deal breaker for me personally but it would be good for some racing games with my wheel and for a few other games for the frame rate increase and game world increase. With such a large monitor it would be very useful as an option.


I posted a pic of the rig on page 105, Sept 21st. I don't want to take up more real estate on the board posting it again.

BTW, my left CX48 is not lagging any more. Somehow the mode got turned off "game". This setup is rocking for me now in 7680x1440. I can't wait to upgrade to an RTX 3090 to really get it going, and try triple 4k. My SLI Pascal Titan XP's can't perform well enough for me there.

When a hdmi monitor flashes it can lose sync. I think it's something to do with hdmi+hdcp (copy protection) handshaking. Anyway if you lose sync the monitor can go back to it's default since it is considering it as if it is seeing the source for the first time again. When I had to hot plug my one samsung hdmi TV's hdmi plug out of and back into it's displayport adapter occasionally due to blinking on wakeup from time to time, it would default back to hdmi named input instead of pc. I could tell immediately since it dropped RGB (which is 444). My text became tattered until I went back into my TV's osd and set it back to PC (which turns on RGB signal).

Lately I've switched one of my two samsung 60hz 4k VA tvs to the hdmi port instead of using two displayport adapters. I also started using blank screensavers on the TVs and a space star-field wormhole type one on my gaming monitor in the middle. I hit a hotkey on my streamdeck to minimize all windows and then another to start the screesaver. Then I'll turn the TV's off with their remote (one remote works for both since it's the same tv). Unfortunately I have to get up and walk over to my gaming monitor to turn it off. To be clear - I'm not putting my pc itself to sleep, just turning off the monitors. You could enable a lot of power saving on usb and hdd's, network devices, even fan profiles, etc if you had to conserve power more for some reason while still keeping the screensaver running and keeping the power option for turning off monitors set to "never".
When I return, I turn on the monitors first with the screensaver still running, then move my mouse to switch out of screensaver mode after the TVs are completely "booted up". Between that or the hdmi port on one of the TVs instead of two displayport adapters, I haven't had any blinking or no signal issues for the past few days. I have been away at work a lot the last few days however so I'll keep you posted.

I also want to let you know that I've learned that the gigabyte aorus 3090 line of gpus is going to have three hdmi 2.1 ports on them. I'm hoping that their aorus waterforce AiO cards will have 3x hdmi 2.1 ports too. I'm going to keep an eye on those as they become available since I'll be using three hdmi tvs even though it's not for the same usage scenario as yours. That way I wouldn't have to use any displayport adapters at all. Incidentally, gigabyte has a 3yr warranty on their gpus as well which is encouraging, but I don't know anything about how good their support system is. My last two gpu purchases were evga since 2014. That was dual gpu for sli both times but this time I'd rather just get a 3090, especially since NVIDIA is openly ditching SLI now and since DLSS performance increase is a thing. Now instead of looking for games that support SLI I'll have to look for games with DLSS support though. :rolleyes:
 
Last edited:
I have a few more data points to add, hopefully this testing is useful for you all. I currently have a unicorn 3080 FE (newest drivers) paired with a 48" CX that I manually updated with USB from the Korean firmware link.

Good news and bad news. Good: resized resolutions do work! Bad: it's a work around for custom resolutions and only 8 bit applies, not 10 bit.

Instead of clicking "customize" for custom resolutions in Nvidia control panel (which didn't work for me, just goes to black screen), choose 3840x2160 at 120hz at 8 bit. Then click on "Adjust desktop size and position." Under Scaling tab choose "No scaling" and "Perform scaling on:" GPU. Then click on the Size tab and "Enable desktop resizing" and then "Resize." I manually moved the sliders to 3840x1620 (it wouldn't allow for less than 1620).

3840x1620 appears to work just fine at a combo of 120hz, HDR, Gsync, 4:4:4 chroma. However it won't work for me in 10 bit, it needs to be 8 bit. LG HDMI diagnostic menu reports 3840x2160p, 120hz, RGB, 8 bit.

2560x1440 at 120hz also only works for me at 8 bit. If I choose 10 bit, the diagnostic menu reports dropping down to 60hz. Using the resize trick, I can make 2560x1440 120hz 8 bit work as an ~32" window in the center with "no scaling." That's a great option for competitive FPS games.

There are clearly still many bugs in this firmware, but it's a huge step in the right direction. Perhaps CRU fiddling could fix much of this? I'm not an expert there. Hopefully LG engineers will continue their recent fast pace and issue more updates shortly.
 
I have a few more data points to add, hopefully this testing is useful for you all. I currently have a unicorn 3080 FE (newest drivers) paired with a 48" CX that I manually updated with USB from the Korean firmware link.

Good news and bad news. Good: resized resolutions do work! Bad: it's a work around for custom resolutions and only 8 bit applies, not 10 bit.

Instead of clicking "customize" for custom resolutions in Nvidia control panel (which didn't work for me, just goes to black screen), choose 3840x2160 at 120hz at 8 bit. Then click on "Adjust desktop size and position." Under Scaling tab choose "No scaling" and "Perform scaling on:" GPU. Then click on the Size tab and "Enable desktop resizing" and then "Resize." I manually moved the sliders to 3840x1620 (it wouldn't allow for less than 1620).

3840x1620 appears to work just fine at a combo of 120hz, HDR, Gsync, 4:4:4 chroma. However it won't work for me in 10 bit, it needs to be 8 bit. LG HDMI diagnostic menu reports 3840x2160p, 120hz, RGB, 8 bit.

2560x1440 at 120hz also only works for me at 8 bit. If I choose 10 bit, the diagnostic menu reports dropping down to 60hz. Using the resize trick, I can make 2560x1440 120hz 8 bit work as an ~32" window in the center with "no scaling." That's a great option for competitive FPS games.

There are clearly still many bugs in this firmware, but it's a huge step in the right direction. Perhaps CRU fiddling could fix much of this? I'm not an expert there. Hopefully LG engineers will continue their recent fast pace and issue more updates shortly.

Given that it doesn't work the "normal" way with GPU scaling, the problems are entirely in Nvidia drivers/NVCP, not the firmware. Literally what GPU scaling is is sending a "normal" 4k signal to the TV (which we know works at 4k 10bit RGB 120hz on 30 series cards). I think the deal is that there have been some really cludgey things done in Nvidia drivers and/or NVCP to support things specifically for LG OLEDS over HDMI 2.1 (as the only display currently available that supports signals of this bandwidth) that people are finding broken edge cases in. Hopefully Nvidia cleans them up for the people interested in custom resolutions.
 
No problem.
Another update on my particular setup that might be relevant to other displayport adapters on hdmi ports.

Framing in quotes for brevity:
I had a TCL S405 and a Samsung NU6900 both off of "regular" active hdmi 2.0b (4k UHD HDR 60hz capable) displayport adapters, along with a LG VA g-sync gaming monitor on displayport to displayport.

I never put my system itself to sleep but I would use power saving on the screens themselves so that they would go into standby after awhile. Sometimes when I would wake up the screens from standby, one or the other would blink repeatedly before initializing to a solid signal. Rarely, they would end up showing "no signal" at all after the blinking period. Also rarely, when waking up from standby one or the other would have "no signal" right off the bat without the blinking. If I was stuck in a blinking loop or had no signal, I would hot unplug and re-connect the hdmi cable from the end of the displayport adapter of the TV with the problematic signal and it would then initialize properly. I'm assuming this is a hdmi(+hdcp?) handshake issue of some kind due to the adapter.

Eventually, my TCL's backlight died on one side after around 1.5 yrs I guess, so recently I replaced it, adding a second samsung NU6900 to the array. Last week I swapped the "old" samsung's hdmi cable to connect directly to the hdmi port on the gpu (1080ti for now). Today my displayport adapter connected samsung was blinking on wakeup but the hdmi to hdmi one was solid. The displayport one blinked 12 - 15 times and then ended up solid. After these issues resolve one way or another, there is never any other problem with the displays during a session - until the screens go into standby again where it becomes another roll of the dice. It was always one display or the other, never both, when both were on displayport adapters. So far I've not seen the hdmi to hdmi connected samsung blink and I don't believe it should as I'm pretty sure it's hdmi being finicky about handshaking/initialization.

I'm not ignoring the fact that this issue may have contributed to the lower lifespan of the TCL S405's backlight on what was likely a slightly cheaper build/screen. I've heard that the gigabyte aorus 3090 cards will have three hdmi 2.1 ports on them so I'm looking at those as my target card, even though they will prob have a premium price.

I'll keep updating on the state of the signals and as I eventually should be able to upgrade to a 3x hdmi aorus card and a LG 48 CX in the following months. Hopefully in november for the OLED and whenever I can find stock on the 3x hdmi aorus card at retail price.
 
Last edited:
No problem.
Another update on my particular setup that might be relevant to other displayport adapters on hdmi ports.

Framing in quotes for brevity:


I'm not ignoring the fact that this issue may have contributed to the lower lifespan of the TCL S405's backlight on what was likely a slightly cheaper build/screen. I've heard that the gigabyte aorus 3090 cards will have three hdmi 2.1 ports on them so I'm looking at those as my target card, even though they will prob have a premium price.

I'll keep updating on the state of the signals and as I eventually should be able to upgrade to a 3x hdmi aorus card and a LG 48 CX in the following months (hopefully in november).

Ok, all good to know!

I am still having trouble when waking my TV's from sleep since I switched to 7860x1440 spanned for SLI. Before with all set at 1440p individually, I could turn on the TVs first, and then wake my PC and I was able to maintain my desktop resolution. But now when waking in an SLI span using the same procedure, I am suddenly in a low spanned res (I forget what).

I then have to open the NCP, set it to spanned 4k, and then have to add back in 7860x1440 in the resolution options, then enable it. Such a pain in the butt. But as you say, I am hesitant to not let these TVs power down. I would rather go through this hassle then burn them out by preventing their power down.

Great news about the 3090 and 3 HDMI ports. I think this will alleviate the issues I am having. But I also need to update my Club 3D adapters. Maybe that will do something.

EDIT: I just updated the firmware on the three CX's also. Will see if that makes any difference!
 
Last edited:
  • Like
Reactions: elvn
like this
Okay.. so I'm about to pull the plug for a 55" LG CX for 1200 EUR.

Can someone please talk me out of it? (Coming from a 43" Philips HDR1000 VA Monitor)
 
Okay.. so I'm about to pull the plug for a 55" LG CX for 1200 EUR.

Can someone please talk me out of it? (Coming from a 43" Philips HDR1000 VA Monitor)
 

Attachments

  • tenor.gif
    tenor.gif
    1.6 MB · Views: 0
Before I pull the trigger - 2 last things:

1) can I use the monitor without the stand AND without a wall mount? Will the panel be strong enough to not break if I place it directly on the desk and lean the backside against the wall? I use my current monitor like this but it has a panel depth of about 1.5cm.. I seriously need the extra depth as 48" would be borderline and 55" will be over the top for sure.. (60cm deep desk, leaning back on my office chair with feet on the table for about 1.2m depth currently). But it's also my main driver for doing productive work so I end up at about 70-80cm when coding..

2) could anyone try CRU and check if the G-Sync/FreeSync range can be expanded into the 30-40ish range? My 43" Momentum goes down to 35Hz G-Sync compatible without issues and it will be weeks if not months before I can get my hand in a 3080 or RDNA2..
 
Before I pull the trigger - 2 last things:

1) can I use the monitor without the stand AND without a wall mount? Will the panel be strong enough to not break if I place it directly on the desk and lean the backside against the wall? I use my current monitor like this but it has a panel depth of about 1.5cm.. I seriously need the extra depth as 48" would be borderline and 55" will be over the top for sure.. (60cm deep desk, leaning back on my office chair with feet on the table for about 1.2m depth currently). But it's also my main driver for doing productive work so I end up at about 70-80cm when coding..

2) could anyone try CRU and check if the G-Sync/FreeSync range can be expanded into the 30-40ish range? My 43" Momentum goes down to 35Hz G-Sync compatible without issues and it will be weeks if not months before I can get my hand in a 3080 or RDNA2..

Please don't do #1. It's gonna fall sooner or later for sure, and the upper half of the panel is literally ~3mm thick. People have posted floor stands that are relatively inexpensive and will get the job you want done safely.
 
Last edited:
Okay.. so I'm about to pull the plug for a 55" LG CX for 1200 EUR.

Can someone please talk me out of it? (Coming from a 43" Philips HDR1000 VA Monitor)

It won't look as sharp as the 48" LG CX model as that has a pixel density of 91.79 PPI vs 80.11 PPI on the 55" LG CX which is similar to a 27" 1080p monitor.
 
It won't look as sharp as the 48" LG CX model as that has a pixel density of 91.79 PPI vs 80.11 PPI on the 55" LG CX which is similar to a 27" 1080p monitor.
but you'll also look at it from further away...
 
Please don't do #1. It's gonna fall sooner or later for sure, and the upper half of the panel is literally ~3mm thick. People have posted floor stands that are relatively inexpensive and will get the job you want done safely.

Feared as much. I'd anchor it on the wall securely so it doesn't fall over, but I'd still need to place it on the desk.. I'm not too worried about the bottom edge as I'd wrap it with a door seal - I'm more afraid of potential tensions on the panel..

but you'll also look at it from further away...

Yeah, not necessarily lol.

That being said: 80ppi at 60-80cm doesn't scare me too much.. I've been a hardcore simracer for the past 2 decades and had been racing on triple 40" 1080p at 70cm distance before VR gained any traction.. My everyday TV setup also is a Sony HW65 1080p projector on a 2.8m screen at 3m seating distance..

I would have preferred the 48" screen, but they're scary expensive in my region compared to the 55" (1600+ EUR for the 48" and ~1200 EUR for the 55") and the former is about as available as the RTX30 unicorn cards..

The CX48 would be a perfect fit for desktop gaming (the 43" I have now is just a tad too small for my liking) - but as I said: expensive and barely available..
 
Before I pull the trigger - 2 last things:

1) can I use the monitor without the stand AND without a wall mount? Will the panel be strong enough to not break if I place it directly on the desk and lean the backside against the wall? I use my current monitor like this but it has a panel depth of about 1.5cm.. I seriously need the extra depth as 48" would be borderline and 55" will be over the top for sure.. (60cm deep desk, leaning back on my office chair with feet on the table for about 1.2m depth currently). But it's also my main driver for doing productive work so I end up at about 70-80cm when coding..

Absolutely not. It won't be in any way stable even though the weight is mostly in the bottom portion of the screen. I would also not put it on a 60cm deep desk because even a 80cm one is just barely good enough for my CX 48. If you want to buy this, buy a floor stand that can mount a TV. The stock stand is not going to be good for things like putting it slightly past the edge of the table.
 
Absolutely not. It won't be in any way stable even though the weight is mostly in the bottom portion of the screen. I would also not put it on a 60cm deep desk because even a 80cm one is just barely good enough for my CX 48. If you want to buy this, buy a floor stand that can mount a TV. The stock stand is not going to be good for things like putting it slightly past the edge of the table.

Thank you, that insight is very much appreciated!

Regarding depth: I was sceptical when I got my current 43" monitor because of the 60cm desk, but I got used to it pretty fast and it's actually comfortable - even for productive work. The only problem was the height of the monitor at first but after removing the stand and putting it point blank on the desk that problem went away.

I also mostly use it in a lean-back relax position on my rocker chair and that distance is enough (~120cm) that I can't differentiate between 1440p upscaled and 2160p native.

I'm guessing the 55" will be borderline usable in my scenario but I can't justify spending 400+ EUR more on the 48"..
 
Thank you, that insight is very much appreciated!

Regarding depth: I was sceptical when I got my current 43" monitor because of the 60cm desk, but I got used to it pretty fast and it's actually comfortable - even for productive work. The only problem was the height of the monitor at first but after removing the stand and putting it point blank on the desk that problem went away.

I also mostly use it in a lean-back relax position on my rocker chair and that distance is enough (~120cm) that I can't differentiate between 1440p upscaled and 2160p native.

I'm guessing the 55" will be borderline usable in my scenario but I can't justify spending 400+ EUR more on the 48"..

With the 55" you would want even more space between you and the table. 55" is massively larger than 43". Again a floor stand with or without wheels or a wall mount will be the best way to mount this so maybe spend some of the money saved on that.

PS. If somebody is already using a floor stand for something similar, do you have any recommendations? I kinda want to get one too so I can push mine a bit further than my monitor arm allows.
 
The super thin bezel actually extends below the bottom of the rear chassis. Without the base or a wall mount, the bezel itself will bear the load. NOT a good idea.
 
With the 55" you would want even more space between you and the table. 55" is massively larger than 43". Again a floor stand with or without wheels or a wall mount will be the best way to mount this so maybe spend some of the money saved on that.

PS. If somebody is already using a floor stand for something similar, do you have any recommendations? I kinda want to get one too so I can push mine a bit further than my monitor arm allows.

I and some other people in this thread are using a Fitueyes stand. It's decent and rather well built and can be found in both Europe and America at least. There might be better options though, especially because this one has rather limited height adjustment. Basically the lowest setting is a bit high still and you'll probably want to use only the 2 lowest screws (even though it's supposed to be 4 holes) for desktop use.
 
The super thin bezel actually extends below the bottom of the rear chassis. Without the base or a wall mount, the bezel itself will bear the load. NOT a good idea.

Yeah, I just visited a store to see that in person.. That super-thin bezel trend is freaking annoying to say the least. What was wrong with a 1cm panel depth.. it's always been the borders that annoyed people, not the depth.. especially if you have a 5cm deep box mounted on the backside... Those super thin panels are more fragile than it's worth imho..

PS: the recommended seating distance for 4k 16:9 material for maximum immersion and detail actually is height × 0.75. Would be 45cm for the 48" and 51cm for the 55" and I don't disagree with that tbh. I already use a 1:1 ratio on our projector setup and it's working just fine *for me*. Yeah, I'm a freak in that regard but the longer I think about it, the more sense the 55" makes for me.

PPS: yeah, those values above aren't going to be comfortable with a light-emitting screen. But with a relaxed seating position I'm at more than double those values..
 
45cm for 48" what? That'd have you craning your neck constantly and the pixels would be huge and everything would look pretty gross.

https://stari.co/tv-monitor-viewing-distance-calculator

The ideal for most people is about 1m for this TV. You can go down to 76cm if you have less than stellar vision (I do, but less than 1m is very uncomfortable for me when gaming).
 
Last edited:
Really depends on who you ask:
https://www.rtings.com/tv/reviews/by-size/size-to-distance-relationship

Rtings has 55" 4k at 0.98m optimal distance at 4k (which agrees with your standpoint). I'd second that for *comfortable* viewing and a cinematic experience.

To be clear: I wouldn't want (like ever) do productive work on a 55" at that close of a distance like I mentioned above. But I do believe that it's the distance where you'd get the most immersion out of. Is it practiable: hell no. Is it doable: yeah, maybe - for some special nutjobs like me.
 
This is what I've found from using various sizes of monitors over the years...
...
A7gEgdY.png


With my 43" 4k screens now at around 4' away (in preparation for a 48" CX later) - I've had to increase the scaling beyond 1:1 to 125% ..


I thought this might be of interest... though it is specifically for movie watching not for pc use (especially a multi-monitor array).

The viewing angle ‘sweet spot’ seems to be around 45-50 degrees

http://www.acousticfrontiers.com/2013314viewing-angles/
New cinemas built to THX specifications have a minimum viewing angle of 36 degrees from the last row of seats.

The viewing angle ‘sweet spot’ seems to be around 45-50 degrees
where SMPTE, THX and 20th Century Fox recommendations converge. This matches quite closely with CEDIA’s 43 degree viewing angle recommendation for 2.4:1 ‘Cinemascope’ content as per CEB-23. For reference 43 degrees is 3x picture height using a 2.35:1 screen.

Depending on where you like to sit in a commercial theater you might have a viewing angle of anywhere from 36 to 60 degrees. Personal preference is therefore an important factor and should be a prime consideration when laying out your home theater.

384148_382904_Erick2520Garci2520diagram.gif
 
Last edited:
Is it me or is BFI mode Borked with this driver update?

I had a 3080 and just traded up to a 3090.....and cannot enable BFI mode to save my life.

I had ZERO issues setting BFI mode on my 2080ti.

Also, just like my 3080.....I am having issues getting gameplay smooth with the 3090.
It feels like a minor microstutter is there when I try to play BFV, Starwars Squadrons, Pub-Oh-Geez....

Grrrrrr......

I am running a brand new EVGA 1000PSU with this
 
Is it me or is BFI mode Borked with this driver update?

I had a 3080 and just traded up to a 3090.....and cannot enable BFI mode to save my life.

I had ZERO issues setting BFI mode on my 2080ti.

Also, just like my 3080.....I am having issues getting gameplay smooth with the 3090.
It feels like a minor microstutter is there when I try to play BFV, Starwars Squadrons, Pub-Oh-Geez....

Grrrrrr......

I am running a brand new EVGA 1000PSU with this

Do you have G-Sync disabled in NVCP? You could disable instant game response on the TV too.
 
Do you have G-Sync disabled in NVCP? You could disable instant game response on the TV too.

Yes, IGR disabled and gsync disabled.

I can only get BFI working 4k60 RGB or 4k120 420 limited

I cannot get BFI working 4k120 Full 8bit or 10bit RGB

Anybody else has this issue?

Also, games just lack smoothness...they were much smoother on my 1080ti
and the only thing I changed in my system was going from 650 corsair psu to 1000evga psu
and the 3080 & 3090 now.
 
Also, games just lack smoothness...they were much smoother on my 1080ti
and the only thing I changed in my system was going from 650 corsair psu to 1000evga psu
and the 3080 & 3090 now.

Your games lack smoothness using G-Sync? This is concerning. I (supposedly) have a 3090 coming in next week myself.
 
Man, this is one long ass thread. All you guys have me super tempted to get the 48" CX. I have decent desk space that is 32" deep so I figure that should be fine. Should have room width wise for it now that I have replaced my monitor speakers with the creative Katana (54" from the edge to where my computer sits). I have placed an order in for a Acer Predator CG437K for $999 (backordered a couple weeks), my plan was to get it and a HP Reverb G2 VR headset for the same $$ as the CX. Figured I'd sacrifice some PQ with the Acer so I could have both, but part of me still wants to get the CX. I have not noticed ghosting on my high refresh VA monitors, so I do not think that will be a issue. That said, should I cancel the Acer/HP orders and just get the CX and maybe just get the VR headset next year? The last of my gaming upgrade budget is tied up in getting a new AMD/Nvidia card to power the 4k goodness once I see what AMD has to offer here is a few more weeks.
 
Games on my 2080ti run as smoothly as smoothly can be. The 3000 series just cant stop sucking.
 
Back
Top