• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

LG 48CX

Those recommended settings on reddit say to turn off AMD freesync premium if using nvidia..

excerpt from the settings I put in quotes in my previous comment:

LG TV
=========

PC mode: On
HDMI Ultra Deep Color: On
Reduce Blue Light/Eye Comfort Mode: off
Aspect Ratio: original
Just Scan: On
Energy Saving Step: Off
AMD Freesync Premium: Off (or AMD/XSX = on)
[CX] Instant Game Response (ALLM+VRR) = on
[C1/C2] VRR & G-SYNC = on
[C1/C2] Game Optimizer (ALLM): on
[C1/C2] Game Genre: Standard
Fine Tune Dark Areas: 0 or (-3 to -5)


I don't know if it would have a detrimental effect or matter but that write up of recommended settings is pretty thorough and reviewed and honed by comments from others in the thread. The recommended settings say to turn it off when using a nvidia gpu and on for amd/xboxseriesx. According to the footnotes there:

"Enabling AMD FreeSync Premium will force XSX/Adrenalin to use FreeSync over HDMi instead of HDMI Forum VRR."


------------------

Named PC hdmi input
=====================

You have to go into the home settings and edit the named input ICON itself, not just the name of the hdmi input, to PC in order to get RGB~444 chroma. If the hdmi cable ever loses connection it can revert back out of PC mode which will compromise text making it look tattered on the edges and will also be detrimental to a lesser degree on graphics fidelity so keep an eye out for that.

From RTings youtube vid " ... /FtJnLPxVtjI?t=775 "

vCRvR7D.png


--------------------------

Picture Mode:
==============

LG-CX_game-mode-menu_1.jpg


As for the picture mode, Game Optimizer or HDR game Optimizer (C1) or Game Mode (CX) will remove processing so that you will get the least input lag -

- but to me, as long as you have all of the interpolation and any other enhancement settings manually turned off on the mode, the other headings are usable in games. The most likely scenario is using most of the other modes for movies and vids and a dim mode - so I'd only zero one other heading extended features wise and customize it for an alternate "game mode" for testing purposes settings wise.
Instant game response should override other modes and swap it to the actual game mode automatically anyway initially though, and back when out of the game, if you have that enabled on the hdmi input (see next image below). I use HDMI eARC out which I think is still incompatible with instant game response hdmi auto-sensing - so most of the time I keep my PC rig's 48CX PC hdmi input on game mode manually and leave it there. If I want to watch movies/vids fullscreen rather than in a desktop window I'll switch to my nvidia shield hdmi input via the remote, with non-game picture modes used on that input. You can also use the web OS with youtube, netflix, amazon video, emby/plex , etc. I think the WebOS will be considered a different input like a whole new hdmi input with it's own picture mode settings.

Any of the named settings can be customized so you can make them your own for whatever kind of material you want them to be used for. They are just a good starting point but you can zero every single heading within any of them and start over if you wanted to. For example, I changed a lot of the vivid settings so it's not extreme nor dynamic. I use vivid mode for anime/cell-shaded video material that I still want to have a little more pop/saturation on along with different motion interpolation settings for it's low frame rate animation (which otherwise can cause some weird artifacts). I use APS as a dim screen usage scenario and a few other picture modes for movies, bright room, dark room. Note that the HDR and DV picture modes won't show up for editing unless you are running that material on the TV at the time.

Some people complain that game mode is dull and lifeless (un-saturated color wise). I just turn up the color saturation in the extended color settings on that game picture mode considerably to make it more vivid. [While in Game Picture mode: Picture -> Picture Mode Settings (press pointer on the heading) --> Color slider]. I can further adjust more or temper it down to less on a per game basis using reshade or nvidia freestyle until it looks great for that particular game style. So I keep the OSD's game mode color setting slightly high as a starting point. That is for SDR games. HDR/DV has it's own scale of color volume that kicks in automatically (and is well saturated and luminous) so it's not necessary for HDR games. My color setting adjustment is ignored by them as it switches to a different HDR gaming picture mode automatically when playing those games.



------------------------------

From RTings review of the C1:
"
The LG C1 has a very low input lag as long as it's in 'Game Optimizer' mode. For low input lag with chroma 4:4:4, the input icon has to be changed to 'PC'.

There's a new setting for 2021 models found in the Game Optimizer menu, called Prevent Input Delay. There are two options: 'Standard' and 'Boost'. We ran several input lag tests and found that the 'Boost' setting consistently lowers the input lag by about 3 ms when the TV is running 60Hz compared to the LG CX OLED. It works by sending a 120Hz signal to refresh the screen more frequently, meaning it doesn't affect 120Hz input lag. The published results are what we measured using the 'Boost' setting. On 'Standard', we measured 13.1 ms for 1080p @ 60Hz, 13.4 ms for 1440p @ 60Hz, and 13.0 ms for 4k @ 60Hz."

<Edit by elvn>: According to HDTVtest videos you shouldn't enable the boost settings unless using 60hz games/consoles because it can actually make the lag worse for 120hz gaming. It's basically duplicating the 60fpsHz frames into the 120fpsHz range.

Rtings on CX:

"The LG CX Series has really low input lag in Game Mode, and it stays low with VRR enabled, which is great for gaming. It also stays low with a 4k resolution, which makes it a good choice for Xbox One X or PS4 Pro owners."

1080p with Variable Refresh Rate

5.9 ms

1440p with VRR

6.2 ms

4k with VRR

5.9 ms

Instant Game Response (https://www.rtings.com/tv/reviews/lg/cx-oled/settings )

LG_CX_2_instant-game-response_cropped-1.jpg




-----------------------------------------------------


HGiG / CRU
===========

"HGiG tone maps to actual max TML of panel and will most accurately display a tone mapping curve set in-game or on system level."

"For Windows 11, you can use CRU to set tone mapping curve [Edit.. CTA-861 > HDR Static Metadata], otherwise Windows uses maxTML/maxFFTML/minTML 1499/999/0 nits. Quickly toggle HDR in Windows 11 using Win+Alt+B. "

Auto HDR (Windows 11/XSX) converts applicable SDR games to HDR10. (edit by elvn: sort-of. It ends up with a % of the highlights, not full HDR range)

DTM Off tone maps to maxTML 4000nits.
DTM On tone maps on a per frame basis. <---- (edit by elvn: the regular DTM, rather than HGiG, is guesswork/interpolation so it does so inaccurately, causing greater compression on the top end which loses color gradations/details and could cause side effects (perhaps even banding)).

Dolby Vision for gaming enables native DV games and converts all HDR10 games to DV set in-game maxTML to 1000nits.

LG_CX_2_HGiG_medium.jpg


That reddit recommending setting guide's readout post CRU edit to 800nit on windows 11:

https://www.reddit.com/r/OLED_Gaming/comments/mbpiwy/lg_oled_gamingpc_monitor_recommended_settings/

 
Last edited:
I just started Horizon Forbidden West and followed this guide:



But honestly I am not liking the picture quality at all using these settings. This is reminding me of just how much of a crapshoot HDR gaming can be...it shouldn't take an expert using some fancy monitor with heat maps or w/e to determine luminance values and how to get a good HDR picture. Does anyone have any advice on making the HDR on HFW look great? Even if it's not the most ACCURATE picture, I'll take something that looks better and more inaccurate vs something that looks more dull and accurate.
 
From what I understand, usually the problem is that the peak nits is wrong so the whole scale the game is mapping to is out of sync. Elden ring allows you to set the peak white and then adjust the brightness slider separately so it seems a lot more flexible between my screens. I set the peak to 800 nit for my oled and 500 for my 500nit ips laptop.

The peak nit defining the scale being wrong can end up with a really bad result. At worst, either dull or white-washed out. Compressed and lost color detail at the top end at best. That's why monstieur kept talking about using HGiG and using CRU to set the peak yourself in regard to your gpu to that particular monitor, so that HDR and auto HDR would have the correct HDR range mapped. e.g. 800 nit manually for the CX instead of windows defaulting to 1400 or 4000nit.

For SDR gaming - I personally go into the color settings of the game picture mode and turn the color slider up until it's more saturated. Then in SDR games I use reshade or freestyle to adjust it from there. HDR has it's own game picture mode with extended settings you could bump up too. I haven't found the need to do that on HDR games but I wouldn't be opposed to trying it if I had a problem with a game being under-saturated. Personally I wouldn't just leap to regular DTM mode because it's seems more inaccurate than bumping up the color (saturation) slider a little.

NWuFr2P.png


------------------------------------

Here is the CRU info again from that settings thread I posted a few back. Use at your own risk though:

https://www.reddit.com/r/OLED_Gaming/comments/mbpiwy/lg_oled_gamingpc_monitor_recommended_settings/

618274_LG-CX_CRU.HDR.ToneMap.settings_1.jpg




And the result:

618228_DOxvZ0C.png
 
From what I understand, usually the problem is that the peak nits is wrong so the whole scale the game is mapping to is out of sync. Elden ring allows you to set the peak white and then adjust the brightness slider separately so it seems a lot more flexible between my screens. I set the peak to 800 nit for my oled and 500 for my 500nit ips laptop.

The peak nit defining the scale being wrong can end up with a really bad result. At worst, either dull or white-washed out. Compressed and lost color detail at the top end at best. That's why monstieur kept talking about using HGiG and using CRU to set the peak yourself in regard to your gpu to that particular monitor, so that HDR and auto HDR would have the correct HDR range mapped. e.g. 800 nit manually for the CX instead of windows defaulting to 1400 or 4000nit.

For SDR gaming - I personally go into the color settings of the game picture mode and turn the color slider up until it's more saturated. Then in SDR games I use reshade or freestyle to adjust it from there. HDR has it's own game picture mode with extended settings you could bump up too. I haven't found the need to do that on HDR games but I wouldn't be opposed to trying it if I had a problem with a game being under-saturated. Personally I wouldn't just leap to regular DTM mode because it's seems more inaccurate than bumping up the color (saturation) slider a little.

View attachment 453762

------------------------------------

Here is the CRU info again from that settings thread I posted a few back. Use at your own risk though:

https://www.reddit.com/r/OLED_Gaming/comments/mbpiwy/lg_oled_gamingpc_monitor_recommended_settings/

View attachment 453763



And the result:

View attachment 453764

CRU isn't going to do anything here because Horizon Forbidden West is a PS5 game lol. Unless it somehow modifies my actual TV for ps5 use?
 
Ah, I know Horizon was released on PC. I didn't realize the expansion wasn't. I don't have a PS5. You could still trying bumping up the HDR Game Picture Mode's color slider a little for that game and see how it looks.

618610_NWuFr2P.png




Worst case turn on dynamic color or DTM or something but I'd do that as a last resort.

The notes at the bottom of that guide have this for PS5 but there are overall TV OSD settings in the main chart too.

PS5​


  • VIDEO OUTPUT SETTINGS
    • VRR [TBD]
    • Resolution: Auto
    • 4K Video Transfer Rate: Auto [0 (RGB) not supported at 4k120 due to 32 Gbps FRL.]
    • HDR: On When Supported
    • Adjust HDR: Adjust tone mapping curve for HGiG. Ignore console's instructions to make icons "barely visible." Increase screens 1/3 (maxFFTML) and 2/3 (maxTML) until sunbursts clip then stop. Decrease screen 3/3 (minTML) to lowest value.
    • Deep Color Output: Auto
    • RGB Range: Auto
    • Enable 120 Hz Output: Auto
  • AUDIO OUTPUT
    • Output Device: HDMI Device (TV)
    • HDMI Device Type: TV
    • Audio Format (Priority): Bitstream (Dolby/DTS) or Linear PCM
    • Number of channels: 5.1/7.1 ch [Choose for LPCM.]
 
Last edited:
I just started Horizon Forbidden West and followed this guide:



But honestly I am not liking the picture quality at all using these settings. This is reminding me of just how much of a crapshoot HDR gaming can be...it shouldn't take an expert using some fancy monitor with heat maps or w/e to determine luminance values and how to get a good HDR picture. Does anyone have any advice on making the HDR on HFW look great? Even if it's not the most ACCURATE picture, I'll take something that looks better and more inaccurate vs something that looks more dull and accurate.

I honestly just left the game's menus at default settings and simply turn "favor performance" mode on because 60 fps > 30 fps. To me the game looks great like that on my LG C9 in the living room.

Excellent game too. I have put off playing Elden Ring on PC because I've been having such a good time with Forbidden West and can wait for ER to get patched more.
 
I just started Horizon Forbidden West and followed this guide:



But honestly I am not liking the picture quality at all using these settings. This is reminding me of just how much of a crapshoot HDR gaming can be...it shouldn't take an expert using some fancy monitor with heat maps or w/e to determine luminance values and how to get a good HDR picture. Does anyone have any advice on making the HDR on HFW look great? Even if it's not the most ACCURATE picture, I'll take something that looks better and more inaccurate vs something that looks more dull and accurate.

I’ve been playing with Vincent’s settings since he released that video and to my eyes it looks good. I think part of the problem is HFW is so overblown at default settings that turning them down to Vincent’s settings looks muted, in comparison. I suggest playing with his settings for a while and inching it up from there, per what looks good to your eyes.
 
I’ve been playing with Vincent’s settings since he released that video and to my eyes it looks good. I think part of the problem is HFW is so overblown at default settings that turning them down to Vincent’s settings looks muted, in comparison. I suggest playing with his settings for a while and inching it up from there, per what looks good to your eyes.

I did some more digging around forums and youtube and found this video:



Both of the alternatives to using HGiG with -8/9 highlights and -2 brightness look better to me. It may not be the most accurate picture but I really don't give 2 hoots about accuracy in games, I just want the best looking picture to my eyes even if it's inaccurate (I always use reshade/color vibrance on PC games which already makes the colors inaccurate). I'm sticking with the DTM On option for now.
 
As bummed as I am about this still not being fixed, at least Nvidia put this on their "Open Issues" list for their latest drivers (512.15):
Club 3D CAC-1085 dongle limited to maximum resolution of 4K at 60Hz. [3542678]
Guess I'll need to leave my work partition stuck at 511.23 for now.
 
As bummed as I am about this still not being fixed, at least Nvidia put this on their "Open Issues" list for their latest drivers (512.15):
Club 3D CAC-1085 dongle limited to maximum resolution of 4K at 60Hz. [3542678]
Guess I'll need to leave my work partition stuck at 511.23 for now.
They've broken it again?!
 
5xx.xx series drivers working fine with Uptab DP to HDMI2.1 adapter. 4K @120Hz, Full RGB/4:4:4 chroma, 10-bit or forced 12-bit color, vanilla HDR, auto HDR in Win11 and MPC-BE, etc. Only thing missing still is VRR and I havent missed it in a while.
 
5xx.xx series drivers working fine with Uptab DP to HDMI2.1 adapter. 4K @120Hz, Full RGB/4:4:4 chroma, 10-bit or forced 12-bit color, vanilla HDR, auto HDR in Win11 and MPC-BE, etc. Only thing missing still is VRR and I havent missed it in a while.

I would love to have vrr. My highly OC'd 2080Ti can only do so much.
 
So therefore I do not have VRR.

You didn't say you would love to have VRR at 120Hz HDR, just that you would love to have VRR in general ;)

EDITL: Ah, I replied without reading your quoted post first, my bad. Anyways with GPU availability improving a ton recently I'm sure you'll be able to get a 30 series card soon enough. 40 series card will also probably be easier to obtain I hope.
 
Last edited:
Hopefully Nvidia gimps crypto mining with the 4xxx series enough to make the cards unattractive to miners and creates a proper ordering queue to allow people to pickup these cards at MSRP and before the decade is out.

Based on the latest promo-leaks, a low end 4xxx should be able to rival 3080 level 4K performance, while a mid-tier 4xxx will easily outperform a 3090... here is to hoping to that TDP not reaching nuclear power plant levels.
Maybe time for a PCI-E spec revision or interface replacement.
 
Hopefully Nvidia gimps crypto mining with the 4xxx series enough to make the cards unattractive to miners and creates a proper ordering queue to allow people to pickup these cards at MSRP and before the decade is out.

Based on the latest promo-leaks, a low end 4xxx should be able to rival 3080 level 4K performance, while a mid-tier 4xxx will easily outperform a 3090... here is to hoping to that TDP not reaching nuclear power plant levels.
Maybe time for a PCI-E spec revision or interface replacement.

Pci-e spec revision isn't going to change power requirements, if thats what you're insinuating.
The only thing that will change power requirements is finding new semiconductor materials and transistor designs to reduce hysteresis and leakage. Or to completely change how to build processes all together. Maybe move to light/photon?
It's all math, and math requires lots of transistors to do quickly do the scale needed for graphics work.

Anyway, yes. I hope to see some cheaper GPU's here soon. It's been two years since I even attempted to buy one.
 
Hopefully Nvidia gimps crypto mining with the 4xxx series enough to make the cards unattractive to miners and creates a proper ordering queue to allow people to pickup these cards at MSRP and before the decade is out.

Based on the latest promo-leaks, a low end 4xxx should be able to rival 3080 level 4K performance, while a mid-tier 4xxx will easily outperform a 3090... here is to hoping to that TDP not reaching nuclear power plant levels.
Maybe time for a PCI-E spec revision or interface replacement.

Wishing for a company to gimp a product's capabilities is insanity. It's never the right solution.
 
Yeah I think it would be better if they had 1 per table kind of thing for the first 4 - 6 months of release with a well vetted registry of people + registered PC/phone/OS, gpu sold direct from mfg at MSRP, with a NDA type of contract or lease it contracted from the mfg initially .. where you couldn't resell it for 6 - 8 months or some similar solution. but I think they went the other way and did bulk orders to mining operations, resellers to bulk mining operations, + bots and scalpers.. If people couldn't buy a car because people were running warehouses of them up on blocks because it was the only viable way to generate mariocoins and WoW gold 24/7 I think the government might have to be involved at some point. But these aren't cars or furnaces or even air conditioners... which can all be life essential, or houses.. they are gpus. That's not really in the scope of this thread either.
 
Last edited:
Wishing for a company to gimp a product's capabilities is insanity. It's never the right solution.

They've done it with cgi dev versions of the gpus with the Quadros. They've also kept cuda and g-sync in house software wise so you couldn't run it on amd. They also didn't allow some software/driver features or capabilities of their newer gen gpus to work on their previous gpu gens even when it wasn't a hardware limitation if I'm remembering correctly. Windows OS generations do similar with some things.
 
Last edited:
They've done it with cgi dev versions of the gpus with the Quadros. They've also kept cuda and g-sync in house software wise so you couldn't run it on amd. They also didn't allow some software/driver features or capabilities of their newer gen gpus to work on their previous gpu gens even when it wasn't a hardware limitation if I'm remembering correctly. Windows OS generations do similar with some things.

It's all bad. Pointing out terrible behavior in other instances doesn't make this the right way to go.
 
Pci-e spec revision isn't going to change power requirements, if thats what you're insinuating.
The only thing that will change power requirements is finding new semiconductor materials and transistor designs to reduce hysteresis and leakage. Or to completely change how to build processes all together. Maybe move to light/photon?
It's all math, and math requires lots of transistors to do quickly do the scale needed for graphics work.

Anyway, yes. I hope to see some cheaper GPU's here soon. It's been two years since I even attempted to buy one.
Optical computing has the major downside of needing to generate Photons, which is *very* power intensive.
 
They've done it with cgi dev versions of the gpus with the Quadros. They've also kept cuda and g-sync in house software wise so you couldn't run it on amd. They also didn't allow some software/driver features or capabilities of their newer gen gpus to work on their previous gpu gens even when it wasn't a hardware limitation if I'm remembering correctly. Windows OS generations do similar with some things.
In the case of CUDA (or any technology built on it, like PhysX) there's nothing preventing AMD from creating it's own implementation by reverse-engineering the API. Or they could just pay NVIDIA a fee to license it.

As for Windows, it actually bends over backwards to try and maintain compatibility. Pretty much everything not a DOS executable or built on Win16 still runs, since all the necessary API calls still function. I've still got a very early Win95 game that runs fine on either Glide (via wrapper), OpenGL 1.1, or Direct3D 5; all modes work as expected. Generally, the only things that break rely on the compositor to act a certain way (Diablo is an excellent example), and those can generally be fixed with a few tweaks.
 
AMD does have it's own cuda like thing I think but I believe the devs of creator suites focus more on cuda benefits with their time/money.

They have openCL but they are starting another open source project.

AMD launches GPUFORT, an open-source attempt against NVIDIA's CUDA
https://www.notebookcheck.net/AMD-l...e-attempt-against-NVIDIA-s-CUDA.571133.0.html

From a 2018 reddit post about openCL
On an NVIDIA box I can download and install the CUDA SDK and be up and running with built-in Visual Studio integration in minutes. Host and device code can be in the same file. PTX instructions can be inlined or accessed via intrinsics. Not to mention excellent documentation and sample codes.
On the AMD side we have OpenCL, which is lacking in so many areas. The optimizer is poor. Device code cannot exist with host code. There is no access to AMD-specific features except through extensions.
Or ROCm, which is only available on Linux and has been a nightmare to get working and the documentation is HORRIBLE.
NVIDIA is winning in GPGPU because they make development easy while giving full power to developers.
Where is AMD's one-click install SDK that unleashes the full potential of their hardware?


You could pay a fee to license mining functions too I guess if you want to go the route of paying for functionality but I don't think that would solve the gpu availability and pricing problem.

As for windows (and nvidia) I wasn't talking about backwards compatibility. Completely the opposite - forwards.
There have been things that wouldn't work in the previous iteration of windows while the next version supported it. The same for nvidia gpus. That even when some features weren't necessarily a hardware limitation for nvidia or a major software hurdle for windows.
 
Does anyone happen to know what the proper setting is on the brightness slider for HDR in Death Stranding Directors Cut for the CX? It's literally the only value you can adjust in HDR and it does not even specify what the nit values are, it's just a simple brightness slider that goes from 0-100. Man I really hate HDR gaming sometimes because without a fancy monitor to display heatmaps and nit values, we have no idea what correct settings are.
 
Idk. The elden ring one has a slider for the max brightness of the display and another, separate, HDR brightness slider. It also has a saturation slider. Between all of those I'm able to get very good results either on a CX, C1, or my laptop's ~ 500nit peak (non-fald, non-OLED) screen. The laptop's quasi HDR in HDR mode looks better if I tone it down a little in order to avoid washing out areas of the screen too much since it's not side by side contrast like FALD or OLED. The highlight brightness and saturation~luminance/color volume gains are still very appreciable compared to SDR though. On the OLED I set elden ring's peak slider to around 800 peak and the laptop to around 500peak, then further adjust game's HDR brightness setting and HDR saturation settings separately.

Unfortunately not all HDR games offer the same amount of controls or settings. I don't think a single adjustment slider would be adequate. If you use CRU to set your screen's peak in windows that would take care of one missing slider though.

I did find this though in case it might help you:

Death Stranding - Adjusting HDR Settings

(idk if that list is accurate but that's what the vid claims)

IZ6f2Xj.jpg


. . . . . . . . . . .

HDR and Brightness: Death Stranding


Those are PS5 and there is definitely some disagreement in the comments between HGiG 's lack of brightness vs DTM but also DTM being inaccurate and losing detail due to compression.

That's why it would have probably been better had the dev's included HDR peak screen brightness, HDR brightness, and HDR saturation settings in the game so you could adjust that to your liking in HGiG instead of having an either/or situation in HGiG or DTM at wherever they decided to have them.

. . . . . . . .

CRU.exe seems like the best way to set your peak brightness setting for any particular HDR monitor to start with so that windows won't default to 4000nit or whatever. You could experiment with changing your actual OLED screen's OSD settings as a work-around though. You can adjust a different named HDR mode (in low latency mode / with all of the extended tv features turned off) in HGiG mode.

Set up one extra named HDR mode to adjust for problematic games. Adjust that HDR mode's OSD settings to the same as the HDR picture mode's OSD settings to start with. Then adjust the other mode's HDR brightness and Color settings (along with any extended sub-menus) in the OSD.

I pasted this pic again but you could try it on a different HDR picture mode than GAME to use as a workspace/testing ground in order to tweak your settings. Be sure to turn off all of the interpolation and other extended TV settings and anything dynamic on that mode. If you find that other mode has a little more input lag in game even with every extra tv processing feature turned off, you could write down those adjusted mode settings and then adjust your regular Game HDR Picture Mode settings to those you've decided on.

Write down your regular Game HDR picture mode settings you'd been using before messing with them if you've changed them from the defaults previously and don't want to lose them. This allows you to go back to the other named HDR mode(s) to play around with adjustments without losing what the HDR game mode is set to. Usually you don't want to go too crazy but I definitely upped the Color setting in the regular game mode (the SDR one for SDR games that don't support autoHDR). Most of the HDR games I play look great so I haven't had to mess with anything outside of the game's own settings. I try to prioritize well-implemented HDR games so I'm very happy that Elden Ring has enough adjustments to make it look great. Jedi Fallen Order, Nioh2, Fenyx, Pathfinder, and AC-Odyssey all look great also.

618610_NWuFr2P.png
 
Last edited:
Boom, the 42 C2 is out now. For higher msrp than I grabbed a 65C1 last week! I’m still happy with my 48CX but when these drop to 800-900 in a year I’ll probably grab one.
Looking forward to the 42" c2 hitting $700 to 800 myself. When you can buy a 55" c1 for $1k I expect nothing close to that for the 42 (https://slickdeals.net/f/15709150-5...rd-1097-free-s-h-at-buydig-less-w-sd-cashback). These deals are pretty consistent at end of model year time for the oled tv's. Their initial ask for the 42 c2 is just too high for what it is.
 
So I'm thinking about going all out and just getting a C1 or another OLED TV in the 55 inch range. It would replace a plasma I picked up cheap last year. Question - does anyone here have experience with OLED TV's and their BFI and can compare with a plasma TV? Even if it's the same I would want it. Also - is there a site out there that actually rates TV's BFI implementations? RTings states whether or not the TV has it, and if it 's single or double strobe. For OLED this is perfect because there's no cross-talk. For LCD this is different because cross-talk can come into play. Blurbusters is out there, sure, but they seem to only be interested in PC monitors and not TV's.

Also - I have old tech and can't really push 4K unless running older titles. I assume that 1080p is integer scaled on these displays? I plan on plugging in a Switch, Wii-U, older PC. Probably running 1080p most of the time. Hopefully at 120hz for the PC titles that support it.
 
Looking forward to the 42" c2 hitting $700 to 800 myself. When you can buy a 55" c1 for $1k I expect nothing close to that for the 42 (https://slickdeals.net/f/15709150-5...rd-1097-free-s-h-at-buydig-less-w-sd-cashback). These deals are pretty consistent at end of model year time for the oled tv's. Their initial ask for the 42 c2 is just too high for what it is.

The problem is it's not that much cheaper to manufacture the 42 than the 55. The only difference is the actual panel, it's like why a 1TB Barracuda and 2TB Barracuda only differ in price by $1.50.
 
So I'm thinking about going all out and just getting a C1 or another OLED TV in the 55 inch range. It would replace a plasma I picked up cheap last year. Question - does anyone here have experience with OLED TV's and their BFI and can compare with a plasma TV? Even if it's the same I would want it. Also - is there a site out there that actually rates TV's BFI implementations? RTings states whether or not the TV has it, and if it 's single or double strobe. For OLED this is perfect because there's no cross-talk. For LCD this is different because cross-talk can come into play. Blurbusters is out there, sure, but they seem to only be interested in PC monitors and not TV's.

Also - I have old tech and can't really push 4K unless running older titles. I assume that 1080p is integer scaled on these displays? I plan on plugging in a Switch, Wii-U, older PC. Probably running 1080p most of the time. Hopefully at 120hz for the PC titles that support it.

I haven't had a plasma since my 2011 Panasonic but from my memory, plasma is better at 60Hz because OLED is just too flickery at that low of a refresh rate with BFI enabled. But the CX/C1 can do BFI at 120Hz and that's when it destroys plasma in motion clarity because now you have the combination of no sample and hold + 120Hz. 1080p is not integer scaled on this TV, you have to set your nvidia control panel/AMD radeon settings to GPU scaling and then enabled integer scaling if you want that.
 
The problem is it's not that much cheaper to manufacture the 42 than the 55. The only difference is the actual panel, it's like why a 1TB Barracuda and 2TB Barracuda only differ in price by $1.50.
Oh, I know.... But it can't be $400 more ;).
 
So I'm thinking about going all out and just getting a C1 or another OLED TV in the 55 inch range. It would replace a plasma I picked up cheap last year. Question - does anyone here have experience with OLED TV's and their BFI and can compare with a plasma TV? Even if it's the same I would want it. Also - is there a site out there that actually rates TV's BFI implementations? RTings states whether or not the TV has it, and if it 's single or double strobe. For OLED this is perfect because there's no cross-talk. For LCD this is different because cross-talk can come into play. Blurbusters is out there, sure, but they seem to only be interested in PC monitors and not TV's.

Also - I have old tech and can't really push 4K unless running older titles. I assume that 1080p is integer scaled on these displays? I plan on plugging in a Switch, Wii-U, older PC. Probably running 1080p most of the time. Hopefully at 120hz for the PC titles that support it.
BFI on my CX 48" is really only useful for SDR content. HDR loses all its highlights. Even for SDR content it is only just bright enough. The main reason I don't use BFI is that it's inconvenient to toggle on/off on this model as the setting is buried pretty deep. I felt it gave some motion clarity advantages vs without using it but at least for me it's not a particularly important feature. If I was playing a lot of multiplayer shooters I'd probably use it more but most games I play these days support HDR.
 
Thanks for the replies. My understanding is that at 60hz, the C1 isn’t as clear as CX. It has 8ms of MPRT, which is around what plasma is. So it shouldn’t flicker as much as the CX and should be brighter (at the cost of clarity). C1 and CX should be identical at 120hz though.
 
Wait for prices to stabilize this summer and it'll be marginally cheaper than the its larger C2 siblings. I doubt it gets down to $700, would be nice though.
He's talking about stock clearances (the C148 just dropped below $1000 for the first time:

https://www.amazon.com/LG-OLED48C1P...id=1647571106&sprefix=c1+,aps,113&sr=8-3&th=1

You need to have a noticeable price differential when clearing stock (or most folks will just buy the much larger panel for slightly more money!) But this also means that any stock clearance of the 42" model will drop to around $700 to $800!
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Back
Top