LG 48CX

As far as I know - unfortunately there is no windows app that you can hotkey, only the phone app and it's visual touchscreen interface.

It would be great if there was a way to integrate the mic button and a few other functions like inputs into a hotkey in windows or on a stream deck. If there were an LG app you could connect to via IP method on LAN like security cams are capable of or something..

Maybe windows 11's android app compatibility could do something with the LG phone app but I haven't messed with that at all yet. Even if it could run the LG app , that app wouldn't have hotkeys like I prefer and once you switched inputs that desktop app would be lost on the pc input unless you had a multi-monitor setup. Since I have a 3 screen setup at my desk I might experiment with the windows 11 android app functionality someday I guess.

There is no LG control app available directly for windows (esp. with hotkeys) that you can link directly to the tv afaik but that would be useful to me.


=====================================================


The amazon app store version that might work with windows 11's android functionality is linked here.
*Amazon appstore only and usa only currently unless you mess with sideloading I believe, for now.

https://www.amazon.com/LG-SmartThinQ-Complete-Smart-Appliance/dp/B079R91LN6/


https://www.howtogeek.com/764014/how-to-install-android-apps-on-windows-11/

https://www.windowscentral.com/how-get-started-android-apps-windows-11

So that theoretically will give you the LG ThinQ smartphone interface on your windows desktop and you could select inputs and options from there with your mouse pointer. If you want to swap inputs you'd need to place the app's window on a side screen using multiple screens rather than leaving it on the LG tv though for obvious reasons.

............

It can also work with google assistant and alexa stand alone voice assistants but it can be a pain to get them set up with your accounts to the point where they will sync.

(quotes I pulled from that amazon app link's reviews)
so, after a couple days of troubleshooting and tweaking, I finally got the TV to pair and operate with Alexa. everything from adjusting the volume, to selecting different apps on the TV, to shutting it off. after they make the feature for turning it on, I will give it 5 stars. what you need to do, is make sure that your TV is registered with the email address you have linked to your Amazon account. then, download basic AND complete LG SmartThinq apps/skills on your Alexa app.... follow the instructions on the TV, and have your Alexa discover your TV. it may take a few tries, but it will work. When you want to navigate within apps, you have to be specific. Say "Alexa, ask LG to launch NetFlix on TV." Or whichever function you want... I'll say, it's not very user-friendly, but works none the less. I have included video and a short tutorial on voice commands.

.. You can name the TV something other than "LG", like "LG OLED", or "48CX", etc. So you could say: HeyGoogle / Alexa / Echo "ask 48CX to switch to PS5 input". That would only save you from using the remote, as the mic button on the remote does the same thing with more direct phrasing: <hold mic button down> "switch to PS5 input".

I do understand that the account linking is a bit tedious but to be honest the amazon site or google interface does it all for you. All you basically have to do is make sure you link up everything with the same exact account and use the correct phrases when asking the certain device functions or actions. I've literally not had any issues with any commands or linking the TV up within the 10 minutes going through both setups step by step. So all i have to say is "Alexa, TV on." boom its on, or Alexa / Google TV off. Now the only thing alexa has over google is the ability to turn the TV on from a standby point but thats about it, they both have the same functionality and work just the same. Now if anyone has any issues and needs help setting thier devices up I will gladly assist you in doing so
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
elvn just made an account to say thank you to your comment 2 years ago about VRR bypassing Dithering creating the awful near black to black gradient transition. I didn't know the technical name of it.

I've now successfully used Reshade in Elden Ring to use a Shader that supports Dithering, and it literally fixed that issue! How is LG not implementing this easy "post processing" fix already?
 
Last edited:
elvn just made an account to say thank you to your comment 2 years ago about VRR bypassing Dithering creating the awful near black to black gradient transition. I didn't know the technical name of it.

I've now successfully used Reshade in Elden Ring to use a Shader that supports Dithering, and it literally fixed that issue! How is LG not implementing this easy "post processing" fix already?


No problem. 👍 I just try to source info online but I try to do my homework as much as I can. Hope it helped.

Some people run 8bit + FRC/dithering instead of running 10bit in PC mode on the LG OLEDs because some of their content shows issues in 10bit (when in RGB ~ 4:4:4 in PC mode). So you might want to experiment with that and see if you get any better results.
 
Last edited:
No problem. 👍 I just try to source info online but I try to do my homework as much as I can. Hope it helped.

Some people run 8bit + FRC/dithering instead of running 10bit in PC mode on the LG OLEDs because some of their content shows issues in 10bit (when in RGB ~ 4:4:4 in PC mode). So you might want to experiment with that and see if you get any better results.
Hmmm I tried putting 8 bit RGB while being HDR and VRR (instead of my 444 10 bit), and I do not see any improvements at all checking a static image showing the issue.
Tried also assigning PC mode for the HDMI port.

Am I missing something?
 
Hmmm I tried putting 8 bit RGB while being HDR and VRR (instead of my 444 10 bit), and I do not see any improvements at all checking a static image showing the issue.
Tried also assigning PC mode for the HDMI port.

Am I missing something?

Try ycbcr444 and 8 bit.
 
I use the EDID hack for 48Gb and 12 bit color and set the Nvidia CP to RGB, 12bit, 120hz, VRR on. I'm aware the screen can't actually do 12 bit, the issue is how it processes the input from a banding standpoint.
 
I use the EDID hack for 48Gb and 12 bit color and set the Nvidia CP to RGB, 12bit, 120hz, VRR on. I'm aware the screen can't actually do 12 bit, the issue is how it processes the input from a banding standpoint.
Are you saying it helps with Banding to do that?

Even though the dithering helped for near black gradient issue, I get severe banding in the sky in Elden Ring. Sometimes I am not sure if it is just the game that is like that.
 
Can't speak to Elden Ring I have not played it. I "think" it helps with banding in general. When I ran through the various options early on it seemed good to me. Nothing is quite as good on the banding issue as running 4:2:2 on very demanding demo videos that do a lot of red to black or white to black transitions but I don't get my attention grabbed by jarring distinct lines in the sky in games with these settings. And there is no way in heck I am ever going to use this set in 4:2:2. It's a computer monitor for me and it stays set at one all around use setting all the time for games/desktop/movies.

First thing to do is take a look at other games and see if they do this to you on large flat gradient transitions in the sky etc? If not, then it's the way the specific game is rendering the sky. Beyond that, does the game have some type of other compression enabled? Or any other post processing effects in the video settings you can play with? Or FSR or DLSS?
 
i have a 3080 ti coming next week and was looking forward to being able to run 4k 120hz 10 bit... but apparently 8 bit is still the better setting?!
 
No, really it's not. Not if you are using it hooked to your PC full time. Just put it on 10bit and see what you think. A lot of this back and forth is based on edge use cases, specific problems with specific programs, or just other settings issues causing the problem. After that it's splitting hairs because we are tweakers at heart.

Just set it like you said, 4k, 120hz, 10bit, RGB, Gsync. You'll probably not care about the rest of this one bit. You can try additional tweaks if you actually feel the need.

I played through Shadow of the Tomb Raider right after I got the screen in HDR with those settings (10bit) and it was truly glorioius. I don't remember any banding issues in the sky let's put it that way.

Just remember to use the utility to adjust ClearType to greyscale mode. It makes text look fantastic.
 
Last edited:
Yeah sometimes it can come down to nit picking but there might be some obvious things in some content or configs. There are a lot of settings you can mess with in games, the OS, and on the TV so you'd have to make sure everything is the same.

Some recommended settings here:

https://www.reddit.com/r/OLED_Gaming/comments/mbpiwy/lg_oled_gamingpc_monitor_recommended_settings/

-------------------------------------------------------

Also sit appropriate distance away or the pixel structure will be aggressively blocky, aliased even with AA and text subsampling.. and the viewing angle to the extents will be poor:

60 PPD on a 42" 4k screen = 29.3"
60 PPD on a 48" 4k screen = 33.5" 64 deg viewing angle at 4k


80 PPD on a 42" 4k screen = 41.1"
80 PPD on a 48" 4k screen = 47" 48 deg viewing angle at 4k


These aren't sized suitably for use as up against the wall like a bookshelf / player~upright piano style desk setups. This kind of size demands more of a command center setup.

---------------------------

Are you saying it helps with Banding to do that?

Even though the dithering helped for near black gradient issue, I get severe banding in the sky in Elden Ring. Sometimes I am not sure if it is just the game that is like that.

Are you playing Elden ring in HDR? If you are (and imo you should be, it looks great) - make sure you go into the extended HDR settings in the game and set the peak white to 700 to 800 nit. Or you could set it to 1000nit and let the TV tone map it from there to within the TV's limits. Then adjust the main HDR brightness in the game's HDR settings to taste.

Also make sure you have the TV settings optimal.

If your TV is doing dynamic HDR it will compress the HDR scale more at the top end.
If the Elden Ring game's HDR settings are not mapped to the range of the TV it could also compress the color range and blow out hightlights.

----------------------------------------------------

- Changing the named input to be named PC puts it in 4:4:4/RGB. If it hadn't been on that setting before it should be been pretty obvious in text. It appears tattered when not in RGB mode so I would triple check that. Pulling the hdmi cable out and plugging it back in should drop PC/RGB named setting and make it tattered looking. It also happens in graphics, not just text, but text is more obvious. Setting the name of the hdmi input on the TV to "PC" again will put it back to RGB and remove the tattered look.

https://www.rtings.com/tv/learn/chroma-subsampling

I'll also add a few other things to mess with:

..your resolution in the display settings (there can also be a different TV version of some resolutions in addition to the same PC version of that resolution if you scroll down far enough, also make sure it 's on 3840x2160 4k and not 4096x)
..any scaling options that changed the display scaling from 1:1 should be turned off
..your windows desktop/text scaling settings (e.g. if you set text too small or the default is too small for your ppi it will look bad).

..make sure the new gpu doesn't have sharpness settings enabled by default (I think amd kept a sharpness slider where nvidia dropped it at some point, and their "zero" sharpness might be 10, would have to look it up)
..make sure your tv settings don't have any processing enabled in the details settings (incl sharpness), color settings, etc accidentally somehow. Use HGiG/game mode with no processing.

..That and your viewing distance being too close vs the ppi (getting a bad PPD) will show a more aggravated pixel structure that makes text look worse with more aggressive text fringing/aliasing but I assume you are at the same distance you were at with the nvidia GPU previously.

A good reference for settings here that I linked in this thread previously:

(Reddit) LG OLED gaming/PC monitor recommended settings guide

Some recommended PC gaming HDR settings from that link:

LG TV
=========

PC mode: On
HDMI Ultra Deep Color: On

Reduce Blue Light/Eye Comfort Mode: off
Aspect Ratio: original
Just Scan: On
Energy Saving Step: Off
AMD Freesync Premium: Off (or AMD/XSX = on)
[CX] Instant Game Response (ALLM+VRR) = on
[C1/C2] VRR & G-SYNC = on
[C1/C2] Game Optimizer (ALLM): on
[C1/C2] Game Genre: Standard
Fine Tune Dark Areas: 0 or (-3 to -5)

---------------------------------------

Picture Mode (while already on PC ICON/named PC hdmi Input):
HDR 10 = HDR Game Optimizer (dark room) , HDR Filmmaker (bright room). Dolby Vision Game Optimizer (dark room) , Dolby Vision Cinema Home (bright room)

OLED Pixel Brightness/Light: 100

Peak Brightness: High

Contrast: 50

Black Level/Screen Brigntness: 50

Video Range/Black Level:
HDR 10 = Auto, Dolby Vision = Auto or Limited/Low

Gamma (Adjust Brightness): 2.2 (ST.2084)

Auto Dynamic Contrast:
HDR 10 (dark room) = Off,
HDR 10 (bright room) = Off or Low/Med/High
Dolby Vision (dark room) = Off, Dolby Vision (bright room) = Off or Low/Med/High

HDR/Dynamic Tone Mapping:
HDR 10 (dark room) = HGiG or Off
, HDR 10 (bright room) = on (* brighter but inaccurate and compressed so personally I wouldn't)

Motion Eye Care: Off

Color Depth:
HDR 10 (dark room) = [C1] 55, [CX] 50, HDR 10 (bright room) = 50
Dolby Vision (dark room) = [C1] 55, [CX] 50, Dolby Vision (bright room) = 50

Tint = 0

Color Gamut:
HDR 10 = auto detect
Dolby Vision = Native/Wide (DCP-P3)

Color Temperature: W50/Warm2 or [cool] W25/0/50

C1/C2 Color Adjustment Upgrade: off

Adjust Sharpness: 0

Super Resolution: Off

Noise Reduction: Off

MPEG Noise Reduction: Off

Smooth Gradation: Off

Cinema Screen/Real Cinema: Off

Truemotion/OLED Motion Pro (BFI):
HDR10 =Off
Dolby Vision Dark room = off, Dolby Vision Bright room = Off or Cinematic Movement

Auto Genre Selection = Off

AI Picture Pro = Off

AI Brightness Settings/Control:
HDR 10 (dark room) = Off
, HDR 10 (bright room) = On
Dolby Vision (dark room) = Off, Dolby Vision Bright room = On (DV IQ)

Nvidia GPU settings
=================

Resolution PC: 3840x2160
Resotluion: Customize
-Enable resolution not exposed by display: On
-Create Custom Resolutions: 3840x1620 120hz

Refresh Rate: 120 Hz [48 Gbps FRL]

NVIDIA Image scaling = on or off
scaling aspect ratio = no scaling
perform scaling on: display
override the scaling mode set by games: on
Output color depth : 12 bpc

(optional) GeForce Experience GFE Freestyle (Alt+F3) -> styles -> Add Filter: Deband

* apparently there is a similar shader in reshade but be careful of online bans in some games when using shader apps

Output color format: RGB
Output dynamic range: Full


Enable G-SYNC Compatible for: Full screen mode

(Optional) NPI - GSYNC Application Mode: Fullscreen and Windowed

Low Latency Mode: Ultra
Max Frame Rate: 118fps

Monitor Technology: G-SYNC Compatible
Power Management Mode: Prefer maximum performance
Preferred refresh rate: Highest available
Triple buffering: Off
Vertical sync: On
 
Last edited:
does anyone have a breakdown of what the input type differences actually do? (i.e. 'pc' 'game console' etc)

also does enabling the freesync premium setting do anything at all if the tv is only connected to an nvidia card? i assumed it doesn't make a difference..
 
Those recommended settings on reddit say to turn off AMD freesync premium if using nvidia..

excerpt from the settings I put in quotes in my previous comment:

LG TV
=========

PC mode: On
HDMI Ultra Deep Color: On
Reduce Blue Light/Eye Comfort Mode: off
Aspect Ratio: original
Just Scan: On
Energy Saving Step: Off
AMD Freesync Premium: Off (or AMD/XSX = on)
[CX] Instant Game Response (ALLM+VRR) = on
[C1/C2] VRR & G-SYNC = on
[C1/C2] Game Optimizer (ALLM): on
[C1/C2] Game Genre: Standard
Fine Tune Dark Areas: 0 or (-3 to -5)


I don't know if it would have a detrimental effect or matter but that write up of recommended settings is pretty thorough and reviewed and honed by comments from others in the thread. The recommended settings say to turn it off when using a nvidia gpu and on for amd/xboxseriesx. According to the footnotes there:

"Enabling AMD FreeSync Premium will force XSX/Adrenalin to use FreeSync over HDMi instead of HDMI Forum VRR."


------------------

Named PC hdmi input
=====================

You have to go into the home settings and edit the named input ICON itself, not just the name of the hdmi input, to PC in order to get RGB~444 chroma. If the hdmi cable ever loses connection it can revert back out of PC mode which will compromise text making it look tattered on the edges and will also be detrimental to a lesser degree on graphics fidelity so keep an eye out for that.

From RTings youtube vid " ... /FtJnLPxVtjI?t=775 "

vCRvR7D.png


--------------------------

Picture Mode:
==============

LG-CX_game-mode-menu_1.jpg


As for the picture mode, Game Optimizer or HDR game Optimizer (C1) or Game Mode (CX) will remove processing so that you will get the least input lag -

- but to me, as long as you have all of the interpolation and any other enhancement settings manually turned off on the mode, the other headings are usable in games. The most likely scenario is using most of the other modes for movies and vids and a dim mode - so I'd only zero one other heading extended features wise and customize it for an alternate "game mode" for testing purposes settings wise.
Instant game response should override other modes and swap it to the actual game mode automatically anyway initially though, and back when out of the game, if you have that enabled on the hdmi input (see next image below). I use HDMI eARC out which I think is still incompatible with instant game response hdmi auto-sensing - so most of the time I keep my PC rig's 48CX PC hdmi input on game mode manually and leave it there. If I want to watch movies/vids fullscreen rather than in a desktop window I'll switch to my nvidia shield hdmi input via the remote, with non-game picture modes used on that input. You can also use the web OS with youtube, netflix, amazon video, emby/plex , etc. I think the WebOS will be considered a different input like a whole new hdmi input with it's own picture mode settings.

Any of the named settings can be customized so you can make them your own for whatever kind of material you want them to be used for. They are just a good starting point but you can zero every single heading within any of them and start over if you wanted to. For example, I changed a lot of the vivid settings so it's not extreme nor dynamic. I use vivid mode for anime/cell-shaded video material that I still want to have a little more pop/saturation on along with different motion interpolation settings for it's low frame rate animation (which otherwise can cause some weird artifacts). I use APS as a dim screen usage scenario and a few other picture modes for movies, bright room, dark room. Note that the HDR and DV picture modes won't show up for editing unless you are running that material on the TV at the time.

Some people complain that game mode is dull and lifeless (un-saturated color wise). I just turn up the color saturation in the extended color settings on that game picture mode considerably to make it more vivid. [While in Game Picture mode: Picture -> Picture Mode Settings (press pointer on the heading) --> Color slider]. I can further adjust more or temper it down to less on a per game basis using reshade or nvidia freestyle until it looks great for that particular game style. So I keep the OSD's game mode color setting slightly high as a starting point. That is for SDR games. HDR/DV has it's own scale of color volume that kicks in automatically (and is well saturated and luminous) so it's not necessary for HDR games. My color setting adjustment is ignored by them as it switches to a different HDR gaming picture mode automatically when playing those games.



------------------------------

From RTings review of the C1:
"
The LG C1 has a very low input lag as long as it's in 'Game Optimizer' mode. For low input lag with chroma 4:4:4, the input icon has to be changed to 'PC'.

There's a new setting for 2021 models found in the Game Optimizer menu, called Prevent Input Delay. There are two options: 'Standard' and 'Boost'. We ran several input lag tests and found that the 'Boost' setting consistently lowers the input lag by about 3 ms when the TV is running 60Hz compared to the LG CX OLED. It works by sending a 120Hz signal to refresh the screen more frequently, meaning it doesn't affect 120Hz input lag. The published results are what we measured using the 'Boost' setting. On 'Standard', we measured 13.1 ms for 1080p @ 60Hz, 13.4 ms for 1440p @ 60Hz, and 13.0 ms for 4k @ 60Hz."

<Edit by elvn>: According to HDTVtest videos you shouldn't enable the boost settings unless using 60hz games/consoles because it can actually make the lag worse for 120hz gaming. It's basically duplicating the 60fpsHz frames into the 120fpsHz range.

Rtings on CX:

"The LG CX Series has really low input lag in Game Mode, and it stays low with VRR enabled, which is great for gaming. It also stays low with a 4k resolution, which makes it a good choice for Xbox One X or PS4 Pro owners."

1080p with Variable Refresh Rate

5.9 ms

1440p with VRR

6.2 ms

4k with VRR

5.9 ms

Instant Game Response (https://www.rtings.com/tv/reviews/lg/cx-oled/settings )

LG_CX_2_instant-game-response_cropped-1.jpg




-----------------------------------------------------


HGiG / CRU
===========

"HGiG tone maps to actual max TML of panel and will most accurately display a tone mapping curve set in-game or on system level."

"For Windows 11, you can use CRU to set tone mapping curve [Edit.. CTA-861 > HDR Static Metadata], otherwise Windows uses maxTML/maxFFTML/minTML 1499/999/0 nits. Quickly toggle HDR in Windows 11 using Win+Alt+B. "

Auto HDR (Windows 11/XSX) converts applicable SDR games to HDR10. (edit by elvn: sort-of. It ends up with a % of the highlights, not full HDR range)

DTM Off tone maps to maxTML 4000nits.
DTM On tone maps on a per frame basis. <---- (edit by elvn: the regular DTM, rather than HGiG, is guesswork/interpolation so it does so inaccurately, causing greater compression on the top end which loses color gradations/details and could cause side effects (perhaps even banding)).

Dolby Vision for gaming enables native DV games and converts all HDR10 games to DV set in-game maxTML to 1000nits.

LG_CX_2_HGiG_medium.jpg


That reddit recommending setting guide's readout post CRU edit to 800nit on windows 11:

https://www.reddit.com/r/OLED_Gaming/comments/mbpiwy/lg_oled_gamingpc_monitor_recommended_settings/

 
Last edited:
I just started Horizon Forbidden West and followed this guide:



But honestly I am not liking the picture quality at all using these settings. This is reminding me of just how much of a crapshoot HDR gaming can be...it shouldn't take an expert using some fancy monitor with heat maps or w/e to determine luminance values and how to get a good HDR picture. Does anyone have any advice on making the HDR on HFW look great? Even if it's not the most ACCURATE picture, I'll take something that looks better and more inaccurate vs something that looks more dull and accurate.
 
From what I understand, usually the problem is that the peak nits is wrong so the whole scale the game is mapping to is out of sync. Elden ring allows you to set the peak white and then adjust the brightness slider separately so it seems a lot more flexible between my screens. I set the peak to 800 nit for my oled and 500 for my 500nit ips laptop.

The peak nit defining the scale being wrong can end up with a really bad result. At worst, either dull or white-washed out. Compressed and lost color detail at the top end at best. That's why monstieur kept talking about using HGiG and using CRU to set the peak yourself in regard to your gpu to that particular monitor, so that HDR and auto HDR would have the correct HDR range mapped. e.g. 800 nit manually for the CX instead of windows defaulting to 1400 or 4000nit.

For SDR gaming - I personally go into the color settings of the game picture mode and turn the color slider up until it's more saturated. Then in SDR games I use reshade or freestyle to adjust it from there. HDR has it's own game picture mode with extended settings you could bump up too. I haven't found the need to do that on HDR games but I wouldn't be opposed to trying it if I had a problem with a game being under-saturated. Personally I wouldn't just leap to regular DTM mode because it's seems more inaccurate than bumping up the color (saturation) slider a little.

NWuFr2P.png


------------------------------------

Here is the CRU info again from that settings thread I posted a few back. Use at your own risk though:

https://www.reddit.com/r/OLED_Gaming/comments/mbpiwy/lg_oled_gamingpc_monitor_recommended_settings/

618274_LG-CX_CRU.HDR.ToneMap.settings_1.jpg




And the result:

618228_DOxvZ0C.png
 
From what I understand, usually the problem is that the peak nits is wrong so the whole scale the game is mapping to is out of sync. Elden ring allows you to set the peak white and then adjust the brightness slider separately so it seems a lot more flexible between my screens. I set the peak to 800 nit for my oled and 500 for my 500nit ips laptop.

The peak nit defining the scale being wrong can end up with a really bad result. At worst, either dull or white-washed out. Compressed and lost color detail at the top end at best. That's why monstieur kept talking about using HGiG and using CRU to set the peak yourself in regard to your gpu to that particular monitor, so that HDR and auto HDR would have the correct HDR range mapped. e.g. 800 nit manually for the CX instead of windows defaulting to 1400 or 4000nit.

For SDR gaming - I personally go into the color settings of the game picture mode and turn the color slider up until it's more saturated. Then in SDR games I use reshade or freestyle to adjust it from there. HDR has it's own game picture mode with extended settings you could bump up too. I haven't found the need to do that on HDR games but I wouldn't be opposed to trying it if I had a problem with a game being under-saturated. Personally I wouldn't just leap to regular DTM mode because it's seems more inaccurate than bumping up the color (saturation) slider a little.

View attachment 453762

------------------------------------

Here is the CRU info again from that settings thread I posted a few back. Use at your own risk though:

https://www.reddit.com/r/OLED_Gaming/comments/mbpiwy/lg_oled_gamingpc_monitor_recommended_settings/

View attachment 453763



And the result:

View attachment 453764

CRU isn't going to do anything here because Horizon Forbidden West is a PS5 game lol. Unless it somehow modifies my actual TV for ps5 use?
 
Ah, I know Horizon was released on PC. I didn't realize the expansion wasn't. I don't have a PS5. You could still trying bumping up the HDR Game Picture Mode's color slider a little for that game and see how it looks.

618610_NWuFr2P.png




Worst case turn on dynamic color or DTM or something but I'd do that as a last resort.

The notes at the bottom of that guide have this for PS5 but there are overall TV OSD settings in the main chart too.

PS5​


  • VIDEO OUTPUT SETTINGS
    • VRR [TBD]
    • Resolution: Auto
    • 4K Video Transfer Rate: Auto [0 (RGB) not supported at 4k120 due to 32 Gbps FRL.]
    • HDR: On When Supported
    • Adjust HDR: Adjust tone mapping curve for HGiG. Ignore console's instructions to make icons "barely visible." Increase screens 1/3 (maxFFTML) and 2/3 (maxTML) until sunbursts clip then stop. Decrease screen 3/3 (minTML) to lowest value.
    • Deep Color Output: Auto
    • RGB Range: Auto
    • Enable 120 Hz Output: Auto
  • AUDIO OUTPUT
    • Output Device: HDMI Device (TV)
    • HDMI Device Type: TV
    • Audio Format (Priority): Bitstream (Dolby/DTS) or Linear PCM
    • Number of channels: 5.1/7.1 ch [Choose for LPCM.]
 
Last edited:
I just started Horizon Forbidden West and followed this guide:



But honestly I am not liking the picture quality at all using these settings. This is reminding me of just how much of a crapshoot HDR gaming can be...it shouldn't take an expert using some fancy monitor with heat maps or w/e to determine luminance values and how to get a good HDR picture. Does anyone have any advice on making the HDR on HFW look great? Even if it's not the most ACCURATE picture, I'll take something that looks better and more inaccurate vs something that looks more dull and accurate.

I honestly just left the game's menus at default settings and simply turn "favor performance" mode on because 60 fps > 30 fps. To me the game looks great like that on my LG C9 in the living room.

Excellent game too. I have put off playing Elden Ring on PC because I've been having such a good time with Forbidden West and can wait for ER to get patched more.
 
I just started Horizon Forbidden West and followed this guide:



But honestly I am not liking the picture quality at all using these settings. This is reminding me of just how much of a crapshoot HDR gaming can be...it shouldn't take an expert using some fancy monitor with heat maps or w/e to determine luminance values and how to get a good HDR picture. Does anyone have any advice on making the HDR on HFW look great? Even if it's not the most ACCURATE picture, I'll take something that looks better and more inaccurate vs something that looks more dull and accurate.

I’ve been playing with Vincent’s settings since he released that video and to my eyes it looks good. I think part of the problem is HFW is so overblown at default settings that turning them down to Vincent’s settings looks muted, in comparison. I suggest playing with his settings for a while and inching it up from there, per what looks good to your eyes.
 
I’ve been playing with Vincent’s settings since he released that video and to my eyes it looks good. I think part of the problem is HFW is so overblown at default settings that turning them down to Vincent’s settings looks muted, in comparison. I suggest playing with his settings for a while and inching it up from there, per what looks good to your eyes.

I did some more digging around forums and youtube and found this video:



Both of the alternatives to using HGiG with -8/9 highlights and -2 brightness look better to me. It may not be the most accurate picture but I really don't give 2 hoots about accuracy in games, I just want the best looking picture to my eyes even if it's inaccurate (I always use reshade/color vibrance on PC games which already makes the colors inaccurate). I'm sticking with the DTM On option for now.
 
As bummed as I am about this still not being fixed, at least Nvidia put this on their "Open Issues" list for their latest drivers (512.15):
Club 3D CAC-1085 dongle limited to maximum resolution of 4K at 60Hz. [3542678]
Guess I'll need to leave my work partition stuck at 511.23 for now.
 
As bummed as I am about this still not being fixed, at least Nvidia put this on their "Open Issues" list for their latest drivers (512.15):
Club 3D CAC-1085 dongle limited to maximum resolution of 4K at 60Hz. [3542678]
Guess I'll need to leave my work partition stuck at 511.23 for now.
They've broken it again?!
 
5xx.xx series drivers working fine with Uptab DP to HDMI2.1 adapter. 4K @120Hz, Full RGB/4:4:4 chroma, 10-bit or forced 12-bit color, vanilla HDR, auto HDR in Win11 and MPC-BE, etc. Only thing missing still is VRR and I havent missed it in a while.
 
5xx.xx series drivers working fine with Uptab DP to HDMI2.1 adapter. 4K @120Hz, Full RGB/4:4:4 chroma, 10-bit or forced 12-bit color, vanilla HDR, auto HDR in Win11 and MPC-BE, etc. Only thing missing still is VRR and I havent missed it in a while.

I would love to have vrr. My highly OC'd 2080Ti can only do so much.
 
So therefore I do not have VRR.

You didn't say you would love to have VRR at 120Hz HDR, just that you would love to have VRR in general ;)

EDITL: Ah, I replied without reading your quoted post first, my bad. Anyways with GPU availability improving a ton recently I'm sure you'll be able to get a 30 series card soon enough. 40 series card will also probably be easier to obtain I hope.
 
Last edited:
Hopefully Nvidia gimps crypto mining with the 4xxx series enough to make the cards unattractive to miners and creates a proper ordering queue to allow people to pickup these cards at MSRP and before the decade is out.

Based on the latest promo-leaks, a low end 4xxx should be able to rival 3080 level 4K performance, while a mid-tier 4xxx will easily outperform a 3090... here is to hoping to that TDP not reaching nuclear power plant levels.
Maybe time for a PCI-E spec revision or interface replacement.
 
Hopefully Nvidia gimps crypto mining with the 4xxx series enough to make the cards unattractive to miners and creates a proper ordering queue to allow people to pickup these cards at MSRP and before the decade is out.

Based on the latest promo-leaks, a low end 4xxx should be able to rival 3080 level 4K performance, while a mid-tier 4xxx will easily outperform a 3090... here is to hoping to that TDP not reaching nuclear power plant levels.
Maybe time for a PCI-E spec revision or interface replacement.

Pci-e spec revision isn't going to change power requirements, if thats what you're insinuating.
The only thing that will change power requirements is finding new semiconductor materials and transistor designs to reduce hysteresis and leakage. Or to completely change how to build processes all together. Maybe move to light/photon?
It's all math, and math requires lots of transistors to do quickly do the scale needed for graphics work.

Anyway, yes. I hope to see some cheaper GPU's here soon. It's been two years since I even attempted to buy one.
 
Hopefully Nvidia gimps crypto mining with the 4xxx series enough to make the cards unattractive to miners and creates a proper ordering queue to allow people to pickup these cards at MSRP and before the decade is out.

Based on the latest promo-leaks, a low end 4xxx should be able to rival 3080 level 4K performance, while a mid-tier 4xxx will easily outperform a 3090... here is to hoping to that TDP not reaching nuclear power plant levels.
Maybe time for a PCI-E spec revision or interface replacement.

Wishing for a company to gimp a product's capabilities is insanity. It's never the right solution.
 
Yeah I think it would be better if they had 1 per table kind of thing for the first 4 - 6 months of release with a well vetted registry of people + registered PC/phone/OS, gpu sold direct from mfg at MSRP, with a NDA type of contract or lease it contracted from the mfg initially .. where you couldn't resell it for 6 - 8 months or some similar solution. but I think they went the other way and did bulk orders to mining operations, resellers to bulk mining operations, + bots and scalpers.. If people couldn't buy a car because people were running warehouses of them up on blocks because it was the only viable way to generate mariocoins and WoW gold 24/7 I think the government might have to be involved at some point. But these aren't cars or furnaces or even air conditioners... which can all be life essential, or houses.. they are gpus. That's not really in the scope of this thread either.
 
Last edited:
Wishing for a company to gimp a product's capabilities is insanity. It's never the right solution.

They've done it with cgi dev versions of the gpus with the Quadros. They've also kept cuda and g-sync in house software wise so you couldn't run it on amd. They also didn't allow some software/driver features or capabilities of their newer gen gpus to work on their previous gpu gens even when it wasn't a hardware limitation if I'm remembering correctly. Windows OS generations do similar with some things.
 
Last edited:
They've done it with cgi dev versions of the gpus with the Quadros. They've also kept cuda and g-sync in house software wise so you couldn't run it on amd. They also didn't allow some software/driver features or capabilities of their newer gen gpus to work on their previous gpu gens even when it wasn't a hardware limitation if I'm remembering correctly. Windows OS generations do similar with some things.

It's all bad. Pointing out terrible behavior in other instances doesn't make this the right way to go.
 
Pci-e spec revision isn't going to change power requirements, if thats what you're insinuating.
The only thing that will change power requirements is finding new semiconductor materials and transistor designs to reduce hysteresis and leakage. Or to completely change how to build processes all together. Maybe move to light/photon?
It's all math, and math requires lots of transistors to do quickly do the scale needed for graphics work.

Anyway, yes. I hope to see some cheaper GPU's here soon. It's been two years since I even attempted to buy one.
Optical computing has the major downside of needing to generate Photons, which is *very* power intensive.
 
They've done it with cgi dev versions of the gpus with the Quadros. They've also kept cuda and g-sync in house software wise so you couldn't run it on amd. They also didn't allow some software/driver features or capabilities of their newer gen gpus to work on their previous gpu gens even when it wasn't a hardware limitation if I'm remembering correctly. Windows OS generations do similar with some things.
In the case of CUDA (or any technology built on it, like PhysX) there's nothing preventing AMD from creating it's own implementation by reverse-engineering the API. Or they could just pay NVIDIA a fee to license it.

As for Windows, it actually bends over backwards to try and maintain compatibility. Pretty much everything not a DOS executable or built on Win16 still runs, since all the necessary API calls still function. I've still got a very early Win95 game that runs fine on either Glide (via wrapper), OpenGL 1.1, or Direct3D 5; all modes work as expected. Generally, the only things that break rely on the compositor to act a certain way (Diablo is an excellent example), and those can generally be fixed with a few tweaks.
 
AMD does have it's own cuda like thing I think but I believe the devs of creator suites focus more on cuda benefits with their time/money.

They have openCL but they are starting another open source project.

AMD launches GPUFORT, an open-source attempt against NVIDIA's CUDA
https://www.notebookcheck.net/AMD-l...e-attempt-against-NVIDIA-s-CUDA.571133.0.html

From a 2018 reddit post about openCL
On an NVIDIA box I can download and install the CUDA SDK and be up and running with built-in Visual Studio integration in minutes. Host and device code can be in the same file. PTX instructions can be inlined or accessed via intrinsics. Not to mention excellent documentation and sample codes.
On the AMD side we have OpenCL, which is lacking in so many areas. The optimizer is poor. Device code cannot exist with host code. There is no access to AMD-specific features except through extensions.
Or ROCm, which is only available on Linux and has been a nightmare to get working and the documentation is HORRIBLE.
NVIDIA is winning in GPGPU because they make development easy while giving full power to developers.
Where is AMD's one-click install SDK that unleashes the full potential of their hardware?


You could pay a fee to license mining functions too I guess if you want to go the route of paying for functionality but I don't think that would solve the gpu availability and pricing problem.

As for windows (and nvidia) I wasn't talking about backwards compatibility. Completely the opposite - forwards.
There have been things that wouldn't work in the previous iteration of windows while the next version supported it. The same for nvidia gpus. That even when some features weren't necessarily a hardware limitation for nvidia or a major software hurdle for windows.
 
Back
Top