Active signal resolution stays at 4k, need 1080/1440p

mike6289

n00b
Joined
Oct 2, 2016
Messages
28
I'll try to keep this as brief as possible. Hardware involved: HTPC with 3080/1030, Sony A8G TV

I browse the internet, play games, and watch TV/movies with the setup.

The problem is: in Windows Settings > System > Display, the "Advanced display settings" page shows that the the "Desktop resolution" is not the same as the "Active signal resolution".
To get there you right click on the desktop, select "Display settings", then clicking "Advanced display settings" under the grey button labelled "Detect".

My active signal resolution basically wants to stay at 4k (the TV's native resolution), while the desktop resolution changes to what I set it to in the "Display settings" page (or even the nVidia control panel)

Why is this a problem? Two reasons:
First is, instead of enjoying 4:4:4 chroma at 1080/1440p HDR, I'm forced to endure 4:2:2.
Second reason is the upscaler in my TV is better than the one in my video card, so things aren't as sharp as they should be.

The reason I get 4:2:2 is because HDMI 2.0 only has the bandwidth for up to 4:2:2 at 3840x2160 10/12 bit colour at 60hz. Since 4k is the forced signal resolution for my 1080p/1440p HDR settings, I get 4:2:2 1080p/1440p HDR. Unless I want to drop to 30hz, which I don't.

In the nVidia control panel, there is "Adjust desktop size and position" with a tab called Scaling. Options are
1. Aspect ratio
2. Full-screen
3. No scaling
Picking 3 has no effect.

There's also a "Perform scaling on" and you can choose "Display" or "GPU". Choosing display also has no effect...


My main card is a 3080, it's out of service at the moment, so I'm using a 1030.

I've tried so many things. At this point I'm thinking it must be able to be done in the registry somehow, but that's beyond my skill level
 
When it comes to networking, the term here is "negotiation."
Hopefully that word helps you find more information.
 
Damnnnn, this is smooth! Going to make gaming great. And scrolling. And moving the mouse lol
 
I came back to this thread to update it with some answers. The solution is for everyone. There's also specific information on the TVs previously mentioned - known as Sony A8G/AG8 (OLED) and X900F/XF900 (LCD). Setting custom resolutions and having custom resolutions take is discussed. And discussing 1080p240 on OLED. The 2560x1440 60hz both Desktop and Signal resolution fix is at the very bottom.

These TVs are quite different from one another, but they share the X1 Extreme image processing chip by Sony, which spine has 1080p 240hz running through it

To begin:
Rtings measured the input lag of my OLED TV (A8G) for 1080/1440p signals at 120hz to be: 21ms
Rtings measured the input lag of spine's LCD (X900F) for 1080/1440p signals at 120hz to be 13ms
My eyes and brain measured the input lag of my OLED TV for 1080p signal at 240hz to be down by approximately 5 milliseconds compared to 120hz.
So about FIFTEEN (15) milliseconds now! Wooo!
And, since the TVs share the same processor, it's reasonable to assume that the same thing would happen to the LCD
Only EIGHT milliseconds on it at 240 hertz! Great! Wooo!

Problem though, why in the first place is the LCD TV's input lag is lower than the OLEDs? 13ms vs 21ms?
How big of a problem is this?


The X1 Extreme chip is designed to receive video over HDMI, processes it, and then output it. It was in the 2016 and 2017 flagship TVs, and used in premium 4k TVs in 2018 and 2019. I looked at reviews of these TVs and compared their input lags at different resolutions and refresh rates. It became very obvious that the OLED TVs have a 7ms input lag handicap compared to the LCDs. Possible contributing factors are:
Typically pixels in a display consist of 3 subpixels. The OLED TV displays currently available (all made by LG) have a 4th subpixel. It exists to increase the peak brightness from ~400 nits to ~800 nits, and to extend the product's life from what would definitely not be any more than 10,000 hours, up to a somewhat acceptable 30,000 hours at medium brightness levels. Yes, it'd be that bad without them.

Anyway, to differentiate:
LCD subpixels think of them as passive, like a window blind - They work by blocking some, none, or all of the light (of its colour) from behind it. And they're not the sun.
OLED subpixels are active - light is emitted to the degree required

The LCD display's layer of powered transistors in the X900F are "driven" by the output of the X1 Extreme chip.

The OLED subpixels, being active, require quite a bit more power than the LCD subpixels - so the output of the X1 chip needs amplification. It first needs to be inverted though, because brighter OLED requires more power, not less power like an LCD requires to block less. Also, the brighter an OLED is run, the higher emitted heat is, as a percentage of total power used. The relationship is close to linear through most of the usable range, but at a point heat increases drastically. This point and above it must be avoided for extended periods.
There's also an extra subpixel which needs to be driven.

This second chip does this conversion. Let's call this the "LCD to OLED conversion chip" : "LCLED"
I think LG makes the LCLED chip, and Sony is making their own for this years coming flagship, the A90J

LCLED does:
Brightness separated from colour so that the 4th white subpixel can be used without negatively affecting image quality.
Inversion/Mapping
Another stage of amplification.

Anyway,

I believe that regardless of the input (whether it's analogue to digital to analogue, or digital to digital to analogue, the output of this LCLED chip is always 3840x2160 at 120hz 10 bit for LG's OLEDs, unless VRR is in use.
I have read that changing the refresh rate of an OLED has an impact on gamma, negatively affecting image quality. G-Sync and similar technologies had issues because of it. For gamma problems, 24hz to 120hz swings are a big deal apparently. So consider, for the average consumer with a possible preference for watching 24fps movies and 120fps motion interpolated TV shows/documentaries, sending 24fps movies at 120fps with each frame duplicated 5 times seems to be a reasonable way to play back blu-rays - it avoids changing the actual refresh rate of the panel, and doesn't really create an issue elsewhere.

Except for me. And you if you were thinking your A1E/A8F/A8G might do 240fps at 1080p. I appreciate shaving 5ms off the input lag though :)


The way I got active signal resolution to match desktop resolution was adding 2560x1440 at 60hz with RB/RB2 in CRU.exe as a default supported resolution in the main supported resolutions box (top of window), and adding 3840x2160 at 67hz as a custom resolution in the nVidia control panel. I left the resolution at 3840x2160 at 60hz and restarted my computer. Then I set my resolution to 3840x2160 and 67hz in the nVidia control panel. Then I set it to 2560x1440 at 60hz. It worked! Then I set it to 12 bit colour/Full range/RGB in the nVidia control panel. Then I changed it to 4k60 right clicking the desktop/display settings method. Then I changed it back to 2560x1440 at 60hz the same way. It automatically went to 12 bit colour/Full range/RGB. Now whenever I set the resolution from anywhere - windows or nVidia - even from 4k - I get the desktop and signal resolution matching at 2560x1440. Setting the 12 bit colour makes it stick - 4k 12 bit 60hz doesn't fit through HDMI 2.0, so it's just a way to ensure that Windows chooses to lower the signal resolution.

*Italicized, I'm not sure where I set the resolution (Windows or nV control panel). I think it was the nV control panel. If it doesn't work, but it works following everything above using Windows for that step, please let everyone know with a reply in this thread. To restart you have to remove all custom resolutions from the nV control panel and reset the resolutions in CRU.exe and restart the computer.
 
Last edited:
Back
Top