So I have an Asus Tuf Gaming edition 3080 hooked up to my OLED TV, a Sony A8G.
The TV doesn't support HDMI 2.1, but fortunately for me I want 120fps gaming, and the card is only really capable at 1440p using the nicer graphics settings in modern games.
We know of the bandwidth limitations of HDMI 2.0 - it's stated as 18.0gbps, 14.4 for video (with 8/10b encoding)
I don't know the exact math on how things are calculated, but 4k60 (8,294,400 million pixels per frame, 60 times a second is 497,664,000 pixels per second, each with 32 bits per pixel is 12,740,198,400bps, which is very close to the 12.54 Gbit/s stated on wikipedia for 4k60 8 bit colour (after dividing by 1024)
Now, on Wikipedia, it's stated that 1440p at 10 bit colour (which I really want over 8 bit because of banding) requires 14.49 Gbit/s. We have 14.40. It says on the page you need to use 4:2:2 colour to stay within available bandwidth. Now, I've tried to set it, but the colour isn't taking. I do set 4:2:2 successfully, but afterward the drop down menu for colour depth only has 8 bit as an option.
I really want 4:4:4 for clear text, and I found that there's a way to reduce video bandwidth a bit with a mode called "CVT reduced blank" - after some math I found only takes 97.11% of the bandwidth of the default mode, making the rate 14.07gbps for 10 bit colour.
I made a custom resolution with the reduced blank CVT mode of 2560x1440 @120hz, and it runs in 8 bit colour no problem (normal mode works too), but when I go to change the bit depth to 10, the drop down menu only contains the 8 bit colour option. It's extremely frustrating because even 4:2:2 and 4:2:0, where the bandwidth requirement is WAY under what's available, I still can't select 10 bit. It'll do 10 bit and even 12 bit at 30hz, but the problem is with the "active signal resolution" forcing itself to 4k with a selected "desktop resolution" of 1440p. I want the TV to scale the video signal because it does a much better job than the card (the higher end Sonys are known for having the best upscaling - when I feed 1080 and 1440 signal with the "active signal resolution" actually being 1080 and 1440 like the desktop resolution, the TV truly almost looks like a native 1080 or 1440p display. Fonts are sharp and not overly bold. It's likely noticeably better than LGs TVs, making my A8G a better display for high frame rate gaming with a 3080 than even LG's CX. As long as I can get these colour settings to take. Otherwise the TV is just a bit sharper, but with more banding.
I should add, I can't even select 10 or 12 bit colour at 1080p 120hz, so I have a bit of a feeling this isn't bandwidth related. The TV supports 1080p 60hz at both 10 and 12 bit colour 4:4:4 (SDR and HDR), but I can't get 1440p at 60hz to do 10 or 12 bit colour because windows wants the "active signal resolution" to be 3840x2160, and HDMI 2.0 doesn't do 4k60fps at anything but 8 bit colour.
So, does anyone know why I don't have the menu options available and how to go about enabling them? Is this a driver issue? I really, really don't want 8 bit HDR.
And how do I make it so in Windows 10 the active signal resolution always matches the chosen desktop resolution? For me and all my other systems they've always been the same as each other. It's very important because of the horrible built in scaling the 3080 seems to have (at least compared to my TV), and making 10/12 bit colour unavailable even at 1440p 60hz because the signal resolution is 3840x2160 (beyond HDMI 2.0 spec)
The TV doesn't support HDMI 2.1, but fortunately for me I want 120fps gaming, and the card is only really capable at 1440p using the nicer graphics settings in modern games.
We know of the bandwidth limitations of HDMI 2.0 - it's stated as 18.0gbps, 14.4 for video (with 8/10b encoding)
I don't know the exact math on how things are calculated, but 4k60 (8,294,400 million pixels per frame, 60 times a second is 497,664,000 pixels per second, each with 32 bits per pixel is 12,740,198,400bps, which is very close to the 12.54 Gbit/s stated on wikipedia for 4k60 8 bit colour (after dividing by 1024)
Now, on Wikipedia, it's stated that 1440p at 10 bit colour (which I really want over 8 bit because of banding) requires 14.49 Gbit/s. We have 14.40. It says on the page you need to use 4:2:2 colour to stay within available bandwidth. Now, I've tried to set it, but the colour isn't taking. I do set 4:2:2 successfully, but afterward the drop down menu for colour depth only has 8 bit as an option.
I really want 4:4:4 for clear text, and I found that there's a way to reduce video bandwidth a bit with a mode called "CVT reduced blank" - after some math I found only takes 97.11% of the bandwidth of the default mode, making the rate 14.07gbps for 10 bit colour.
I made a custom resolution with the reduced blank CVT mode of 2560x1440 @120hz, and it runs in 8 bit colour no problem (normal mode works too), but when I go to change the bit depth to 10, the drop down menu only contains the 8 bit colour option. It's extremely frustrating because even 4:2:2 and 4:2:0, where the bandwidth requirement is WAY under what's available, I still can't select 10 bit. It'll do 10 bit and even 12 bit at 30hz, but the problem is with the "active signal resolution" forcing itself to 4k with a selected "desktop resolution" of 1440p. I want the TV to scale the video signal because it does a much better job than the card (the higher end Sonys are known for having the best upscaling - when I feed 1080 and 1440 signal with the "active signal resolution" actually being 1080 and 1440 like the desktop resolution, the TV truly almost looks like a native 1080 or 1440p display. Fonts are sharp and not overly bold. It's likely noticeably better than LGs TVs, making my A8G a better display for high frame rate gaming with a 3080 than even LG's CX. As long as I can get these colour settings to take. Otherwise the TV is just a bit sharper, but with more banding.
I should add, I can't even select 10 or 12 bit colour at 1080p 120hz, so I have a bit of a feeling this isn't bandwidth related. The TV supports 1080p 60hz at both 10 and 12 bit colour 4:4:4 (SDR and HDR), but I can't get 1440p at 60hz to do 10 or 12 bit colour because windows wants the "active signal resolution" to be 3840x2160, and HDMI 2.0 doesn't do 4k60fps at anything but 8 bit colour.
So, does anyone know why I don't have the menu options available and how to go about enabling them? Is this a driver issue? I really, really don't want 8 bit HDR.
And how do I make it so in Windows 10 the active signal resolution always matches the chosen desktop resolution? For me and all my other systems they've always been the same as each other. It's very important because of the horrible built in scaling the 3080 seems to have (at least compared to my TV), and making 10/12 bit colour unavailable even at 1440p 60hz because the signal resolution is 3840x2160 (beyond HDMI 2.0 spec)
Last edited: