10 bit/HDR on Windows. Is it Fixed? Nvidia?

On my set (a Samsung KS8000), RGB Full 8-bit and YCBr 4:2:2 12-bit seem to have roughly the same color space as far as I can tell. At the very least that seems to be the case with all of the color and black matching tests I can find. I usually just leave it on RGB since it defaults to that anyway. If there's a visual difference, it's beyond what my eyes can pick up or I don't know what I'm supposed to be looking for.

Although if I set mine up as RGB, my TV will always default the PC input no matter what. Even if I try to swap it to something else, it automatically goes back. The exception is if HDR content is playing. If it is, I can swap to whatever I feel like. If I use YCBr, I can change settings at will, although I don't really need to or want to.

What I did was to make me "Game Console" input setting be for my HDR color settings. That way it also cuts down on some additional input lag that HDR causes. It's a handy trick so I can now toggle over to HDR via a single button.
Very similar with my Samsung QE55Q9FN, there is no point using ycbcr, RGB is great no matter what.
I'm lucky with changing the input type, it automatically knows I am using a PC and sets it to "PC" mode. But it stays as "Blu Ray Player" once set. And I dont need to use game mode, even with HDR, the lag is always really low.
I'm not sure if RTings need their review updating because they use game mode to get the faster lag readings, I dont notice any lag issues.
The great thing about not using game mode, it allows the use of "HDR+" (fake HDR) on none HDR games etc, it looks really good!
HDR+ is greyed out in game mode.

These Sammys are all great displays/TVs.
 
That should be fine. I had to set the TV to Game Console and tweak a few settings on the TV, but now everything is perfect.

SDR content actually looks better now too, I'm so glad this thread came around because now I can enjoy HDR without having to switch settings every time.

Wait lol, what changes did you make that made SDR look better?
 
On my set (a Samsung KS8000), RGB Full 8-bit and YCBr 4:2:2 12-bit seem to have roughly the same color space as far as I can tell. At the very least that seems to be the case with all of the color and black matching tests I can find. I usually just leave it on RGB since it defaults to that anyway. If there's a visual difference, it's beyond what my eyes can pick up or I don't know what I'm supposed to be looking for.

Although if I set mine up as RGB, my TV will always default the PC input no matter what. Even if I try to swap it to something else, it automatically goes back. The exception is if HDR content is playing. If it is, I can swap to whatever I feel like. If I use YCBr, I can change settings at will, although I don't really need to or want to.

What I did was to make me "Game Console" input setting be for my HDR color settings. That way it also cuts down on some additional input lag that HDR causes. It's a handy trick so I can now toggle over to HDR via a single button.
Very similar with my Samsung QE55Q9FN, there is no point using ycbcr, RGB is great no matter what.
I'm lucky with changing the input type, it automatically knows I am using a PC and sets it to "PC" mode. But it stays as "Blu Ray Player" once set. And I dont need to use game mode, even with HDR, the lag is always really low.
I'm not sure if RTings need their review updating because they use game mode to get the faster lag readings, I dont notice any lag issues.
The great thing about not using game mode, it allows the use of "HDR+" (fake HDR) on none HDR games etc, it looks really good!
HDR+ is greyed out in game mode.

These Sammys are all great displays/TVs.


For me I don't see the difference in HDR between YCBr and RGB but I just set it to YCBr so that in case RGB compresses anythign with 8bpc I will at least be getting it in full with YCBr 12 bpc. Visually, no I cant tell lol. Just making sure my TV is getting all the bandwidth
 
For me I don't see the difference in HDR between YCBr and RGB but I just set it to YCBr so that in case RGB compresses anythign with 8bpc I will at least be getting it in full with YCBr 12 bpc. Visually, no I cant tell lol. Just making sure my TV is getting all the bandwidth
Unless you are using ycbcr 4:4:4 the colours will leak into adjacent pixels due to the colour compression.
ie 4:2:2 uses approx half the bandwidth of 4:4:4
This is most noticeable on a computer desktop, you wont notice it in Blu Ray movies because they are 4:2:0.

Also ycbcr uses the limited 8 bit range for the desktop, the same as RGB limited.
Instead of 0 to 255 colours it uses 16 to 235 to represent the whole SDR range, approx 6.5 bits per colour equivalence.
This is why I use RGB full for my desktop.

If you cant see a difference, fair enough :)
This is fyi
 
Unless you are using ycbcr 4:4:4 the colours will leak into adjacent pixels due to the colour compression.
ie 4:2:2 uses approx half the bandwidth of 4:4:4
This is most noticeable on a computer desktop, you wont notice it in Blu Ray movies because they are 4:2:0.

Also ycbcr uses the limited 8 bit range for the desktop, the same as RGB limited.
Instead of 0 to 255 colours it uses 16 to 235 to represent the whole SDR range, approx 6.5 bits per colour equivalence.
This is why I use RGB full for my desktop.

If you cant see a difference, fair enough :)
This is fyi

So why does HDR work then for RGB 8bpc? I thought 8bpc means there is compression? PC is so weird. Id rather use RGB honestly, I just seen some people say YCbr 422 is the way to be able to get true HDR
 
So why does HDR work then for RGB 8bpc? I thought 8bpc means there is compression? PC is so weird. Id rather use RGB honestly, I just seen some people say YCbr 422 is the way to be able to get true HDR
The only reason UHD HDR didnt work with RGB was because either the display or the driver didnt support it.
Reason being there isnt enough bandwidth on HDMI 2.0 to pass UHD HDR10 with full RGB so the driver must convert 10 bit colour to 8 bit for this scenario. HDMI 2.1 will alleviate this problem. DP 1.4 already does but not over a DP to HDMI adapter.
Our displays can accept UHD HDR with RGB over HDMI and the driver can do it so it is supported.

10 bits per colour (ie HDR10) gives you 1024 gradients of each colour when using full RGB.
But ycbcr uses limited RGB giving approx 650 gradients per colour.
8 bits per colour with full RGB gives you 256 gradients per colour.

If you reduce the number of bits per colour enough, the difference between the next gradient can be seen.
For example the sky is displayed ranging from light to dark blue.
Without enough shades of blue you will see the colours change as bands across the sky.
You can test this by changing from 32bit colour to 16bit or 8bit.

256 shades per colour is very good for SDR but when used for HDR you may see banding.
I saw it for a moment in a war film (cant remember which) but thats so infrequent I dont care.

If you cant see banding with HDR 10 bit, HDR 12 bit wont give any benefit.
It will not be needed until we get HDR displays with extreme brightness and there is 12bit HDR media.

Also consider the media you are using.
HDR10 video is 10 bits per colour.
Setting HDR to use 12bits per colour wont make any difference.
 
Last edited:
The only reason UHD HDR didnt work with RGB was because either the display or the driver didnt support it.
Reason being there isnt enough bandwidth on HDMI 2.0 to pass UHD HDR10 with full RGB so the driver must convert 10 bit colour to 8 bit for this scenario. HDMI 2.1 will alleviate this problem. DP 1.4 already does but not over a DP to HDMI adapter.
Our displays can accept UHD HDR with RGB over HDMI and the driver can do it so it is supported.

10 bits per colour (ie HDR10) gives you 1024 gradients of each colour when using full RGB.
But ycbcr uses limited RGB giving approx 650 gradients per colour.
8 bits per colour with full RGB gives you 256 gradients per colour.

If you reduce the number of bits per colour enough, the difference between the next gradient can be seen.
For example the sky is displayed ranging from light to dark blue.
Without enough shades of blue you will see the colours change as bands across the sky.
You can test this by changing from 32bit colour to 16bit or 8bit.

256 shades per colour is very good for SDR but when used for HDR you may see banding.
I saw it for a moment in a war film (cant remember which) but thats so infrequent I dont care.

If you cant see banding with HDR 10 bit, HDR 12 bit wont give any benefit.
It will not be needed until we get HDR displays with extreme brightness and there is 12bit HDR media.

Also consider the media you are using.
HDR10 video is 10 bits per colour.
Setting HDR to use 12bits per colour wont make any difference.

Then with everything youve said, it wwoudl seem that ycbr is still the bette option to use since it gives you more gradients per color.

Nvidia CP only has 8bpc so I can't even use full RGB on my HDMI port. That would mean YCBr is the closest thing I can get to it then
 
Also what is the difference between YCbCr444 and 422? My TV despite being HDR10 and DV compliant only has HDMI 2.0 ports and so too does my 1080 Ti only. So i guess not HDR10 10bpc RGB for me unless I buy a display with DP 1.4

I'll assuming its better to use 442 with 10 bpc over 444 with 8bpc for YCbBr
 
Then with everything youve said, it wwoudl seem that ycbr is still the bette option to use since it gives you more gradients per color.

Nvidia CP only has 8bpc so I can't even use full RGB on my HDMI port. That would mean YCBr is the closest thing I can get to it then
If you change to a lower res, NVCP will let you use more bits per colour if your display has the ability.
I explained HDMI 2.0 is bandwidth limited with RGB, you wont be given options that cannot be displayed.
RGB has 2 modes, limited or full. Neither saves bandwidth (despite displaying with less bits!) so both are available.

Using ycbcr 4:4:4 on the desktop means you may get banding when viewing photos, gaming gfx etc. because only around 6.5 bits per colour are used.
ycbcr 4:2:2 on the desktop will smear the colours because of the compression, most noticeable on text.

With the limit we have on HDMI 2.0, there needs to be some form of compression to use 10bits per colour with UHD.
ycbcr 4:2:2 can generate more shades but there is colour smear across adjacent pixels because of the colour compression.
4:4:4 is uncompressed. 4:2:2 is compressed and uses about 1/2 the bandwidth which is why there is space for 10 bit colour.
4:2:0 uses 1/2 the bandwidth again and is how films fit on BD disks, and 30fps UHD Blue Ray films can be displayed over a HDMI 1.4 capable cable.
 
If you change to a lower res, NVCP will let you use more bits per colour.
I explained HDMI 2.0 is bandwidth limited with RGB, you wont be given options that cannot be displayed.

Using ycbcr 4:4:4 on the desktop means you may get banding when viewing photos, gaming gfx etc. because only around 6.5 bits per colour are used.
ycbcr 4:2:2 on the desktop will smear the colours because of the compression, most noticeable on text.

With the limit we have on HDMI 2.0, there needs to be some form of compression to use 10bits per colour with UHD.
ycbcr 4:2:2 can generate more shades but there is colour smear across adjacent pixels because of the colour compression.
4:4:4 is uncompressed. 4:2:2 is compressed and uses about 1/2 the bandwidth which is why there is space for 10 bit colour.
4:2:0 uses 1/2 the bandwidth again and is how films fit on BD disks, and 30fps UHD Blue Ray films can be displayed over a HDMI 1.4 capable cable.

So for my situation given the limits, what is the best option to use?

RGB full with limit of 8bpc (I'll asumme this is defaulted to 444?)
YCbr 422 with 12/10 bpc?
I won't drop my resolution or refresh rate for YCBr 444. Nvidia custom res doesn't give me an option for lower refreshed RGB 4K with 10 bpc
 
So for my situation given the limits, what is the best option to use?

RGB full with limit of 8bpc (I'll asumme this is defaulted to 444?)
YCbr 422 with 12/10 bpc?
I won't drop my resolution or refresh rate for YCBr 444. Nvidia custom res doesn't give me an option for lower refreshed RGB 4K with 10 bpc
The best is ycbcr 4:2:2 for HDR and RGB full for desktop.
But if you cant be asked changing all the time, see how you get on with RGB in HDR, ie leave it in RGB full all the time.
If you dont see any banding then its a win.
Thats what I do.
 
The best is ycbcr 4:2:2 for HDR and RGB full for desktop.
But if you cant be asked changing all the time, see how you get on with RGB in HDR, ie leave it in RGB full all the time.
If you dont see any banding then its a win.
Thats what I do.

So even though you get more gradient with YCB in HDR, RGB would be better for it? Btw, what is banding?
 
I have mine on RGB now, I think all in that is the best option for not having to switch settings all the time.

Windows desktop actually looks even better than before (when I was using SDR), it is less saturated but after I got used to it I think it's more natural.

The other settings like 4:2:2 ruined text on the desktop. I do notice some banding in SDR games now, but it's honestly not that bad and I think overall it's better.
 
  • Like
Reactions: Nenu
like this
So even though you get more gradient with YCB in HDR, RGB would be better for it? Btw, what is banding?
I explained both of those but I added a few bits along the way.
Read my posts not the quoted bits in your posts.
 
I have mine on RGB now, I think all in that is the best option for not having to switch settings all the time.

Windows desktop actually looks even better than before (when I was using SDR), it is less saturated but after I got used to it I think it's more natural.

The other settings like 4:2:2 ruined text on the desktop. I do notice some banding in SDR games now, but it's honestly not that bad and I think overall it's better.
Exactly.
Note you can change how saturated SDR looks with a setting on the display, natural is better.
edo101 found the setting on his display.
 
I explained both of those but I added a few bits along the way.
Read my posts not the quoted bits in your posts.

I have. You only mention the concept of banding. You don't explain what it. You sau when it can occur but not what it is. I want to know what it is so I know what to look out for

And thank you for the detailed yet easy to understand explanations btw
 
I have. You only mention the concept of banding. You don't explain what it. You sau when it can occur but not what it is. I want to know what it is so I know what to look out for

And thank you for the detailed yet easy to understand explanations btw
np :)

In post 87 I said this
If you reduce the number of bits per colour enough, the difference between the next gradient can be seen.
For example the sky is displayed ranging from light to dark blue.
Without enough shades of blue you will see the colours change as bands across the sky.
You can test this by changing from 32bit colour to 16bit or 8bit.
 
np :)

In post 87 I said this

Ah thats what it is. Yeah I'd have to look for that then. So far in the little i have had time to do, no difference
I have mine on RGB now, I think all in that is the best option for not having to switch settings all the time.

Windows desktop actually looks even better than before (when I was using SDR), it is less saturated but after I got used to it I think it's more natural.

The other settings like 4:2:2 ruined text on the desktop. I do notice some banding in SDR games now, but it's honestly not that bad and I think overall it's better.

I actually havent seen a difference between RGB and YCbr when it comes to text
My computer is in HDR mode, even on desktop.


np :)

In post 87 I said this

For MKVs, what player and codec do you use to get HDR?
 
For MKVs, what player and codec do you use to get HDR?
MPC-BE with MadVR.
MadVR is needed because it is about the only way to get HDR metadata from the video and pass it through to the display.
Its also a great renderer.
You need to find a good guide on setting it up.
 
MPC-BE with MadVR.
MadVR is needed because it is about the only way to get HDR metadata from the video and pass it through to the display.
Its also a great renderer.
You need to find a good guide on setting it up.

And in other for it to display HDR on MPC, desktop would need to be in HDR mode yes?
 
And in other for it to display HDR on MPC, desktop would need to be in HDR mode yes?
I'm not sure with Windows 10, I use it on Windows 7 which doesnt have a windows HDR mode.
Its easy enough to try both ways.
 
I'm not sure with Windows 10, I use it on Windows 7 which doesnt have a windows HDR mode.
Its easy enough to try both ways.

Right. Well you have given me plenty to look and tinker on when I have time. thanky you very much
 
  • Like
Reactions: Nenu
like this
You only mention the concept of banding. You don't explain what it.
Banding is when colors that should have a smooth transition between them look segmented, where there is a noticeable jump from one shade to the next.

It is similar to if you use the posterize filter in Photoshop (if you use that). See this image:

Colour_banding_example01.png


The image on the left is the result of banding, though in actual use it will not be so noticeable.
 
Banding is when colors that should have a smooth transition between them look segmented, where there is a noticeable jump from one shade to the next.

It is similar to if you use the posterize filter in Photoshop (if you use that). See this image:

View attachment 144113

The image on the left is the result of banding, though in actual use it will not be so noticeable.
THank you!


What is dithering? djudder?
 
THank you!


What is dithering? djudder?
As Cybereality said, its a way to fake a colour that the display cannot show natively.
Say you want an orange colour to be a bit more orangey but the colour pallet isnt large enough (not enough bits per colour) to handle more native colours.
There are 2 general ways of mixing colours (dithering) to change the shade.
1) pixels adjacent can be used to blend other colours in but those affect other pixels around them.
2) flicker between different colours on the same pixel. The length of time each colour is displayed will vary the final colour.
The flicker needs to be very fast to not be noticed but some people still see it.
 
I have a samsung C32HG70, It has latest firmware from Samsung. Windows 1809 fall 2018 update.
Nvidia settings are RGB, 10-bit running over displayport to my 1080ti, I did the DP 1.3,1.4 firmware update.

I can't get any games to play in HDR mode without being washed out. The display OSD says HDR is enabled.

The only thing that works in HDR is chrome youtube videos of HDR.


Any advice?

Edit: Other troubleshooting I did was:
YCBR 4:4:4, 4:2:2, 8-bit, and even HDMI instead of DP. None worked.
 
Last edited:
I have a samsung C32HG70, It has latest firmware from Samsung. Windows 1809 fall 2018 update.
Nvidia settings are RGB, 10-bit running over displayport to my 1080ti, I did the DP 1.3,1.4 firmware update.

I can't get any games to play in HDR mode without being washed out. The display OSD says HDR is enabled.

The only thing that works in HDR is chrome youtube videos of HDR.


Any advice?

Edit: Other troubleshooting I did was:
YCBR 4:4:4, 4:2:2, 8-bit, and even HDMI instead of DP. None worked.

With my Samsung CHG70 27-inch, desktop needed to be the same resolution, frequency as what the game was being played. Plus with Nvidia anything other than native 144hz would have a psudo HDR10 using dithering. AMD Windows desktop in HDR looks much better than Nvidia ever did, not sure why. Anyways I put the monitor back on the Vega's due to much better Fresync + HDR support and it just works way better overall. I played Shadow Of The Tomb Raider with the 1080Ti's in HDR - Pure Awesome Experience.

So in a nutshell for Nvidia and Samsung:
  • Game and desktop in same resolution and hz
  • enable 10bit in Nvidia drivers for the monitor - if you see dithering anywhere on the selection then mess with it more until it is fixed (it will suck otherwise)
  • For desktop use and most programs - turn off HDR - HDR videos will look good though
  • Use Display Port
  • You may have to turn off the Monitor FreeSync for HDR to work?
I have no idea if Nvidia support FreeSync II, such as FreeSync and HDR at the same time. Since I moved the monitor back to the Vega's. The Vega's does this with BF5, FC5, Shadow Of The Tomb Raider, it just works with FreeSync and HDR without issue.
 
With my Samsung CHG70 27-inch, desktop needed to be the same resolution, frequency as what the game was being played. Plus with Nvidia anything other than native 144hz would have a psudo HDR10 using dithering. AMD Windows desktop in HDR looks much better than Nvidia ever did, not sure why. Anyways I put the monitor back on the Vega's due to much better Fresync + HDR support and it just works way better overall. I played Shadow Of The Tomb Raider with the 1080Ti's in HDR - Pure Awesome Experience.

So in a nutshell for Nvidia and Samsung:
  • Game and desktop in same resolution and hz
  • enable 10bit in Nvidia drivers for the monitor - if you see dithering anywhere on the selection then mess with it more until it is fixed (it will suck otherwise)
  • For desktop use and most programs - turn off HDR - HDR videos will look good though
  • Use Display Port
  • You may have to turn off the Monitor FreeSync for HDR to work?
I have no idea if Nvidia support FreeSync II, such as FreeSync and HDR at the same time. Since I moved the monitor back to the Vega's. The Vega's does this with BF5, FC5, Shadow Of The Tomb Raider, it just works with FreeSync and HDR without issue.


Thanks for the reply. I have gotten freesync II to work with the latest nvidia drivers. Only Hunt:Showdown has flickering. I'll try disabling freesync on the monitor to see if that helps get HDR to work. Other than that, I see dithering on occassion even with 10-bit HDR, and this is with 144hz and 1440P native screen res. I forgot to mention that i do run a DVI-DL 1440p 60hz monitor as a second screen.
 
Thanks for the reply. I have gotten freesync II to work with the latest nvidia drivers. Only Hunt:Showdown has flickering. I'll try disabling freesync on the monitor to see if that helps get HDR to work. Other than that, I see dithering on occassion even with 10-bit HDR, and this is with 144hz and 1440P native screen res. I forgot to mention that i do run a DVI-DL 1440p 60hz monitor as a second screen.
Interesting, I did not try with two monitors, might want to turn the 60hz one off in Windows and see if that gives you better HDR ability for gaming. I was only able to get HDR 10 without dithering when at native resolution and refresh rates and this was before Nvidia supported FreeSync (Adaptive Sync Gsync). I may run a cable from the 1080Ti's over to the Samsung monitor and test out except it would be a longer cable which can have it's own problems. One of FreeSync 2 abilities is to be able to run FreeSync and HDR at the same time, a lot more data is going to the monitor with HDR. That will be great if Nvidia supports Adaptive Sync and HDR at the same time since only the $1600+ monitors with Gsync do that.
 
Won't HDMI 2.1 alleviate many of these bandwidth issues? I know that's not exactly comforting right now, but I'm more willing to accept a huge format shift over proprietary tech.
 
Hello partners.
First of all excuse my English, but I'm not native.
I have an NVIDIA GTX970 and an LG monitor with FreeSync 2 / GSync Compatible, and as I see here that some of you have some doubts, I have decided to share what I have learned and checked for myself, in case it can help someone. Many of you will surely have more experience and knowledge, but there goes my grain of sand:

- The first thing I have had to do is update the NVIDIA VGA firmware, so that it is compatible with the new standards of the new versions of the DysplayPort and HDMI cables (Note to NVIDIA: The upgrade tool could be a little less hidden in the download page It would be appreciated since a lot of people do not know this tool, so they neither look for it nor find it, this being an essential step.)

- The GTX970 being a previous model to the GTX1050, will not be able to work for GSync Compatible but with "pure" GSync yes, because the "friends" of NVIDIA have supposed that we all want to send 4K signal to a minimum of 60fps with 32 bits of color information and in a full range 0-256 of color tones (which can prevent banding and other unwanted effects on image quality such as the typical blacks that look gray, etc ...). Gsync Compatible, requires a VGA (GTX1050 onwards), with DisplayPort, cable and monitor that support version 1.4 in the three components I quote. With resolutions lower than 4K, even at 144Hz the signal "fits" without problem, as you can easily calculate. This limitation in the drivers that have established an exaggeratedly ironwork to avoid hacks, I suppose that it is only due to a sales policy, nothing more. (Tiron with ears again to NVIDIA).
As for using normal FreeSync or Freesenc 2, the VGA has to be yes or yes from AMD.
My monitor supports Display Port 1.4 and HDMI 2.0. I have a DP1.4 cable and an HDM 2.1,
The compatible GSync has been impossible for me to activate, since again the "bright friends of the green company" do not allow to pass a 1080p @ 144Hz @ 10bpc signal in the "Full" color range (as I understand, being HDR10 does not need compression to "fit" by the bandwidth offered by DP 1.2, at that resolution / Hz / bits / bpc. That way we just have to wait for NVIDIA to reconsider and undo the restriction at 1080p or supperior (thecnicaly there's no inconvenence) , which knowing them, I really doubt that do in a next driver / fix. If they allow it, it would be enough to go to the NVIDIA Control Panel, activate the full range, the 32bits and the 10bpc (most importantly) . It is not necessary to reach 12bpc since the RGB mode (theoretically you get exactly the same quality as in Ycbcr 4: 4: 4 @ 12bpc, since there is no compression of any kind, but in Windows you have to take into account that it is originally designed for RGB and not for Ycbcr so I prefer RGB / 10bpc / 32bits / Complete. I can reach 12bpc, but relly is a waste o bandwith that can be necesary for Hz or more scrren resolution for example.
As I said, I do not have VGA AMD and I can not talk about FreeSync since NVIDIA drivers by their noses, want to make us pass through their proprietary technologies and closed to earn more money (By one side they think is the best for they, because aren't an NGO, but they could be a little less greedy and be friendlier to the ones we feed them in. In the long run they would earn more clerks / money in my modest and ignorant oppinion offcourse). At the risk of getting bored you a little more I will tell you about my experience with the HDR10.
The monitor is capable of playing HDR10, but with NVidia by DP1. 2 I have found it impossible to activate it in Windows due infamous and innecesary NVDIA cap's which is an essential requirement to be able to enjoy the non-emulated HDR10 (beware of this, since although the two look good, the difference is remarkable). Now, when connecting an HDMI 2.1 (8K) cable (a 2.0 should be sufficient), to the GTX970 (which, I remember you, is compatible with the new DP1.2 and HDMI 2.0 / 1 standards (only if the firmware is updated with the tool mentioned above)), I was able to activate in Windows, without any problem the HDR (real and emulated too, according to the image mode selected on the monitor), both on the Windows desktop and in the games that have the HDR option in their corresponding menus.
Zero problems, no setup complications, and the image looks glorious.

I hope I could shed some light, since it could be a plug & play theme but between Windows and NVIDIA they want to take us to their own garden each. Allow me to reserve my personal opinion on these practices / policies of the companies, but if I capture them here, I think that the phone in my hands would be demolished, because of the heat produced by my opinion about this type of issue.
Greetings to all.

Q. I am involved, but I will go through here, in case someone needs / wants me to expand information, if modestly in my knowledge.

I would also be very grateful if someone notices that I have made a mistake in the data I have presented and corrected me with data that is 100% verifiable and not rumors or opinions taken from any site without demonstrable grounds.

Sorry for the wall of words. ;).
 
Last edited:
Hello partners.
First of all excuse my English, but I'm not native.
I have an NVIDIA GTX970 and an LG monitor with FreeSync 2 / GSync Compatible, and as I see here that some of you have some doubts, I have decided to share what I have learned and checked for myself, in case you can help someone. Many of you will surely have more experience and knowledge, but there goes my grain of sand:
- The first thing I have had to do is update the VGA firmware, so that it is compatible with the new standards of the new versions of the DysplayPort and HDMI cables (Note to NVIDIA: The upgrade tool could be a little less hidden in the download page It would be appreciated since a lot of people do not know this tool, so they neither look for it nor find it, this being an essential step.)

- The GTX970 being a previous model to the GTX1050, will not be able to work for GSync / GSync Compatible, because the "friends" of NVIDIA have supposed that we all want to send 4K signal to a minimum of 60fps with 32 bits of color information and in a full range 0-256 of color tones (which can prevent banding and other unwanted effects on image quality such as the typical blacks that look gray, etc ...). This requires a VGA (GTX1050 onwards), with DisplayPort, cable and monitor that support version 1.4 in the three components I quote. With resolutions lower than 4K, even at 144Hz the signal "fits" without problem, as you can easily calculate. This limitation in the drivers that have established an exaggeratedly ironwork to avoid hacks, I suppose that it is only due to a sales policy, nothing more. (Tiron with ears again to NVIDIA).
As for using normal FreeSync or Freesenc 2, the VGA has to be yes or yes from AMD.
My monitor supports Display Port 1.4 and HDMI 2.0. I have a DP1.4 cable and an HDM 2.1,
The compatible GSync has been impossible for me to activate, since again the "bright friends of the green company" do not allow to pass a 1080p @ 144Hz @ 10bpc signal in the "Full" color range (as I understand, being HDR10 does not need compression to "fit" by the bandwidth offered by DP 1.2, at that resolution / Hz / bits / bpc. That way we just have to wait for NVIDIA to reconsider and undo the restriction at 1080p, which knowing them, I really doubt that do in a next driver / fix. If they allow it, it would be enough to go to the NVIDIA Control Panel, activate the full range, the 32bits the 10bpc (most importantly. It is not necessary to reach 12bpc since ey the RGB mode (theoretically you get exactly the same quality as in Ycbcr 4: 4: 4 @ 12bpc, since there is no compression of any kind, but in Windows you have to take into account that it is originally designed for RGB and not for Ycbcr so I prefer RGB / 10bpc / 32bits / Complete.
As I said, I do not have VGA AMD and I can not talk about FreeSync since NVIDIA drivers by their noses, want to make us pass through their proprietary technologies and closed to earn more money (normal. It is not an NGO, but they could be a little less greedy and be friendlier to the ones we feed them in. In the long run they would earn more clerks / money). At the risk of getting bored a little more I also tell you about my experience with the HDR10.
The monitor is capable of playing HDR10, but with NVidia by DP1. 2 I have found it impossible to activate it in Windows, which is an essential requirement to be able to enjoy the non-emulated HDR10 (beware of this, since although the two look good, the difference is remarkable). Now, when connecting an HDMI 2.1 (8K) cable (a 2.0 should be sufficient), to the GTX970 (which, I remember, is compatible with the new DP1.2 and HDMI 2.0 / 1 standards (only if the firmware is updated with the tool mentioned above)), I was able to activate in Windows, without any problem the HDR (real and emulated too, according to the image mode selected on the monitor), both on the Windows desktop and in the games that have the HDR option in their corresponding menus.
Zero problems, no setup complications, and the image looks glorious.

I hope I could shed some light, since it could be a plug & play theme but between Windows and NVIDIA they want to take us to their own garden each. Allow me to reserve my personal opinion on these practices / policies of the companies, but if I capture them here, I think that the phone in my hands would be demolished, because of the heat produced by this type of issue.
Greetings to all.

Q. I am involved, but I will go through here, in case someone needs / wants me to expand information, if modestly in my knowledge.

I struggled to find the problem you are having among the text wall.
It would help to start with that.
 
I struggled to find the problem you are having among the text wall.
It would help to start with that.
Thank you for your interest, but the only "problem" is that I don't have an RTX2080Ti. ;D ;D If that can be called a problem.
As I indicated at the beginning of the wall, my only intention is to share my experience and help anyone who may need it if it is, like this. Cheers.
 
LONG-ASS-POST.

You could have just summarized by saying: for Maxwell 2.0 cards, your video output options are:

DisplayPort 1.2 does not support HDR color space (although you CAN run it at 10-bit normal color space).

HDMI 2.0 DOES support HDR color space.

Nvidia decided not to back-port Freesync into the less capable Maxwell generation (for reasons like those you encountered.) See how much easier that is to read :D
 
Last edited:
Thanks again for the suggestion, but due to the doubts and questions that I have observed in the whole thread, I have consciously decided to express myself and detail the points as it seemed best to ME since I believe that here there is no charge for words and not everyone has the same level, needing some more concrete data than others, although the end is the same. What I do not see necessary is to result in this point without any more importance, since it does not contribute anything to the subject that really occupies the thread. Thank you nonetheless for your advice. I will consider it for the future as far as it is worth. Best reggards, unknow and estimated friend. ;)
 
Last edited:
  • Like
Reactions: noko
like this
Back
Top