AMD vs nVidia Colors - Concrete proof of differences

Quality of Rendering, who does it better? AMD or NVIDIA?


  • Total voters
    61
For those seeking more infos on this subject, just google “NVIDIA vs AMD Color Quality” and “NVIDIA vs AMD Image Quality”. Read every threads, forums, websites, and comments you’re able to find.
After reading every thread and comment nothing else to do than take old trusty shotgun and shoot oneself in the...

To truly put an end to this 30-years old doubt/argument once and for all, we’re gonna need side-by-side comparison videos and screenshots between RTX 4090 & 7900 XTX on a modern 4K OLED monitor with same DP/HDMI cable, same monitor, same monitor settings, same driver, same driver settings, same game, same in-game settings. Let’s see if there’s still any noticeable visual differences after all.
I'd take proper measurement using color probe. It is possible to show without showing pictures that eg. one GPU shows things differently than the other eg. has banding with settings where other GPU still looks fine, or that gamut/gamma is the same, etc.

Only thing which cannot be proven is existence or quality of any additional filters like sharpening - though no one is fortunately claiming anyone uses such filters so such comparisons are not needed.
 
After reading every thread and comment nothing else to do than take old trusty shotgun and shoot oneself in the...


I'd take proper measurement using color probe. It is possible to show without showing pictures that eg. one GPU shows things differently than the other eg. has banding with settings where other GPU still looks fine, or that gamut/gamma is the same, etc.

Only thing which cannot be proven is existence or quality of any additional filters like sharpening - though no one is fortunately claiming anyone uses such filters so such comparisons are not needed.
I don't have a colorimeter/calibration device nor an AMD card with me to conduct tests. Did you find any difference from testing both vendors (same settings) on your monitor?


Tech Yes City did a comparison years ago and he didn't find much noticeable difference, only one game.
 
  • Like
Reactions: pavel
like this
For those seeking more infos on this subject, just google “NVIDIA vs AMD Color Quality” and “NVIDIA vs AMD Image Quality”. Read every threads, forums, websites, and comments you’re able to find.

To truly put an end to this 30-years old doubt/argument once and for all, we’re gonna need side-by-side comparison videos and screenshots between RTX 4090 & 7900 XTX on a modern 4K OLED monitor with same DP/HDMI cable, same monitor, same monitor settings, same driver, same driver settings, same game, same in-game settings. Let’s see if there’s still any noticeable visual differences after all.
What about the QD-LED displays these days?

1671544439163.png


https://www.flatpanelshd.com/news.php?subaction=showfull&id=1501832557
 
After reading every thread and comment nothing else to do than take old trusty shotgun and shoot oneself in the...


I'd take proper measurement using color probe. It is possible to show without showing pictures that eg. one GPU shows things differently than the other eg. has banding with settings where other GPU still looks fine, or that gamut/gamma is the same, etc.

Only thing which cannot be proven is existence or quality of any additional filters like sharpening - though no one is fortunately claiming anyone uses such filters so such comparisons are not needed.
Games can have different optimizations or paths in code for different GPUs which can introduce differences. This is all digital, all math and logic -> the calculations per pixel should be the same. If AMD or Nvidia compression methods for like color are not loose less then some errors can be introduced. In addition where mipmaps, LOD can also be different for the different resolution textures.

In a nutshell -> very doubtful and at most insignificant. Since I cannot tell, even on an OLED, that is if I can get perfect gamma, color saturation, brightness/contrast which can also be slightly different. Now what I can tell is Nvidia does VRR on my Sony TV and AMD does not which is a significant gaming experience difference.
 
Games can have different optimizations or paths in code for different GPUs which can introduce differences. This is all digital, all math and logic -> the calculations per pixel should be the same. If AMD or Nvidia compression methods for like color are not loose less then some errors can be introduced. In addition where mipmaps, LOD can also be different for the different resolution textures.

In a nutshell -> very doubtful and at most insignificant. Since I cannot tell, even on an OLED, that is if I can get perfect gamma, color saturation, brightness/contrast which can also be slightly different. Now what I can tell is Nvidia does VRR on my Sony TV and AMD does not which is a significant gaming experience difference.

That is extremely odd that Nvidia can do VRR and AMD cannot. Are you speaking of VRR only, no Gsync or Freesync on that TV?
 
That is extremely odd that Nvidia can do VRR and AMD cannot. Are you speaking of VRR only, no Gsync or Freesync on that TV?
Yes, Sony only supports VRR and not GSync or Freesync officially. I and others at the AMD forums had this same issue. Nvidia just works in this case.
 
Yes, Sony only supports VRR and not GSync or Freesync officially. I and others at the AMD forums had this same issue. Nvidia just works in this case.

I know you are right but, AMD does support VRR as well, just not on that TV, apparently.
 
Yes, Sony only supports VRR and not GSync or Freesync officially. I and others at the AMD forums had this same issue. Nvidia just works in this case.
Have Nvidia added dithering for old 8bit + frc display with recent drivers? I bought old Plasma TV for testing, still think it will banded like hell.
 
  • Like
Reactions: erek
like this
Have Nvidia added dithering for old 8bit + frc display with recent drivers? I bought old Plasma TV for testing, still think it will banded like hell.
It was cool with the nVidia Linux control panel or X.Org config you could disable the dithering and see the real difference. Fascinating
 
Oh god, this again? I really thought this crap had died off. I remember doing a post on [H] probably a decade ago where I did measurements with my colorimeter, took screenshots and did differences, etc. For some reason AMD fans STILL insisted it was wrong and AMD looked better.

I'm not sure what it is: If it is entirely placebo/copium or if some people just configure one card wrong and decide that means the other looks better. I lean towards the placebo/copium because I remember I posted text rendering screenshots and did a difference in an image editor. They came out completely black, meaning no difference, on the text, but a couple people still swore the AMD screenshot looked better.

I don't have the energy to do something like it again, and it is clear people won't listen. I guess some fans just want to believe that their GPU has some magic pixie juice that somehow makes colors look better on a digital display, even though that is not how any of this shit works.
 
I have my Nvidia RTX 4080 plugged into an $800 4K SS-IPS 144mhz monitor with 98% DCI P3 coverage.

I have my AMD 7900 XTX plugged into a $300 1440p bargain-bin ASUS monitor with horizontal stuck pixel rows all over it.

Therefore AMD sucks, Nvidia rulz.
 
It was cool with the nVidia Linux control panel or X.Org config you could disable the dithering and see the real difference. Fascinating
As far as I've remembered, toggerable 8bpc dithering option is exclusively on Nvidia Linux and Quadro drivers. I was asking about normal Game-Ready-Driver on Windows 10-11, they may have already added it silently, like some overlooked essential features in the past.

Oh god, this again? I really thought this crap had died off. I remember doing a post on [H] probably a decade ago where I did measurements with my colorimeter, took screenshots and did differences, etc. For some reason AMD fans STILL insisted it was wrong and AMD looked better.

I'm not sure what it is: If it is entirely placebo/copium or if some people just configure one card wrong and decide that means the other looks better. I lean towards the placebo/copium because I remember I posted text rendering screenshots and did a difference in an image editor. They came out completely black, meaning no difference, on the text, but a couple people still swore the AMD screenshot looked better.

I don't have the energy to do something like it again, and it is clear people won't listen. I guess some fans just want to believe that their GPU has some magic pixie juice that somehow makes colors look better on a digital display, even though that is not how any of this shit works.
It's always been that way or this reviewer and others who said otherwise probably need to re-check their eyesight.


View: https://www.youtube.com/watch?v=JPtFfTIoTuI

View: https://www.youtube.com/watch?v=FVb25eomcrI

View: https://www.youtube.com/watch?v=iRJNrSEb6oI
 
  • Like
Reactions: erek
like this
As far as I've remembered, toggerable 8bpc dithering option is exclusively on Nvidia Linux and Quadro drivers. I was asking about normal Game-Ready-Driver on Windows 10-11, they may have already added it silently, like some overlooked essential features in the past.


It's always been that way or this reviewer and others who said otherwise probably need to re-check their eyesight.


View: https://www.youtube.com/watch?v=JPtFfTIoTuI

View: https://www.youtube.com/watch?v=FVb25eomcrI

View: https://www.youtube.com/watch?v=iRJNrSEb6oI

Are there any registry hacks?
 
It's always been that way or this reviewer and others who said otherwise probably need to re-check their eyesight.
While the limited vs full range thing is a suspect for people who actually do see differences, I think most often they don't and they just think they do but the rumor persists. It started back in the VGA days that AMD had better converters, and I dunno may that was true. But people claimed it in the DVI days as well. As I said, I went and tested it, even though I knew the results, since I had both an nVidia (in my desktop) and AMD (in my laptop) card as well as a professional NEC monitor and a i1 colorimeter. I just tested with color patches because that's what a colorimeter is designed to do and also because a game can very well be rendered differently on the different cards, depending on how it is set up (some actually have multiple render paths, iD used to do that a lot). Of course the result was there was no difference, to within the accuracy of the meter, all differences being under like 0.3dE.

Didn't matter, had people argue that they could see something that small, that I was doing it wrong, etc. Just seems to be an article of faith to some people.
 
In my experience, there is a difference with how they both display their default color values. Of course, both have good options to tweak them to your taste. And you can also use your monitor's own settings to tweak color values, as well.

In general, at default driver color settings on the same monitor: AMD has truer whites. Which overall, looks 'better' to me, with more depth. Whereas Nvidia has more green/yellow in their whites. And having an inaccurate white makes the image look more 'flat', for me.
However, Nvidia has more subtlety in reds/purples.

As far as visual rendering goes, there absolutely are differences in some games. However, its pretty rare that they are large. And most of them wouldn't be noticeable, without pixel perfect side-by-side comparisons.
 
  • Like
Reactions: Xar
like this
Have Nvidia added dithering for old 8bit + frc display with recent drivers? I bought old Plasma TV for testing, still think it will banded like hell.
Don’t know, they use to have the setting in the past. Also AMD for some time now, supports VRR on my Sony TV.
 
  • Like
Reactions: Xar
like this
While the limited vs full range thing is a suspect for people who actually do see differences, I think most often they don't and they just think they do but the rumor persists. It started back in the VGA days that AMD had better converters, and I dunno may that was true. But people claimed it in the DVI days as well. As I said, I went and tested it, even though I knew the results, since I had both an nVidia (in my desktop) and AMD (in my laptop) card as well as a professional NEC monitor and a i1 colorimeter. I just tested with color patches because that's what a colorimeter is designed to do and also because a game can very well be rendered differently on the different cards, depending on how it is set up (some actually have multiple render paths, iD used to do that a lot). Of course the result was there was no difference, to within the accuracy of the meter, all differences being under like 0.3dE.

Didn't matter, had people argue that they could see something that small, that I was doing it wrong, etc. Just seems to be an article of faith to some people.
The default to limited range issue is still persist to this day I think and it's not like it helps anything with gaining more frames by limited the rage, many users have been complaining this for years, so I don't know why haven't they fixed it already. What I've seen in Rage3D threads back in the days, most people pointed out the DAC quality of each card being the reason, that might be true only till modern digital ports like DP and HDMI being implemented. Some people said it was their over usage of compressions and lower graphical features in driver by default compared to other vendors that made it apparent during the 2D Era. AnandTech also pointed out CRT and some old LCDs are quite sensitive to analog VGA/DVI/S-Video outputs of each card hence different results. And later they said it was this no dithering issue that caused massive banding, washed out gradient, and other cons making pictures not being as polished as AMD's.

I made my conclusion that this traditional image quality subject will never truly be proved since Nvidia, AMD, and developers already moved on to what they deemed most important to 3D rendering such as A.I, DLSS, FSR, RT, PT.
 
Sure, there was a time 20 years ago that ATI had superior image quality and anti-aliasing methods. Not sure that is still true.
 
I've owned every single generation of ATi/amd and Nvidia graphics cards since the 90s except Nvidia's 4000 series. It's been this way that ATI/amd had better Image quality/colors since the 5800ultra from Nvidia because they starting cutting corners on Image quality during this gen. Numerous times they have been busted cheating on benchmarks or sacrificing image quality to win (FPS). It's always been known that Nvidia's anisotropic filtering is inferior to ATI/Amd. With the introduction of Pascal's lossless Delta color compression has ruined IQ even more. I know Amd uses it as well but it's lossless like it should be. I have been playing in 4k since 2013 and I sometimes play games with 10 bit color which looks amazing on my AMD cards and I can't reproduce this on Nvidia cards with geforce drivers. The BS about Nvidia's Full range rgb and limited is BS.. doesn't' correct the issue. What does get Nvidia's image quality/colors to look comparable to AMD is using modified Quadro drivers and NV inspector. I have been able to reproduce this over and over again since Pascal to Ampere in 4k. I feel the differences are kind of small outside of 4k resolution, there needs to be testing done in native 4k only when people make videos.
 
  • Like
Reactions: Xar
like this
I've owned every single generation of ATi/amd and Nvidia graphics cards since the 90s except Nvidia's 4000 series. It's been this way that ATI/amd had better Image quality/colors since the 5800ultra from Nvidia because they starting cutting corners on Image quality during this gen. Numerous times they have been busted cheating on benchmarks or sacrificing image quality to win (FPS). It's always been known that Nvidia's anisotropic filtering is inferior to ATI/Amd. With the introduction of Pascal's lossless Delta color compression has ruined IQ even more. I know Amd uses it as well but it's lossless like it should be. I have been playing in 4k since 2013 and I sometimes play games with 10 bit color which looks amazing on my AMD cards and I can't reproduce this on Nvidia cards with geforce drivers. The BS about Nvidia's Full range rgb and limited is BS.. doesn't' correct the issue. What does get Nvidia's image quality/colors to look comparable to AMD is using modified Quadro drivers and NV inspector. I have been able to reproduce this over and over again since Pascal to Ampere in 4k. I feel the differences are kind of small outside of 4k resolution, there needs to be testing done in native 4k only when people make videos.
I guess you missed that people have literally tested with a color meter and proven that there is zero difference, but here, let me help you by pointing it out.

Oh god, this again? I really thought this crap had died off. I remember doing a post on [H] probably a decade ago where I did measurements with my colorimeter, took screenshots and did differences, etc. For some reason AMD fans STILL insisted it was wrong and AMD looked better.

I'm not sure what it is: If it is entirely placebo/copium or if some people just configure one card wrong and decide that means the other looks better. I lean towards the placebo/copium because I remember I posted text rendering screenshots and did a difference in an image editor. They came out completely black, meaning no difference, on the text, but a couple people still swore the AMD screenshot looked better.

I don't have the energy to do something like it again, and it is clear people won't listen. I guess some fans just want to believe that their GPU has some magic pixie juice that somehow makes colors look better on a digital display, even though that is not how any of this shit works.
 
I've owned every single generation of ATi/amd and Nvidia graphics cards since the 90s except Nvidia's 4000 series. It's been this way that ATI/amd had better Image quality/colors since the 5800ultra from Nvidia because they starting cutting corners on Image quality during this gen. Numerous times they have been busted cheating on benchmarks or sacrificing image quality to win (FPS). It's always been known that Nvidia's anisotropic filtering is inferior to ATI/Amd. With the introduction of Pascal's lossless Delta color compression has ruined IQ even more. I know Amd uses it as well but it's lossless like it should be. I have been playing in 4k since 2013 and I sometimes play games with 10 bit color which looks amazing on my AMD cards and I can't reproduce this on Nvidia cards with geforce drivers. The BS about Nvidia's Full range rgb and limited is BS.. doesn't' correct the issue. What does get Nvidia's image quality/colors to look comparable to AMD is using modified Quadro drivers and NV inspector. I have been able to reproduce this over and over again since Pascal to Ampere in 4k. I feel the differences are kind of small outside of 4k resolution, there needs to be testing done in native 4k only when people make videos.
Back then S3 Graphics, ATi, Matrox, and 3dfX were the front runners when it comes to image quality in contrast to nVIDIA. Also historically, nVIDIA does not dithered native 6-8 bit color, only 10-bit and up. If your monitor is not native 10-bit, you will notice less quality colors using an nVIDIA card compare to using an ATi/AMD especially at 6-bit and pseudo 6-bit + FRC/dithering. Even changing the output range to 0-255 or Full RGB will not produce the same color fidelity as the AMD card on the same LCD monitor. This was actually a problem for years, back then many people couldn't understand exactly why using an nVIDIA card and an AMD card with the same RGB 0-255 output and the nVIDIA card produced slightly lower output quality. In actuality, it wasn't that ATi/AMD has some secret technique for better color reproduction on older 6-8 bit LCD and CRT monitors, it was because of the algorithm that both vendors used and AMD happened to use the proper one and forced it on permanently 6-12 bit across the board.
A lot are connected to why including nVIDIA's default uncalibrated/outdated contrast and gamma values and the false color output data bug that forced almost all monitors to display Limited range of 16-235 RGB. Even with users forced 0-255 RGB, there's still a problem with no 6-8 bit dithering and nonexistence gradient dithering which causes tons of color banding. The error reproduction EDID will reset itself overtime (occasionally happened once you're updated the GeForce Game-Ready driver or changed monitor/TV). Multi color and image compressions that have been integrated into both GPU driver and memory architecture over the years gen after gen didn't help either.

There're articles about nVIDIA's Progressive "Lossy" Memory Compression here and quality comparisons here (translated) and here (original).
In my experience, there is a difference with how they both display their default color values. Of course, both have good options to tweak them to your taste. And you can also use your monitor's own settings to tweak color values, as well.

In general, at default driver color settings on the same monitor: AMD has truer whites. Which overall, looks 'better' to me, with more depth. Whereas Nvidia has more green/yellow in their whites. And having an inaccurate white makes the image look more 'flat', for me.
However, Nvidia has more subtlety in reds/purples.

As far as visual rendering goes, there absolutely are differences in some games. However, its pretty rare that they are large. And most of them wouldn't be noticeable, without pixel perfect side-by-side comparisons.
nVIDIA's colors output will noticeably look worse if your monitor bpc mode is allied with monitor-level dithering/FRC like 6+2, 8+2 or 10+2. That's where driver-level dithering comes into play. On screens with native 6 bpc -to- native 16 bpc which doesn't require driver-level dithering, the difference is very minimal.
Adjusting both several times and make sure everything is set to equal doing a comparison to spot a difference. During Analog S-Video and D-Sub signal days, nVIDIA was one of the worst in this regard compare to 5-6 other GPU vendors. They were caught cheating during early 2000s 3DMark benchmarks compressing GPU data and image output for a couple extra frames just enough to remain ahead of competitors and lead the chart.
 
I used to notice differences years ago between amd and nvidia, i switched from an nvidia card to an amd card around the release of Team Fortress 2 and found the amd card more colourful. But not really noticed it too much in games the last few years despite owning amd and nvidia cards at different times playing the same games.
 
  • Like
Reactions: Xar
like this
To begin with, when anyone has made these reports of one card being "more colorful", were any actual objective measurements done by the user? Because more colorful or some other metric doesn't necessarily mean better. Many S23 Ultra users were complaining about the display on it being "less vibrant", when the reality is that it was just less saturated and inaccurate. ie they actually set it to natural color schemes. This could very well also be another "tube amplifier" thing. They're objectively not that great because they color sound, but people like that color. I think "accuracy" is a general gold standard measurement, and you can't easily do that win the naked eye.

That is supposing there's actually any difference to begin with.

I also remember a display discussion quite a while ago with 10 bit (it was either 10 or 8) panels being liked because of their colors, but the reality was that they were oversaturated because GPUs were only really meant to be outputting 8 (or was it 6?) bits without specialized professional software or something.
 
After I run a spyder 5 colorimeter, they all look the same to me... lol

I don't think you'll find many out of the box experiences are colored correctly according to your room lighting conditions with the exact monitor/gpu combination making such comparisons pointless.
 
After I run a spyder 5 colorimeter, they all look the same to me... lol

I don't think you'll find many out of the box experiences are colored correctly according to your room lighting conditions with the exact monitor/gpu combination making such comparisons pointless.
Can you post that SpyderX 5 data, monitor you used, OSD setting, AMNV control panel setting, something like that?

Maybe in half and half shot like these.

FU1KfIEakAAPP5T (1).jpeg
FDWwrEIVEAIjenY (1).jpeg
FVGoAmWUUAAXVrd (1).jpeg
FDWwrFLVcAUL8ep (1).jpeg
FZaG3IpaUAABn2J.jpeg
FZaG5_yaUAAq8d6.jpg
 
I've owned every single generation of ATi/amd and Nvidia graphics cards since the 90s except Nvidia's 4000 series. It's been this way that ATI/amd had better Image quality/colors since the 5800ultra from Nvidia because they starting cutting corners on Image quality during this gen. Numerous times they have been busted cheating on benchmarks or sacrificing image quality to win (FPS). It's always been known that Nvidia's anisotropic filtering is inferior to ATI/Amd. With the introduction of Pascal's lossless Delta color compression has ruined IQ even more. I know Amd uses it as well but it's lossless like it should be. I have been playing in 4k since 2013 and I sometimes play games with 10 bit color which looks amazing on my AMD cards and I can't reproduce this on Nvidia cards with geforce drivers. The BS about Nvidia's Full range rgb and limited is BS.. doesn't' correct the issue. What does get Nvidia's image quality/colors to look comparable to AMD is using modified Quadro drivers and NV inspector. I have been able to reproduce this over and over again since Pascal to Ampere in 4k. I feel the differences are kind of small outside of 4k resolution, there needs to be testing done in native 4k only when people make videos.
In my experience, there is a difference with how they both display their default color values. Of course, both have good options to tweak them to your taste. And you can also use your monitor's own settings to tweak color values, as well.

In general, at default driver color settings on the same monitor: AMD has truer whites. Which overall, looks 'better' to me, with more depth. Whereas Nvidia has more green/yellow in their whites. And having an inaccurate white makes the image look more 'flat', for me.
However, Nvidia has more subtlety in reds/purples.

As far as visual rendering goes, there absolutely are differences in some games. However, its pretty rare that they are large. And most of them wouldn't be noticeable, without pixel perfect side-by-side comparisons.
Back then S3 Graphics, ATi, Matrox, and 3dfX were the front runners when it comes to image quality in contrast to nVIDIA. Also historically, nVIDIA does not dithered native 6-8 bit color, only 10-bit and up. If your monitor is not native 10-bit, you will notice less quality colors using an nVIDIA card compare to using an ATi/AMD especially at 6-bit and pseudo 6-bit + FRC/dithering. Even changing the output range to 0-255 or Full RGB will not produce the same color fidelity as the AMD card on the same LCD monitor. This was actually a problem for years, back then many people couldn't understand exactly why using an nVIDIA card and an AMD card with the same RGB 0-255 output and the nVIDIA card produced slightly lower output quality. In actuality, it wasn't that ATi/AMD has some secret technique for better color reproduction on older 6-8 bit LCD and CRT monitors, it was because of the algorithm that both vendors used and AMD happened to use the proper one and forced it on permanently 6-12 bit across the board.
A lot are connected to why including nVIDIA's default uncalibrated/outdated contrast and gamma values and the false color output data bug that forced almost all monitors to display Limited range of 16-235 RGB. Even with users forced 0-255 RGB, there's still a problem with no 6-8 bit dithering and nonexistence gradient dithering which causes tons of color banding. The error reproduction EDID will reset itself overtime (occasionally happened once you're updated the GeForce Game-Ready driver or changed monitor/TV). Multi color and image compressions that have been integrated into both GPU driver and memory architecture over the years gen after gen didn't help either.

There're articles about nVIDIA's Progressive "Lossy" Memory Compression here and quality comparisons here (translated) and here (original).

nVIDIA's colors output will noticeably look worse if your monitor bpc mode is allied with monitor-level dithering/FRC like 6+2, 8+2 or 10+2. That's where driver-level dithering comes into play. On screens with native 6 bpc -to- native 16 bpc which doesn't require driver-level dithering, the difference is very minimal.
Adjusting both several times and make sure everything is set to equal doing a comparison to spot a difference. During Analog S-Video and D-Sub signal days, nVIDIA was one of the worst in this regard compare to 5-6 other GPU vendors. They were caught cheating during early 2000s 3DMark benchmarks compressing GPU data and image output for a couple extra frames just enough to remain ahead of competitors and lead the chart.
To begin with, when anyone has made these reports of one card being "more colorful", were any actual objective measurements done by the user? Because more colorful or some other metric doesn't necessarily mean better. Many S23 Ultra users were complaining about the display on it being "less vibrant", when the reality is that it was just less saturated and inaccurate. ie they actually set it to natural color schemes. This could very well also be another "tube amplifier" thing. They're objectively not that great because they color sound, but people like that color. I think "accuracy" is a general gold standard measurement, and you can't easily do that win the naked eye.

That is supposing there's actually any difference to begin with.

I also remember a display discussion quite a while ago with 10 bit (it was either 10 or 8) panels being liked because of their colors, but the reality was that they were oversaturated because GPUs were only really meant to be outputting 8 (or was it 6?) bits without specialized professional software or something.
Great findings and excellent insights from all of you.
So what was the actual scientific reason behind it?

1. No implementation of Temporal Dithering for 6bit, 6bit + FRC, 8bit, and 8bit + FRC

2. Aggressive usage of Delta Color Compression (DCC), Memory Compression, Bandwidth Compression, Texture Compression, and HDR Compression

3. Poor RAMDAC/Palette DAC quality and Ports quality

4. Driver defaulting to 16-235 Limited RGB Range 4:2:2 signal behaviour on DP-HDMI ports

5. Driver's inaccurate uncalibrated slight Grey tone Washed-out Colors & Contrast default stock values

6. All of them combined
 
To begin with, when anyone has made these reports of one card being "more colorful", were any actual objective measurements done by the user? Because more colorful or some other metric doesn't necessarily mean better. Many S23 Ultra users were complaining about the display on it being "less vibrant", when the reality is that it was just less saturated and inaccurate. ie they actually set it to natural color schemes. This could very well also be another "tube amplifier" thing. They're objectively not that great because they color sound, but people like that color. I think "accuracy" is a general gold standard measurement, and you can't easily do that win the naked eye.

That is supposing there's actually any difference to begin with.

I also remember a display discussion quite a while ago with 10 bit (it was either 10 or 8) panels being liked because of their colors, but the reality was that they were oversaturated because GPUs were only really meant to be outputting 8 (or was it 6?) bits without specialized professional software or something.
The whole thing is silly. I wouldn't be surprised if there were some slight merit to this argument 20+ years ago, but even then, I think it's all bunk. At most, the default color configuration at the driver level might fit some monitors better than others. Someone who might be concerned about this kind of thing would have the ability to fix it with a colorimeter. Having said that, I think people who own colorimeter's would also find this to be silly and wouldn't waste their time on this.
 
The whole thing is silly. I wouldn't be surprised if there were some slight merit to this argument 20+ years ago, but even then, I think it's all bunk. At most, the default color configuration at the driver level might fit some monitors better than others. Someone who might be concerned about this kind of thing would have the ability to fix it with a colorimeter. Having said that, I think people who own colorimeter's would also find this to be silly and wouldn't waste their time on this.
I mean... I do own one and I did waste my time on it back in the day. In the end, it didn't convince any of the kool-aid drinkers. They came up with all kinds of silly reasons as for why it wasn't real, I didn't do it right, or they could tell the 0.1dE difference that my meter had measured (which was just the limits of the meter). There's just no winning. They wanna believe their have magic sauce in their GPU and nobody is going to convince them otherwise.

As a side note on the bit-depth thing, higher bit depths don't make things more colorful, they make gradients smoother. It also doesn't work at all like Potan is talking about 16-bit per channel is not the "native" size, that's just the amount of precision that the 1D LUTs in graphics cards are limited to. Has nothing to do with the native rendering pipeline of anything.

Basically me recommendations are to just not worry about it. By that I mean both don't worry about the color on your monitor/GPU, and don't worry about trying to convince people on this. It is one of those arguments that isn't founded in fact, it is founded in belief and you aren't going to change that. I wasted my time, don't waste yours.
 
Great findings and excellent insights from all of you.
So what was the actual scientific reason behind it?

1. No implementation of Temporal Dithering for 6bit, 6bit + FRC, 8bit, and 8bit + FRC

2. Aggressive usage of Delta Color Compression (DCC), Memory Compression, Bandwidth Compression, Texture Compression, and HDR Compression

3. Poor RAMDAC/Palette DAC quality and Ports quality

4. Driver defaulting to 16-235 Limited RGB Range 4:2:2 signal behaviour on DP-HDMI ports

5. Driver's inaccurate uncalibrated slight Grey tone Washed-out Colors & Contrast default stock values

6. All of them combined
all combined. The deluded fanboys of course will deny to the best of their ability without providing any test data, screenshot, video or link to support their stupid shit.
Those of us who've been observing picture quality related stuff and owned multiple vendors through out history would recognize this kind of shit immediately. Hell, connecting a GT1030 and R7 250X to a CRT via D-sub or DVI with latest driver and differences are still there.

The fanboys often said it is some sort of conspiracy theory to make nVIDIA looks bad, but then again, true tech enthusiasts who actually conducted tests and measured them were saying this kind of thing ages ago. Just logically don't see how tech testers and gamers back then were lying about this kind of stuff when the display is right in front of their fking eyes. Also like I've said, it wasn't just worse result against AMD/ATi, but against S3 Graphics, Matrox, 3dfX as well back when people were doing real time testings on CRT, early 8-bit CCFL TN and IPS monitors. Not just no dithering, the DAC chips of nVIDIA GPU that output color palette and images were poorly made because of mass production and cost reduction. Plus, they lowered the graphic settings inside ForceWare nVCP kernel as much as possible without the majority of these idiots noticing to gain frames.
 
  • Like
Reactions: Xar
like this
all combined. The deluded fanboys of course will deny to the best of their ability without providing any test data, screenshot, video or link to support their stupid shit.
Those of us who've been observing picture quality related stuff and owned multiple vendors through out history would recognize this kind of shit immediately. Hell, connecting a GT1030 and R7 250X to a CRT via D-sub or DVI with latest driver and differences are still there.

The fanboys often said it is some sort of conspiracy theory to make nVIDIA looks bad, but then again, true tech enthusiasts who actually conducted tests and measured them were saying this kind of thing ages ago. Just logically don't see how tech testers and gamers back then were lying about this kind of stuff when the display is right in front of their fking eyes. Also like I've said, it wasn't just worse result against AMD/ATi, but against S3 Graphics, Matrox, 3dfX as well back when people were doing real time testings on CRT, early 8-bit CCFL TN and IPS monitors. Not just no dithering, the DAC chips of nVIDIA GPU that output color palette and images were poorly made because of mass production and cost reduction. Plus, they lowered the graphic settings inside ForceWare nVCP kernel as much as possible without the majority of these idiots noticing to gain frames.
Take it easy, lad... I DMed Bryan Bilowol from TechYESCity about all these, he confirmed whatever exists nowadays isn't obvious anymore compared to Pre GCN-Kepler Era.
 
Take it easy, lad... I DMed Bryan Bilowol from TechYESCity about all these, he confirmed whatever exists nowadays isn't obvious anymore compared to Pre GCN-Kepler Era.
What do you expect, given that he bleeds green?
 
all combined. The deluded fanboys of course will deny to the best of their ability without providing any test data, screenshot, video or link to support their stupid shit.
Those of us who've been observing picture quality related stuff and owned multiple vendors through out history would recognize this kind of shit immediately. Hell, connecting a GT1030 and R7 250X to a CRT via D-sub or DVI with latest driver and differences are still there.

The fanboys often said it is some sort of conspiracy theory to make nVIDIA looks bad, but then again, true tech enthusiasts who actually conducted tests and measured them were saying this kind of thing ages ago. Just logically don't see how tech testers and gamers back then were lying about this kind of stuff when the display is right in front of their fking eyes. Also like I've said, it wasn't just worse result against AMD/ATi, but against S3 Graphics, Matrox, 3dfX as well back when people were doing real time testings on CRT, early 8-bit CCFL TN and IPS monitors. Not just no dithering, the DAC chips of nVIDIA GPU that output color palette and images were poorly made because of mass production and cost reduction. Plus, they lowered the graphic settings inside ForceWare nVCP kernel as much as possible without the majority of these idiots noticing to gain frames.
I don't think anyone is arguing that there weren't differences in the past - analog transmission will do that, after all (digital conversion on the early LCDs was handled at the screen). But we're well past that era - we're sending digital signals all the way through before conversion to photons, and a value conversion to RGB (or other) LCD combo is... well, always going to look like that barring shens on the part of the panel maker.
 
People are talking about the Analog days.. 20 years ago. And it wasn't even AMD, it was ATI.
It's digital now. There isn't a DAC on the video card anymore unless it has VGA output.

It's the same Monster Cables arguments, that only applied to analog cables.
HDMI, just make sure you have a 2.1 compatible cable. The important spec is the bandwidth. 48gbps is the current high end. Bandwidth is really the only spec that matters for digital signal transmission. And if you have a subpar cable, you will get specs in the picture, artifacts.
"This cable costs 10x as much, but it makes a difference!"
Analog, it can, digital, not so much.

All of the color settings are in the control panel. You set these identically, you will get an identical picture between GPUs (on the same display using the same input).

Hell I think someone did a blind comparison for this exact thing (or might have been picture quality, was years ago), might have even been [H]. I remember the conclusion was there was no difference.
Since Dec 2014, the ability to control the color depth has been built into the Nvidia control panel:

"Nvidia solutions

First solution: functionality now built into the Nvidia display driver

As of driver version 347.09, Nvidia have added a small drop-down to the Nvidia Control Panel (NCP) that will allow you to enforce the correct ‘Full Range’ signal. Simply open NCP and navigate to ‘Display – Change resolution’. You should see a drop down box labelled ‘Output dynamic range’. At time of writing this is the final drop-down box on the page, under ‘Apply the following settings’ as shown below.
1702589995440.png

Make sure this is set to ‘Full’ rather than ‘Limited’ and press ‘Apply’ to enforce the ‘Full Range RGB 0-255’ signal.
‘Limited’ is used by default on Full HD monitors connected using HDMI and some connected using DisplayPort, when running at 60Hz or other refresh rates listed as ‘Ultra HD, HD, SD’.
For resolutions or refresh rates listed as ‘PC’ the default setting will be ‘Full’.
If the monitor has an ‘HDMI Black Level’, ‘HDMI RGB PC Range’ or similar option make sure this is set to ‘Normal’, ‘High’, ‘Full’ or ‘RGB (0~255)’ rather than ‘Low’, ‘Limited’ or ‘RGB (16~235).
The exact nomenclature depends on the monitor model. An ‘Automatic’ setting or similar should correctly detect the colour signal type send out by the GPU."

This is from an old article, I don't know what these settings default to these days... checking.

My control panel:
1702590282741.png

When it is on "Use default color settings" the Desktop color depth is Highest (32-bit), Output color depth 10bpc, RGB, Output dynamic range: Full.
Selecting Use NVIDIA color settings, I can bump the Output color depth to 12 bpc.

My monitor is DisplayPort connected, so it only allows me to select from the list of 'PC' resolutions which all default to Full Dynamic range and highest color depth, and 10bpc color depth.

Pretty sure the available choices will depend on the connected display.

So about the only thing you might need to do these days, is set it to use NVIDIA color settings, and max it out. Cheap displays or TV's, some of the higher settings may not display properly. It will default to lower dynamic range and color depth when HDMI connected display is being set to TV resolutions and low refresh rates (60Hz). This is because if the TV cannot support it, those higher color settings will not work. Select a 'PC' resolution, and it will default to the middle color depth of 10, but more importantly to the Full dynamic range.

Not sure anyone can tell any difference between 8bpc, 10bpc, and 12bpc... Our eyes can see about 1 million colors. 8bpc = 16.7 million colors, 10bpc = a billion colors. The higher setting might allow for a smoother gradient, but that it probably reaching. If someone wants to test this and let us know what they find. With your eyes... a colorimeter might be able to 'see' the difference, but if my eyes can't no reason to get in a fuss about it.

Edit: After doing some more research, I found out that no monitor exists that can display 12bpc.
So, all you have to do is ensure you are using a 'PC' resolution in the Nvidia control panel, and it will default to the highest and proper color settings.
It defaults to lower for 'TV' resolutions as many of them cannot display the higher settings. You can override any of those settings' default selections if your TV can.

It's working properly.
 
Last edited:
Not sure anyone can tell any difference between 8bpc, 10bpc, and 12bpc... Our eyes can see about 1 million colors. 8bpc = 16.7 million colors, 10bpc = a billion colors. The higher setting might allow for a smoother gradient, but that it probably reaching. If someone wants to test this and let us know what they find. With your eyes... a colorimeter might be able to 'see' the difference, but if my eyes can't no reason to get in a fuss about it.
So you can... but there is a huge "it depends" on that.

First we have to establish if we are talking about SDR or HDR, because that matters. For SDR there is a slight difference between 8 and 10-bit. 8-bit isn't quite enough to produce a completely smooth gradient in the grays with a 2.2 or 2.4 gamma, you can see a little banding. Moving up to 10-bit completely eliminates any visible banding. The pro video world and medical imaging used 10-bit for a long time because of that. However it is fairly subtle, and dithering does a good job masking it when done properly.

For HDR it is much more important. Because we not only have a wider color gamut but also much increased dynamic range, you need more steps to eliminate banding. If we'd stuck with a gamma curve, it'd need to be like 15-bits to eliminate banding and have lots of wasted precision. Dolby did research on our preferences and visual system and came up with the ST.2048 PQ EOTF which does a pretty good job of tracking how we actually perceive brightness and that's what most HDR uses. For it, you need 12-bits to completely eliminate any banding, however 10-bit comes pretty close and you don't really see much in actual usage.

Near as I know there aren't any actual 12-bit panels yet, so what you get for those that support it is a 10-bit panel with 2-bits of FRC dither (or in some cases they just ignore the 2 LSB). Most HDR content and rendering is 10-bit anyhow so the extra precision doesn't help at this point.

You are correct it is all about gradients, not actually about "more colors" despite it getting marketed that way. It is about getting each step below our perceptual threshold so that it looks totally smooth.
 
Back
Top