AMD vs nVidia Colors - Concrete proof of differences

Quality of Rendering, who does it better? AMD or NVIDIA?


  • Total voters
    61
People are talking about the Analog days.. 20 years ago. And it wasn't even AMD, it was ATI.
It's digital now. There isn't a DAC on the video card anymore unless it has VGA output.

It's the same Monster Cables arguments, that only applied to analog cables.
HDMI, just make sure you have a 2.1 compatible cable. The important spec is the bandwidth. 48gbps is the current high end. Bandwidth is really the only spec that matters for digital signal transmission. And if you have a subpar cable, you will get specs in the picture, artifacts.
"This cable costs 10x as much, but it makes a difference!"
Analog, it can, digital, not so much.

All of the color settings are in the control panel. You set these identically, you will get an identical picture between GPUs (on the same display using the same input).

Hell I think someone did a blind comparison for this exact thing (or might have been picture quality, was years ago), might have even been [H]. I remember the conclusion was there was no difference.
Since Dec 2014, the ability to control the color depth has been built into the Nvidia control panel:

"Nvidia solutions

First solution: functionality now built into the Nvidia display driver

As of driver version 347.09, Nvidia have added a small drop-down to the Nvidia Control Panel (NCP) that will allow you to enforce the correct ‘Full Range’ signal. Simply open NCP and navigate to ‘Display – Change resolution’. You should see a drop down box labelled ‘Output dynamic range’. At time of writing this is the final drop-down box on the page, under ‘Apply the following settings’ as shown below.
View attachment 620137
Make sure this is set to ‘Full’ rather than ‘Limited’ and press ‘Apply’ to enforce the ‘Full Range RGB 0-255’ signal.
‘Limited’ is used by default on Full HD monitors connected using HDMI and some connected using DisplayPort, when running at 60Hz or other refresh rates listed as ‘Ultra HD, HD, SD’.
For resolutions or refresh rates listed as ‘PC’ the default setting will be ‘Full’.
If the monitor has an ‘HDMI Black Level’, ‘HDMI RGB PC Range’ or similar option make sure this is set to ‘Normal’, ‘High’, ‘Full’ or ‘RGB (0~255)’ rather than ‘Low’, ‘Limited’ or ‘RGB (16~235).
The exact nomenclature depends on the monitor model. An ‘Automatic’ setting or similar should correctly detect the colour signal type send out by the GPU."

This is from an old article, I don't know what these settings default to these days... checking.

My control panel:
View attachment 620139
When it is on "Use default color settings" the Desktop color depth is Highest (32-bit), Output color depth 10bpc, RGB, Output dynamic range: Full.
Selecting Use NVIDIA color settings, I can bump the Output color depth to 12 bpc.

My monitor is DisplayPort connected, so it only allows me to select from the list of 'PC' resolutions which all default to Full Dynamic range and highest color depth, and 10bpc color depth.

Pretty sure the available choices will depend on the connected display.

So about the only thing you might need to do these days, is set it to use NVIDIA color settings, and max it out. Cheap displays or TV's, some of the higher settings may not display properly. It will default to lower dynamic range and color depth when HDMI connected display is being set to TV resolutions and low refresh rates (60Hz). This is because if the TV cannot support it, those higher color settings will not work. Select a 'PC' resolution, and it will default to the middle color depth of 10, but more importantly to the Full dynamic range.

Not sure anyone can tell any difference between 8bpc, 10bpc, and 12bpc... Our eyes can see about 1 million colors. 8bpc = 16.7 million colors, 10bpc = a billion colors. The higher setting might allow for a smoother gradient, but that it probably reaching. If someone wants to test this and let us know what they find. With your eyes... a colorimeter might be able to 'see' the difference, but if my eyes can't no reason to get in a fuss about it.
The dynamic range defaulted to Full on my LG C3 with HDMI the first time I used it.
 
To do a comparison of my own amd versus nvidia cards, I would have to disconnect my kid's pc using the 6700, extract the 6700, then uninstall my 4090 install it on my pc and do the calibration again take pics of it in the basement all to satisfy this ridiculous idea that somehow 1 company displays colors better after a spyder 5 makes the pictures look the same to my preference lighting conditions to satisfy a few fanboys egos.

That's too much of an ask.
 
That's a hell of a time sink with no real pay off. It would feel like a complete waste of time to the person doing all the work. Only worth doing if you're being paid to do so or I guess you're just that OCD about it.
 
The reason I say they look the same on my setups is because I control the final output via a colorimeter. The biggest difference bw my systems is the monitor/tv themselves, not what the cards output. The coloromiter will make adjustments to match what I prefer for look and adjust settings to match ergo, they will look the same regardless of card. To say otherwise is really just being in denial. Both nvidia and amd have "default" settings which fit a range of monitors where rgba will be rgba within reasonable expectations, but even with so called from the factory calibrations, I've still had to adjust my screens to the lighting in the rooms I'm using. The reason I set them up so is so that when I make some art for my fight sticks, I want what I see on screen to match up to what I print.

2023-12-1622.50.112495547644983357811.jpg
 
Last edited:
Back
Top