AMD vs nVidia Colors - Concrete proof of differences

Quality of Rendering, who does it better? AMD or NVIDIA?


  • Total voters
    61
So that doesn’t count. RAMDAC was back when we were sending analog signals out; we’re not. Now we’re theoretically sending 100% digital; it either arrives intact or it doesn’t (moderate simplification). Back then yes - ATi had a better RAMDAC (as did matrix) and did a better job converting to analog, much like a better modern DAC for headphones would.

It’s 2022. That point is 22+ years old now. We’re talking old as shit, unless you’re a CRT crazy.
 
like i said
1670533583293.png
 
So that doesn’t count. RAMDAC was back when we were sending analog signals out; we’re not. Now we’re theoretically sending 100% digital; it either arrives intact or it doesn’t (moderate simplification). Back then yes - ATi had a better RAMDAC (as did matrix) and did a better job converting to analog, much like a better modern DAC for headphones would.

It’s 2022. That point is 22+ years old now. We’re talking old as shit, unless you’re a CRT crazy.
even after the all digital he seems to be indicating that in motion there's a perceptible difference

"|I still felt like AMD card produced sharper/clearer colors when watching videos though."
 
key words "felt like", doesnt mean it is.
perhaps you need to give up on this one....
there's an obvious subjective nature indicated by the poll and so far nobody has asked to allow for multiple selections. each vendor implementation is proprietary and their own interoperation of the standards so even reasonably it's acceptable to conclude there will be a difference in how the rendering is happening. the architectural designs are radically different and so many variables related to timing and frequencies being maintained.

to me it's similar to the previous analogy of some individuals being sensitive to frame rates below 60. i have a friend who easily sees the flickering and it gives him a headache even though i can't tell from 24-30-60, but at 120+ i can tell it's significantly smoother

1670541102636.png
 
Last edited:
maybe a title edit then if you insist on persisting. subjective isnt proof

"two year old thread i dug up about who has better colour, which do you prefer?"
i am not fixated on the colours, but the overall feeling of the rendering over time / motion in general
 
Colour data is encoded digitally by the HDMI/Displayport standards. These digital signals are unambiguous and error corrected. The only way that AMD/Nvidia/Intel/3dfx/Matrox would have different colour reproduction would be for them to purposefully produce different colours at the OS level. Which is entirely possible, but also 100% fixable through changing settings to user preferences through their respective desktop composition settings.

So the real concern is that AMD/Nvidia/Intel's DEFAULT colour settings are different.
 
Colour data is encoded digitally by the HDMI/Displayport standards. These digital signals are unambiguous and error corrected. The only way that AMD/Nvidia/Intel/3dfx/Matrox would have different colour reproduction would be for them to purposefully produce different colours at the OS level. Which is entirely possible, but also 100% fixable through changing settings to user preferences through their respective desktop composition settings.
I feel you are getting your timeline a bit mixed up, by the time we used HDMI port on video card, 3dfx was not a mainstream video card and for a very long time.

The PS3 was in 2008, first video card with an hdmi port wa apparently a Radeon X1600 GPU in 2006, the 7600GT was 2007, that around that time.

3dfx was bought in 2001 and closed in 2002, last effort was around 2000, they would have been VGA card not even DVI yet, that just launched and brand new in 1999.

Back in those 3dfx/matrox days lot of analogue signal getting out of video card, and different for monitor than our television and it was something review would bring up, s-video port is made with that DAC or that high quality one.
 
I feel you are getting your timeline a bit mixed up, by the time we used HDMI port on video card, 3dfx was not a mainstream video card and for a very long time.

The PS3 was in 2008, first video card with an hdmi port wa apparently a Radeon X1600 GPU in 2006, the 7600GT was 2007, that around that time.

3dfx was bought in 2001 and closed in 2002, last effort was around 2000, they would have been VGA card not even DVI yet, that just launched and brand new in 1999.

Back in those 3dfx/matrox days lot of analogue signal getting out of video card, and different for monitor than our television and it was something review would bring up, s-video port is made with that DAC or that high quality one.
I was adding them as a sarcastic joke but Poe's law and all
 
I read on here long ago that AMD was known for setting their default settings for more vibrant colors and if you so decided you could change settings on Nvidia cards to match it. It is like TVs in stores being in "demo mode" vs the same model set for accurate colors. You can achieve either look by altering settings.
I thought it was the reverse. Nvidia pushing their "vibrant" settings back in the day and the impression stuck.
 
My eyes definitely saw differences back when the last Nvidia card I owned was a 980Ti. Even a Vega 56, R9 290 Pro and my Furies all look better on the exact same monitor. And no, it was not the monitor and no amount of tweaking and pushing was going to change a thing. Nowadays, I do not know, since I no longer own any Nvidia cards but it was not a placebo and it was not just my mind playing tricks on me. :)

No, I used to have customers like you, who swore blind that AMD's colours were better. That the Nvidia card they switched over to looked worse. And they said the exact same thing as you. No amount of tweaking would make the display on the Nvidia card look the same. You would try to talk them through solutions over the phone, but, no, they wouldn't take your word for it. That what they were seeing was "real"

So I would go out and it was always one of three things. They had the RGB setup wrong on Nvidia. There was a bug sometimes with Nvidia cards that they defaulted to Limited instead of Full range when connecting with display port/Hdmi.
If that didn't fix the problem, or they still thought that the AMD card looked better, than I would just slightly increase the vibrancy in the display settings until it matched the AMD's saturation settings.

So, on your 980ti, you either didn't change the RGB range to Full. Or you didn't do any tweaking in the display settings. Because it takes less than a minute to make them look the exact same.

I would bet all my money, that if we put a 980ti and Vega 64 connected to your monitor and did a blind test, you would not be able to tell the difference.
 
There's something really funny to me about conducting a blind test in this situation.

Also there's no way that there isn't some difference among hardware vendors. As much as I like having an illusion of choice. We straight up wouldn't have the little that exists if everything accomplished in the same manner.
 
To start you need to define what each color is, Pantone, Raw info, RGB, SRGB, what is the standard. Is everything tuned to meet the expectations.
 
  • Like
Reactions: erek
like this
No, I used to have customers like you, who swore blind that AMD's colours were better. That the Nvidia card they switched over to looked worse. And they said the exact same thing as you. No amount of tweaking would make the display on the Nvidia card look the same. You would try to talk them through solutions over the phone, but, no, they wouldn't take your word for it. That what they were seeing was "real"

So I would go out and it was always one of three things. They had the RGB setup wrong on Nvidia. There was a bug sometimes with Nvidia cards that they defaulted to Limited instead of Full range when connecting with display port/Hdmi.
If that didn't fix the problem, or they still thought that the AMD card looked better, than I would just slightly increase the vibrancy in the display settings until it matched the AMD's saturation settings.

So, on your 980ti, you either didn't change the RGB range to Full. Or you didn't do any tweaking in the display settings. Because it takes less than a minute to make them look the exact same.

I would bet all my money, that if we put a 980ti and Vega 64 connected to your monitor and did a blind test, you would not be able to tell the difference.

Yes, there were definite differences and my eyes were telling me the truth. I spent 6 months with that 980Ti and no amount of tweaking or pushing changed a single solitary thing. The colors were washed out and nothing changed that and no, it was not the cable, the monitor or my settings, it was the NVidia card itself and / or their drivers at that time. The AMD cards I ran on that very same monitor looked and worked extremely well so, agree with me or not, no skin off my back.

Edit: Oh, and I am an IT professional so I knew what I was doing and how to follow instructions as people posted them. The facts are simply the facts, nothing more and nothing less. Also, I was using Displayport on that 28 inch 4k Samsung monitor in 2015 and into 2016. Everything was setup as it should have been and made no difference so yes, NVidia colors were not as good and were washed out.
 
Maybe your eyes see the same color differently. Schrodingers paint.

For my personal experience, no, for whatever reason, that EVGA 980Ti with the display port was washed out on the 28 inch 4k Samsung monitor and the AMD cards I used were not. 6 months of that was enough for me to not bother going back to Nvidia, for my own personal use.
 
  • Like
Reactions: erek
like this
Sometimes you can't believe your eyes. Thats what my mom used to say.

Sometimes, sure but with me sitting in front of the computer, yep, my eyes were 100% believable. I think the biggest challenge is that for many, over the years, they really needed to get rid of the Nvidia can do no wrong mantra, since they clearly have done many things wrong. Also, the fact that others never experienced what I had experienced does not make them right nor make me wrong, it was neither either or in computer hardware.
 
  • Like
Reactions: erek
like this
You never know, you could have a floater or slightly detached retina.
 
  • Like
Reactions: erek
like this
These days there's so much bs on the screen in games I'm not sure you could tell the difference even if there was vs when this was a thing in the 90s/00s but matrox especially had better rendering then ati then nvidia IME. Nvidia used to cheat more to get higher FPS then too lol. Like others said this was with CRTs though, I feel like your monitor makes a bigger difference now so just buy whatever you want and adjust your settings and get a calibrator if you care.
 
even after the all digital he seems to be indicating that in motion there's a perceptible difference

"|I still felt like AMD card produced sharper/clearer colors when watching videos though."
video != raster "in-motion" (if you're implying game motion looks different - which I would doubt very much because the GPU is literally working on whatever data the game engine provides it. I suppose it's possible that either AMD or Nvidia truncate data types or take shortcuts in ALU units but that would be very obvious).

However, there could very well be very noticeable differences in video rendering quality though because this will depend on the hardware manufacturer's implementation - AMD's VCE and VCN vs Nvidia's NVDEC. It's been a while since I worked on codec streaming hardware but standards compliance doesn't prevent shortcuts and differences in implementations.
 
  • Like
Reactions: erek
like this
Nope, AMD/ATI default colours have always been slightly more saturated Nvidia's default. And our eyes are naturally drawn to more saturated colours, that's why some people think that AMD/ATI looks better.

Nah.. I really don't think so. You would have to go pre-digital to RAMDAC's to see those sort of differences. Once we went digital that completely vanished. And likely testing with a Colorimeter would show that if a person went and tested vintage hardware.

People often love that nVidia has the Digital Vibrance setting though. A feature ATi eventually implemented as an on/off toggle(I can't remember the name ATi used in CCC for this). Wasn't as nice though as you didn't have a slider to control it's intensity. AMD later brought it back with "Vivid Color" as "Display Color Enhancement" profile in the current drivers.

video != raster "in-motion" (if you're implying game motion looks different - which I would doubt very much because the GPU is literally working on whatever data the game engine provides it. I suppose it's possible that either AMD or Nvidia truncate data types or take shortcuts in ALU units but that would be very obvious).

However, there could very well be very noticeable differences in video rendering quality though because this will depend on the hardware manufacturer's implementation - AMD's VCE and VCN vs Nvidia's NVDEC. It's been a while since I worked on codec streaming hardware but standards compliance doesn't prevent shortcuts and differences in implementations.

Yea it mentions "video". That could have been differences in h/w decoding maybe.
 
Last edited:
  • Like
Reactions: erek
like this
AMD has ability to correct gamut
It is enabled via enabling Custom Color and then disabling Color Temperature Control in Global Displays

amd_color_corr.png


The way it works is that monitor upon connecting to GPU sends identification data called EDID to GPU and it contains informations about preferred video modes, freesync range, which color bitdepth and standards are supported, etc. and among these information is information about monitor's native gamut. AMD cards can use this gamut information to automatically convert native gamut to sRGB colorspace so that all sRGB graphics send to monitor look correctly.

This is very useful for monitor without sRGB emulation or when its realized poorly. It can also be useful for monitors where sRGB emulation mode has some restrictions eg. some monitors:
- lock backlight brightness
- lock overdrive strength setting
- lock gamma setting
- sRGB emulation mode has more input lag than native gamut mode or special game mode

This gamut emulation is not perfect because EDID has limited number precision but otherwise manufacturers are almost always putting correct information there and it doesn't change over time so where it matters for normal content consumption such feature can be very useful to have

Other differences between Radeon cards and GeForce
1. Nvidia does not always do dithering of output images
Due to some obscure reasons relating to how Windows is constructed and what it requires GPU's to do it mostly affects situations where GPU LUT is changed like when changing gamma in games or "loading" color profiles. Not having dithering can either lead to severe banding artifacts on 8 bpc connection to monitor or slight inaccuracies in colors on 10 bpc or higher. Radeon does dithering always and Nvidia only when using Limited color range setting.

2. Nvidia has ability to force changing of GPU LUT entirely.
Presumably a fix for banding issue in games which do touch GPU LUT.
Imho good option to have but even better would be always using dithering (which BTW has zero negative consequences when gamma is not changed - read its 'linear') and as for 'reference' mode just automatically load selected in Windows ICC profile and then lock applications from being able to change GPU LUT. While they would be at it they could implement gamut correction based on either EDID or ICC file. If they did that then Nvidia would have perfect GPU for color critical tasks. So far neither Nvidia nor AMD has perfect GPU there but AMD is much closer.

Otherwise colors in games (read: how games are rendered) should be the same in all cases because unlike in the past today's GPU's have to adhere to standards to get certification. In other words rendering in DirectX have to exactly match Microsoft's reference renderer and they do.
At least in theory. Default settings in both AMD and Nvidia are slightly off Microsoft target but can be configured to be exactly the same by disabling special optimization in respective control panels.
 
Nope, AMD/ATI default colours have always been slightly more saturated Nvidia's default. And our eyes are naturally drawn to more saturated colours, that's why some people think that AMD/ATI looks better.
Maybe it was because I came from a Matrox G400 and bumped up to a Nvidia Geforce 2 MX400. After the swap I was confused and disappointed in the overly vibrant colours. It was a quick un-tick in the driver suite and all was good.
 
Maybe it was because I came from a Matrox G400 and bumped up to a Nvidia Geforce 2 MX400. After the swap I was confused and disappointed in the overly vibrant colours. It was a quick un-tick in the driver suite and all was good.

Yeah, you are right, Back in the day Nvidia had the digital vibrancy(basically Saturation) turned up a lot by default. They changed this default later to a much more neutral setting.
 
Nah.. I really don't think so. You would have to go pre-digital to RAMDAC's to see those sort of differences. Once we went digital that completely vanished. And likely testing with a Colorimeter would show that if a person went and tested vintage hardware.

It still happened even after we went digital. Sometimes the AMD driver defaults to YCbCr 4:4:4, not as much recently. Sometimes it varies depending on the Monitor.

I will agree with you though, that nowadays, for the most part, there is no difference.

Funny you should mention Colorimeter's. I would have to resort to using one for those especially difficult customers who would swear that there was a difference, even when there wasn't any.
 
Yes, there were definite differences and my eyes were telling me the truth. I spent 6 months with that 980Ti and no amount of tweaking or pushing changed a single solitary thing. The colors were washed out and nothing changed that and no, it was not the cable, the monitor or my settings, it was the NVidia card itself and / or their drivers at that time. The AMD cards I ran on that very same monitor looked and worked extremely well so, agree with me or not, no skin off my back.

Edit: Oh, and I am an IT professional so I knew what I was doing and how to follow instructions as people posted them. The facts are simply the facts, nothing more and nothing less. Also, I was using Displayport on that 28 inch 4k Samsung monitor in 2015 and into 2016. Everything was setup as it should have been and made no difference so yes, NVidia colors were not as good and were washed out.

Saying that you are an IT professional and that you couldn't get a Nvidia card and an AMD card to look the same on the one Monitor is an oxymoron.

Back in the analog days this might have been true. Now in the digital age, GPUs and monitors have to meet certain requirements to get certification, Directx rendering etc.

Is it possible that you noticed a difference when you changed over? Yes, very possible as I said in my previous post.

However, your statement that "no amount of tweaking or pushing changed a single thing" is complete horse manure. They are both digital, both have to conform to the same standards. Both can be adjusted to use the same colours in the same colour space. And you, a supposed IT professional, are trying to tell me that you couldn't get the Nvidia card to output the same colours as the AMD card. 5 minutes with any of the online calibration tools would have sorted this, that's if you checked to make sure that the Nvidia card was outputting full range RGB instead of limited. And by the sounds of your posts, you didn't.
 
Saying that you are an IT professional and that you couldn't get a Nvidia card and an AMD card to look the same on the one Monitor is an oxymoron.

Back in the analog days this might have been true. Now in the digital age, GPUs and monitors have to meet certain requirements to get certification, Directx rendering etc.

Is it possible that you noticed a difference when you changed over? Yes, very possible as I said in my previous post.

However, your statement that "no amount of tweaking or pushing changed a single thing" is complete horse manure. They are both digital, both have to conform to the same standards. Both can be adjusted to use the same colours in the same colour space. And you, a supposed IT professional, are trying to tell me that you couldn't get the Nvidia card to output the same colours as the AMD card. 5 minutes with any of the online calibration tools would have sorted this, that's if you checked to make sure that the Nvidia card was outputting full range RGB instead of limited. And by the sounds of your posts, you didn't.

No, it is not an oxymoron, it is simply a fact that bared out in reality, nothing more and nothing less. The fact was, the Nvidia card, in this case it was the EVGA 980Ti, looked like washed out garbage on that monitor and none of the AMD cards did.

Once again, no amount of tweaking and pushing was going to change that fact, nothing more and nothing less. The fact that I am an IT professional indicates that I had no issue following directions and fully understand that I do not know all things at all times. However, I did know the fact of what was occurring on my setup and it was not going to change, even after 6 months of trying to do otherwise.
 
Last edited:
No, it is not an oxymoron, it is simply a fact that bared out in reality, nothing more and nothing less. The fact was, the Nvidia card, in this case it was the EVGA 980Ti, looked like washed out garbage on that monitor and none of the AMD cards did.

Once again, no amount of tweaking and pushing was going to change that fact, nothing more and nothing less. The fact that I am an IT professional indicates that I had no issue following directions and fully understand that I do not know all things at all times. However, I did know the fact of what was occurring on my setup and it was not going to change, even after 6 months of trying to do otherwise.
Every time you say that no amount of tweaking changed anything further emphasises that you didn't know what the hell you were doing. You see, what you say is actually impossible. You do realise that? If you really are an IT professional then you would understand how foolish your comments are. All GPUs can be tweaked to output exactly the way you want. You can set the brightness, contrast and gamma for each individual colour as well as all the colours combined. You can change the vibrance, hue and you can change the colour space and range used. And you are trying convince me that you couldn't get them to look the same? LOL Seriously dude? It's not magic. AMD doesn't have some secret tech that makes their cards output in different colours than Nvidia. They all use the same standard, they have to. Any colour that an AMD card can output a Nvidia card can output. Don't you get that?

This myth that one GPU manufacturer has better colours/picture quality is been perpetuated by people, who are either clueless or have an agenda to defend their favourite brand. It has been debunked so many times. Can they look slightly different even on the same monitor if you switch between them? Yes they can. Due to not reading the monitor's EDID correctly, applying the wrong space etc. But, the chief cause of washed out looking colours in GPUs is mismatch in ranges between the display and the GPU. Either the display is on full range and the GPU is limited or vice versa.

You didn't change the output range from limited to full in the Nvidia control panel. Your problem was as simple as that.
 
Every time you say that no amount of tweaking changed anything further emphasises that you didn't know what the hell you were doing.

Nah, it just simply means you are not willing to deal with what is right in front of you. Here be the facts, you are not willing to accept those facts, it does not change the facts and things remain as they were. (Were because this was a bit over 6 years ago.) Also, claiming something is impossible in the IT space is indicative of someone putting their head in the sand and showing that they actually are unwilling to learn any further. Simply put, there is no such thing as impossible so.......

Also, you seem to not realize that the 980Ti could only output in 8 bit color, well the AMD cards were able to output in 10 bit color. However, I do not believe the washed out appearance on the 980Ti was effected by that and once again, nothing changed that so I moved on.
 
Last edited:
Nah, it just simply means you are not willing to deal with what is right in front of you. Here be the facts, you are not willing to accept those facts, it does not change the facts and things remain as they were. (Were because this was a bit over 6 years ago.)

What are you talking about? Show me any technical fact that you have stated? Show me one actual technical truth? Show me one piece of science in your posts that backs up your idiotic comments?

You really should go learn something. I mean, it's all out there. Have you never heard of calibration? Do you anything about colour gamuts? And you plainly and obviously don't know know jack about the effects of mismatched output ranges.

You trying to suggest that two cards could not be made look the same on the same monitor is absolute nonsense and your reply about facts just shows how ignorant you are.
 
What are you talking about? Show me any technical fact that you have stated? Show me one actual technical truth? Show me one piece of science in your posts that backs up your idiotic comments?

You really should go learn something. I mean, it's all out there. Have you never heard of calibration? Do you anything about colour gamuts? And you plainly and obviously don't know know jack about the effects of mismatched output ranges.

You trying to suggest that two cards could not be made look the same on the same monitor is absolute nonsense and your reply about facts just shows how ignorant you are.

I am not suggesting anything, I simply stating a fact of what actually occurred. I spent 6 months with that 980Ti card and the same monitor and nothing changed how it actually displayed on that screen.
 
  • Like
Reactions: erek
like this
I am not suggesting anything, I simply stating a fact of what actually occurred. I spent 6 months with that 980Ti card and the same monitor and nothing changed how it actually displayed on that screen.
Hmm 🤔 🧐
 
This thread is why we have reviews and not just listen to what people feel.
Because people are biased and that blinds them to reality.
I had to register, because I remembered reading this thread here on this very forum:
https://hardforum.com/threads/a-real-test-of-nvidia-vs-amd-2d-image-quality.1694755/

An objective test, with hard data that invalidates any user poll.
But it does display which users are affected by bias and to what side their bias lies.

But that is the only thing you can garner from this thread/poll.
Simple human bias.
 
Back
Top