AMD vs nVidia Colors - Concrete proof of differences

Quality of Rendering, who does it better? AMD or NVIDIA?


  • Total voters
    61

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,786
Opinion on the qualitative rendering differences between AMD and NVIDIA (beyond frame times even)?

1670449825476.png


1670449841709.png


https://www.reddit.com/r/Amd/comments/n8uwqk/amd_vs_nvidia_colors_concrete_proof_of_differences/
 
I can make two photos look different by taking them from slightly different angles without even having to change the GPU inbetween. Seriously, do people really reach for a camera before the Print Screen key?
 
I can make two photos look different by taking them from slightly different angles without even having to change the GPU inbetween. Seriously, do people really reach for a camera before the Print Screen key?
I use the print screen key/button but also further conversation extends beyond stills and to the perceptible differences of the rendering in motion beyond the measurement of frame times
 
i have proof, heres a poll, come on...
this:
I can make two photos look different by taking them from slightly different angles without even having to change the GPU inbetween. Seriously, do people really reach for a camera before the Print Screen key?
moving to the other side of my couch completely changes the look on my setup...
 
Haha that isn't "concrete" proof, those are just shitty pictures. I seriously doubt they render colors differently and nvidia sacrificing color accuracy for fps? lol Show me some measured data (colorimeter, spectrometer, etc) ... This is about as bad as audiophile cable nonsense.
 
Last edited:
Not clicking the link. However, for my own personal experiences, over the years, AMD has far better color appearances than Nvidia has, on the same monitor and different graphics cards. If you do not agree with me, I do not care, I know what my eyes tell me and that is all that matters to me.
 
I am not sure how it would not show up if there a difference in a render buffer and comparing raw image color bit by bit, it is not hard to believe that there could be difference drivers to drivers and so on with Intel-nvidia-amd but I would imagine that it would be easy to prove even just with screenshot for someone with 0 tech skills.
 
I am not sure how it would not show up if there a difference in a render buffer and comparing raw image color bit by bit, it is not hard to believe that there could be difference drivers to drivers and so on with Intel-nvidia-amd but I would imagine that it would be easy to prove even just with screenshot for someone with 0 tech skills.
A screenshot might not capture any differences, at a guess. What would be shown on your particular monitor would be output however your video card/driver/windows/monitor would show (as in, it'd likely look the very same as if you were the one that did the screen capture, even if the original was on a completely different setup).

At a guess, there's no real differences other than what AMD or Nvidia set for the default color values (like for Nvidia's below). If there are, and a tech site wanted to prove it - side by side exact same PCs using the same monitor (monitor variance is then removed), only difference is Nvidia GPU and AMD GPU both using default values. Take some video / photographs using the same cameras in the same distances/angles and then report on it. Less than that is just clickbait for me.
 

Attachments

  • Capture.JPG
    Capture.JPG
    119.2 KB · Views: 1
My vote is: "I don't know". Subjectively in the past I would say AMD had better colors, today no clue owning both Nvidia and AMD. Which means it makes zero difference in my case now.
 
My vote is: "I don't know". Subjectively in the past I would say AMD had better colors, today no clue owning both Nvidia and AMD. Which means it makes zero difference in my case now.
Hmm
 
Everytime I install a new video card I have placebo that somehow IQ improved. Until after a few mins I come back to my senses. There was genuinely a time when AMD had better colors and rendering. I think maybe 9700 or X800 XT times.

However, I have not felt the same since Nvidia 7800//8800 series cards.

If you see different then probably you are in the same placebo state. If there were actual differences you would see in the many head to head IQ comparison / reviews that are posted routinely these days and the YouTube journalists won’t shut up about it.
 
There were the days of 16 vs 24 vs 32 bits colours at least until the 9700 pro / 5900 ultra (if I understood correctly ATI aimed to optimize 24 bits performance, while Nvidia went for 32 bits too quickly with 16 bits has the alternative, gaming ended up being too slow at 32, so it often went in part 16 bits on Nvidia card vs 24 bits on Radeon)
 
I always calibrate and profile my monitor with a colorimeter when I change graphics cards so I guess it doesn't affect me.

So far, beyond anecdotal evidence, I haven't seen any definitive testing that provides proof or evidence of these differences. There are also so many variables at play here.
 
I run an amd on my home entertainment pc and nvidia card on my main gaming rig. I use a spyder color calibrator. They both look the same to me after calibration despite being on different screens. I wouldn't trust a monitor or GPU out of the box to be setup perfectly. Just not possible.
 
Spyder x pro. Check fb marketplace or local ads for a used one before buying new. A lot of people use it once then never bother again.
 
  • Like
Reactions: erek
like this
Spyder x pro. Check fb marketplace or local ads for a used one before buying new. A lot of people use it once then never bother again.
I took your advice and looking for used one, does it make that big of difference? sorry sound dumb, I mostly game. but color is very important to me.
 
  • Like
Reactions: erek
like this
The information is DIGITAL that goes thru your displayport cable.

If you you still use a VGA port there might be something to talk about except those issues were about 8 generations back and got fixed long, long ago.
You will see differences between different LCD's.
In a blind test with the same image, on the same display, the same color settings in windows, on different GPU's, you aren't going to see a difference.
 
I thought this was all lost on the old analog days when Matrox ran supreme. My, maybe wrongly is HDMI/Display-Port standards left little wiggle room for variance? There are so many settings over time to curve a GPU's display over time I think it is much illogical to compare and judge a newly slapped in card on all defaults.
Anyways as an owner of many cards from BRANDS I have never noticed a diff, yet never compared side-by-side to be honest. So many options to tweak so why care?
PS-Am I missing a bigger point?
 
Everytime I install a new video card I have placebo that somehow IQ improved. Until after a few mins I come back to my senses. There was genuinely a time when AMD had better colors and rendering. I think maybe 9700 or X800 XT times.

However, I have not felt the same since Nvidia 7800//8800 series cards.

If you see different then probably you are in the same placebo state. If there were actual differences you would see in the many head to head IQ comparison / reviews that are posted routinely these days and the YouTube journalists won’t shut up about it.
Came here to say just this.
This was indeed a thing years ago with NVIDIA and ATI (circa 2006 and earlier) where one generation and iteration looked better than the other, and tech magazines and sites actually did show physical evidence and technical proof of this.

But after 2007 this hasn't been a thing for either NVIDIA or AMD, and not after 2011 with Intel.
 
I thought this was all lost on the old analog days when Matrox ran supreme. My, maybe wrongly is HDMI/Display-Port standards left little wiggle room for variance? There are so many settings over time to curve a GPU's display over time I think it is much illogical to compare and judge a newly slapped in card on all defaults.
Anyways as an owner of many cards from BRANDS I have never noticed a diff, yet never compared side-by-side to be honest. So many options to tweak so why care?
PS-Am I missing a bigger point?

My eyes definitely saw differences back when the last Nvidia card I owned was a 980Ti. Even a Vega 56, R9 290 Pro and my Furies all look better on the exact same monitor. And no, it was not the monitor and no amount of tweaking and pushing was going to change a thing. Nowadays, I do not know, since I no longer own any Nvidia cards but it was not a placebo and it was not just my mind playing tricks on me. :)
 
None of my applications require this kind of color accuracy anymore. I didn't think that if even mattered unless you are working in print or other physical mediums.
 
None of my applications require this kind of color accuracy anymore. I didn't think that if even mattered unless you are working in print or other physical mediums.
I’m not questioning just color quality but the overall rendering over time in terms of perception and feeling of quality.
 
Back in the day, there were significant differences. Today, dunno. I know they can still render two identical opengl commands completely differently, but that has nothing to do with color accuracy.
 
Back in the day when reviews would take color accuracy into account I was gaming in software mode and integrated graphics weren't a thing in the household.

By the time that I built my first PC, DVI and HDMI were the mid to high end display solutions. Which coincides with my first job where I often heard my coworkers tearing their hair out trying to color match corporate logos with our shitty, decades old computers. I could barely tell the difference back then so I know that it could still be a concern.

I think that the only scenario where an average person would notice anything peculiar is going to be in a streaming situation.
And maybe with the however the hell frame interpolation works.
 
  • Like
Reactions: erek
like this
The video card, the monitor, the printer, the operating system, the programs, all can only reproduce x amount of colors.
You would need to calibrate them (drivers) then if your taking pictures with a camera it too has different (drivers) and your eyes are not the same as everyone else.
 
I can make two photos look different by taking them from slightly different angles without even having to change the GPU inbetween. Seriously, do people really reach for a camera before the Print Screen key?
Given that the screen shot would be rendered (potentially) differently on your system depending on your card/screen/etc... I'm not even sure HOW you'd capture that. Same issue with a picture, unless you're staring at an art ultra-high-end print out.
 
Given that the screen shot would be rendered (potentially) differently on your system depending on your card/screen/etc... I'm not even sure HOW you'd capture that. Same issue with a picture, unless you're staring at an art ultra-high-end print out.
Some people are sensitive to 60Hz screen flickering. I know one gamer friend who says this phenomenon even gives them a headache if Hz aren’t sufficiently high enough. I believe there is rendering quality difference and would like to have it verified and tested in an objective manner. Maybe frame time is a start but idk 🤷
 
Given that the screen shot would be rendered (potentially) differently on your system depending on your card/screen/etc... I'm not even sure HOW you'd capture that. Same issue with a picture, unless you're staring at an art ultra-high-end print out.
You can just diff the two screenshots. If you think the difference might be due to colour settings in the driver that aren't applied to screenshots, then it's not a real rendering difference, as it can be removed by simply setting the driver to neutral values (which is already the default when I checked the settings on my system with an AMD card). If you want to capture that difference anyway, then an HDMI capture card will see exactly what is sent to the monitor. As far as I know, no one has ever demonstrated a real measuable difference like that, despite a lot of claims over the years.
 
Looks like the default brightness is too high on the Nvidia sample. Other than that, I doubt there is any difference.

Maybe that's why you don't stick with the default GeForce Experience settings?
 
Reminds me of that hotly debated washed out image of a dress that went viral in 2015. It's all about perception.

But this is why I stopped giving an F about what others think in all areas of life. I know for a fact that what my eyes are seeing are real, despite others in the past saying I had to adjust this, tweak that and nothing changed or improved. Now, I do support the RTings site in many things because at least they are mostly objective in their reviews.
 
Back in the day, there were significant differences. Today, dunno. I know they can still render two identical opengl commands completely differently, but that has nothing to do with color accuracy.
This is what I thought people were talking about for rendering in a buffer and making an exact bit by bit comp to proove it, has the talk that it was for performance gain issue implied that it was not some color grading affair.

And that back in the days it was that, literal different render.
 
  • Like
Reactions: erek
like this
I read on here long ago that AMD was known for setting their default settings for more vibrant colors and if you so decided you could change settings on Nvidia cards to match it. It is like TVs in stores being in "demo mode" vs the same model set for accurate colors. You can achieve either look by altering settings.
 
You can just diff the two screenshots. If you think the difference might be due to colour settings in the driver that aren't applied to screenshots, then it's not a real rendering difference, as it can be removed by simply setting the driver to neutral values (which is already the default when I checked the settings on my system with an AMD card). If you want to capture that difference anyway, then an HDMI capture card will see exactly what is sent to the monitor. As far as I know, no one has ever demonstrated a real measuable difference like that, despite a lot of claims over the years.
Im talking beyond stills here, but temporal and feeling of the rendering
 
Back
Top