AMD vs nVidia Colors - Concrete proof of differences

Quality of Rendering, who does it better? AMD or NVIDIA?


  • Total voters
    33

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
8,848
Opinion on the qualitative rendering differences between AMD and NVIDIA (beyond frame times even)?

1670449825476.png


1670449841709.png


https://www.reddit.com/r/Amd/comments/n8uwqk/amd_vs_nvidia_colors_concrete_proof_of_differences/
 

Mr Evil

Limp Gawd
Joined
Jul 11, 2015
Messages
193
I can make two photos look different by taking them from slightly different angles without even having to change the GPU inbetween. Seriously, do people really reach for a camera before the Print Screen key?
 

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
8,848
I can make two photos look different by taking them from slightly different angles without even having to change the GPU inbetween. Seriously, do people really reach for a camera before the Print Screen key?
I use the print screen key/button but also further conversation extends beyond stills and to the perceptible differences of the rendering in motion beyond the measurement of frame times
 

pendragon1

Extremely [H]
Joined
Oct 7, 2000
Messages
44,866
i have proof, heres a poll, come on...
this:
I can make two photos look different by taking them from slightly different angles without even having to change the GPU inbetween. Seriously, do people really reach for a camera before the Print Screen key?
moving to the other side of my couch completely changes the look on my setup...
 

|Tch0rT|

[H]ard|Gawd
Joined
Jan 11, 2007
Messages
1,309
Haha that isn't "concrete" proof, those are just shitty pictures. I seriously doubt they render colors differently and nvidia sacrificing color accuracy for fps? lol Show me some measured data (colorimeter, spectrometer, etc) ... This is about as bad as audiophile cable nonsense.
 
Last edited:

ManofGod

[H]F Junkie
Joined
Oct 4, 2007
Messages
12,719
Not clicking the link. However, for my own personal experiences, over the years, AMD has far better color appearances than Nvidia has, on the same monitor and different graphics cards. If you do not agree with me, I do not care, I know what my eyes tell me and that is all that matters to me.
 

LukeTbk

2[H]4U
Joined
Sep 10, 2020
Messages
3,635
I am not sure how it would not show up if there a difference in a render buffer and comparing raw image color bit by bit, it is not hard to believe that there could be difference drivers to drivers and so on with Intel-nvidia-amd but I would imagine that it would be easy to prove even just with screenshot for someone with 0 tech skills.
 

arestavo

[H]ard|Gawd
Joined
Mar 25, 2013
Messages
1,843
I am not sure how it would not show up if there a difference in a render buffer and comparing raw image color bit by bit, it is not hard to believe that there could be difference drivers to drivers and so on with Intel-nvidia-amd but I would imagine that it would be easy to prove even just with screenshot for someone with 0 tech skills.
A screenshot might not capture any differences, at a guess. What would be shown on your particular monitor would be output however your video card/driver/windows/monitor would show (as in, it'd likely look the very same as if you were the one that did the screen capture, even if the original was on a completely different setup).

At a guess, there's no real differences other than what AMD or Nvidia set for the default color values (like for Nvidia's below). If there are, and a tech site wanted to prove it - side by side exact same PCs using the same monitor (monitor variance is then removed), only difference is Nvidia GPU and AMD GPU both using default values. Take some video / photographs using the same cameras in the same distances/angles and then report on it. Less than that is just clickbait for me.
 

Attachments

  • Capture.JPG
    Capture.JPG
    119.2 KB · Views: 1

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
7,072
My vote is: "I don't know". Subjectively in the past I would say AMD had better colors, today no clue owning both Nvidia and AMD. Which means it makes zero difference in my case now.
 

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
8,848
My vote is: "I don't know". Subjectively in the past I would say AMD had better colors, today no clue owning both Nvidia and AMD. Which means it makes zero difference in my case now.
Hmm
 

KickAssCop

Supreme [H]ardness
Joined
Mar 19, 2003
Messages
7,426
Everytime I install a new video card I have placebo that somehow IQ improved. Until after a few mins I come back to my senses. There was genuinely a time when AMD had better colors and rendering. I think maybe 9700 or X800 XT times.

However, I have not felt the same since Nvidia 7800//8800 series cards.

If you see different then probably you are in the same placebo state. If there were actual differences you would see in the many head to head IQ comparison / reviews that are posted routinely these days and the YouTube journalists won’t shut up about it.
 

LukeTbk

2[H]4U
Joined
Sep 10, 2020
Messages
3,635
There were the days of 16 vs 24 vs 32 bits colours at least until the 9700 pro / 5900 ultra (if I understood correctly ATI aimed to optimize 24 bits performance, while Nvidia went for 32 bits too quickly with 16 bits has the alternative, gaming ended up being too slow at 32, so it often went in part 16 bits on Nvidia card vs 24 bits on Radeon)
 

Captain Newmackwa

Limp Gawd
Joined
Mar 20, 2017
Messages
263
I always calibrate and profile my monitor with a colorimeter when I change graphics cards so I guess it doesn't affect me.

So far, beyond anecdotal evidence, I haven't seen any definitive testing that provides proof or evidence of these differences. There are also so many variables at play here.
 

Slade

2[H]4U
Joined
Jun 9, 2004
Messages
2,927
I run an amd on my home entertainment pc and nvidia card on my main gaming rig. I use a spyder color calibrator. They both look the same to me after calibration despite being on different screens. I wouldn't trust a monitor or GPU out of the box to be setup perfectly. Just not possible.
 

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
8,848

Slade

2[H]4U
Joined
Jun 9, 2004
Messages
2,927
Spyder x pro. Check fb marketplace or local ads for a used one before buying new. A lot of people use it once then never bother again.
 
  • Like
Reactions: erek
like this

Gulkor

[H]ard|Gawd
Joined
Jan 26, 2007
Messages
1,411
Spyder x pro. Check fb marketplace or local ads for a used one before buying new. A lot of people use it once then never bother again.
I took your advice and looking for used one, does it make that big of difference? sorry sound dumb, I mostly game. but color is very important to me.
 
  • Like
Reactions: erek
like this

GoodBoy

2[H]4U
Joined
Nov 29, 2004
Messages
2,456
The information is DIGITAL that goes thru your displayport cable.

If you you still use a VGA port there might be something to talk about except those issues were about 8 generations back and got fixed long, long ago.
You will see differences between different LCD's.
In a blind test with the same image, on the same display, the same color settings in windows, on different GPU's, you aren't going to see a difference.
 

funkydmunky

2[H]4U
Joined
Aug 28, 2008
Messages
3,366
I thought this was all lost on the old analog days when Matrox ran supreme. My, maybe wrongly is HDMI/Display-Port standards left little wiggle room for variance? There are so many settings over time to curve a GPU's display over time I think it is much illogical to compare and judge a newly slapped in card on all defaults.
Anyways as an owner of many cards from BRANDS I have never noticed a diff, yet never compared side-by-side to be honest. So many options to tweak so why care?
PS-Am I missing a bigger point?
 

Red Falcon

[H]F Junkie
Joined
May 7, 2007
Messages
11,900
Everytime I install a new video card I have placebo that somehow IQ improved. Until after a few mins I come back to my senses. There was genuinely a time when AMD had better colors and rendering. I think maybe 9700 or X800 XT times.

However, I have not felt the same since Nvidia 7800//8800 series cards.

If you see different then probably you are in the same placebo state. If there were actual differences you would see in the many head to head IQ comparison / reviews that are posted routinely these days and the YouTube journalists won’t shut up about it.
Came here to say just this.
This was indeed a thing years ago with NVIDIA and ATI (circa 2006 and earlier) where one generation and iteration looked better than the other, and tech magazines and sites actually did show physical evidence and technical proof of this.

But after 2007 this hasn't been a thing for either NVIDIA or AMD, and not after 2011 with Intel.
 

ManofGod

[H]F Junkie
Joined
Oct 4, 2007
Messages
12,719
I thought this was all lost on the old analog days when Matrox ran supreme. My, maybe wrongly is HDMI/Display-Port standards left little wiggle room for variance? There are so many settings over time to curve a GPU's display over time I think it is much illogical to compare and judge a newly slapped in card on all defaults.
Anyways as an owner of many cards from BRANDS I have never noticed a diff, yet never compared side-by-side to be honest. So many options to tweak so why care?
PS-Am I missing a bigger point?

My eyes definitely saw differences back when the last Nvidia card I owned was a 980Ti. Even a Vega 56, R9 290 Pro and my Furies all look better on the exact same monitor. And no, it was not the monitor and no amount of tweaking and pushing was going to change a thing. Nowadays, I do not know, since I no longer own any Nvidia cards but it was not a placebo and it was not just my mind playing tricks on me. :)
 

Tactlesss

Limp Gawd
Joined
Sep 2, 2017
Messages
208
None of my applications require this kind of color accuracy anymore. I didn't think that if even mattered unless you are working in print or other physical mediums.
 

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
8,848
None of my applications require this kind of color accuracy anymore. I didn't think that if even mattered unless you are working in print or other physical mediums.
I’m not questioning just color quality but the overall rendering over time in terms of perception and feeling of quality.
 

Nobu

[H]F Junkie
Joined
Jun 7, 2007
Messages
8,967
Back in the day, there were significant differences. Today, dunno. I know they can still render two identical opengl commands completely differently, but that has nothing to do with color accuracy.
 

Tactlesss

Limp Gawd
Joined
Sep 2, 2017
Messages
208
Back in the day when reviews would take color accuracy into account I was gaming in software mode and integrated graphics weren't a thing in the household.

By the time that I built my first PC, DVI and HDMI were the mid to high end display solutions. Which coincides with my first job where I often heard my coworkers tearing their hair out trying to color match corporate logos with our shitty, decades old computers. I could barely tell the difference back then so I know that it could still be a concern.

I think that the only scenario where an average person would notice anything peculiar is going to be in a streaming situation.
And maybe with the however the hell frame interpolation works.
 
  • Like
Reactions: erek
like this

Fresch

Weaksauce
Joined
Mar 14, 2018
Messages
78
The video card, the monitor, the printer, the operating system, the programs, all can only reproduce x amount of colors.
You would need to calibrate them (drivers) then if your taking pictures with a camera it too has different (drivers) and your eyes are not the same as everyone else.
 

lopoetve

Extremely [H]
Joined
Oct 11, 2001
Messages
33,589
I can make two photos look different by taking them from slightly different angles without even having to change the GPU inbetween. Seriously, do people really reach for a camera before the Print Screen key?
Given that the screen shot would be rendered (potentially) differently on your system depending on your card/screen/etc... I'm not even sure HOW you'd capture that. Same issue with a picture, unless you're staring at an art ultra-high-end print out.
 

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
8,848
Given that the screen shot would be rendered (potentially) differently on your system depending on your card/screen/etc... I'm not even sure HOW you'd capture that. Same issue with a picture, unless you're staring at an art ultra-high-end print out.
Some people are sensitive to 60Hz screen flickering. I know one gamer friend who says this phenomenon even gives them a headache if Hz aren’t sufficiently high enough. I believe there is rendering quality difference and would like to have it verified and tested in an objective manner. Maybe frame time is a start but idk 🤷
 

Mr Evil

Limp Gawd
Joined
Jul 11, 2015
Messages
193
Given that the screen shot would be rendered (potentially) differently on your system depending on your card/screen/etc... I'm not even sure HOW you'd capture that. Same issue with a picture, unless you're staring at an art ultra-high-end print out.
You can just diff the two screenshots. If you think the difference might be due to colour settings in the driver that aren't applied to screenshots, then it's not a real rendering difference, as it can be removed by simply setting the driver to neutral values (which is already the default when I checked the settings on my system with an AMD card). If you want to capture that difference anyway, then an HDMI capture card will see exactly what is sent to the monitor. As far as I know, no one has ever demonstrated a real measuable difference like that, despite a lot of claims over the years.
 

kirbyrj

Fully [H]
Joined
Feb 1, 2005
Messages
30,332
Looks like the default brightness is too high on the Nvidia sample. Other than that, I doubt there is any difference.

Maybe that's why you don't stick with the default GeForce Experience settings?
 

ManofGod

[H]F Junkie
Joined
Oct 4, 2007
Messages
12,719
Reminds me of that hotly debated washed out image of a dress that went viral in 2015. It's all about perception.

But this is why I stopped giving an F about what others think in all areas of life. I know for a fact that what my eyes are seeing are real, despite others in the past saying I had to adjust this, tweak that and nothing changed or improved. Now, I do support the RTings site in many things because at least they are mostly objective in their reviews.
 

LukeTbk

2[H]4U
Joined
Sep 10, 2020
Messages
3,635
Back in the day, there were significant differences. Today, dunno. I know they can still render two identical opengl commands completely differently, but that has nothing to do with color accuracy.
This is what I thought people were talking about for rendering in a buffer and making an exact bit by bit comp to proove it, has the talk that it was for performance gain issue implied that it was not some color grading affair.

And that back in the days it was that, literal different render.
 
  • Like
Reactions: erek
like this

rinaldo00

[H]ard|Gawd
Joined
Mar 9, 2005
Messages
2,015
I read on here long ago that AMD was known for setting their default settings for more vibrant colors and if you so decided you could change settings on Nvidia cards to match it. It is like TVs in stores being in "demo mode" vs the same model set for accurate colors. You can achieve either look by altering settings.
 

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
8,848
You can just diff the two screenshots. If you think the difference might be due to colour settings in the driver that aren't applied to screenshots, then it's not a real rendering difference, as it can be removed by simply setting the driver to neutral values (which is already the default when I checked the settings on my system with an AMD card). If you want to capture that difference anyway, then an HDMI capture card will see exactly what is sent to the monitor. As far as I know, no one has ever demonstrated a real measuable difference like that, despite a lot of claims over the years.
Im talking beyond stills here, but temporal and feeling of the rendering
 
Top