A Real Test of nVidia vs AMD 2D Image Quality

Sycraft

Supreme [H]ardness
Joined
Nov 9, 2006
Messages
5,852
Since Brent passed on the article I bring you:

Sycraft's Image Quality Showdown

Something I’ve seen claimed many times on the Internet is that there is a visible difference in color and font quality between AMD and nVidia cards. People claim this even on new monitors, new cards, with 100% digital connections. This is something that seems impossible to me, so I decided to put it to a real objective test in this article.

TL;DR Version: There is no difference in image quality between nVidia and AMD.


Full Version

Overview of Claims

The claim that we are evaluating is that there is a visible difference in 2D quality between AMD and nVidia cards. Numerous times on forums people have claimed that they see a difference in either the colors, the fonts, or both. They are sure they’ve seen this difference when they’ve gotten a new card, or looked at a different system. Generally the claim is that AMD has "richer" colors and better looking fonts. The claim is that this is universal, it is a function of the company, not a specific card, or specific vendor, or monitor.

Additionally, the claim is this happens on modern systems with all digital connections. An LCD monitor connected with a DVI or DP cable to a video card is claimed to show the difference. So it isn’t a difference in conversion to analog, it is an actual difference with the cards, the way the handle colors.


Technical Details: Why it Shouldn’t be the Case

The problem is that when you learn a bit about how graphics actually work on computers, it all seems to be impossible. There is no logical way this would be correct. The reason is because a digital signal remains the same, no matter how many times it is retransmitted or changed in form, unless something deliberately changes it.

In the case of color on a computer, you first have to understand how computers represent color. All colors are represented using what it called a tristimulus value. This means it is made up of a red, green, and blue component. This is because our eyes perceive those colors, and use that information to give our brains the color detail we see.

Being digital devices, that means each of those three colors are stored as a number. In the case of desktop graphics, an 8-bit value, from 0-255. You may have encountered this before in programs like Photoshop that will have three sliders, one if each color, demarcated in 256 steps, or in HTML code where you specify colors as #XXYYZZ, each pair of characters is a color value in hexadecimal (FF in hex is equal to 255 in decimal).

When the computer wants a given color displayed, it sends that tristimulus value for the video card. There the video card looks at it and decides what to do with it based on the lookup table in the video card. By default, the lookup table doesn’t do anything; it is a straight line, specifying that the output should be the same as the input. It can be changed by the user in the control panel, or by a program such as a monitor calibration program. So by default, the value the OS hands the video card is the value the video card sends out over the DVI cable.

What this all means is that the monitor should be receiving the same digital data from either kind of card, and thus the image should be the same. Thus it would seem the claim isn’t possible.


The Impartial Observer

For color, we need an impartial observer. Humans are notoriously bad with color perception, and lots of things can change what we perceive a color to be. Fortunately, I happen to have one on hand.

For this test we’ll be using the i1Display Pro aka i1Display 3 aka EODIS3. This is a modern, quality colorimeter that can read color levels with a greater level of precision than the human eye. It is used in calibrating displays, in particular this one is for calibrating my display.

i1.jpg

It will be used to take measurements of various color patches to find out their results. I’ll give the same patch set to both the AMD card and the nVidia card and see if there is any difference in measured color. This gives us good, objective, results.


Test Setup

For these tests I’ll be using my NEC MultiSync 2690WUXi monitor. It is a professional IPS monitor, designed with color accuracy in mind. It actually has internal lookup tables, so it can have correction applied to it internally, independent of the device feeding it the signal. Measurement will be taken by the aforementioned i1Display Pro. The software I’ll be using is NEC’s SpectraView II. It is the monitor calibration software for this screen, but it can also generate color patches and give the result of their measurement.

In the nVidia corner I have a GeForce GTX 680 in my desktop, and in the AMD corner I have a Radeon 5850M in my laptop. The difference in power in these cards isn’t relevant to this test since we aren’t testing performance, or even 3D, just color and fonts.

The first step on both systems is to make sure that all onboard color modifications are turned off. We know that the cards can change the color if they want, the question is if they are different despite settings. That means having brightness, contrast, gamma, digital vibrance, hue temperature and all that set to no modification.

nvidia.png
amd.png

Then I’ll use Monitor Calibraton Wizard just to make sure the system lookup tables are unaltered, a straight line.

mcw.png

That done, it is time to take some measurements. The monitor is warmed up and calibrated, the room is dark, and the sensor is good to go.


Color Test

For this test I’ll use a variety of color patches, one of each bright primary and secondary colors, as well as three pastels and two neutral colors. Colors are measured on the standard CIE chart and expressed in terms of xy for color coordinates and Y for luminescence (brightness).

chart-1.png


Looking at the charts, you can already see that most colors are dead on with regards to the x and y value, and very close on the luma (y) value. In fact, what you see here is as perfect a match as this setup can measure. The meter is not a perfectly accurate device to begin with, and is also slightly affected by temperature. Also, there are small variations across a screen, and I don’t place it on precisely the same spot every time I take a reading. A difference of +- 0.001 xy is normal, and in fact below the rated error for this device. It is also completely unnoticeable.

To quantify if a difference is noticeable, the difference between the measurements can be expressed by a value called delta E. This is a calculation of the perceived difference between two colors to the human eye. A value under 4 is generally unnoticeable to most observers for most colors. A value of 2 or under is considered acceptable for print. A value of 1 is the lowest perceptible difference level for any color, hence why it is 1.

chart-2.png


As we can see, the highest delta E for the readings is 0.34, and most are under 0.2, well below a perceptible level and getting around the margin of error of the colorimeter. I am quite sure we’d have an even closer match with better equipment and a precise aim on the monitor. The very, very small differences we are seeing are a result of imperfect measurement equipment, not any actual difference.

So as we can see, colors coming from AMD and nVidia cards are dead identical to within the limits of this test, and to within the limits of human perception. This is what we’d expect given the completely digital nature of all the connections: A given input value will be the same output value, unless something changes it along the way.

If you think you see a difference with your new card, you may want to check your color settings. To be sure, video cards can change color output, when you ask them to. However with everything turned off, they should look completely the same.



Font Test

For fonts, there isn’t the luxury of a device that can just measure the way they look and give us an objective verdict. However, we can get a good side by side comparison using our eyes and a camera. What I’ve done here is taken some extreme closeup pictures of my monitor displaying the same text, with the same font (Calibri), in the same program (Word), from both the GTX 680 and the 5850M. Thus we can get a pretty good look at the actual pixel structure of the text, and see if any differences are noticeable. I’ve attached the full resolution shots so you can have a look at whatever you like but let’s compare a couple letters. The slight difference in angle is due to me holding the camera.

b.png

The thing to notice isn’t just the shape of the letter, but the sub pixel anti-aliasing. Were there to be a difference, that would be where you’d see it. However as we can see it is the same in both cases. Sub pixel for sub pixel, the “b” from the 12 point font is the same on either card.

c.png

With the 16 point “c” the result is the same; the anti-aliasing is identical on the subpixel level.

There just isn’t any difference to be seen with the characters in the font. Both cards render them the same in every detail.

A possible reason you can see a difference between some systems is that Microsoft actually offers a cleartype tuner. It will let you change around how cleartype (the subpixel anti-aliasing) is applied to the fonts to suit your liking. In these pictures it is the default but if you have run the program, it could look slightly different on your system. When comparing fonts for small details, you need to make sure it is set the same in both cases.


Conclusion

The results were as we’d expect: There is no difference in image quality based on the brand of card you buy. The days of 2D image quality being a function of particular graphics cards are long gone for most people. When CRTs were popular, it was something that mattered as the quality of the RAMDAC controlled the quality of the analog VGA output. Thus different cards, even with the same chipset, could potentially produce different 2D image quality.

However LCDs and DVI have changed all that. The signal is fully digital from the OS, to the card, to the monitor right up until the individual sub pixels are finally lit up. This removes any quality differences from card outputs, since they are outputting a completely accurate digital signal. The only time they change anything is when instructed to by a change to their LUTs from the user, OS, application, and so on.

What that all means is the conclusion is pretty clear: There are many good reasons to buy team green or team red, but 2D image quality is not one of them. They are precisely the same.
 
You tested color accuracy. Good for you.

Did anyone make this an nvidia vs ATI thing? You seem to be invested in defending nvidias honor here, anyway, I made a thread earlier but it wasn't about nvidia vs. ATI. I've definitely seen differences in IQ and what happens is some factory overvolted cards seem to "warm" the monitor image up quicker than other brands, i've seen this a lot with EVGA and MSI lightning cards.

Color accuracy though. Good for you. This definitely doesn't tell the full story. It is interesting though how you interpreted it to be an AMD vs nvidia thing.
 
Excellent read! Finally a purely objective comparison to show the truth.
 
thank you for this write up, glad to know their isn't any difference!
 
You tested color accuracy. Good for you.

Did anyone make this an nvidia vs ATI thing? You seem to be invested in defending nvidias honor here, anyway, I made a thread earlier but it wasn't about nvidia vs. ATI. I've definitely seen differences in IQ and what happens is some factory overvolted cards seem to "warm" the monitor image up quicker than other brands, i've seen this a lot with EVGA and MSI lightning cards.

Color accuracy though. Good for you. This definitely doesn't tell the full story. It is interesting though how you interpreted it to be an AMD vs nvidia thing.

His analysis isnt based solely on your thread. Your thread sparked memories of a decade-old debate about colors quality, even if it wasnt your intention. The OP did a good job at putting this debate to rest.

As for your thread, I have no idea how an overclock on a video card can give the same effect than a monitor warming up... maybe you should use the same methodology than the author and apply it to your situation.
 
You tested color accuracy. Good for you.

Did anyone make this an nvidia vs ATI thing? You seem to be invested in defending nvidias honor here, anyway, I made a thread earlier but it wasn't about nvidia vs. ATI. I've definitely seen differences in IQ and what happens is some factory overvolted cards seem to "warm" the monitor image up quicker than other brands, i've seen this a lot with EVGA and MSI lightning cards.

Color accuracy though. Good for you. This definitely doesn't tell the full story. It is interesting though how you interpreted it to be an AMD vs nvidia thing.

Your thread wasn't the cause, just the latest in a series of posts dating back, well hell at least to 1999 I can remember this stuff happening. I've seen people say that there is a 2D image difference, mostly in terms of "richer" colors on AMD and better fonts on AMD (in basically all the cases I've seen it was AMD that was claimed to have the better image).

Now this confused me since I'd never seen it, and I've used more than a few of each, and what's more based on what I knew it shouldn't be possible. However while people claiming there was a difference could be suffering from confirmation bias, so could I. So I decided to put it to an actual objective test. These are the results.

If it angers you, sorry, but it is what it is. I wrote up my methodology and results, you can reproduce the test to attempt to falsify it, as is done in science. If there's a flaw in the methodology, you can point it out. Also if you feel something else needs testing, you can devise and carry out your own test.

This isn't a matter of defending anything. It is to analyze the hypothesis that AMD and nVidia have different color quality or font quality. Answer is no, according to my data.
 
I used a less scientific method (my eyes) to test 2D IQ between my old 5870 and new 680. And I agree, no difference.
 
It might be worth doing this with a few random images and video stills in a few points on the monitor, just to add to the results. All I see here is the letters 'b' and 'C' are rendered identically between the two GPUs. Though I personally doubt there will be any difference, two samples is hardly conclusive. I've personally seen no difference between various nV and AMD mobile GPUs, the GTX 275, HD 5870, GTX 570, and now GTX 680.
 
You tested color accuracy. Good for you.

Did anyone make this an nvidia vs ATI thing? You seem to be invested in defending nvidias honor here, anyway, I made a thread earlier but it wasn't about nvidia vs. ATI. I've definitely seen differences in IQ and what happens is some factory overvolted cards seem to "warm" the monitor image up quicker than other brands, i've seen this a lot with EVGA and MSI lightning cards.

Color accuracy though. Good for you. This definitely doesn't tell the full story. It is interesting though how you interpreted it to be an AMD vs nvidia thing.

You're showing your age. And your age is young. This has nothing to do with you, claims of Nvidia fudging their 2D Desktop quality are a thing that started around the 6800 series, maybe even before then.
 
It might be worth doing this with a few random images and video stills in a few points on the monitor, just to add to the results. All I see here is the letters 'b' and 'C' are rendered identically between the two GPUs. Though I personally doubt there will be any difference, two samples is hardly conclusive. I've personally seen no difference between various nV and AMD mobile GPUs, the GTX 275, HD 5870, GTX 570, and now GTX 680.

He also used multiple color patches and tested their accuracy. See the charts?

Sycraft, good tests!
 
It might be worth doing this with a few random images and video stills in a few points on the monitor, just to add to the results. All I see here is the letters 'b' and 'C' are rendered identically between the two GPUs. Though I personally doubt there will be any difference, two samples is hardly conclusive. I've personally seen no difference between various nV and AMD mobile GPUs, the GTX 275, HD 5870, GTX 570, and now GTX 680.

There's a link to the full rez pictures (14MB total). You can see the phrase "The quick brown fox" shot on the two different cards. I chose a letter from each size (did it in two sizes). There is spherical distortion near the edges though, because this was a regular camera lens, and it was pressed on the screen.

The reason I used fonts is that fonts was the claim I've seen. For colours, like what you'd see in images, the patch test with the colorimeter is a better method because it is more sensitive than our eyes, and can also give an objective quantification.

As for video, you well could see a difference because most video software uses the card's onboard processing to accelerate decoding and different cards can have a different idea to do things. I could well expect a difference there.

Though if you think either of these are worth doing, be my guest :). Remember the scientific method isn't an exclusive club, anyone can play.
 
He also used multiple color patches and tested their accuracy. See the charts?

Sycraft, good tests!

True, I failed to notice those initially. But those might not reveal aliasing or other tricks that would be present IF image quality is sacrificed somewhere. All those charts do is prove that neither side outputs "off" colors.

There's a link to the full rez pictures (14MB total). You can see the phrase "The quick brown fox" shot on the two different cards. I chose a letter from each size (did it in two sizes). There is spherical distortion near the edges though, because this was a regular camera lens, and it was pressed on the screen.

Noted. That being said in the OP you only presented two data points that are not conclusive in and of themselves. Your post is more of an abstract, I suppose.

The reason I used fonts is that fonts was the claim I've seen. For colours, like what you'd see in images, the patch test with the colorimeter is a better method because it is more sensitive than our eyes, and can also give an objective quantification.

Also noted. You've got fonts and colors, which is certainly a lot. But 2D images or other GUI objects might be affected. It's unclear where IQ is alleged to differ between the two cards though, so its not entirely clear what tests are needed. Definitely would be interested in seeing data posted that claims IQ differs AND posts reproducible test cases.

As for video, you well could see a difference because most video software uses the card's onboard processing to accelerate decoding and different cards can have a different idea to do things. I could well expect a difference there.

This I did not think of.

Though if you think either of these are worth doing, be my guest :). Remember the scientific method isn't an exclusive club, anyone can play.

Worth doing, probably. But do I have time, not really. Since you've already volunteered, I volunteer you again. :p

While I'm fairly confident that IQ really does not differ between the two in 2D, I'm merely trying to provide useful feedback so keep that in mind. ;)
 
Noted. That being said in the OP you only presented two data points that are not conclusive in and of themselves. Your post is more of an abstract, I suppose.

Not really, every sub pixel in the character is a data point. Remember the claim we are testing is if the font rendering is any different. It can't be better or worse if it is equal. So that's what to look at, sub pixel for sub pixel, is it equal? Answer is yes. The same ones are on, off, dim, and dim by the same amount. It isn't just the two cards trying to do the same thing and coming up with an answer that is close, it is them outputting the same thing (which you'd expect since Windows renders the fonts, not the video card).
 
:rolleyes: I love "limits of human perception" part. Sorry, all people aren't the same. I'll keep throwing out there the tremendously fine example of *some* people having been able to see rainbows on color-wheel tvs, while the vast majority of people can't.

Putting in an Nvidia card is like smearing vaseline on the screen. I won't go back.
 
It isn't just the two cards trying to do the same thing and coming up with an answer that is close, it is them outputting the same thing (which you'd expect since Windows renders the fonts, not the video card).

I was under the impression from what I've read repeatedly about cleartype that the video card itself IS responsible for the fonts, including alpha and rgb blending, since DX9 and 10

http://en.wikipedia.org/wiki/ClearType#ClearType_in_WPF
 
:rolleyes: I love "limits of human perception" part. Sorry, all people aren't the same. I'll keep throwing out there the tremendously fine example of *some* people having been able to see rainbows on color-wheel tvs, while the vast majority of people can't.

Putting in an Nvidia card is like smearing vaseline on the screen. I won't go back.

Maybe you can provide data to backup your claim as well as a useful test that objectively measures the "rainbowiness" of outputs that you claim to observe with nVidia cards? The original results may have a few flaws in their testing methodology but it provides far more data to back up the results than one claim that vaseline was involved in the production of nVidia chips.
 
Maybe you can provide data to backup your claim as well as a useful test that objectively measures the "rainbowiness" of outputs that you claim to observe with nVidia cards?

You should actually read my post again, because apparently you missed something.
 
You should actually read my post again, because apparently you missed something.

I read your post again. I don't see what he missed. I'd like to see the vasoline effect documented as well since I'm not gifted enough to see it.
 
As for your thread, I have no idea how an overclock on a video card can give the same effect than a monitor warming up... maybe you should use the same methodology than the author and apply it to your situation.

It's called the placebo effect and people like Xoleras are especially susceptible.

Back in days of analog VGA monitors and RAMDAC's there truly was a difference but (as mentioned by the OP) in today's purely digital world, its all the same.


You're showing your age. And your age is young. This has nothing to do with you, claims of Nvidia fudging their 2D Desktop quality are a thing that started around the 6800 series, maybe even before then.

This battle existed in the Geforce 1 days, hell even in the TnT days.


I read your post again. I don't see what he missed. I'd like to see the vasoline effect documented as well since I'm not gifted enough to see it.

There is no "vaseline" effect. Some people just refuse to understand the scientific method, and believe that proven facts are wrong.

About the limits of human perception, if you looked at the differences in the values, it was WAY smaller than any of the limits he mentioned, and small enough it is within the error rate of the sensor, which is agreed to be better than human eyes (otherwise there would be no point in having said sensor...)
 
Last edited:
I read your post again. I don't see what he missed. I'd like to see the vasoline effect documented as well since I'm not gifted enough to see it.

Well that is the crux of it isn't it? Some people see it, some don't. So you think those of use who do are just randomly making it up?
 
I guess I'm a little confused on how you did the color tests. You mentioned the monitor is calibrated - but calibrated to what? Or did you just mean that you took out the calibration and reset the monitor to default? I have the 24" NEC monitor which is calibrated to my 680, so I would expect near perfect results, but if I hook it up to my wife's 8800GT it won't be calibrated for that card and should show a difference. Once I calibrate it to that card, then it would be perfect there too. Maybe the difference is that I'm not using the SpectraView software (I use Basiccolor)?
 
Something else I probably should have noted about the measurements on the xyY scale is you are seeing measurements accurate to at or below the quantization of 8-bit signals. A single bit change will generate a change of 0.001 x and y or more.

As an example I tested 250,190,90 (a pale orange, just a random triad I chose) vs 251,190,90. The difference was +0.001x -0.001y and 0.2Y.

So control to a level below 0.001 isn't something a card could achieve over an 8-bit DVI link. Even if the monitor was 10-bit, and not very many are, and even if the card would do 10-bit output (for some reason only the Quadros/FireGLs will) it couldn't so it over a single-link DVI since that is 8-bit per channel.

Also Q1DM6, I do find it a little strange that you'd claim to have colour perception above and beyond the human norm, yet elect to use a U2412M. It is the best budget IPS panel no doubt, but it is only 6-bit eIPS, with no internal calibration. If you truly are so colour sensitive, you would easily see a far greater difference moving to a better monitor. Something with a real 8 or 10-bit panel, internal 12 or 14-bit LUTs so you can precisely dial in accurate colour.

A stock 2414M is accurate to about 3.2dE average, 6.3dE max in TFTcentral's test. You dial in and calibrate something like a P241W and they got it to 0.3dE average, 0.7dE max. That's using a more expensive i1 Pro Spectrophotometer though, probably won't get it that accurate with a i1 Display Pro (though you'll get better tracking of dark values).
 
I guess I'm a little confused on how you did the color tests. You mentioned the monitor is calibrated - but calibrated to what? Or did you just mean that you took out the calibration and reset the monitor to default? I have the 24" NEC monitor which is calibrated to my 680, so I would expect near perfect results, but if I hook it up to my wife's 8800GT it won't be calibrated for that card and should show a difference. Once I calibrate it to that card, then it would be perfect there too. Maybe the difference is that I'm not using the SpectraView software (I use Basiccolor)?

Internally calibrated. The 90 series and PA series monitors have a set of internal lookup tables you can use to make sure their output is as near to correct as it should be. This means matching the colours accurate in general, and to the gamma, white point, and so on you select. In my case 120nit brightness, 6500k whitepoint, L* gamma curve.

They calibrate on a per input basis, so the particular input (I used the same input for both systems) has the calibration set on it. The system then has no correction set on it, it uses a perfectly linear LUT, which I used Monitor Calibration Wizard to make sure no programs had messed with (for example games will change video card LUTs to do their gamma).

That my monitor was calibrated really wasn't all that relevant I suppose, the question was if there was a difference between the systems, when they had everything set to default.
 
Also Q1DM6, I do find it a little strange that you'd claim to have colour perception above and beyond the human norm, yet elect to use a U2412M. It is the best budget IPS panel no doubt, but it is only 6-bit eIPS, with no internal calibration. If you truly are so colour sensitive, you would easily see a far greater difference moving to a better monitor. Something with a real 8 or 10-bit panel, internal 12 or 14-bit LUTs so you can precisely dial in accurate colour.

Really, is that what you're going to attack, what monitor I can afford? We should return to this instead:
http://hardforum.com/showpost.php?p=1038753188&postcount=15

Not to mention the fact that my whole example was based on the fact that certain people could view the exact same DLP tv as others and see rainbows. The only difference being the observer, not the source.
There are people with perfect pitch. I'm not one of them, but I don't deny they exist. You probably would though ;)
 
Last edited:
I guess I'm a little confused on how you did the color tests. You mentioned the monitor is calibrated - but calibrated to what? Or did you just mean that you took out the calibration and reset the monitor to default? I have the 24" NEC monitor which is calibrated to my 680, so I would expect near perfect results, but if I hook it up to my wife's 8800GT it won't be calibrated for that card and should show a difference. Once I calibrate it to that card, then it would be perfect there too. Maybe the difference is that I'm not using the SpectraView software (I use Basiccolor)?

My monitor was calibrated to my old 5870 and when I put my 670 in and loaded the profile and then did a calibration test the results were no different, meaning both cards are rendering colours exactly the same.
 
Really, is that what you're going to attack, what monitor I can afford? We should return to this instead:
http://hardforum.com/showpost.php?p=1038753188&postcount=15

Not to mention the fact that my whole example was based on the fact that certain people could view the exact same DLP tv as others and see rainbows. The only difference being the observer, not the source.
There are people with perfect pitch. I'm not one of them, but I don't deny they exist. You probably would though ;)

Since when is it an attack? I am simply confused. You take issue with my statements on the limits of human perception. I think my statements are pretty well informed, I've done more than a bit of research on the subject, but I'll grant the possibility you have above normal perception. My confusion then is why if you have such good perception, and apparently care about it a great deal, would you not look at the thing likely to be problematic in your signal chain: Your monitor.

Basically you are taking issue with my statement that dE values of less than one aren't perceptible, yet you have a monitor that lacks that kind of accuracy. So how would you evaluate such a thing on your system? That is my confusion.

If colour isn't you issue then perhaps you should clarify what is. My statement on perception was related to that, to explain to people who dE results are and how to interpret them since they aren't something most people have heard of unless they are in to colour theory. Also it was just for people who aren't familiar with experimental results and would think everything should be flat zeros, not realizing you don't get perfect measurements in the real world.

I'm not out to attack anyone, just to test my observation that there is not difference between nVidia and AMD cards in this regard. My test indicates I am correct, which is what I expected but I wanted to test it anyhow. Though I own both AMD and nVidia systems (and will continue to, my new laptop is incoming with a 7970M in it) I'd never seen a difference I could notice. Of course I was willing to entertain the idea that I simply couldn't, so I enlisted a more objective test device for colour, and used a camera to enhance detail for fonts.

If you feel attacked because someone disagrees with you, well that is an issue with you, not with me.
 
I've never tried to pin down exactly what the difference is from a technical perspective as I lack that ability, but here's the rub: People like you seem to insist that I and others like me who notice an absolute difference immediately are somehow either lying, mistaken, or as another posted noted we are apparently falling for a "placebo effect".

I don't have any agenda, I don't owe any allegiance to a video card company, I simply noticed a difference in quality that jumped out at me that no amount of cleartype tuning could fix.

I have a 430gt in the closet. I had a 560ti I gave to my brother. I had retried both of these after moving to amd to be sure I wasn't imagining things. All other things being equal, I see a marked difference. I wish I could pull the technical reason right out of my ass and give it to you, but I can't.

I do know one thing, I'm not the only person who does notice it.
 
Placebo maybe, incorrect configuration maybe. I lean toward placebo/expectations influencing perception if you can't pin down precisely what looks different. Not why, just what. When I see or hear or whatever a difference between things it is generally easy for me to say what I perceive that difference to be.

But incorrect configurations can cause real, visible, problems which would always be the first thing I'd look for if I did see a difference where there shouldn't be one.

Like when I got my LCD TV, my laptop didn't look right on it. Edges were cut off. So I went in to the AMD control panel and found out that because it was HDMI, there was an overscan control. Well I played with that to try and get it right but I couldn't get it dead on, it always seemed to be a bit blurred and cut off a bit, or be a bit blurred and not fill the screen. Well more research turned out that I had to change the mode on my screen. "16:9" wasn't the right mode despite that being the TV's ratio, it was "screen fit". To me that sounds like "stretch the image to fit the screen" but to Samsung that means "disable overscanning."

That set right, I then had to turn off overscan control in the AMD driver and now I had a clear, full, uncut image like I expected. Not the end of things I had to check though, HDMI can do chroma subsampling so I needed to make sure that was off and I was on a port on the TV that did 4:4:4 subsampling (some only do 4:2:2 for some reason) and so on.

The initial problem wasn't because AMD does a bad job, or Samsung does a bad job, it was because HDMI introduces settings I hadn't messed with before, and they were incorrect.

When I see something different, where there ought not be one, I want to know why. I try and find out the reason. I didn't wish to jump to the conclusion that Samsung wasn't good for laptops or that AMD couldn't handle TVs. I wanted to find out what was going on before making a decision. As I said, turned out I had messed up settings.

In this case it was the other way around: People were saying there was something obvious I couldn't see, so I wanted to see if it was just me, if I could measure a difference I couldn't see, or if I could see a difference by greatly enhancing my vision (by looking directly at the sub pixels up close). The answer is no, I can't find a difference, so I have more confidence in my perception.

Thus I'll keep buying whichever kind of card suits my fancy, without worry. Had I found a difference, I would have set off to find out why there was one, if it truly was related to the cards, or if I'd messed something up along the line.

I figured I'd share since others might be interested. I also suspected some might take it as an attack, but as I said that's really not my issue as it was not my goal.
 
Internally calibrated. The 90 series and PA series monitors have a set of internal lookup tables you can use to make sure their output is as near to correct as it should be. This means matching the colours accurate in general, and to the gamma, white point, and so on you select. In my case 120nit brightness, 6500k whitepoint, L* gamma curve.

They calibrate on a per input basis, so the particular input (I used the same input for both systems) has the calibration set on it. The system then has no correction set on it, it uses a perfectly linear LUT, which I used Monitor Calibration Wizard to make sure no programs had messed with (for example games will change video card LUTs to do their gamma).

That my monitor was calibrated really wasn't all that relevant I suppose, the question was if there was a difference between the systems, when they had everything set to default.

Gotcha. I don't use the internal LUT on mine because I had some issues with the software, so that's why it was unfamiliar to me.
 
I lean toward placebo/expectations influencing perception if you can't pin down precisely what looks different.

Oh, I've stated many times precisely what looks different. The sheer clarity of essentially everything on screen, and especially text is my primary concern. I noted the 'vaseline screen' look of Nvidia and I wasn't exaggerating. People will of course endlessly tell me that it's something in cleartype I've overlooked, or it's just placebo and I'm fooling myself, but there are other people just like me who notice it immediately.

There are plenty of people on this very forum who will sit and tell you that LCD televisions are perfectly acceptable for watching tv, when they are in fact utter crap. I can't help that they are less discerning, but the fact remains they are. A lot of people never even notice things like reduced motion resolution, horrible black/gray levels, flashlighting, etc. These kinds of things simply enforce to me that a great number of people lack something in the way they process the world visually.
 
That's why we use instrumentation to measure in an objective way what any screen is putting out. Who cares what one person or another *thinks* they see? That's not reliable. Welcome to science 101.
 
Oh, I've stated many times precisely what looks different. The sheer clarity of essentially everything on screen, and especially text is my primary concern.

Sycraft has pictures of text that looks exactly the same between Nvidia and AMD. You claim to see a vaseline effect on text when using Nvidia cards, but have no pictures. He has data; you have opinion. You can't win this without data. Your argument is worthless without pictures.
 
Gotcha. I don't use the internal LUT on mine because I had some issues with the software, so that's why it was unfamiliar to me.

You might think of getting Spectraview. In my experience it is dead simple, setup the profile you like (they have a number of presets if you can't decide), put the puck on the screen and hit go. 5-10 minutes later, you have a calibrated display. It handles everything.

Oh, I've stated many times precisely what looks different. The sheer clarity of essentially everything on screen, and especially text is my primary concern. I noted the 'vaseline screen' look of Nvidia and I wasn't exaggerating. People will of course endlessly tell me that it's something in cleartype I've overlooked, or it's just placebo and I'm fooling myself, but there are other people just like me who notice it immediately.

Ok well that says there is probably a setup issue. Maybe you ought to investigate it.

First off I'd make sure to see that you aren't doing anything different. Like don't use HDMI on one card and DVI on the other. In theory, no difference, in practice, well maybe there is. Also make sure you have a full digital connection, use a DVI-D cable (there are 3 kinds, DVI-I, DVI-A and DVI-D, D is what you want) from the card to monitor, or use DP (if they both support it). Check all settings, make sure you don't have the card fiddling with anything, that you aren't applying FSAA globally or something (nVidia cards at least can and will apply it in things like Firefox with odd results).

If all that doesn't do it, grab a camera and get an up close look at what is different on something where you notice the difference. Find out what is actually happening different. Also see what kinds of things it occurs on. Like just text, or would fine black lines in a image editor give the same effect? Look at the macro photos, see what the actual difference on the monitor panel is.

Then maybe do some research, call support, see if they can suggest why that might be the case. There may be an answer out there.

It just seems rather silly and not very [H]ard to just assume there's a problem write off a company without a little sniffing around, particularly when others don't seem to have a problem.

I'm not knocking AMD or trying to prop up nVidia, I'm saying I think it is bad if yo make your choices based on bad information. If you choose a card because it performs better for what you do, or costs less, or uses less power, or has better features, or it is in stock when you want it, or whatever, those are great reasons. However if you choose one because you have a fixable problem, that isn't a good reason.
 
Oh, I've stated many times precisely what looks different. The sheer clarity of essentially everything on screen, and especially text is my primary concern. I noted the 'vaseline screen' look of Nvidia and I wasn't exaggerating. People will of course endlessly tell me that it's something in cleartype I've overlooked, or it's just placebo and I'm fooling myself, but there are other people just like me who notice it immediately.

If you can see it you can measure it and show others what you are seeing. That's there is too it. You can't fault anyone for not believing you without any data.
 
To the OP,

I appreciate your post and your work. I've used nVidia and AMD cards a good deal myself. I've used reference models and custom versions as well. I've not noticed any differences.

Currently, I've a GTX 580 in one computer that is connected to a calibrated p-IPS display and an HD 7950 in another that is connected to a calibrated p-IPS display. Both are Asus DirectCU II cards.

What has always annoyed me is vertical gamma shift on TN panels. I can't tolerate them. They look broken to me. I've never been annoyed by the output of an AMD or nVidia card to an LCD.

Cards I've owned-

Complete list for the AMD (ATi) side;

Rage 128 Pro
9700 Pro
9800 Pro
X800 Pro (returned it)
X800 XL
AIW X1800 XL
X1900 XT Crossfire
X1950 Pro
HD 3850
HD 4850 1Gb
HD 4870 X2 (returned that)
HD 4890 (tried Crossfire with it and returned the second card)
HD 5870 Crossfire (Sold off to bitcoin guys)
HD 7950 (Run it at up to 1150MHz)

On the nVidia side I've owned;

GeForce 3 Ti 500
GeForce 4 Ti 4600
FX 5700 Ultra
6800 GT
7800 GT SLi
7950 GX2 (sold that)
8800 GTX (one of the most impressive cards I've owned and for the longest stretch)
8800 GT
Mixed GTX 260 Core 216 Tri-SLi (I put two Core 216 cards with the GTX 260 192SP card I bought at release)
GTX 470 SLi
GTX 580 (Run it at up to 980MHz)

If you're wondering why I sometimes have cards close in power around the same time, it's because I've owned up to three computers. I currently have two.
 
When I upgraded from a GeForce 4 MX 440 to a Radeon 9600XT, there was a definite change in desktop color vibrancy. This was nearly a decade ago. Since then I have swapped brands several times and while I have looked for the change, it wasn't there.

And going from those first two cards, I SHOULD notice a change :) I also think those were in VGA hookups too.
 
All I will add to this silly thread is that I've always seen a difference in my desktop image quality when going from Nvidia to AMD or vice versa with my preference being AMD. Don't know what is different technically, but I've seen it EVERY time I've made the switch between the two. AMD's fonts do seem to be a bit sharper and have better contrast, colors do seem to be more accurate as well. Since this is a 2D image quality thread, this only relates to 2D image quality. Nvidia is doing ok on the 3D side.
 
Sycraft,

I came across as a jerk earlier in this thread and I apologize about that. Anyway, the differences I have seen weren't really related to color accuracy , more like minute focus details - some cards tend to deliver a warmer image to the screen somehow and as you know a warmer image makes the image appear to be less sharp. I don't know how to explain it, but its there. I have more than 1 PC so it is quite easy to switch the same screen back and forth to see the differences. This also isn't specific to ATI vs nvidia, I wasn't aware that was a "thing" - i've noticed it on certain brands of cards.

Anyway, i'm not out to prove anything so if people thinks they're the same thats cool with me. Its just something i've noticed on my end, which is a small annoyance since I generally run high resolution IPS/PLS screens. I'm not out to prove anything to anyone though., but it'll be in the back of my mind when I make future purchases. Like I said EVGA superclocked cards tend to have this issue, also saw it with MSI lightning 580s. I'm using a galaxy 4gb GTX 680OC card now that is absolutely perfect, the sapphire 7970s in one of my rigs also are perfect. So on my end it is not an nvidia vs ATI thing, just certain brands..
 
If you can see it you can measure it and show others what you are seeing. That's there is too it. You can't fault anyone for not believing you without any data.

How exactly can I, a person who doesn't have professional equipment or knowledge, measure or distinguish it? Can you perform a litany of scientific tests on random subject matter unrelated to your own life? The ignorance displayed in this thread is amazing. The OP didn't even know cleartype is hardware accelerated, who the hell knows how good his data is or how his tests might be flawed or what exactly he might be missing out on.
 
Last edited:
Ok well that says there is probably a setup issue. Maybe you ought to investigate it.

No, lets not go through setup 101, don't let your arrogance overwhelm you. You think I sat here and didn't as myself these questions? didn't use the same cables? Didn't switch to others? I have dvi, hdmi, and DP in combination with these cards, none of them make a difference. For the record I'm now on mini dp to dp on a 7770 which looks exactly the same as dp to dp on the 6670 from another manufacturer.

Yeah, I wish I had the resources of intel to be able to run a whole slew of tests for the incredibly bitter crew here. But I don't, until then please, sit and enjoy your substandard IQ thinking it's wonderful.
 
Back
Top