A Real Test of nVidia vs AMD 2D Image Quality

Oh, I've stated many times precisely what looks different. The sheer clarity of essentially everything on screen, and especially text is my primary concern. I noted the 'vaseline screen' look of Nvidia and I wasn't exaggerating. People will of course endlessly tell me that it's something in cleartype I've overlooked, or it's just placebo and I'm fooling myself, but there are other people just like me who notice it immediately.

There are plenty of people on this very forum who will sit and tell you that LCD televisions are perfectly acceptable for watching tv, when they are in fact utter crap. I can't help that they are less discerning, but the fact remains they are. A lot of people never even notice things like reduced motion resolution, horrible black/gray levels, flashlighting, etc. These kinds of things simply enforce to me that a great number of people lack something in the way they process the world visually.

I think what you might be seeing is the shittyness of VGA. I recommend going DVI for sure. (and use the dvi-d which doesn't contain the analog signal path, which the card/lcd might default to).

We use HP pc's at work, with built in amd graphics, hp lcd's. Most all of the buildings 800 or so users run dual displays. The hp pc's come with one DVI port, one VGA port (and more recently the DVI was replaced with displayport on last years model desktop). So we use both for the lcd's on each users desk. The VGA connected LCD's have weird issues, the auto-adjust helps a bit but its still there. in focus on most of the display, but there will be vertical sections that look just a tad out of focus. Annoys the shit out of me. The dvi connected side looks perfect.

You are definately right that there can be differences, maybe subtle to some people, between cards. But in my experience it only appears on analog (VGA) connected display devices.

The original posters mention of the age old argument, which I think originally began as fact in the early 90's where different DAC's, etc. between video cards (very simple non-3d devices back then) was quite real. I remember some Viewsonic vga cards, isa slot, with 1Mb ram, that was like a $200 or more video card, and it was mainly priced there for the quality of the output. This is back in the day when monster cable actually could make a difference.

Today, you can eliminate all those issues as the OP did, use the digital connections.

Op, you might try your original tests again, this time purposely using the VGA ports.

The problem with analog connections affects both brands as I can see even today on pc's I support at work. It's possible there's differences in this between brands of cards. In this day and age, it seems more likely this difference might be attributed to manufacturer (ASUS, Gigabyte, etc) along side/in lieu of the GPU manufacturer. Some more testing might reveal it!
 
I think what you might be seeing is the shittyness of VGA. I recommend going DVI for sure.

:rolleyes:
I'm far from an idiot. I can't remember the last time I had anything connected via VGA, it's been so many years.

Sycraft:
Regarding "just taking a screenshot and comparing" to my knowledge you need specialized hardware/software to capture directly out of the buffer, you can't just print screen and paste away. I'd have to look for the technical reasons why. It was something I was aware of as I've seen it mentioned in the video game journalism world before. Someone in another one of these IQ threads posted a technical explanation, I'll see if I can find it
 
Last edited:
You are comparing static image to static image when in reality, you are always seeing a constantly changing image. Perhaps some of the images during the 60hz refresh cycle are sightly different causing undesired effects
Actually, in the case of an LCD, you're generally not seeing a changing image, at least in normal desktop usage. The only redraw occurs in areas where it's required at the time. Additionally, the ability to present one frame correctly generally indicates that the device will be able to present subsequent frames correctly and that it will have presented preceding frames correctly. Any anomolies would be errors, and should be fairly apparent.

That's not to say that it can be ruled out as an impossibility, but I think the likelihood is quite low.

Would differenct vendors render cleartype differently when there are different colors behind the text?
To my knowledge GPU vendors have no means to tune ClearType. ClearType rendering is a GPU-based process, but the actual implementation is done on Microsoft's end, sitting atop Microsoft's existing hardware drawing API stacks. In order for vendors to tune ClearType, they'd need to perform hacks in the driver or have some sort of driver fault which exposes problems with conformance to Windows APIs, and DirectX conformance is something GPU vendors take pretty seriously these days (since it results in fewer problems for end users).
 
Regarding "just taking a screenshot and comparing" to my knowledge you need specialized hardware/software to capture directly out of the buffer, you can't just print screen and paste away.
Where did you ascertain such knowledge? It is not correct.

A Direct3D frame buffer is simply a raw 'stretch' of color values associated with onscreen pixels in a particular format, usually D3DFMT_X8R8G8B8. Grabbing its contents with an API call is trivial.
 
:rolleyes:
I'm far from an idiot. I can't remember the last time I had anything connected via VGA, it's been so many years.

Sycraft:
Regarding "just taking a screenshot and comparing" to my knowledge you need specialized hardware/software to capture directly out of the buffer, you can't just print screen and paste away. I'd have to look for the technical reasons why. It was something I was aware of as I've seen it mentioned in the video game journalism world before. Someone in another one of these IQ threads posted a technical explanation, I'll see if I can find it

Yup, you can find an image of the specialized hardware required to do so right here
 
Sure, it has to do with the pixel alignment, and because of it Cleartype doesn't work in portrait. The thing is most of these people will run portrait and not even notice.

Not that I ever thought this issue was ever about color accuracy or nvidia vs ATI, but in your case overzealous people are out in droves to pounce on the attack at you. Remember, nvidia is perfect and say anything contrary to that, you're going to be reamed. This is a no-win situation here, you will be called an idiot and crazy for saying something that they perceive to threaten their honor....the term "nvidiots" was created back in the TNT2 days for a reason. I love nvidia hardware to death but nvidiots are, and always have been, annoying as hell. Let it go.

Besides which, who the hell cares? If you see a difference, purchase accordingly. I don't see why you care what anyone else thinks, I SURE don't. I will not buy EVGA SC cards anymore because of 2d IQ being out of focus, but I am VERY pleased with my galaxy 680 4gb card. It displays none of the problems that the EVGA SC's did. I will buy another one (4gb galaxy 680) when one shows up in stock, unless I can find a 690 in stock anywhere.
 
Last edited:
I've never tried to pin down exactly what the difference is from a technical perspective as I lack that ability, but here's the rub: People like you seem to insist that I and others like me who notice an absolute difference immediately are somehow either lying, mistaken, or as another posted noted we are apparently falling for a "placebo effect".

I don't have any agenda, I don't owe any allegiance to a video card company, I simply noticed a difference in quality that jumped out at me that no amount of cleartype tuning could fix.

I have a 430gt in the closet. I had a 560ti I gave to my brother. I had retried both of these after moving to amd to be sure I wasn't imagining things. All other things being equal, I see a marked difference. I wish I could pull the technical reason right out of my ass and give it to you, but I can't.

I do know one thing, I'm not the only person who does notice it.

Anyone using an argument that begins with "people like you" is not interested in a debate.

Leave the ad hominems out and try again.
 
Anyone using an argument that begins with "people like you" is not interested in a debate.

Leave the ad hominems out and try again.

Really? Because you can only be told you have some form of mental illness from the armchair psychiatrists so many times before you'll voice similar sentiment.
 
As I said before, there's no correlation between the placebo effect and mental illness. At least none that I'm aware of.
 
Really? Because you can only be told you have some form of mental illness from the armchair psychiatrists so many times before you'll voice similar sentiment.

Mental illness? No, having issues with perception isn't an illness, it is the human condition. That is how and why optical illusions work. Your brain misperceives reality. The checker shadow illusion is one of my favourite ones that is easy to see on a computer (and verify that it is an illusion using an image editor). Confirmation bias is another part of that and is again, part of being human. That's why there has been so much research on it. If it was something that only a few people had it wouldn't be so interesting, it is that EVERYONE suffers from their beliefs colouring their perceptions that makes it such an area of study.

None of this has been a personal attack on you. If you've taken it as such that is your issue.
 
What that all means is the conclusion is pretty clear: There are many good reasons to buy team green or team red, but 2D image quality is not one of them. They are precisely the same.

Excellent thread, and exactly what I have been saying in several threads.

Placebo is powerful thing.
 
To everyone dismissing the OP's results I have only one thing to say:

fyOQx.jpg

This whole community should embrace the concept of objective and quantitative analysis.

You don't need tons of fancy equipment to collect useful data. Screenshots and camera pics are easy to create and can be quite useful. If you want to get more serious you can see if there are any ISF certified techs in your area (check the AVS forums) and hire one to come do some measurements. The last time I had one come around it was $100 and his equipment was fairly high end.
 
Last edited:
I would love to see some of these posters claiming to see a difference submit to a blind visual test of identical systems save the video card and see how accurate they are in trying to identify the vasoline. My prediction: they will be correct about 50% of the time. An earlier poster compared this to audiophiles and videophiles and he was exactly right. People don't understand the power of bias and preconception and how it can change your perception (which to you seems real enough). That's why there are blinded tests in the first place--this phenomenon is well known!
 
As far as the "overclocking affects my image quality argument," get Afterburner running and change the core clock. Without a 3-D application running you can varry it quite a bit. The card won't throttle down imidiately because it doesn't detect hard 3-D workload, so you'll have time to see and differences as a function of clock speed. Hell, do it with VRAM speed to.
 
I'm not out to win you or convert you, hell if you don't see what I see I PITY you. Never mistake your place.

Just a reminder to those of us that lack his super-human perception... Vaseline is cheap and can easily be applied to any monitor. Don't fret my friends.
 
Okay, so if you turn everything off, they are the same. But that is not how most people end up seeing it. They see it according to the default install of the drivers. I would love to see this article done with the default install of each set of drivers. Bet there are differences, which is why people claim what they do.
 
Someone should do something similar for DX/GL.

Although with the state of both AMD/nVidia's GL implementation I'd be surprised if you got the same result on two cards from the same manufacturer and family even.
 
Forgive me if this has been asked already, but doesn't your choice of display monitor invalidate your results based on this information?

For these tests I’ll be using my NEC MultiSync 2690WUXi monitor. It is a professional IPS monitor, designed with color accuracy in mind. It actually has internal lookup tables, so it can have correction applied to it internally, independent of the device feeding it the signal.

Am I reading that wrong, or would your display correct the lookup tables regardless of the device input. Meaning, wouldn't you need to use a monitor that doesn't do this internal correction to accurately measure the cards actual display output?
 
This thread is an excellent example of how people mis-interpret 'data' and draw conclusions beyond what is supported by evidence. The first experiment shows that color reproduction in the two experimental systems cannot be distinguished. The second experiment shows that when photographed, fonts appear to be indistinguishable in the two systems.

The conclusion drawn was that the two systems produce identical 2d images -- this is a reach. Another poster mentions that he/she can see a visible difference between the two systems and everyone in here goes batshit saying that the data proves him wrong. Not only that, the armchair psychologists in here determine that the placebo effect is the cause for the described phenomenon.

There is a HUGE logical gap with what people are doing here and you hear this all of the time when people talk about science. The whole 'vaseline' screen phenomenon (if it exists) is not understood. What could cause this difference could be something that the two experiments doesn't even consider, or that is not detected using the OP's methods.

The data shows what the data shows, nothing more. The OP's data is probably 100% correct, but it could also be true that some people can see a difference between the two systems. If you had a blind test where you could evaluate someone like Q1DM6's ability to predict which system is which then you could dismiss or confirm it's existence -- but these data simply don't address that particular question.

If I surveyed people any found that 5% of people say they can predict the difference between an AMD and an Nvidia displays, that would be data too! And it would contradict the conclusion argued by many in this thread but not the OP's data itself.

As I mentioned before, what I find so disturbing about this is how people blindly mis-apply data and twist it to prove that their argument is correct. People are quick to take these data and turn them into evidence that Nvidia cards never had the problems that many AMD supporters claim they have. The data supports this claim, but it falls well short of proving it. There is no evidence for or against the placebo effect in this situation -- it is simply being brought up to pad an argument and dismiss Q1DM6's opinion by (mis)-using science. Just because a psychological phenomenon could explain something doesn't mean that it does and it certainly doesn't make your argument more 'scientific'.

I don't mean to dismiss the OP's experiments , what was done seems really useful and well done. I am not saying that Q1DM6 is right or wrong. But in the responses to his post I see similar patterns to the way people talk about 'science' these days: "science' is used as a club by people who want to support one side of an argument while dismissing the other side without proper evaluation. It makes me sad. :(
 
Well considering I have asked people very close to AMD in the past, and their reply was the only difference between the two vendors is a slight difference in colour saturation, and default neg lod bias for 3d, I tend to believe Q1DM6's "golden eyes" are infact nothing more than placebo.

The colour results may not be accurate in this testing as Brent pointed out, but the fonts seem to match pixel perfect, indicating that clarity is also the same. I was always under the impression AMD used slightly more vibrant colour saturation by default, which may cause soemone to think the image has more clarity, because it "pops" a little more, but it was nothing that couldn't be achieved on Nv hardware.
 
Forgive me if this has been asked already, but doesn't your choice of display monitor invalidate your results based on this information?

Am I reading that wrong, or would your display correct the lookup tables regardless of the device input. Meaning, wouldn't you need to use a monitor that doesn't do this internal correction to accurately measure the cards actual display output?

No, all the lookup tables to is translate a given input to a given output. They are three tables of numbers, with 8-bit inputs and 12-bit outputs. So it says "when you get a value of X for red, drive the subpixel with a value of Y."

Most monitors have them, it is how you adjust colour, contrast, and so on on them, the NECs are just able to be calibrated with their tools. Also the NEC's are higher bit, so they don't cause banding or colour loss.

However they are consistent in their operation on an input. So they operate the same on it, no matter what is feeding the signal, unless you change it.

The question isn't one of what the monitor is doing, the question is if given the same input in software from the two different cards, is that then passed along to the monitor unchanged? The best way to test that would be to record the DVI signal, but I don't have anything that can do DVI or HDMI in at 4:4:4, best I have does 4:2:2 which means it sub-samples colours making it worthless for a test.

Next best thing I could do was to measure the actual colour output. If the same input on both cards generates the same output on the display, then it is quite a reasonable assumption that it was the same signal being sent.

If anyone wants to buy me a Blackmagic DeckLink HD Extreme or an AJA Io XT, plus a stupidly fast SSD, I'll be happy to use them to do a direct signal capture :).
 
Someone should do something similar for DX/GL.

Although with the state of both AMD/nVidia's GL implementation I'd be surprised if you got the same result on two cards from the same manufacturer and family even.

You mean test 3D images on the cards? Thing is there you may expect some difference because of how the cards choose to process the information. A simple example is anisotropic filtering. These days both brands are quite good but not that long ago it was a feature with contention for improvements. New cards would come out and they'd have better methods for generating more uniform aniso filtering. There was real visible in game differences, and there are utilities to test it.

Also it would be a matter of what specific chip you chose. The 680 might well look better than a 4870, but not then a 7970, for example.
 
Look at the letters with the red, green, and blue background. Shake sideways as fast as you can, while looking at the screen. Those colors turn white!
 
Since ClearType and the orientation of the monitor has come up in this thread, it's worth mentioning that it looks like Windows 8 will drop ClearType entirely, for two reasons. First, tablets may be oriented in any direction. Second, higher-resolution displays may end up making sub-pixel rendering unnecessary; when you quadruple your resolution, you are showing better text without ClearType than the original resolution does with ClearType.

http://www.istartedsomething.com/20120303/cleartype-takes-a-back-seat-for-windows-8-metro/
 
Since ClearType and the orientation of the monitor has come up in this thread, it's worth mentioning that it looks like Windows 8 will drop ClearType entirely, for two reasons. First, tablets may be oriented in any direction. Second, higher-resolution displays may end up making sub-pixel rendering unnecessary; when you quadruple your resolution, you are showing better text without ClearType than the original resolution does with ClearType.

http://www.istartedsomething.com/20120303/cleartype-takes-a-back-seat-for-windows-8-metro/

High rez displays are ultimately the right answer. If we start getting 300ppi displays on our desks, they'll be far beyond our eye's ability to resolve detail unless you squash your nose on the display (even then maybe). Problem is cost. Transistors cost money and every sub pixel needs one or two (depending on the kind of display).

Still going to be some time yet before we start to see really high rez displays on the desktop. Not only is there the cost issue, but there's also the interconnect issue. Only with DP 1.2 and HDMI 1.4 do we now have interconnects that can handle some really high rez displays, and even they have lowish limits when you talk really high rez computer monitors DP 1.2 can do 3840x2160 30bit @ 60 Hz but that's it. You want 120Hz, or an even higher rez, you'd need a better interconnect.

Now that rez sounds high, and is high compared to what we have now (double 1080p in both directions) but on a 24" monitor you are still only talking 188 ppi. Bigger monitors are lower of course. To push near 300 ppi you'd have to go to 5760x3240.
 
Actually I'm not entirely convinced the interconnect is going to be a big issue; just add more pins when you can't increase the data rates further. The difference between single link and dual link DVI? Six pins (+ and - for red, green, and blue). If you drop the analog pins from the DVI specification and decrease the size of the pins slightly, I'm sure quad link DVI would be possible. Same with DP...just double the number of links. Might be more expensive sure but not a huge problem from a technical standpoint.
 
Gets a lot more complex, and thus expensive, when you want to start doing parallel data paths. Not saying it is impossible, it is done. However we are talking about something for widespread availability and that means it needs to be economical. Remember you have have 4k monitor right now... they are just $50k.

Also, even then, a new standard is still needed. DP has the number of links it has. A new version could be made with 2x, but it would have to be standardized and implemented before it could be used.

Just saying there are still some lurking issues. The biggie is just cost of making a high rez panel, but there are others. I've no doubt they'll be overcome and we'll see monitors that have pixels too small to discern, but it is still a ways off.
 
I have seen the difference between video cards, but its not brand specific. It's usually one card model if great, the next series of advancement has an issue. Usually it comes down to the type of monitor. My Dell 2001 monitor shows the differences immediately where my I-INC iH282 does not show any differences, nor does my Samsung P2370.

Seems to me the video cards are pretty much equal when used with washed out looking monitors made for playing games with fast response times. That kind of makes sense as those who play games usually have fast TN monitors that are incapable of 'crisp' image detail anyway. Get a professional monitor and the difference can be astounding.
 
I used to notice image quality differences on my old 21 inch CRT running on VGA. DVI pretty much killed that concern for me.
 
I would love to see some of these posters claiming to see a difference submit to a blind visual test of identical systems save the video card and see how accurate they are in trying to identify the vasoline. My prediction: they will be correct about 50% of the time. An earlier poster compared this to audiophiles and videophiles and he was exactly right. People don't understand the power of bias and preconception and how it can change your perception (which to you seems real enough). That's why there are blinded tests in the first place--this phenomenon is well known!

+1

Starting about 18 years ago, in the analog days, there were differences, but I agree with the op that it is mostly gone these days (can still see it a bit on VGA, with the oem being the larger piece of the puzzle, not the gpu manufacturer).

However certain of themselves some posters in the thread seem to be, I think they will not be able to tell the difference in a digital connected blind taste test.
 
Last edited:
Sycraft's Image Quality Showdown


What that all means is the conclusion is pretty clear: There are many good reasons to buy team green or team red, but 2D image quality is not one of them. They are precisely the same.

Thank you for slaying this Myth ( well except for those who believe personal delusions instead of evidence).
 
And you trust everything you read on the internet even when you don't know a person's credentials or agenda, correct? :eek:

Credentials don't matter when the data and the testing method is presented so that the test can be repeated by others.
 
Back
Top