A Real Test of nVidia vs AMD 2D Image Quality

This debate is much like the vinyl records vs CD debate from the early 1980's. Most everyone who grew up listening to tube amps and vinyl records knew what analog recordings tried to convey. They had their signature sound and warmth that people loved. Then came along CD's and to this very day people are still trying to tell us old timers that they sound good and we know that they sound like sh*t. But CD's had the portability aspect that ran poor old records out of town and into extinction.

We had great looking but monstrous monitors that displayed any resolution with the same clarity at high refresh rates. An cheap monitor that I purchased for my niece ran at 85Hz for less than $100. The girl had seizures if she used any lesser refresh rate. Nobody from the 80's to the early 2000's knew what input lag was really. Everything was grand. Some monitors looked better than others but that was all about what you were looking to spend to get that perfect picture.

Now we have LCD's that look like sh*t that replicate the CD's from years ago. Can't change resolutions unless they look crappier than before. Have so much input lag so that you have to pray some random test site does a review of it before you buy, And to top it off the more expensive and better looking ones have the most input lag. WTF??? Even the best of them look like crap compared to a nice CRT. But they are small and that's what we like as consumers.

So in a digital world how are our analog eyes going to survive this inferior onslaught of smaller cheaper appliances? Those of us that really despise LCD TV's, LCD monitors, CD's, mp3's, the oxymoron that is loseless when it comes from an inferior digital source; what are we supposed to do? Seems all we can do is inspire someone to invent a smaller analog version version of it or accept the crap that is currently coming from the manufacturers.

TN and IPS look like pure crap. No amount of money can make them warm or vibrant or anything else. Just do like me and buy something and pretend that you don't know that the monitors made 10 years ago are superior. It's the only way to keep your sanity when forced to use bad tools.

@Xoleras I notice the same thing on my 7950 OC. The faster I go, the worse the picture looks until I can't stand to look at it. I think that is why AMD doesn't sell 1200 binned chips.

@Sycraft. Nice work. Not going to try to pick holes at your work as it is commendable to even try to do so. I can notice differences between AMD and Nvidia's fonts, AA, and single card microstutter. Maybe something is physically wrong with the cards I've used but it is what it is.

@Q1DM6. I feel for you as I know what you mean. 20/20 vision in one person isn't the same in another. For example I don't like AMD's AA especially MLAA, but I think AMD cards display images without AA much better than Nvidia. How can I convey that? I don't know. All I can tell you is that in BF3 my KD/R tripled on an AMD card without AA compared to my old Nvidia SLI setup. Is that placebo or "scientific facts"? Who knows, who cares; it's what works for me. Keep using what works best for you.

Back to the example of my niece. If she stares at her Apple monitor for over an hour she starts to get headaches which will lead to her seeing spots and a possible seizure. If she comes over and stares at my 120Hz monitor she can surf the web for 5 hours without problems and leaves happy. She does photo work for school so she can't use my cheap TN. Very sad situation. Just like you can stand to look at Nvidia images.
 
No, lets not go through setup 101, don't let your arrogance overwhelm you. You think I sat here and didn't as myself these questions? didn't use the same cables? Didn't switch to others? I have dvi, hdmi, and DP in combination with these cards, none of them make a difference. For the record I'm now on mini dp to dp on a 7770 which looks exactly the same as dp to dp on the 6670 from another manufacturer.

Yeah, I wish I had the resources of intel to be able to run a whole slew of tests for the incredibly bitter crew here. But I don't, until then please, sit and enjoy your substandard IQ thinking it's wonderful.

Dude, you realize you are getting all emotional and upset over this. Its not a big deal man, you don't need to take it personally. Sycraft's posts have been comprised purely of logic and experimental data. He followed the exact scientific method, you have a question (Is there a difference?) then you make a guess/hypothesis (no.) Then you test your hypothesis to see if you are correct or not. It doesnt MATTER if you are wrong or right, nobody cares, the whole point of this thread was to just find out using an objective method (which means it is fact and not opinion) whether there really is a difference. In these two categories, color output, and font rendering, the cards both produced an identical image. That's not his opinion, or mine, thats simply what the data shows. I used to have a room-mate that would get all upset if I proved something using an objective scientific method, but it didnt agree with how he thought things worked. He always thought I was just trying to be right, when in reality I DIDNT CARE what the answer was I just wanted to know it FOR SURE.
 
I get headaches when looking at apple screens for too long too but it's because of the glare caused by the glass overlay.
 
The OP didn't even know cleartype is hardware accelerated, who the hell knows how good his data is or how his tests might be flawed or what exactly he might be missing out on.

The OP listed what he used and how he collected the data and presented it to us. He drew some conclusions which I find to be quite reasonable based on the available data.

Whereas all you have is a bunch of anecdotes?

I have a bridge I'd like to sell you...
 
Good job OP just a question
op said:
The first step on both systems is to make sure that all onboard color modifications are turned off. We know that the cards can change the color if they want, the question is if they are different despite settings. That means having brightness, contrast, gamma, digital vibrance, hue temperature and all that set to no modification.

does this mean this test was done at ATI/Nvidia default settings? or did you alter both cards settings to something else?
 
Well that is the crux of it isn't it? Some people see it, some don't. So you think those of use who do are just randomly making it up?
No, the placebo effect is not "making it up". It's what happens when a bias is allowed to dominate your perception such that you become convinced of things that are not there. Your eyes are not the only component of your visual perception — your brain must process the data transmitted by your eyes, which can yield a scenario where your brain is influencing what you believe what data your eyes are receiving.

You believe that AMD image quality is superior. Your brain believes this is true, due to an internal bias, and propagates that belief as a function of your perception. In simple terms, a placebo effect is the perception of what is not. The only way to nullify the possibility of the placebo effect is to perform objective testing, usually in a double-blind matter, comparing one thing to that which is perceived to be different. As a colorimeter demonstrates no perceptual bias, it's a means to objectively test for differences between images.

To dispute the claim that AMD and NVIDIA 2D image quality is identical, you need to come forth with a way to objectively test your claim. You keep repeating "I see a difference", but that is not sufficient.
 
With regards to fonts and Cleartype if you feel there is a difference and that difference is because of the rendering the card is doing then you don't actually have to go to any great lengths to get evidence of it: A screenshot will do the trick. If the information being rendered in the card's framebuffer is different, then it'll show up in a screenshot. When you take a shot of Cleartype'd text and zoom in, you can see all the anti-aliasing. You can't see the direct effect on sub pixels (because the capture is at pixel level), but it is clearly visible all the same.

So if you can capture the difference in a screenshot, then you know there is a difference in what the cards are trying to do. If you can't, that means the same data is being sent out.
 
Nice job OP. I came into this thread expecting it to be another troll thread but was incredibly surprised to find an actual test.
 
First of all, I've said this numerous times but again - in my case it has nothing to do with nvidia vs ATI and nothing to do with color accuracy. It has everything to do with how "in focus" and how warm the image gets. As far as placebo (wonderfield) you can just stop it with the arrogance suggesting that people are crazy with these assertions. Do you seriously think anyone spends 1084$ (Pair of EVGA 680s in my case) on hardware just to prove that there are problems? Give me a break. Anyway, my factory overvolted EVGAs make the image warm faster on my IPS panel. I don't know why, but it does, and it results in a slightly less focused image. I replaced the cards with a 4gb Galaxy GTX 680 that does not exhibit the issue. Point is, I think it depends on the brand of the SKU.

Again - nothing to do with color accuracy and nothing to do with nvidia vs ATI. As far as seeing what you want to see, its not too hard to see those who jumped on this thread are here to defend nvidias honor. Too bad I don't think it has anything to do with nvidia vs ATI, and more with certain brands and factory overvolted cards. I'm not really out to prove that i'm right either. This is just something that will guage my future buying decisions in terms of preferred brands, nothing more nothing less. If others find they see no differences in 2d image quality, good for them. If you don't believe a word of this and I don't care at all....I buy based on what *I* like ;)
 
Last edited:
Good job OP just a question

does this mean this test was done at ATI/Nvidia default settings? or did you alter both cards settings to something else?

I made sure that both cards had their settings so that they weren't mucking about with the colour. I believe that is the default, but I can't say for sure on the AMD card as it was a long time ago that I set it up and it is possible I changed something then (for the test I went and verified everything was set to not make changes, I didn't have to alter any settings as that is how it was).

If I remember, I'll see what the defaults are when I get my new laptop. However as I said, I think they do default to not making any changes (it would make sense). I do remember one period when AMD drivers had a bug where they did make minor changes to the whitepoint by default. I installed new drivers and things looked wrong, a bit too reddish. Opened the control panel and one of their settings was off, set it back and everything was good.
 
When I went from two Radeon 6970's to one gtx 680, I didn't notice any difference whatsoever, so not surprised by this at all.
 
First of all, I've said this numerous times but again - in my case it has nothing to do with nvidia vs ATI and nothing to do with color accuracy. It has everything to do with how "in focus" and how warm the image gets.
Define "warm". Ordinarily, "warmth" refers to colors which tend toward the red side. A colorimeter would have no trouble detecting when an image is "warm".

As for placebo effect, it doesn't mean you're crazy. It means your brain is functioning as intended. Hallucinations are an entirely different thing which would be indicative of a mental disorder.
 
Just an FYI, if you have FXAA forced or enhanced in Nvidia control panel, depending on program, you will get the filter applied to fonts. I've seen it on some customer units.
 
Well that is the crux of it isn't it? Some people see it, some don't. So you think those of use who do are just randomly making it up?

Yup, I'm one of those people that notice microstutter, I was annoyed as crap by the Gamebyro stutter that most people don't notice. I'm sensitive to input lag. I've switched between Nvidia and AMD a few times in the past few years and never seen a difference at the desktop or with text and there is no reason why you should. That doesn't even make sense.

I truly believe that these claims are just an example of placebo effect.
 
No, lets not go through setup 101, don't let your arrogance overwhelm you. You think I sat here and didn't as myself these questions? didn't use the same cables? Didn't switch to others? I have dvi, hdmi, and DP in combination with these cards, none of them make a difference. For the record I'm now on mini dp to dp on a 7770 which looks exactly the same as dp to dp on the 6670 from another manufacturer.

Yeah, I wish I had the resources of intel to be able to run a whole slew of tests for the incredibly bitter crew here. But I don't, until then please, sit and enjoy your substandard IQ thinking it's wonderful.

The OP put together a well-documented test backed by scientific findings and all you continue to do is attack it with no proof whatsoever. You've brought nothing to this thread other than baseless statements. It's just pointless to make statements like "enjoy your substandard IQ thinking it's wonderful" if you have nothing to back it up with. If that's your experience, fine, but don't attack his research with no proof. Everyone is wired differently and color can be subjective. You obviously aren't willing to look at it in an objective light, so time to shut up and move on.
 
Yup, I'm one of those people that notice microstutter, I was annoyed as crap by the Gamebyro stutter that most people don't notice. I'm sensitive to input lag. I've switched between Nvidia and AMD a few times in the past few years and never seen a difference at the desktop or with text and there is no reason why you should. That doesn't even make sense.

I truly believe that these claims are just an example of placebo effect.

"I'll give an example of something most other people don't notice but I do, then entirely dismiss your own experience".

Done with this thread, moving on. I'm happy for those of you who are visually impaired.
 
Last edited:
I used a green magic marker to color my GPUs DVI connector. It did wonders for the warmth and fidelity of the image. But once I sat the box on some river rocks selected by a Feng Shui master, it got even better.
 
This site is filled with people who turn a monitor running windows to portrait mode and don't even see the the degradation in quality. :rolleyes:
 
Nice write up OP. Ignore the self centered snobs in the thread. I don't see anything wrong with the data or conclusions you've presented. I would have been surprised if there was a difference in this day and age tbh. But then again, I'd guess 25% or more of this forum thinks that things look better when they're getting 120 fps on thier fraps counter than 90fps on the 60Hz monitor.
 
This site is filled with people who turn a monitor running windows to portrait mode and don't even see the the degradation in quality. :rolleyes:

Lol, explain your logic behind that one, that is... if there is any logic...
 
Excellent post Sycraft. I am one of the users that noticed the difference from switching from Nvidia GTX 280 FTW to an AMD HD 5970 in Windows Vista with the DVI-D on my old HP LP2475w. Now that I upgraded to Windows 7 and to a HP ZR30w, and use the display port. I might give this another shot for Nvidias latest and great GPU in the future. There shouldn't be that much of a difference (if any) from going DVI-D to displayport I assume.
 
I used a green magic marker to color my GPUs DVI connector. It did wonders for the warmth and fidelity of the image. But once I sat the box on some river rocks selected by a Feng Shui master, it got even better.
Right on. If you really want your display to 'pop', though, surround the DVI cable with used toilet paper rolls and elevate it from the surface using any harmonically-balanced, non-magnetic mounting apparatus. You'll see major gains in sharpness, warmth, clarity, danceability, diode warble neutrality, swing and baseball bat.
 
This site is filled with people who turn a monitor running windows to portrait mode and don't even see the the degradation in quality. :rolleyes:

You realize the quality degradation you refer to has nothing to do with Windows or the video card, and is entirely dependent on the monitor? Sure, I've had some cheap monitors that were unbearable when rotated sideways - and that has everything to do with the pixel alignment and viewing angles the monitor design technology was created to support. It's nothing to do with the OS! ROFL...
 
So another little test, a couple of screenshots from nVidia and AMD hardware. As I said, if there were any deliberate difference in font rendering, it would show up in screenshots. If the cards were on purpose doing the sub pixel AA different, it would show up in their framebuffer and hence a screen shot.

So here's MS word on my nVidia GeForce GTX 560TI at work:

nvidia.png


And here is MS work on my co-worker's AMD Radeon HD5750:

amd.png


We can of course see deliberate differences, like he has a different colour scheme for his controls and uses rulers and I don't. However when you zoom in on the fonts, you see they are pixel for pixel identical. In fact, we can test it real easy. If we overlay the fonts on top of each other, and then do a mathematical difference, any areas that are the same will become black, any areas that are different will show up in white. So here's that:

difference.png


I shuffled the position of one of the layers around so the fonts overlay right on top of each other and as you can see, they cancel out to black. They are 100% the same.

You can try it yourself in Paint .NET. Take the nVidia and AMD PNGs, load them in to a file as layers. Set the top layer to "difference" blending, and then line up the text. It will vanish (you can also make the top one semi-transparent, line up the text that way, switch to difference mode, and them make it opaque and watch it vanish).

Also these weren't systems I'd specially configured or anything. I had nothing to do with my co-worker's system, and my work system isn't all calibrated n' such like my one at home.
 
Clearly all this is converging to is an evidentiary-based, fact-supported opinion vs. just plain opinion.

People can believe all that they want. Whether they use facts, evidence, or just their own perception it will, undoubtedly, never stray from others who want to find solidarity in forming groups to align their beliefs and convictions.

Nice tests, regardless. I have no preference for IQ in Windows, as that is not where I spend 90% of my waking life. I like the IQ my own eyes perceive and, hopefully, they can stay healthy enough where they don't degrade into oblivion. But I will be the first one to say that I notice a difference between using MadVR as a video renderer versus EVR-CP, and in doing so allows it to function as a Direct3D program to allow me to use 32xAA on my NVIDIA card in my HTPC, which also helps make it look that much better.

And then, I game on my PC with CrossFire. Funny how that works, not being loyal to a certain manufacturer. Although, with my CrossFire experiences, I'll be thinking twice about AMD again unless something is done drastically different.

But anyway, the tests here are cool, the opinions are very entertaining, but the arguments are just boring and, quite honestly, immature.
 
Back in the RAMDAC days with analog output, there were good and bad cards, good and better cards, etc. Even with the same GPU form different vendors. There was definitely a difference, and AMD was not always the winner.

I blessed the day where LCDs made digital output king and I could stop getting screwed by component selection as much in the card and monitor. However, there IS a difference, it's just not in hardware, it is in default driver settings. AMD makes things a little more saturated and contrasty, and that looks nice and can make reading nicer until you are doing some serious reading on the monitor for extended periods, then it can lead to greater eye fatigue. Nvidia seems to aim for a different setting fairly frequently, and it can be good or not so good. Neither brand is horrible by any means.

Both brands look pretty much identical if you calibrate your monitor.

Although basically agreeing with the OP's test, I do have to say the images of the fonts does seem to show that there are differing intensity to the pixels dimmed for aliasing the fonts, and that would affect how the fonts look. I'm not going to say he proved himself wrong, because being familliar with cameras and monitors, the fact that he handheld for the test rather than using a tripod makes the test pretty much worthless for comparisons. If he let the camera set the exposure, it also is worthless. Adobe's color picker agrees with me that there's a slight difference in the photos.

The framebuffer approach with difference is more sound.
 
So another little test, a couple of screenshots from nVidia and AMD hardware. As I said, if there were any deliberate difference in font rendering, it would show up in screenshots. If the cards were on purpose doing the sub pixel AA different, it would show up in their framebuffer and hence a screen shot.

So here's MS word on my nVidia GeForce GTX 560TI at work:

nvidia.png


And here is MS work on my co-worker's AMD Radeon HD5750:

amd.png


We can of course see deliberate differences, like he has a different colour scheme for his controls and uses rulers and I don't. However when you zoom in on the fonts, you see they are pixel for pixel identical. In fact, we can test it real easy. If we overlay the fonts on top of each other, and then do a mathematical difference, any areas that are the same will become black, any areas that are different will show up in white. So here's that:

difference.png


I shuffled the position of one of the layers around so the fonts overlay right on top of each other and as you can see, they cancel out to black. They are 100% the same.

You can try it yourself in Paint .NET. Take the nVidia and AMD PNGs, load them in to a file as layers. Set the top layer to "difference" blending, and then line up the text. It will vanish (you can also make the top one semi-transparent, line up the text that way, switch to difference mode, and them make it opaque and watch it vanish).

Also these weren't systems I'd specially configured or anything. I had nothing to do with my co-worker's system, and my work system isn't all calibrated n' such like my one at home.

Honestly, I see minute differences in the text, but I would chalk it up to the equivalent of an optical illusion rather than a difference in how the graphics cards generate the respective images.
 
So does switching a monitor from 60Hz to 120 Hz make a difference on your desktop? Mine surely does to me but I'm told that it's placebo on these boards. It's to the point where low refresh rate monitors look drab and gloomy to me. In the last pic Sycraft posted I see what he means that the two images are the "same" but if I stare at the AMD pic for too long I get a headache. If I can just stare at the text in the box it's not as bad but not preferable to me. Maybe his grey is triggering my sleep instinct. LOL!

Post more and let us pick prefer, dislike, or same. Don't label the pics. Then tally how many think the Nvidia fonts are great and how many prefer the AMD ones. That would be interesting. :)
 
I'm honestly horrified at the people in this thread still screaming to high heaven that there is a difference yet they provide no data to back up their claims.

If you can't provide objective data to backup your claim then you have no box to stand on.
 
So does switching a monitor from 60Hz to 120 Hz make a difference on your desktop? Mine surely does to me but I'm told that it's placebo on these boards. It's to the point where low refresh rate monitors look drab and gloomy to me. In the last pic Sycraft posted I see what he means that the two images are the "same" but if I stare at the AMD pic for too long I get a headache. If I can just stare at the text in the box it's not as bad but not preferable to me. Maybe his grey is triggering my sleep instinct. LOL!

Post more and let us pick prefer, dislike, or same. Don't label the pics. Then tally how many think the Nvidia fonts are great and how many prefer the AMD ones. That would be interesting. :)

No real point. I mean people might prefer a difference between the images there because of the different backgrounds on Office or the like but that has nothing to do with the fonts. The real test is the difference test. If a mathematical difference operation results zero (which means a black display), then you know you have identical pixels. That is just how it works.

I posted the originals so people could try it themselves, or analyze them zoomed in (when looking for small difference, zoom is what you'd want) not because I thought there'd be no preference. As I said, my coworker likes to fiddle with the skin of Office, mine is stock.
 
I have never noticed a difference between the 2 vendors. With that said, I can think of a few more theoretical causes of differences that may be perceived by the two vendors. Here are some of them:
  • The transfer from the framebuffer to the DVI output somehow flips some of the bits and causes data loss
  • You are comparing static image to static image when in reality, you are always seeing a constantly changing image. Perhaps some of the images during the 60hz refresh cycle are sightly different causing undesired effects
  • Would differenct vendors render cleartype differently when there are different colors behind the text?

Of course, testing these hypotheses is much harder than the other ones. You would need to capture the DVI signal and compare it for a set period of time to check the differences.

I also have no idea how a "warm" image effects its sharpness. I thought "warm" was more of a color difference. Also, to test the warm image, please take a picture of a warm picture using a macro shot and a non-warm image to compare. Don't let your camera do auto adjust as that would make the picture different.
 
My 2c

My earliest PC had an S3 Virge (2MB 2d only analog) and when moving to a Matrox Millenium 2 it was clear that the 2D quality was much better, but this was in 1999!

My first 3D card was a GeForce 2 MX made by powercolor, back then it was completely up to the manufacturer of the card as to what RAMDAC to install (for nVidia cards) and Powercolor used really shitty 250mhz RAMDACs, as a result 2D quality sucked.

On the flip side, ATI had thier own RAMDAC silicon, so when Powercolor started manufacturing ATI cards, they had to use AMDs 400mhz RAMDAC, between the GeForce 2 Pro and the Radeon 8500, the ATI card had vastly superiour 2D quality.

so yes, once apon a time it was possible to get an nVidia card with shitty 2D quality, while at the same time impossible to get an ATI card with crap 2D quality, and I believe this is where biases originated.
 
This thread reminds me of debates between Young Earth Creationists and evolutionary biologists.
 
You can measure the difference there.

Yes, I think what people need to understand is that there's two issues when talking differences in how something looks/sounds/etc.

First off there is the question of if there is any difference, at all. If you can't measure a difference to a statistically significant amount (as in there are always trivial differences due to measurement error when talking actual measurements of the world) then there is really no more to talk about. There is no difference to be perceived. There's no point in arguing over if someone may have better perception because it doesn't matter, there isn't a difference.

No question, you can measure the difference between 60Hz and 120Hz monitors.

Then if there is a measurable difference, there's a question of if humans can perceive it. We can measure things we can't perceive (at least not without the help of our tools) that's part of the reason to do so. So when something can be measured, you have to then test to see if it makes a difference humans can notice. One area like that where there's debate is sound sample rates. 44.1kHz and 48kHz are the normal sample rates used for sound. However some stuff, like DVD Audio, can be done much higher like 96kHz or 192kHz. Some claim that there is no audible difference, others claim there is. However either side claims there's not a measurable difference, it is easy to measure, the question is if humans can perceive it (the best double blind studies I've seen indicate no we can't).

In terms of 120Hz, I would guess people can indeed perceive the difference without much trouble, but then I've never tested it. However I wouldn't then extrapolate that to mean any higher frame rate is perceptible. At some point, you reach the limit. So 60-120Hz could be easy to perceive but 10,000-20,000fps could be impossible to perceive. Where the limit is would be a matter for a different kind of test.
 
You realize the quality degradation you refer to has nothing to do with Windows or the video card, and is entirely dependent on the monitor? Sure, I've had some cheap monitors that were unbearable when rotated sideways - and that has everything to do with the pixel alignment and viewing angles the monitor design technology was created to support. It's nothing to do with the OS! ROFL...

Sure, it has to do with the pixel alignment, and because of it Cleartype doesn't work in portrait. The thing is most of these people will run portrait and not even notice.
 
Back
Top