A Real Test of nVidia vs AMD 2D Image Quality

Credentials don't matter when the data and the testing method is presented so that the test can be repeated by others.

Sure OK, because everyone here has access to the exact same equipment that was used and it will be repeated by at least two or three others here just to make sure there's no inconsistencies.

Just being realistic...because I would hope common sense would tell you that no one here is going to repeat the tests and purchase any equipment needed to duplicate it. Agree?
 
This was a good read. I remember back in the analog days you could sometimes see a difference between brands or even individual cards of the same brand (ATi always looked more washed out to me).

I have not really noticed any difference in the last 10 years or so once I started using digital connections.
 
Sure OK, because everyone here has access to the exact same equipment that was used and it will be repeated by at least two or three others here just to make sure there's no inconsistencies.

Just being realistic...because I would hope common sense would tell you that no one here is going to repeat the tests and purchase any equipment needed to duplicate it. Agree?

You don't need the same equipment to do a verification test. You just need AMD/NV cards, any monitor, and a calibration device to measure objectively with.

There really should be no controversy or surprise. Once we enter the digital domain. #3366cc(51,102,204) is going be the same no matter which card it was coming out of.

The crazy part is that this test was needed at all. The double crazy is people still disagreeing with the obvious even after a solid test results were presented to erase what should at best be lingering questions.
 
Everything you see or read on the internet that you did not witness with your own eyes should be taken with a grain of salt. If you haven't figured that out yet, then good luck with the internet. In this case, if you were not actually there to witness the test parameters, nothing was tampered with, adjusted, or even properly setup, and you have not repeated the tests yourself with the same equipment in your own environment, then you have already made the decision to simply trust what someone else is telling you as fact and I'm just not naive enough to do that.

Plese note, this is not taking a shot at the OP, just saying I don't always believe everything read on the internet no matter how factual it may appear to be because things are very easy to manipulate, even screen shots, data, etc and without being there to verify it personally, I just can't easily accept it as fact.
 
Just being realistic...because I would hope common sense would tell you that no one here is going to repeat the tests and purchase any equipment needed to duplicate it. Agree?

Agreed. There's got to be at least one other knucklehead around here though that can try this test too, though. :p
 
Meh, I could careless what that test says, its not showing something. After running video cards from the GF2 mx200 to the GF GT520 and radeon 8500 to the HD6450 many times there has been a noticable difference in 2d quaitly even after adjusting settings. This has been my experience from building PC's for friends and family for a few dollars. Maybe nvidia just skimps on their low end cards?
 
Meh, I could careless what that test says, its not showing something. After running video cards from the GF2 mx200 to the GF GT520 and radeon 8500 to the HD6450 many times there has been a noticable difference in 2d quaitly even after adjusting settings. This has been my experience from building PC's for friends and family for a few dollars. Maybe nvidia just skimps on their low end cards?

The difference is likely in default driver settings and/or monitor calibration between the two cards (as stated in this thread). Also Q1DM6 should be banned for a while because of his posts here.
 
People are on crack, heroin, marijuana, qualoodles, LSD, 24 packs of beer and liquor.

People see different things ya know ;)

Never believe the hype they try to sell you! Investigate.
 
Everything you see or read on the internet that you did not witness with your own eyes should be taken with a grain of salt. If you haven't figured that out yet, then good luck with the internet. In this case, if you were not actually there to witness the test parameters, nothing was tampered with, adjusted, or even properly setup, and you have not repeated the tests yourself with the same equipment in your own environment, then you have already made the decision to simply trust what someone else is telling you as fact and I'm just not naive enough to do that.

Plese note, this is not taking a shot at the OP, just saying I don't always believe everything read on the internet no matter how factual it may appear to be because things are very easy to manipulate, even screen shots, data, etc and without being there to verify it personally, I just can't easily accept it as fact.

This is true, you can't believe everything on the internet. However you cant just decide something is wrong because you feel like it or it disagrees with your opinion. Thats not how it works.

Are you of the opinion that the OP's results are fabricated or otherwise false?



Meh, I could careless what that test says, its not showing something. After running video cards from the GF2 mx200 to the GF GT520 and radeon 8500 to the HD6450 many times there has been a noticable difference in 2d quaitly even after adjusting settings. This has been my experience from building PC's for friends and family for a few dollars. Maybe nvidia just skimps on their low end cards?


Yes, it is showing something. It shows the output of the two video devices used in the test is IDENTICAL. It is well known that back in the days of RAMDACS and analog signals there were differences, but now in the age of all digital interconnects that is not the case. These days if you still think you see a difference it is placebo and/or incorrect expectations coloring your vision, ie Rose Colored Glasses. It is a normal thing, so don't let that fact upset you.
 
Last edited:
Back in the day, when I was comparing Matrox/ATI/NVIDIA/S3/3dfx, I only noticed a font / color difference in the BIOS startup screen (and other places where the generic system font was used) and depending on whether or not some manner of digital vibrance was enabled.

It's good to see that there's no significant difference between the two biggest gaming card vendors when it comes to color accuracy. This is something like Mythbusters for nerds, right? GET A MUSTACHE! :D
 
My agenda wasn't anything other than to present the results.

The impetus behind the test was to see if there was something to this claimed difference. I'd never seen a difference and I've used a LOT of different cards, since I do computer support for a living. I can find labs full of systems with AMD, nVidia, and Intel GPUs in my building. Also I've used both in my desktop, and happen to have both at home right now. Never noticed a difference. Then, as I said, I know a bit about how the signal transfer actually works and I couldn't see how they should look different. Plus I have something of a high end display setup, so I would think if there was a difference, surely it would show on my system.

However I'd seen the claim more than once. So I was open to the possibility I just couldn't see it. I'm aware of limits to perception, I don't pretend I'm perfect. So I figured I'd just test it out. My i1 isn't perfect either, but it is more sensitive than a human and more importantly can give objective results. I set up a test to see what I'd get, and this is what I got.
 
This is true, you can't believe everything on the internet. However you cant just decide something is wrong because you feel like it or it disagrees with your opinion. Thats not how it works.

Are you of the opinion that the OP's results are fabricated or otherwise false?

I said nothing of the OP being wrong or right, I simply stated I don't believe everything I read on the internet no matter how factual it may "appear" when presented. Nothing more. I just replied to one person's post about how factual data was presented and there was no arguing otherwise unless someone else could post facts showing as such. My point was that data, screen shots and just about any other info can be altered, skewed and even presented in a manner to appear legit.
 
My agenda wasn't anything other than to present the results.

The impetus behind the test was to see if there was something to this claimed difference. I'd never seen a difference and I've used a LOT of different cards, since I do computer support for a living. I can find labs full of systems with AMD, nVidia, and Intel GPUs in my building. Also I've used both in my desktop, and happen to have both at home right now. Never noticed a difference. Then, as I said, I know a bit about how the signal transfer actually works and I couldn't see how they should look different. Plus I have something of a high end display setup, so I would think if there was a difference, surely it would show on my system.

However I'd seen the claim more than once. So I was open to the possibility I just couldn't see it. I'm aware of limits to perception, I don't pretend I'm perfect. So I figured I'd just test it out. My i1 isn't perfect either, but it is more sensitive than a human and more importantly can give objective results. I set up a test to see what I'd get, and this is what I got.

Many people have seen a difference and I'm not inclined to call them liars either. I believe sight just like hearing can be a somewhat subjective thing based on how the brain interprets what it sees.
 
Many people have seen a difference and I'm not inclined to call them liars either.

Lying implies intent to deceive. So I'm not sure why you would call them liars.

However, it might be appropriate to call those people stubborn who cling hopelessly to their position when no data exists to bolster their position and devices which are far more capable than the human eye exist to capture it.

I believe sight just like hearing can be a somewhat subjective thing based on how the brain interprets what it sees.

I have a $50,000 pair of glasses I want to sell you. It will make the NVIDIA Vaseline problem go away.
 
I said nothing of the OP being wrong or right, I simply stated I don't believe everything I read on the internet...

So you have no opinion on the matter? You just like casting FUD on his testing for no apparent reason? :rolleyes:

Anyone who has decent understanding of color standards and working in a digital domain, would understand that,as he points out, any specific color code, which is a digital representation, and goes digital straight into monitor should be identical on each card.
#3366cc(51,102,204) NV = #3366cc(51,102,204) AMD.

There are standards for this and any company that couldn't match the standards, shouldn't be doing graphics cards. It isn't exactly challenging to keep 24 bit digital values accurate.

So it is absolutely the expected case, that the digital output to represent the same image, will be identical from both AMD/NV products

Sycraft is just posting a bit of additional confirmation that the facts around the standards and digital transmission are working as expected.

It isn't like he posted something that should be controversial. He posted something that lines up with the established understanding of how these technologies and standards work. His is just one more piece of confirming information for what should be well understood behavior.

It is like he dropped an apple and reported that it fell to the ground, confirming our understanding of the workings of gravity. Yet some of you argue, we can't trust his testing. His info could be altered. Perhaps the first 5 apples fell up and he only took a picture of #6 on the floor, not the 5 resting on his ceiling.

The burden of proof really falls on to the gravity deniers and in this case the digital signal deniers.

Sycraft has been posting reliable information for years, in the display forum I frequent, and his methodology here is sound and repeatable, and builds on established fact about how these systems work.
 
I believe sight just like hearing can be a somewhat subjective thing based on how the brain interprets what it sees.

Absolutely! This is 100% true, which is why he used a calibration tester to objectively quantify the results. That way there is no subjective opinion used with a possible bias that may color the results.
 
You'll have to kill me to get me to leave my Trinitrons for any type of LCD. So 2D quality (RAMDAC I suppose) still matters to users like myself.

I look forward to the day when a monitor is fully capable of beating out CRTs. Perhaps some type of OLED will do? When will that day come? Another 5 years? 10 years?
 
You'll have to kill me to get me to leave my Trinitrons for any type of LCD. So 2D quality (RAMDAC I suppose) still matters to users like myself.

I look forward to the day when a monitor is fully capable of beating out CRTs. Perhaps some type of OLED will do? When will that day come? Another 5 years? 10 years?

Beat a CRT at what? Comparing an IPS panel to any trinitron CRT is a bit lopsided because the IPS will look much, much better. Perhaps the CRT is better for gaming, but with LCD/IPS panels you get larger sizes without having a 50 pound heater sitting on your desk.
 
Beat a CRT at what? Comparing an IPS panel to any trinitron CRT is a bit lopsided because the IPS will look much, much better. Perhaps the CRT is better for gaming, but with LCD/IPS panels you get larger sizes without having a 50 pound heater sitting on your desk.

Your carrying your monitors around all the time? I don't think weight matters to much if its just sitting on your desk nor does it have a affect on image quality. IPS Will not look "much, much better" CRT has better image quality and was capable of high resolutions.

The sony FW900 was a 24" With a 2304 x 1440 resolution is fair sized.
 
Your carrying your monitors around all the time? I don't think weight matters to much if its just sitting on your desk nor does it have a affect on image quality. IPS Will not look "much, much better" CRT has better image quality and was capable of high resolutions.

The sony FW900 was a 24" With a 2304 x 1440 resolution is fair sized.

I used trinitron CRTs for years. Back when CRTs were relevant, trinitrons were great. But now? The heat output is considerable and the image will lose sharpness as the CRT "heats" up over time. And there's no way to get around those stupid trinitron damper wires that are VERY annoying. CRTs also tend to suffer from convergence problems that IPS panels and even TN panel LCDs do not suffer from. Color is good on trinitron but in every other respect IPS panels are in a different league.
 
Your carrying your monitors around all the time? I don't think weight matters to much if its just sitting on your desk nor does it have a affect on image quality. IPS Will not look "much, much better" CRT has better image quality and was capable of high resolutions.

The sony FW900 was a 24" With a 2304 x 1440 resolution is fair sized.

Depends on what you value in image quality. CRTs usually have higher contrast ratios (though the pro ones were often lower than people think due to circuitry to help prevent burn-in), faster response/higher refresh support, are better at low brightnesses, and variable pixel sizing. LCDs have no geometry, convergence, or focus issues ever, can have a wider colour gamut if you wish, are better at higher brightnesses (much higher if needed), and are available in larger sizes.

LCDs are not in all ways better than CRTs, but neither are CRTs in all ways better than LCDs. The geometry, convergence, and focus are major issues. I spent many an hour fiddling with my LaCie to keep it will setup well, and it never had particularly good focus.

There are many reasons to like LCDs better. There are reasons to like CRTs too.
 
I used trinitron CRTs for years. Back when CRTs were relevant, trinitrons were great. But now? The heat output is considerable and the image will lose sharpness as the CRT "heats" up over time. And there's no way to get around those stupid trinitron damper wires that are VERY annoying. CRTs also tend to suffer from convergence problems that IPS panels and even TN panel LCDs do not suffer from. Color is good on trinitron but in every other respect IPS panels are in a different league.

Thats one thing I didn't like about CRTs I had. The slow fade. Everything got darker and darker and the brightness kept being cranked more and more. They became useless for editing photos as even on full brightness when I looked at it on an LCD panel you could see stuff you couldn't on the CRT. :(
 
LCDs do fade over time as well, just just can do much higher brightnesses. Most pro CRTs were designed to operate at around 80-90nits, any brighter and they'd start to bleed. They'd cap out at 150-200nits absolute max, sometimes less. LCDs are much brighter. Most can handle 300-400nits no problem. Hence they've got more range to deal with. Also they don't have problems being bright, they can function over their whole range without bleed.
 
You tested color accuracy. Good for you.

Did anyone make this an nvidia vs ATI thing? You seem to be invested in defending nvidias honor here, anyway, I made a thread earlier but it wasn't about nvidia vs. ATI. I've definitely seen differences in IQ and what happens is some factory overvolted cards seem to "warm" the monitor image up quicker than other brands, i've seen this a lot with EVGA and MSI lightning cards.

Color accuracy though. Good for you. This definitely doesn't tell the full story. It is interesting though how you interpreted it to be an AMD vs nvidia thing.

I didn't take it as such. Perhaps you are looking for a "vs"? Maybe it was just a good info thread on how 2d image quality is great across the board. Hmmmm ;)
 
You're showing your age. And your age is young. This has nothing to do with you, claims of Nvidia fudging their 2D Desktop quality are a thing that started around the 6800 series, maybe even before then.

try original TNT days and before.......Nvidia has always been inferior to ATi in this regard when it came to analog... (try running a TNT @ 1600*1200 and you will immediately see the issue). ATi simply used better filters and their chips had better 2D quality even when the filters were bypassed.

Those days are long gone btw......

Question is do both cards render the same 2d image in digital? For all practical purposes, yes they do. Do they render the same 3d images? No, there are subtle differences between the two. Can they both produce outstanding image quality? definately. Do they trade IQ for performance? Yup, nothing new here.

LCDs do fade over time as well, just just can do much higher brightnesses. Most pro CRTs were designed to operate at around 80-90nits, any brighter and they'd start to bleed. They'd cap out at 150-200nits absolute max, sometimes less. LCDs are much brighter. Most can handle 300-400nits no problem. Hence they've got more range to deal with. Also they don't have problems being bright, they can function over their whole range without bleed.


Correction, LCDs do not fade over time, their backlights do and they can be replaced.

LCDs are inferior to CRT when it comes to:

color reproduction
viewing angles
refresh rate
image reproduction at various resolutions below it's native res

CRT are innferior to LCD in:

geometry
performance in brightly lit areas
 
Last edited:
LCDs are inferior to CRT when it comes to:

color reproduction

Note even close. The reason that "72% NTSC" to the sRGB standard IS the standard is because that is all you can easily get tubes to reproduce. As the "72%" bit implies it is not 100% of what the original NTSC spec called for. Plenty of CRTs would miss even that sRGB spec (the blue phosphor was the one I most often saw fall a bit short). I am aware of a couple pro CRTs that could come close to 95%ish, but they were like $4000+ (NEC made one).

LCDs? Easy, you can get them with huge gamuts if you like. Also high end LCDs have 3D LUTs in them so you can precisely correct the colour on the unit itself, and you can map any gamut space at or under the native gamut you wish. CRTs can't hold a candle to that. Even cheap LCDs can do it these days. You can get 95% laptop screens if you want. Also a hardware calibrated LCD has fantastic colour detail, particularly if you give it an L* gamma curve (and non-power curves are no problem for LCDs since the LUT handles it). My screen has full shadow detail with no blown out highlights, I love it.

I liked the colour on my LaCie Electron22BlueIV, but it didn't compare to my NEC 2690 (which is what I still use).

refresh rate

That kinda depends. I am aware of CRTs that can pull refresh rates above any LCD, I've seen 200Hz ones, but at high resolutions? No LCDs win that battle these days. You can get 120Hz 1920x1080 LCDs and I don't know of any CRTs that are stable at that refresh at that kind of rez. Also you had to be careful about pushing CRT refresh. Push it too high and though the CRT could display it, image quality would suffer (hence lower recommended refresh rates).

image reproduction at various resolutions below it's native res

Depends on the LCD, and depends on what you are after. While CRTs adjust their scanning to change pixel size, LCDs do digital scaling. Which is better? Depends on what you want. I am a huge fan of NEC's scaler, it gives an extremely nice scaled up image, often nicer than a CRT I feel. Very smooth and clean. Then when you start talking low resolutions, CRTs fall over. You start getting scan lines because the scan rate of the low rez is so low. No problem on a good LCD, it upsamples things cleanly.


Also in the "LCDs win on" category you need to add size, weight, and power. For size it is both not being so bulky, and being available larger. A 24" LCD is a mid sized one, not tiny, but not the biggest by far. 30" are easy to get, and they do come bigger for 4k monitors. A 24" CRT was as big as they ever got for professional stuff that I'm aware of.

Weight and power are obvious, with weight being one of the limiting factors on bigger CRTs.

Really, LCDs do have a number of advantages over CRTs. It isn't the lopsided situation some people like to pretend that people only liked them because they aren't as bulky. That's a part to be sure but once LCDs developed, there were a number of image advantages. Not coincidentally, that is when they started getting more popular, and when they started displacing CRTs in the pro arena (which was the end for them). Time was if you did serious graphic design work you had a CRT, LCDs weren't good for it. Now you'd be silly not to have a good IPS (or maybe PVA) LCD for it, you'll get far better results
 
^ lol okay at the LCD being better at color reproduction....might want to pass the weed over here so I can some some of that.

LCDs have discrete steps in their color palette thus making it impossible for them to preproduce the true color spectrum> No way around this.....CRT are analog and thus can reproduce any color as you can infinately control each gun's output.

Go watch a HD CRT and an LCD side by side and then tell me what one is better ......CRT black levels are so far superior, it's not even funny. Not to metion no artifacting....they handle motion like a dream....

Digital scaling sucks ass and will NEVER come close to the CRT in that respect. Just because you "feel" something does not make it fact..

Ever notice color banding on LCD? that's due to it's limitions when it comes to reproducing colors
 
LCDs have discrete steps in their color palette thus making it impossible for them to preproduce the true color spectrum> No way around this.....CRT are analog and thus can reproduce any color as you can infinately control each gun's output.
Any 'analog' device only offers so much granularity. It's easy to say that analog = infinite resolution, but practical constraints prohibit that from being a genuine reality.

Take the vinyl record, for instance. It's analog, and so can therefore, in theory, offer infinite resolution. In practice, though, the fact that an atom of polyvinyl chloride is not infinitely small, a record cannot offer infinite resolution. Various measurements have pegged it as having less resolution than an ordinary Redbook audio CD.

Besides, a 10-bit panel can correctly display billions of colors and incredibly smooth gradations. I'm not sure what you suspect would be 'missing' from a 10-bit LCD.
 
LCDs do fade over time as well, just just can do much higher brightnesses. Most pro CRTs were designed to operate at around 80-90nits, any brighter and they'd start to bleed. They'd cap out at 150-200nits absolute max, sometimes less. LCDs are much brighter. Most can handle 300-400nits no problem. Hence they've got more range to deal with. Also they don't have problems being bright, they can function over their whole range without bleed.

I have noticed a slow decline in some of the really old LCDs I use (though the slight flickering on grey screens is worse!). Probably time I upgraded...

Also in the "LCDs win on" category you need to add size, weight, and power. For size it is both not being so bulky, and being available larger. A 24" LCD is a mid sized one, not tiny, but not the biggest by far. 30" are easy to get, and they do come bigger for 4k monitors. A 24" CRT was as big as they ever got for professional stuff that I'm aware of.

Also need flatness (most CRTs are curved, which makes certain activities troublesome, things where you need to judge angles) and connectors (digital [mostly] vs analog).
All analog cables have some level of noise and are more likely to face interference.
Also moving onto 4k/UHD (possibly by next year for UHD :D), which would be troublesome on a pure analog system where the highes resolutio/framerate analog connector does something like 3840×2400@33. Going higher would probably get bulkier and bulkier.
 
If you don't like the way text appears with windows you should definitely experiment with its ClearType Text Tuner. If that isn't set or tuned to your liking it could be seriously annoying.
 
Last edited:
LCDs are inferior to CRT when it comes to:

color reproduction (debatable but yes for the most part)
viewing angles
refresh rate (LCDs don't flicker even at low refresh rates, and there are 120hz LCDs)
response time (more important IMO than refresh rate)
input lag
image reproduction at various resolutions below it's native res.


CRT are inferior to LCD in:

geometry
performance in brightly lit areas
Focus
convergence
size
portability
Digital inputs (DVI-A is basically VGA)
aspect ratio
Price
 
LCDs are inferior to CRT when it comes to:

color reproduction (debatable but yes for the most part)
viewing angles
refresh rate (LCDs don't flicker even at low refresh rates, and there are 120hz LCDs)
response time (more important IMO than refresh rate)
input lag
image reproduction at various resolutions below it's native res.


CRT are inferior to LCD in:

geometry
performance in brightly lit areas
Focus
convergence
size
portability
Digital inputs (DVI-A is basically VGA)
aspect ratio
Price

But you can pick up some good CRT's for free or cheap these days. Have a viewsonic CRT on my guest computer that is a 1792 x 1344 resolution and a 75 Hz refresh rate for free.
 
Enough with the thread-jacking. This is not a CRT vs LCD thread. There are plenty of those already.
 
Just got a GTX680 and love it, but there is a difference in the font aliasing. Yes I have done the clear type tuning, but the text still looks a little fuzzier on the 680 with my HP2711x DVI connection. Not sure why AMD/ATI does a better job with text aliasing, but they do. I would not give up the 680 because of the slightly worse text quality, but I sure wish NVIDIA would look into this.
 
The clear type setting I have on the GTX680 is the best from the available options during tuning. The clear type setting I had on my Radeon 5850 was the best from the available options during tuning. The 5850 text looked better. Just to make sure I was not imagining things, I swapped cards and stuck the 5850 back in and "wa la" the text quality instantly looks better. Now the 680 kicks the 5850's azz in BF3, but I do still need to do work in 2D, so it's frustrating that the text quality is slightly off on the 680.
 
According to MSDN website,[5] Microsoft acknowledges that "[t]ext that is rendered with ClearType can also appear significantly different when viewed by individuals with varying levels of color sensitivity. Some individuals can detect slight differences in color better than others."
This opinion is shared[6] by the font designer Thomas Phinney, program manager for fonts and core technologies at Adobe Systems:[7] "There is also considerable variation between individuals in their sensitivity to color fringing. Some people just notice it and are bothered by it a lot more than others."

http://en.wikipedia.org/wiki/ClearType (references on page)

So, at default color profiles, as most normal people are going to see it, there is the possibility of seeing fringing on one card or another. Not everyone has or would use professional calibration equipment.
 
Back
Top