A Real Test of nVidia vs AMD 2D Image Quality

Ok, this is a bit of an older thread and is going in a theoretical direction, but since there seem to be a lot of people here who have a clue about monitors, ClearType, etc., I'll ask here.

My new secondary monitor is a 24" Dell IPS. I got it to use in Portrait mode for reading and such. Problem is that everything looks nice and clear on it when it's oriented in Landscape mode (even if the monitor itself is physically mounted in a Portrait mode). But as soon as I use software (nVidia control panel or Windows 7 monitor settings) to rotate the image to Portrait mode, the image quality goes way down. It's not just the text - everything becomes less well defined, and there seems to be faint yellow discoloration. It's apparent around black letters on white background and in other areas.

I have so far tried connecting this monitor to Integrated graphics port (HD 4000 through my motherboard via dual-link DVI) and directly to my GTX 670's second DVI port - same thing. Tried turning off ClearType - even worse / doesn't help. Tried turning off software scaling and such in nVidia control panel - no noticeable helpful effect.

I can't figure it out... The monitor is on a desk-clamp mount, sitting in portrait mode permanently. So all I need to do is click some menus to flip the image back and forth between landscape and portrait and watch its quality deteriorate and go back to normal. Is there anything I can do?..
 
IIRC this has something to do with the way the pixels are shaped on some IPS screens, chevron if I recall right. They just don't tend to look quite right when using default portrait mode settings, I'm not sure if it's fixable.
 
My thinking exactly. The subpixel geometry just does not work well with 90-degree rotated rendering.
 
You're showing your age. And your age is young. This has nothing to do with you, claims of Nvidia fudging their 2D Desktop quality are a thing that started around the 6800 series, maybe even before then.

2D image quality claims between AMD and NVIDIA are as old as the Radeon and GeForce names. And such debates are even older than that in general. They pretty much existed from the early days of EGA / CGA and VGA. I recall debates on such matters with the Voodoo series and NVIDIA Riva TNT cards. The Radeon vs. GeForce argument specifically dates back to the late 1990's early 2000's. The debate probably reached it's peak around the Radeon 9x00 series and GeForce FX days. At least thats when I remember the most heated debate for those brands. And back when CRT's were the thing we all pretty much used, there was a difference. RAMDACs were different and image quality even varied by brands sometimes, even with the same chipset.

These days the cards all practically use the same reference designs and are only made in a handful of factories owned by one or two companies. Plus as the OP said it's all digital now. Off or on. 0 or 1. That's pretty much all there is to it.
 
My thinking exactly. The subpixel geometry just does not work well with 90-degree rotated rendering.

No it works fine when the image is oriented in Landscape mode and the monitor is rotated 90 degrees to portrait mode. So it's not the viewing angle or anything like that. It's purely a software thing. When the monitor is stationary in physical portrait mode, swapping the image between Portrait and Landscape reduces and increases quality.
 
I'm not talking about the viewing angle so much as pixel geometry and the orientation of text and images relative to the monitor's display. If the RGB subpixels are laid out a certain way, software rotation of the display by 90-degree could very well produce some ugly results.

Here's a good example of various layouts that have been used in products.
http://en.wikipedia.org/wiki/File:Pixel_geometry_01_Pengo.jpg

ClearType (or nVidia software, or AMD software, whatever) may be making some assumptions about the subpixel layout that are (1) true when displaying in landscape mode but (2) false when rotating the virtual display (not the monitor) into portrait mode. Another example.
http://en.wikipedia.org/wiki/File:Subpixel-rendering-RGB.png

If you were to rotate the word 'sample' by 90 then the RGB subpixels are no longer aligned vertically with respect to the word orientation (which I'm sure software assumes). That's likely the cause for software rotation's poor results.

If you were talking about the monitor looking worse when physically rotating it into portrait orientation then we'd be talking about TN v. IPS all over again. But that is not what we are talking about and I definitely got that.
 
So here's MS word on my nVidia GeForce GTX 560TI at work:

nvidia.png


And here is MS work on my co-worker's AMD Radeon HD5750:

amd.png

On my calibrated 2490, the AMD text, looks better, sharper, and the contrast looks like its higher...its not placebo.
 
LCDs are inferior to CRT when it comes to:

color reproduction
viewing angles
refresh rate
image reproduction at various resolutions below it's native res

CRT are innferior to LCD in:

geometry
performance in brightly lit areas

lol, just lol.
 
On my calibrated 2490, the AMD text, looks better, sharper, and the contrast looks like its higher...its not placebo.
You can't rule out placebo unless you've performed a blind A/B or ABX test under double-blind conditions, and done a sufficient number of trials to establish a statistically significant result. In this case, though, you can just as easily take two photos of the same display under identical conditions and then algorithmically quantify their differences, if any.

If you haven't done either, all you're effectively saying is "I believe it's not placebo", which is kind of a useless statement, as placebo itself is a form of belief.
 
On my calibrated 2490, the AMD text, looks better, sharper, and the contrast looks like its higher...its not placebo.

Lay those two images over each other and do a difference in gimp then contrast the results ... they are identical ... I would post the result but it's just a black rectangle. (edit: ah, someone already did that.)
 
Why did we bump a year old thread?

Because even today people are still realizing when they make the switch from Nvidia to AMD that their desktop just looks better. Its hard for people to admit until they've done it. Its also a major reason why I've used AMD video cards for the last 4 years until this past year when I finally switched to Nvidia for the 3D performance and drivers. Its a trade-off but I still swear by my eyes that AMD just looks better in 2D applications.
 
Was brought up in another thread when someone made the same empty claim about 2D image IQ.

FYI, its not an empty claim just because you may disagree with it. Is it really any different than you telling me what speakers sound best to my ears?
 
Because even today people are still realizing when they make the switch from Nvidia to AMD that their desktop just looks better. Its hard for people to admit until they've done it. Its also a major reason why I've used AMD video cards for the last 4 years until this past year when I finally switched to Nvidia for the 3D performance and drivers. Its a trade-off but I still swear by my eyes that AMD just looks better in 2D applications.

I have gone back and forth between vendors for years, no visible difference. Dell U3011.
 
Because even today people are still realizing when they make the switch from Nvidia to AMD that their desktop just looks better.
They may be 'realizing' that, but the question is whether that realization is actual or is merely perceived. There's no evidence at all that I'm aware of to suggest that there are any actual differences, and all who claim differences — dramatic or otherwise — have certainly presented none.
 
FYI, its not an empty claim just because you may disagree with it. Is it really any different than you telling me what speakers sound best to my ears?

Yes, yes it is. Raw digital display signal decoding is handled display side, there is zero difference between what is sent from any vendor. Back in the days of RAMDAC's and VGA being the primary methods of signal outpu there was totally a measurable difference depending on the quality of DAC being used by the vendor which is of course where all of this started.

A speaker is subjective because each one is quantifiably different when measured for things like frequency response, as such personal taste can be accounted for.

Just as the OP demonstrated with actual proper test information, there is no difference. Any difference you think is there is the result of a sub conscious bias and nothing more.
 
Just as the OP demonstrated with actual proper test information, there is no difference. Any difference you think is there is the result of a sub conscious bias and nothing more.

This is absolutely true.

Also, as one of the last great holdouts on a CRT, I can say this about analog image quality between card makers:

Crisp analog picture quality is a solved problem. Gone are the days when when individual vendors jealously guarded their secret juju from competitors - everyone can get the same analog video quality thanks to shared information on the interwebs, so the quality of your filters is only limited by the desired board cost.

Any video card over $100 will have top-notch analog quality. Since I purchased my 6600 GT ( including HD 4850, GTX 460) I have enjoyed arguably BETTER analog quality than even my vaunted G400 MAX did! Even integrated video has gotten better, forced by the VAST array of devices that will only accept VGA, and the doubling of *usable* screen resolutions with the arrival of the LCD (early integrated graphics looked EMBARRASSING on 1600x1200 panels!)

Given the almost universally high quality of the VGA interface, it astounds me to hear people claiming to no end that the DIGITAL outputs from cards vary. This is complete placebo effect in action - I have several digital monitors at work, all driven by different cards (Intel, Nvidia, AMD), and NOT ONE is noticeably different.
 
They may be 'realizing' that, but the question is whether that realization is actual or is merely perceived. There's no evidence at all that I'm aware of to suggest that there are any actual differences, and all who claim differences — dramatic or otherwise — have certainly presented none.

What evidence do you or anyone else has that Paradigm speakers sound better to my ears than Klipsch? Just making a point.
 
Yes, yes it is. Raw digital display signal decoding is handled display side, there is zero difference between what is sent from any vendor. Back in the days of RAMDAC's and VGA being the primary methods of signal outpu there was totally a measurable difference depending on the quality of DAC being used by the vendor which is of course where all of this started.

A speaker is subjective because each one is quantifiably different when measured for things like frequency response, as such personal taste can be accounted for.

Just as the OP demonstrated with actual proper test information, there is no difference. Any difference you think is there is the result of a sub conscious bias and nothing more.

Wrong, you can have two speakers with very similar measured characteristics and still sound different to your ears. My point is that measurements are great but don't completely define an object when it comes to human sense perception. Tell me what foods taste best to me too while we're at it.
 
Wrong, you can have two speakers with very similar measured characteristics and still sound different to your ears. My point is that measurements are great but don't completely define an object when it comes to human sense perception. Tell me what foods taste best to me too while we're at it.

Similar, but not identical. This is my point.

Again, when the properties of the two images are identical in every conceivable way (and this has been demonstrated) then there is no way in which one can still look better than the other. There is no accounting for personal taste here.
 
Given the almost universally high quality of the VGA interface, it astounds me to hear people claiming to no end that the DIGITAL outputs from cards vary. This is complete placebo effect in action - I have several digital monitors at work, all driven by different cards (Intel, Nvidia, AMD), and NOT ONE is noticeably different.

I agree
 
What evidence do you or anyone else has that Paradigm speakers sound better to my ears than Klipsch? Just making a point.
As alluded to before, the output of a given Paradigm speaker is likely to be measurably different to the output of a Klipsch speaker (Klipsch's horns will pretty much guarantee measurable differences, even given very similarly-designed speakers). There are measurable differences that may, but not always, translate to perceptible differences.

If, however, you have two speakers, each measuring identically in all ways we know how to measure transducers, do you expect either to sound better than the other? If you find that one does, what do you believe is influencing your perceptions? If there are no mechanical differences between the two speakers, is it sensible to conclude that your perceptions are biased to one of the speakers?

Your misunderstanding is thinking that what you perceive as a difference is indicative of an actual difference. That isn't necessarily the case, and you can never correctly assume that's the case. You have to measure.
 
Back
Top