2D image quality - does anyone pay attention anymore?

xoleras

2[H]4U
Joined
Oct 11, 2011
Messages
3,551
Okay, so i've been interchanging 7970s and 680s here and there, and am I crazy or do some AIB makers have poor 2D quality? The thing I notice is this, with certain BRANDS of cards (not 680 specific) the image will warm up really fast on my monitor and leave a very very slight pinkish tinge. Like when the monitor gets very hot, I'm using a samsung S27A850D. Yet using this same monitor on sapphire 7970 boards this doesn't happen - whites are white and the monitor image doesn't get excessively "warm" with a very very slight pink tinge.

I stress, slight, its not something that most people can spot but you can tell the monitor is warmed up when it does this. When it is "cool" everything is in sharper focus, this happens on all screens.

Does anyone else notice this? The worst offenders for me have ALWAYS been factory overvolted cards. The MSI lightnings showed this, and EVGA SC does as well. Really annoying because some brands have this and some don't.
 
In b4 massive debate/trolling over AMD vs nVidia image quality...

:p

Personally I haven't noticed any difference other than the default settings on AMD possibly having more vibrant colors. Bump up the Digital Vibrance on nVidia by a notch or so and you get the same thing.
 
In b4 massive debate/trolling over AMD vs nVidia image quality...

:p

Personally I haven't noticed any difference other than the default settings on AMD possibly having more vibrant colors. Bump up the Digital Vibrance on nVidia by a notch or so and you get the same thing.

Its not an AMD vs nvidia thing, its brand specific and I see it on a lot of EVGA boards in particular. The monitor image just gets super warm fast, some boards the image stays cool therefore you don't get the slight tinge and slightly out of focus look. Its annoying becuase there is definitely a difference, and digital vibrance doesn't help.
 
Yeah I know what you were saying, there had just been an influx of AMD vs nVidia IQ stuff going around recently. ;)

I haven't noticed any major difference between brands, but I don't buy/resell a ton of video cards, either, so I couldn't say for sure.

For instance, though, I had an EVGA 570 as my primary in SLI with an MSI, and later resold the EVGA and went on the MSI alone for awhile, and I saw no difference between the two in IQ.
 
Bought a 570, saw the 2d image quality and was not impressed. took it back and got a 6950. Maybe it is just me, but I could tell a difference. the 6950 was just simply better - better focus on fonts.
 
Bought a 570, saw the 2d image quality and was not impressed. took it back and got a 6950. Maybe it is just me, but I could tell a difference. the 6950 was just simply better - better focus on fonts.

I've never seen that, personally.

I always chalk up those claims to placebo effect.
 
Its not placebo effect. Some cards warm the monitor image a lot faster than others. The monitor image getting hot tinges the image and very very slightly puts everything less focused.

I knew someone would chime in to say that we're confused and crazy, Anyway, its easy to see when you stare at the same screen for hours to do work and you stare
at small fonts all day. The difference is perceivable when you stare at the same monitor to do work for hours on end.
 
Switching from AMD to NVIDIA, the fonts seemed a bit different. Almost like Cleartype got turned off or something. Not sure if that makes sense.
 
I don't think he's talking about the antialiasing on the fonts. It's the fact that the colors seems a bit off (9300K turning into 6500K where it gets reddish aka warm), and this alone affects the readability of the fonts. I go back and forth with ATI and nVidia a lot and have noticed ATI's 2D quality to be superior, but to me, I ignore this and can't tell a huge difference between the two because I play WoW and ArmA all the time and they look fine to me. If I was a pixel junkie, this would bother me, but I've grown out of it and ignore it naturally. You're not crazy. But I can't confirm the difference between factory overclocked cards and the stock ones.
 
Okay, so i've been interchanging 7970s and 680s here and there, and am I crazy or do some AIB makers have poor 2D quality? The thing I notice is this, with certain BRANDS of cards (not 680 specific) the image will warm up really fast on my monitor and leave a very very slight pinkish tinge. Like when the monitor gets very hot, I'm using a samsung S27A850D. Yet using this same monitor on sapphire 7970 boards this doesn't happen - whites are white and the monitor image doesn't get excessively "warm" with a very very slight pink tinge.

I stress, slight, its not something that most people can spot but you can tell the monitor is warmed up when it does this. When it is "cool" everything is in sharper focus, this happens on all screens.

Does anyone else notice this? The worst offenders for me have ALWAYS been factory overvolted cards. The MSI lightnings showed this, and EVGA SC does as well. Really annoying because some brands have this and some don't.

Wow xoleras! I thought I was the only person that noticed this. All my Nvidia cards have been EVGA and usually purchase their "FTW" edition. I did notice this immediately after I got my HD 5970. I always thought Internet Explorer had this pinkish ting on the borders which I thought was the color theme of Windows vista. When I made the switch from Nvidia to AMD, I never knew the borders were in fact gray. I wonder if this is a monitor issue or a video card issue. Because I do use my monitor INF (from the HP website) for their color profile for windows. I am not 100% sure if Nvidia has added a color temperature settings in their control panel yet. I haven't touched a Nvidia product since my GTX 280 FTW.
 
Quality of the monitor also has some bearing in observation of these type of variations. A high quality IPS, color corrected and adjusted someone will more likely see the differences then from a non-adjusted low quality TN panel. I too when I went from Nvidia to AMD saw better text but mostly colors in general better with ATI (260 to 5870). Now I am using an AMD 7970.
 
One of my computers has an Asus DirectCu II HD 7950, it's connected to a calibrated p-IPS LCD via HDMI.

The other computer has an Asus DirectCU II GTX 580, it's connected to a calibrated p-IPS LCD via DP.

I've not noticed a difference between them when simply viewing my desktop, browsing the internet, or looking at photos, vids and movies.
 
Last edited:
I don't think he's talking about the antialiasing on the fonts. It's the fact that the colors seems a bit off (9300K turning into 6500K where it gets reddish aka warm), and this alone affects the readability of the fonts. I go back and forth with ATI and nVidia a lot and have noticed ATI's 2D quality to be superior, but to me, I ignore this and can't tell a huge difference between the two because I play WoW and ArmA all the time and they look fine to me. If I was a pixel junkie, this would bother me, but I've grown out of it and ignore it naturally. You're not crazy. But I can't confirm the difference between factory overclocked cards and the stock ones.

Yeah, I get that he's talking about the temp, but I'm just trying to say that the only difference I've noticed is the font aliasing and I've never noticed the temp being off. I guess someone like Blacklash would notice since he's got two different GPU setups.
 
One of my computers has an Asus DirectCu II HD 7950, it's connected to a calibrated p-IPS LCD via HDMI.

The other computer has an Asus DirectCU II GTX 580, it's connected to a calibrated p-IPS LCD via DP.

I've not noticed a difference between them when simply viewing my desktop, browsing the internet, or looking at photos, vids and movies.

If those are 10bit panels then the DP one can get 10bits/channel of information, the HDMI hooked up one will get 8bits/channel. In anycase them both being about the same qualityshould be the case if both are calibrated.

I don't know how a more modern Nvdia card compairs today to AMD then from the last Nvidia card I owned which was a 260.
 
I have never noticed a difference in image quality between NVIDIA and AMD cards on digital connections.
 
This thread again.

-.-

Last time this came up I said "I should do some tests and write an article to put this to rest." This time, I did. We'll see if HardOCP wants it to publish, if not I'll stick it on the forums.
 
Well, when I moved from the 4870x2 to the HD6950 I noticed a big change in image quality. But I haven't noticed any change since moving to the Nvidia 680 GTX.

And to everyone here, I don't think the OP is making a distinction between AMD and Nvidia but between different cards that the AIBs put out.

To the OP I am not too sure how many people would be in a position to check this out. Most people stick to the same brand name when buying two cards.
 
i felt that my image quality on an XFX card was worse than my Sapphire cards - also comming from an 8800 GTS and 8800gtx to ATI cards i felt that the quality was just different - subjectively it was better. Haven't had a chance to compare since then.
 
I went from a 470 SC to a 5870, and personally I don't notice a huge difference, maybe I just didn't look hard enough on the 470

I also had an old dell E207WFP and recently upgraded to a 120hz 23", so I'm sure I would have noticed if I kept the 470 on this new monitor
 
I've always wondered about the effects of OC'ed cards on IQ.

It seems that there are no quantitative findings that would indIcate that there is actually a definitive effect on IQ. Stability, that's another discussion.
 
That could be it, i've noticed that factory overvolted cards tend to warm the screen image up a lot faster, which contributes to the problem.
 
Bought a 570, saw the 2d image quality and was not impressed. took it back and got a 6950. Maybe it is just me, but I could tell a difference. the 6950 was just simply better - better focus on fonts.

Exactly. I went from nvidia to amd and was shocked that the desktop was so much more vibrant and clear. Text in particular is great on amd. I posted a thread and had tons of people tell me I was an idiot etc. But some see it, others don't. Its not an "adjust digital vibrance" or cleartype settings problem either. Nvidia just sucks.

For the record, my 6670 Sapphire and my 7770 black OC XFX look identical to me.
 
Back
Top