Image Quality for X1950XTX vs 7950GX2 article

Bo_Fox

[H]ard|Gawd
Joined
Aug 23, 2006
Messages
1,544
Here's a wonderful XbitLabs article comparing the X1950XTX against the 7950GX2 at "equal" image quality settings and also at highest quality image settings.

http://www.xbitlabs.com/articles/video/display/quality_vs_quantity.html

This is one of the best video comparison articles I have seen in a long time!

The reason why I said "equal" is because most website hardware reviewers still benchmark Nvidia cards at only the default "Quality" Image settings, which is sub-par to ATI's image quality. ATI's cards do not display the texture shimmering or noise that is very distracting and noticeable (almost as bad as screen tearing!) on Nvidia cards at default settings.

So, for high-end card comparisons, it is a logical requirement that we start testing Nvidia cards only at "High Quality" Image settings. The performance penalty can be as great as 30-40% for Nvidia in some games such as Battlefield 2 and UT2004, but not so bad in other games. It actually causes the 7950GX2 to score lower than the X1950XTX in 3DMark06!

Nvidia has gotten away with unfair comparisons to ATI's cards for a couple of years, and it is time that we bring light to this, and put an end to this. Now that the G80 and R600 are due to be released shortly, it is probably not going to continue since there is tremendous pressure on Nvidia to implement High Quality Anisostropic Filtering once again, just like with ATI's X1xxx series. The ugly thing is that Nvidia has largely gotten away with it while only a few people noticed this unfair advantage with optimizations over the past year or so. Consequently, many of us suffered texture shimmering/noise.

After changing the Nvidia settings to "High Quality" image settings in order to get rid of the horrible shimmering, I noticed that with my 7900GTX I had to lower the settings from TR Supersampling AA down to TR Multisampling AA on many of the newer games at 1920x1200 while on my X1900XTX nearly all of the games were smoothly playable with Quality Adaptive AA without having to lower it to Performance AAA.

If you want to see how bad the texture shimming is at default settings on Nvidia cards compared to ATI, here's a link for this Battlefield 2 video clip download:

EDIT: http://www.filefactory.com/file/5aeee2/

(after clicking on the above link, scroll down to where you see "download for free with filefactory basic" then click on it.)

or you could check out other comparison video clips from Xbitlabs:
http://www.xbitlabs.com/articles/video/display/ati-x1950xtx_8.html

Enjoy!
 
I can't get the video to download from the filefactory website link.
 
biggles said:
I can't get the video to download from the filefactory website link.

All right, here's a link from Xbitlabs... I tried directly linking to the file itself, but I guess that didnt work for you.

http://www.filefactory.com/file/5aeee2/

scroll down to where you see "download for free with filefactory basic" then click on it.
 
Got error messages when trying to download new codecs for windows media player. Can't view the file, can see it is only 30 seconds long.
 
Thank Bo for bringing up the article. Seems like solid information.
 
For those of us who have actually used both of these cards for extended periods of time, not just f@nboys who pimp the GX2 in all of their posts even though they own a 7800GS, this is nothing new. It's nice to see somebody besides [H] do a real world review instead of just seeing which card gets more FPS and calling it a day.
 
biggles said:
Got error messages when trying to download new codecs for windows media player. Can't view the file, can see it is only 30 seconds long.

You probably need DivX, if I'm not mistaken. www.divx.com free player, btw
 
If shimmering in a particular game is bad, you only have to disable one optimization. Anisotropic Sample Optmization, there is no need to run High Quality. What I mean is there is really no point in disableing the other non intrusive optimizations at work also.
If Xbit had took 10 minutes to try diasableing the ops one by one they would have discovered which one causes shimmering, and it isnt Trilinear like they say. Here are 3 BF2 videos, these are uncompressed taken with fraps (you will need fraps installed to view them). Why Xbit is useing compressed videos as there examples is beyond me, even uncompressed the quality isnt near what you see in game.

1a.BF2 Q.avi (79.10 MB)
Download Link: http://www.filesend.net/download.php?f=1aa4e42fbb898826638349be6c29bf24

2a.BF2 Q.ASO OFF.avi (75.89 MB)
Download Link: http://www.filesend.net/download.php?f=19ccc81ca2c3845382b74f4b0e63a28a

3a.BF2 HQ.avi (73.37 MB)
Download Link: http://www.filesend.net/download.php?f=52f1ecb0b6fc19cf287a6735d2b4b6a5
 
dagon11985 said:
I'm surprised people still bring up shimmering... its laughable, really.

Why is it laughable? Nvidia takes a performance plummet when you get rid of shimmering and have similar IQ to ATI...
 
being an owner of a 7950GX2... you do not take a 40% hit from enabling High Quality... what a bunch of crap.
 
dagon11985 said:
I'm surprised people still bring up shimmering... its laughable, really.

No, it's not funny when most website hardware reviewers still benchmark high-end Nvidia cards at Nvidia's recommended default settings (Quality instead of High Quality) which exhibit heavy texture shimmering.

It's about time that a popular American site finally made a good article on this whole issue.
 
Faction said:
being an owner of a 7950GX2... you do not take a 40% hit from enabling High Quality... what a bunch of crap.

Let the numbers speak for themselves (nearly 50% performance hit!)

[IM]http://www.xbitlabs.com/images/video/quality_quantity/diagrams/titan_4x_corrected.gif[/IMG]

Found at: http://www.xbitlabs.com/articles/video/display/quality_vs_quantity_14.html

I benchmarked my 7900GTX myself on some of the UT2004 levels a while ago and found that enabling High Quality setting does indeed bring up to 40% performance penalty while getting rid of ugly mipmap shimmering that is so distracting.


Also, 3DMark06 shows a 30% performance hit:

[IG]http://www.xbitlabs.com/images/video/quality_quantity/diagrams/6dm_total.gif[/IMG]

found at: http://www.xbitlabs.com/articles/video/display/quality_vs_quantity_18.html

ATI's default "Typical" settings are equal to Nvidia's "High Quality" settings. (Note that ATI's typical setting is really Quality AF.)
 
So hey, what about the image quality difference with one crucial feature...something that drove me to dump my own X1900XT for a 7950GX2? Namely, aspect ratio scaling. ATI's image quality sucks hardcore when you can't even get 1600x1200 to display properly in 4:3...
 
Daggah said:
So hey, what about the image quality difference with one crucial feature...something that drove me to dump my own X1900XT for a 7950GX2? Namely, aspect ratio scaling. ATI's image quality sucks hardcore when you can't even get 1600x1200 to display properly in 4:3...

That has nothing to do with IQ... Isn't 1600x1200 a 4:3 aspect ratio?
 
Will somebody post an image of "shimmering" because, even though I think I know what you are talking about, I want to make sure. I have only really used Nvidia, and I would like to see if the grass is greener on the ATIother side.
 
Marvelous said:
That has nothing to do with IQ... Isn't 1600x1200 a 4:3 aspect ratio?

It has everything to do with IQ. On a widescreen display, ATI has no way of forcing 4:3. nVidia does. The result is that on ATI video cards, it's impossible to get proper aspect ratio scaling, which looks shit ugly...hence, poor image quality.
 
Daggah said:
It has everything to do with IQ. On a widescreen display, ATI has no way of forcing 4:3. nVidia does. The result is that on ATI video cards, it's impossible to get proper aspect ratio scaling, which looks shit ugly...hence, poor image quality.

LOL.. It has nothing to do with Image Quality... And 1600x1200 is a 4:3 aspect ratio...

Now if you're talking about wide screen that would be 1680x1050... ATI has an option to change aspect ratio... And why would you game 1600x1200 on a 1680x1050 monitor? :confused:

You should change your games to your monitor aspect ratio...

http://www.widescreengamingforum.com/
 
ATI CC does not scale 4:3 properly, it will always stretch the image.

It's been broken for a long time now, centered timings does not scale 4:3 properly.

Here's a post I made a while back:

2 weeks ago: Nvidia 6800GT on a NEC 20WMGX2. Half-Life 2, set to 4:3 @ 1280 X 1024. Check the option in the Nvidia driver to center the display. Worked like a charm, letterboxed, not stretched.

5 days ago: ATI X1900XTX (now on Cat 6.5) on the same NEC LCD. Half-Life 2, set to 4:3 @ 1280 X 1024. Check the option in CCC for centered timings. Image is still stretched.
 
squishy said:
ATI CC does not scale 4:3 properly, it will always stretch the image.

It's been broken for a long time now, centered timings does not scale 4:3 properly.

Either way you should change your GAMES to play properly... Easily manipulated and has support for almost all games...
 
squishy said:
It's really not the point.

But it is... It's has nothing to do with image quality... If you wanted to play games 1600x1200 than you should go buy 1600x1200 monitor... And why would you want black bars in your games? I would rather use the whole screen...
 
Am I the only one that notices that the wire-link fence, which is in line with the field of view, in the HL2 screenshots disappears on ATi cards?
 
I agree, ATI image scaling sucks. But I have a question,

WHY would you buy a widescreen monitor, a graphics card that can handle widescreen resolutions, and play games that have native widescreen support AT 4:3?

Help me understand :p
 
chris.c said:
I agree, ATI image scaling sucks. But I have a question,

WHY would you buy a widescreen monitor, a graphics card that can handle widescreen resolutions, and play games that have native widescreen support AT 4:3?

Help me understand :p

That's what I'm saying.. :confused:

It works with other widescreen resolutions but it doesn't work with different aspect ratio... Eventually it's going to be written on their drivers... If enough people whined I'm sure we'll get our wish... But I don't think people are too worried about that since most games support widescreen or can be changed to widescreen...
 
Not all games give you the option, especially older games. HL2 does, however, so I'm not terribly sure why he would want to center-scale that one. He is correct though - ATi's drivers stretch the image regardless of what setting you're using. If you have a Dell display, you can use the monitor's aspect scaling to get it back under control. If not, you're stuck.

purgatory said:
If shimmering in a particular game is bad, you only have to disable one optimization. Anisotropic Sample Optmization, there is no need to run High Quality. What I mean is there is really no point in disableing the other non intrusive optimizations at work also.
I believe he's correct here. However, nobody has yet challenged him. Why? Why is it "fair" to disable any and all forms of optimizations on nVidia cards while letting ATi perform any optimizations it desires, even if one switch seems to fix the majority of shimmering issues? I'm all for a level playing ground here, but this doesn't appear to be a level playing ground; it appears to be (intentionally or not) crippling of a particular brand with one seemingly massive uneducated click.
 
chris.c said:
I agree, ATI image scaling sucks. But I have a question,

WHY would you buy a widescreen monitor, a graphics card that can handle widescreen resolutions, and play games that have native widescreen support AT 4:3?

Help me understand :p

There are still games (even newer ones) that don't properly support widescreen resolutions. For example, Need for Speed: Most Wanted doesn't...in fact, EA games in particular still don't. And yes, shame on EA for that, but it doesn't change the fact that non-widescreen games are out there and do exist. My monitor's native res is 1920x1200 by the way...the native 4:3 resolution would be 16x12.

Either way, it's still a feature that ATI is sorely lacking in, especially when so many widescreen displays themselves don't include the ability to select proper aspect ratio scaling.
 
chris.c said:
I agree, ATI image scaling sucks. But I have a question,

WHY would you buy a widescreen monitor, a graphics card that can handle widescreen resolutions, and play games that have native widescreen support AT 4:3?

Help me understand :p

uh, it's really not hard to understand at all. Most new games support WS, which is why I bought one (love it). But some older games I like to play don't support WS, nor is there even a hack for it (via WS gaming forum) and if there is, half of them just stretch the image, which looks like poo.

I do think this is off-topic WRT IQ though. It's just something Nvidia does correctly and ATI does not.
 
Daggah said:
There are still games (even newer ones) that don't properly support widescreen resolutions. My monitor's native res is 1920x1200 by the way...the native 4:3 resolution would be 16x12.

Either way, it's still a feature that ATI is sorely lacking in, especially when so many widescreen displays themselves don't include the ability to select proper aspect ratio scaling.

What games? I haven't ran into a single one that doesn't support it or can be manipulated to play in widescreen...
 
Marvelous said:
What games? I haven't ran into a single one that doesn't support it or can be manipulated to play in widescreen...

Battlefield 2 and NFS:MW are two popular examples...and do note that from my perspective, if a widescreen solution requires a hack that ends up actually cutting off part of the image, as many games do, I do not consider it a viable solution. As a matter of fact, personally, I'm not fond of any solution that requires more than editing just a configuration file (or, ideally, selecting the widescreen resolution in-game.)
 
Marvelous said:
What games? I haven't ran into a single one that doesn't support it or can be manipulated to play in widescreen...

By properly support I think he means games that support the "+hor" change only, excluding "-vert" and "stretch".
 
Marvelous said:
What games? I haven't ran into a single one that doesn't support it or can be manipulated to play in widescreen...
The list is massive. The Thief series, Steam-less Half-Life, the Fallout series, other Black Isle games, Diablo (in fact, most every top-down strategy or RPG), older Need for Speeds (such as NFS4 and Porsche Unleashed).
 
squishy said:
uh, it's really not hard to understand at all. Most new games support WS, which is why I bought one (love it). But some older games I like to play don't support WS, nor is there even a hack for it (via WS gaming forum) and if there is, half of them just stretch the image, which looks like poo.

I do think this is off-topic WRT IQ though. It's just something Nvidia does correctly and ATI does not.

Welcome to 21st century... Damn I miss those old dos games...
 
phide said:
The list is massive. The Thief series, Steam-less Half-Life, the Fallout series, other Black Isle games, Diablo (in fact, most every top-down strategy or RPG), older Need for Speeds (such as NFS4 and Porsche Unleashed).

Who the hell plays 10 year old games? Those games are useless to me like DOS game was when I had windows XP...
 
I play old games. There's nothing wrong with older titles, especially some of the old isometric RPGs - good clean fun there, and ATi drivers seem to have good support for older titles.

Thief doesn't look too bad with 6xAA/16xHQ AF if I must say so myself.
 
Daggah said:
Battlefield 2 and NFS:MW are two popular examples...and do note that from my perspective, if a widescreen solution requires a hack that ends up actually cutting off part of the image, as many games do, I do not consider it a viable solution. As a matter of fact, personally, I'm not fond of any solution that requires more than editing just a configuration file (or, ideally, selecting the widescreen resolution in-game.)

I have no problems playing battlefield 2 with widescreen... Tiny upper part and bottom part is cut off yes... But I rather play like than than have black bars on the side of the screen making my total screen real estate cut by 1/3 or 1/4 or whatever...

It's really few old games that doesn't support widescreen that you're most likely never going to touch... Even than you can still play them in 4:3 ratio... I don't know what the deal is? If that sways you away from a video card than we should still all be playing DOS games...

This topic was about ATI and Nvidia image quality... Stay on topic... But most of the time it becomes ATI vs Nvidia.. :D
 
Back
Top