Image quality ATi vs nvidia

v6maro

[H]ard|Gawd
Joined
Oct 10, 2002
Messages
1,552
Ok, first off I know this can get heated, but let's not do that. I just simply want to know which card outputs the BEST visual image, and why/how it does.

(driver-wise)
What settings do you use for ati to show the best image?
What settings do you use for nvidia to show the best image?

Does anyone have any side-by-side comparisons to actually show the difference?

Please don't turn this into a pissing match of ati vs nvida, I like them both, i'm not a !!!!!!, I just want to see the actual differences. I don't care if either card takes a performance hit for whatever setting, i dont care if it looks the best and only runs 5fps, i just want to see the difference, thanks :)
 
ATi's X1k cards provide the best IQ as far as gaming cards go. Set 16x AF, enable High Quality AF and switch Catalyst AI off. I owned a 7800GTX before switching to an X1900XT, so this is not the biased opinion of a !!!!!!. The extent of the difference depends on the game/environment.
 
rincewind said:
ATi's X1k cards provide the best IQ as far as gaming cards go. Set 16x AF, enable High Quality AF and switch Catalyst AI off. I owned a 7800GTX before switching to an X1900XT, so this is not the biased opinion of a !!!!!!. The extent of the difference depends on the game/environment.

could you expand in games such as fear and cs:source? im v curious...ive never heard it properly from an unbiased person....

thnx :)
 
I own a 7900GT but I have seen the XTX in action, my friend owns one. Playing games like Fear, HL2, CoD, and Battlefield. ATI definately has better image quality, its not overwhelming, but the difference is there. I feel I made the wrong decision. :(
 
ATI has ONLY one edge. No shimmering

We can throw in AA+HDR and HQAF but HDR is rarely used and HQAF is not a default setting
 
Vette5885 said:
I noticed a drop in image quality when I went from my 9800 Pro to my 6800 GT

Im kinda like you. I had a X600 Pro at one point and then went to a 6800 vanilla. The picture the X600 produced just seemed sharper than the one the 6800 produced. Nothing against the 6800, it just seems like the picture is a little "washed out." Its nothing major, but it still can be noticed.
 
chinesepiratefood said:
ATI has ONLY one edge. No shimmering

We can throw in AA+HDR and HQAF but HDR is rarely used and HQAF is not a default setting

Isn´t this bit backward thinking? It would be foolish not to use HQAF since it´s only 1-2% performance hit. And may I ask, what does it matter if HQAF is a default setting or not?
 
Mayhs said:
ive never heard it properly from an unbiased person....

ATI’s HQ AF really helps with image quality in an outdoor game like BF2 when compared to NVIDIA’s way of doing filtering. There are a lot of angles to the ground textures in BF2 and having the filtering performed in ATI’s superior way can indeed provide a noticeably better visual quality and overall better gaming experience. You can find forum arguments all over the Net about Red Vs. Green when it comes to texture filtering, but this is one situation where ATI’s technology clearly leads in the real-world gaming experience battle.

http://enthusiast.hardocp.com/article.html?art=MTAwMSwxMCwsaGVudGh1c2lhc3Q=
 
chinesepiratefood said:
ATI has ONLY one edge. No shimmering

We can throw in AA+HDR and HQAF but HDR is rarely used and HQAF is not a default setting

Continue making yourself appear uncapable of being subjective by making excuses. Then go on and acknowledge, only to quickly discount, the competitor's advantage. This is an example of being "that guy." So freaking old. Nv's IQ advantage is? Non-existant.

HDR is one of the biggest things to hit in 2006. No one wants to go without AA either, especially not on their $300-$500 graphics card. What a different tune was sang when Nv was the best HDR show in town. But I digress. Where this loses any semblance of credibility is with the HQ AF not being default. What's Nv's default again? ATi's HQ AF is angle independent AF. This is something Nv cards are incapable of doing at all. Nv opts can never be completely shut off. There are no technical reasons that ATi with HQ AF - OFF might not still compete well with Nv's High Quality. Myself and many others feel it does. At least in motion. But let's not leave this an arguable point here.... ATi with angle independent AF off does not immediately tank down to NV's "Quality" setting..... period. Not maybe, not sorta, it doesn't. In a perfect world, optimizations go unseen and unnoticeable. At your precious defaults, I could fire up BF2 and tell the difference between a GF7 and x1k series card from... I dunno... maybe 7 or 8 feet. It's "just" AF though right? :rolleyes: This is unrefutable dude. An Nv guy using defaults as a crutch, is using crutches made of matchsticks. They simply chose to be less agressive in this regard. Option to finally get down to completely pure AF is icing. For IQ whores it will come down to High Quality Nv vrs. HQ AF anyways. Quality leaves a bit too much shimmering for these people. Do not pass go. Do not collect $200.

See your post isn't blatant flag touting material that you could see after "spot the overzealous fans 101". We'll call it spot the overzealous fans 201. You can't acknowledge your favorite companies competitors largest advantage. What's up with that? Nv cards run with less power, dole out less heat, are more efficient in SLI, have arguably (won't hear me argue though) better drivers, and usually get more raw fps. But a bunch of people weigh the pros and cons and then pick there choice, only afterwards to put down every con they bought into. Maybe this will be easier on any Nvidiots that might read this, hell put down another advantage or two and there is no reason to go ATi at all. Nv guys bought the shimmering, quieter, and cooler running card. You lost IQ doing it. Deal with it.

I went from a 7800GTX256 to an x1900XT so I know this all too well. There are many reasons sometimes I wished I waited for 7900. IQ not being one. It's painfully obvious who has had both ATi and Nv within this gen when these arguments come up. Easier than spotting shimmies in BF2. ;)
 
sure ati quality may be better (looking at screenshots. i never owned an ati card.) but for me, a 7800gt was leaps and bounds better than anything else in it's price range (now, the x1800xt is 300...not so back a few months). i'd rather play at a higher resolution than have slightly better iq so i went with nvidia.
 
dchrsf said:
Are you kidding me?

Load these 2 seperate images in different tabs in firefox zoomed somewhere around center or near the wall and swich back and forth between the tabs. Without even lookin at file names you will know which one is ATI.

http://www.bit-tech.net/content_images/asus_bfg_msi_geforce_7900_gtx_roundup/1920_ati_hqafb.png
http://www.bit-tech.net/content_images/asus_bfg_msi_geforce_7900_gtx_roundup/1920_nv_hqb.png

Yep I can see it. Like I said, the ATI one is sharper while the Nvidia one is a little "washed" out. Im saying this in the most unbiased way I can. Hell im thinkin of getting a 7900GT in the next week or so!!!
 
This is one of the best examples which shows what angle independant HQAF can do in favour of Ati;

1128280140ABTiXJphEC_8_12_l.jpg


Follow the tilted wall...
 
chinesepiratefood said:
ATI has ONLY one edge. No shimmering

We can throw in AA+HDR and HQAF but HDR is rarely used and HQAF is not a default setting

That simple edge, resulting from angle independant filtering, allows ATi to set LOD bias to a value that increases sharpness in all areas. Even to the point AF voer 2x is not even needed to look detailed and sharp.

Thats a pretty big edge.
 
They are good features, mine is a pure un-biased opnion as i own multiple ATI and NVIDIA cards.

But what i am refering to is if they are set on the same settings, the only true difference is the annoying shimmering on NVIDIA cards. If you guys have to flame for having something said about your blessed video card company go ahead I guess...
 
My brother is a video card expert. He is saying that he will be getting a motherboard that can accept 2 video cards. These 2 video cards will be hooked up to one monitor to double the graphics speed. Where can I find soething like this? :confused:
 
luke101 said:
My brother is a video card expert. He is saying that he will be getting a motherboard that can accept 2 video cards. These 2 video cards will be hooked up to one monitor to double the graphics speed. Where can I find soething like this? :confused:

In your own thread ;)
 
chinesepiratefood said:
They are good features, mine is a pure un-biased opnion as i own multiple ATI and NVIDIA cards.

But what i am refering to is if they are set on the same settings, the only true difference is the annoying shimmering on NVIDIA cards. If you guys have to flame for having something said about your blessed video card company go ahead I guess...

Well I apologize if I read wrong. You just got ONLY in big bold text. Then to go in and mention HDR only to quickly discount it. The defaults thing sounded a reach when as we know Nv's defaults are actually pretty damn agressive. Meh.. sorry then. Just normally that's what's going on when people mention advantages and before even finishing the sentence it's no big deal or whatever. That's why I bring up the guys thinking pros and cons and then coming back and valiantly defending there choice at all costs. It's like they know what the deal is, but rather than Option A had 8 pros and 3 cons for me, it's Option A had 8 pros for me and 3 meaningless drivel bullshit drawbacks! :p

I dunno. Guess I should apologize for misreading text. But it's certainly confusing. Now the shimmering and stuff is annoying. However, your first post totally reads like it's meant to do nothing other than acknowledge and quickly strike down the IQ differences. It's ONLY AF. HDR is barely used. For HQ AF you have to.... check a box. Like I ranted on, that's surely most confusing of all. Unless combo boxes are that much better, Quality in Nv panel is well, lacking.
 
chinesepiratefood said:
They are good features, mine is a pure un-biased opnion as i own multiple ATI and NVIDIA cards.

But what i am refering to is if they are set on the same settings, the only true difference is the annoying shimmering on NVIDIA cards. If you guys have to flame for having something said about your blessed video card company go ahead I guess...

Wrong. Set bone stock to bone stock, nvidia has a more aggressive LOD bias. This reduces detail no matter the settings, stock or not.

But, in all honesty, why anyone would run an ATi card at anything less than HQ AF would be beyond me and I believe that is even the default setting. (I have made direct comparisons from the 6800GT to the X800XT and the X1900XTX in my computer...and I can tell you, the ATi is better at any setting.
 
luke101 said:
i dont understand!

Your brother is right, you throw two cards in and they can split the load. For Nvidia this is called SLI. Many many cards are SLI capable. For low-mid to high end. You will want a motherboard with the Nvidia SLI chipset. The cards would be very similar, same chip and features. Not necessarily same brand but you don't mix a 6800GT with a 7800GT for example. ATi's is crossfire, and you'll need a Crossfire board. They stick to the higher ends for dual solutions. One of these cards will need to be a 'master' card though. The x1900 crossfire edition (the master card) comes in at x1900XT speeds. An XTX may downclock itself. An XT would be the right fit.

Thats it in a nutshell. I guess the your own thread comment was meant as please don't derail the thread. Understandable. I just want to give you some nuggets to find everythng you need. People just don't want to this to turn into a Q&A session about something else. As we're geeks, we're trying to talk about deeper geek things. :p
 
Apple740 said:
This is one of the best examples which shows what angle independant HQAF can do in favour of Ati;

1128280140ABTiXJphEC_8_12_l.jpg


Follow the tilted wall...

that basically puts this thread to rest...if you cant see the iq difference you must be BLIND!
 
Yes nVidia has always been more biased to performance rather then image quality. Thanks to ATI they aren´t as bad as they would be without competition. I have the 7800 GT:s because stupid ATI REFUSES STILL to support stereo3d glasses which makes the nVidias drivers way superior anyway... And I was happy with the 9700 PRO IQ I would say the 7800 is quite similar actually. no HDR+aa though which is annoying.
 
Xeero said:

Huh? Did you even read the article?

F.E.A.R.:
The Sapphire Radeon X1900XTX was as fast as the reference GeForce 7900 GTX, but the filtering quality provided some improvements in image quality.

DoD:
The Radeon X1900XTX was playable at similar settings with the addition of the high quality anisotropic filtering setting. This did help to improve the image quality somewhat, as there were some areas that did appear to lack anisotropic filtering on NVIDIA's GeForce 7900 GTX.

From BF2:
The Radeon X1900XTX was able to play the game at the same resolution as the two GeForce 7900 GTX frequencies that we've tested here. We found that Battlefield 2 was playable at 1920x1200 4xQAAA 8xHQ AF with all details set to their maximum. High quality anisotropic filtering made a huge difference in this title. Even with high quality driver settings, there were still some quite large differences in texture filtering quality - this was most noticeable on gravel or grass-textured areas.

AOE3:
Of course, the Radeon X1900XT and Radeon X1900XTX both support HDR and antialiasing, meaning that we were able to apply antialiasing without much of a performance hit. We were able to play the game at 1920x1200 with 2xAA 16xHQ AF and maximum in-game details, including HDR bloom. The Radeon X1900XTX was able to maintain a smooth frame rate with performance adaptive antialiasing applied too. Sapphire's X1900XTX gave a relatively unparalleled gaming experience in Age of Empires III in all honesty - the image quality was damn good.

Conclusion:
If you've got a GeForce 6-series or GeForce 7-series product and you're happy with the image quality that it delivers with either the quality or high quality driver settings, you'll be perfectly happy with the quality delivered by the GeForce 7900 GTX. However, there are some cases where the Radeon X1900-series - with high quality anisotropic filtering enabled - delivers noticeable image quality benefits.


The GeForce 7900 GTX trades blows with the Radeon X1900XTX in our selection of popular titles, but doesn't deliver a knock out blow. In that respect, neither does ATI. NVIDIA has succeeded on a number of fronts, but we feel that they've missed out on at least one important feature in their move to 90 nanometres at the high end. We feel that NVIDIA's hardware suffers from having lower realistically playable image quality - high quality anisotropic filtering is a given on the Radeon X1900-series cards, meaning that they are in a different league on the image quality front.

If you're looking for the best-looking pixels in the business, you should be using an ATI Radeon X1900XT or Radeon X1900XTX.

Thats not even bringing up the pictures, which they say is more noticable playing the game.

So after reading the article, how did you come to the conclusion that "image quality appears to be identical."?
 
I just think its amazing I can use HQ AF without taking any hit in perfomance(that I can feel anyways) :D

EDIT: Btw guys, I dont get Catalyst AI yet, isnt it supposed to make games faster ? Yet when I turn it off games run much better, is it flawed ?
 
does AF have any use for a game like AOE3 cos there aren't those straight lines or whatever?
 
just for some reinforcement here are some pics i took, full size with 6xADAA one with standard 16x AF and one with 16xHQAF (same spot as in the hard article, but id figure it would do some good to show it with AA)


standard AF and for the record i get about 54-55 FPS at this scene these settings (lots of background tasks so normally higher but i just wanted to compare vis quality not fps)


16xHQAF i get about 52-53 FPS. not much of a hit at all for much more cleaned up image
 
Make shift HQAF for NVIDIA cards is 8xS AA, yes i know unplayable at high-resolutions, but i just founds this when i was looking at some AA quality

(note these are meant to compare AF on left wall)

4xAA



8xS AA

 
fromage said:
It's not really that big of a difference.
It's a big difference that becomes even more noticable during gameplay. I am underwhelmed with my 6600GT (which MSI replaced a fried 9800 Pro with under warranty), and the shimmering drives me CRAZY.
 
another vote for ATI, even though I currently own a 7800gt. I wont end up buying Nvidia again, the difference is just too noticeable. ATIs stuff is just noticeably sharper in some way. I came from a 9800 pro.
 
chinesepiratefood said:
Make shift HQAF for NVIDIA cards is 8xS AA, yes i know unplayable at high-resolutions, but i just founds this when i was looking at some AA quality

(note these are meant to compare AF on left wall)

4xAA



8xS AA


8xSSAA helps in the tilted part at the end vs 4xAA, but it makes the whole wall a bit less sharp too. It blurs the wall textures.
 
chinesepiratefood said:
Make shift HQAF for NVIDIA cards is 8xS AA, yes i know unplayable at high-resolutions, but i just founds this when i was looking at some AA quality

(note these are meant to compare AF on left wall)

4xAA



8xS AA


Call me crazy but I can notice a diffrence between quality sadly. Oh well, this still isnt stopping me from buying a 7600GT!!!
 
Mrwang said:
I own a 7900GT but I have seen the XTX in action, my friend owns one. Playing games like Fear, HL2, CoD, and Battlefield. ATI definately has better image quality, its not overwhelming, but the difference is there. I feel I made the wrong decision. :(

But isn't the XTX prohibitively expensive compared to the 7900GT? That has to count for something.
 
Back
Top