x1800xt or 7900gt?

Rabid Badger

2[H]4U
Joined
Mar 9, 2006
Messages
2,093
Ok, I've got $300 to spend on a new video card, and it's gonna be either a x1800xt or a 7900gt. I can't find a direct comparison of the performance of these two anywhere. What do you guys think? :confused:
 
x1800xt 512mb was my choice and I don't regret it.

HDR + AA = good times.

The X1800XT wins in most benchmarks anyway.

Also, if you do a search I think there are a couple threads about this same topic. You'll find some more opinions much faster doing that.
 
ryansebiz said:
7900gt

nvidia > ati

Uninformed fanb0y posts are not appreciated.

If you want to back your point up with some sort of fact or legitimate reason that's fine, otherwise you're just wasting everyone's time since you won't convince anyone of anything.
 
wow, easy fellas.

the proof was already provided above my post; didn't think i needed to link again.

just an opinion boys, settle down.
 
Now that the x1800 is about the same price, I would guess the image quality would be slightly better at around the same speed. I'm an image quality man, so as long as the game is playable at around 35-60fps, i'm happy. I would get the x1800.
 
x1800xt 512....

its as fast as the 7900gt but has better IQ and can do HDR + AA
 
Rabid Badger said:
Ok, I've got $300 to spend on a new video card, and it's gonna be either a x1800xt or a 7900gt. I can't find a direct comparison of the performance of these two anywhere. What do you guys think? :confused:

everything is always a trade-off. both perform very well, and which is best may be more determined by what's most important to you.

if heat, power output, noise (and a generally more efficient design) are important, the 7900GT is the way to go.

if you prefer better texture filtering, or the ability to be able to do both HDR and multisampled AA, then the XT is the way to go.

some other concerns right now are price and availablity. most of the GT's have "spotty" availability, and some etailers are charging premium price du to this. many of the GT's are "overclocked" versions, and are selling for $330 on up. for that you can get a 512mb XT (the 256mb version is around $290).

performance-wise they are pretty close, tho the XT seems to edge out the GT (even beating the 7900GTX in a couple of games). ati's architecture seems to take less of a performance "hit" when running higher image quality settings, and due to this edges out nv in the image quality dept. this, of course comes at the cost of increased power consumption and noise.

both cards offer fantastic performance in the $300 range, and which is best for you kind of depends on what features are most important, and of course the price/avail. of the GT as overall value certainly hinges on street prices avail to you.

i went the x1800xt 512mb route, and am happy with it. i'm sure others have gone the GT route and are happy as well. in the end, they are comparable parts, and "which is better" for the most part comes down to "different strokes for different folks".. imo either card is a solid choice.
 
the only reason to get a 7900gt over an x1800xt would be if noise was a huge issue for you. Otherwise, just look at the links above.
 
Depends. If not overclocking the X1800XT is the better choice obviously. And it also has better IQ.

However, if you're into overclocking/modding, then the 7900GT is the better choice imo. You can get 7900GTX level performance with just a little bit of volt modding.
 
prtzlboy said:
the only reason to get a 7900gt over an x1800xt would be if noise was a huge issue for you. Otherwise, just look at the links above.

From what I've seen the little cooler on the 7900GT is loud and whiny.
 
I went for the x1800XT (I fell in love with HQ AF). Its a loooooud sucker though, but Im a quiet freak nowadays, VF900 on the way too :p
 
nobi125 said:
From what I've seen the little cooler on the 7900GT is loud and whiny.


Honestly I haven't heard either in person. Plus, it's totally subjective really :/ I'd just buy what performs best in the games you play for the price you can afford.
 
I would pony up the extra cash and go for the X1900XT. The X1800XT is no longer produced. It was basically ATI's red headed step child.
 
For the love of god, quit saying the X1800 has better IQ. If you turn off the optimizations in the Nvidia control panel, like many review sites do, the IQ is identical. I've heard countless testimonies of Nvidia having better IQ, and ATI having better IQ. They are both too similar to even make an opinion off of it. That is, unless you like to take a screenshot every 2 min. Open it up in photoshop and increase the size by 10X. Then you might notice a little something.....

As some people have said, both are solid cards. Your not gonna be able to squeeze a lot of solid performance from HDR+AA in Oblivion and newer games with an X1800 so I would take that feature lightly. With that being said, I would just take do a google search with "7900 gt review" and or "x1800xt review" and check out which games you play, and which card is the higher performer.

However, after owning my fair share of both, I would have to agree that the Nvidia's drivers are what takes a cake. Solid interface, no MS .dot framework, application profiles, digital vibrance, and 8XS AA, for those older games.
 
entre nous said:
For the love of god, quit saying the X1800 has better IQ. If you turn off the optimizations in the Nvidia control panel, like many review sites do, the IQ is identical. I've heard countless testimonies of Nvidia having better IQ, and ATI having better IQ. They are both too similar to even make an opinion off of it. That is, unless you like to take a screenshot every 2 min. Open it up in photoshop and increase the size by 10X. Then you might notice a little something.....

As some people have said, both are solid cards. Your not gonna be able to squeeze a lot of solid performance from HDR+AA in Oblivion and newer games with an X1800 so I would take that feature lightly. With that being said, I would just take do a google search with "7900 gt review" and or "x1800xt review" and check out which games you play, and which card is the higher performer.

However, after owning my fair share of both, I would have to agree that the Nvidia's drivers are what takes a cake. Solid interface, no MS .dot framework, application profiles, digital vibrance, and 8XS AA, for those older games.


Yet those reviews show that the x1800 takes a lower hit on perfomance when you up the eye candy, and youre forgetting HQ AF too. You can easily skip on CCC by using Ati tray tools.

Edit: A highly overclocked 7900GT would seem way better but thats $30 more than a 512MB x1800XT. :rolleyes:
 
entre nous said:
For the love of god, quit saying the X1800 has better IQ. If you turn off the optimizations in the Nvidia control panel, like many review sites do, the IQ is identical. I've heard countless testimonies of Nvidia having better IQ, and ATI having better IQ. They are both too similar to even make an opinion off of it. That is, unless you like to take a screenshot every 2 min. Open it up in photoshop and increase the size by 10X. Then you might notice a little something.....

As some people have said, both are solid cards. Your not gonna be able to squeeze a lot of solid performance from HDR+AA in Oblivion and newer games with an X1800 so I would take that feature lightly. With that being said, I would just take do a google search with "7900 gt review" and or "x1800xt review" and check out which games you play, and which card is the higher performer.

However, after owning my fair share of both, I would have to agree that the Nvidia's drivers are what takes a cake. Solid interface, no MS .dot framework, application profiles, digital vibrance, and 8XS AA, for those older games.

Check out what people mean by better IQ before trying to trash what they say. ATI cards tend to have less of a performance hit when enabling AA and AF, and have higher levels of it. That's what is meant by better image quality. Would you not argue that AA looks better than no AA or that no AF looks better than AF?
 
nobi125 said:
Check out what people mean by better IQ before trying to trash what they say. ATI cards tend to have less of a performance hit when enabling AA and AF, and have higher levels of it. That's what is meant by better image quality. Would you not argue that AA looks better than no AA or that no AF looks better than AF?

Having better performance when enabling AA & AF does not mean better image quality. Image quality would be comparing the ground with 16X AF enabled on an Nvidia and ATI, and seeing which is sharper. Image quality would be the texture shimmering BF2 gets when Nvidia's optimizations are on. http://www.pcper.com/article.php?aid=40&type=expert&pid=4

The ATI has better AA & AF performance, yes, but that doesn't mean their IQ is better. Say the ATI card gets 40 FPS with both enabled yet the Nvidia gets 30 FPS. Both are completely playable, yet since the ATI has better performance their image quality is higher? I don't think so.
 
entre nous said:
Having better performance when enabling AA & AF does not mean better image quality. Image quality would be comparing the ground with 16X AF enabled on an Nvidia and ATI, and seeing which is sharper. Image quality would be the texture shimmering BF2 gets when Nvidia's optimizations are on. http://www.pcper.com/article.php?aid=40&type=expert&pid=4

The ATI has better AA & AF performance, yes, but that doesn't mean their IQ is better. Say the ATI card gets 40 FPS with both enabled yet the Nvidia gets 30 FPS. Both are completely playable, yet since the ATI has better performance their image quality is higher? I don't think so.

...but you're forgetting HQ AF. So at least that point stands.
 
The_Dark_Abyss said:
Depends. If not overclocking the X1800XT is the better choice obviously. And it also has better IQ.

However, if you're into overclocking/modding, then the 7900GT is the better choice imo. You can get 7900GTX level performance with just a little bit of volt modding.

not sure i'd agree with that. i've yet to see an XT that hasn't overclocked to 700mhz/1.8ghz (that's what i'm running at and i haven't even tried any higher).

prtzlboy said:
Honestly I haven't heard either in person. Plus, it's totally subjective really :/ I'd just buy what performs best in the games you play for the price you can afford.

heh.. i don't think it's really subjective unless you include deaf people :p

there's a definate audio presence there all the time letting you know the 1800xt is working, and this is compounded by the fact the fan changes speed/tone often.

entre nous said:
For the love of god, quit saying the X1800 has better IQ. If you turn off the optimizations in the Nvidia control panel, like many review sites do, the IQ is identical. I've heard countless testimonies of Nvidia having better IQ, and ATI having better IQ. They are both too similar to even make an opinion off of it. That is, unless you like to take a screenshot every 2 min. Open it up in photoshop and increase the size by 10X. Then you might notice a little something.....

this is not at all true. NV cannot match the texture filtering offered by the x1k series of cards. there's a good argument which can be made that NV's Transparency AA mode is a bit better than ATI's Adaptive AA. and their 8xssaa is better than the 6xmsaa offered by ati (problem is except in older titles g70 doesn't have the horsepower to run 8xssaa).

while the AA modes may take a photoshop presentation, the differences in AF are easily noticeable.

entre nous said:
As some people have said, both are solid cards. Your not gonna be able to squeeze a lot of solid performance from HDR+AA in Oblivion and newer games with an X1800 so I would take that feature lightly

there is not reason to take it "lightly". it performs fine, and looks great. the only reason anyone would say "take it lightly" is if they just can't do it and they want to make themselves feel better :p

entre nous said:
However, after owning my fair share of both, I would have to agree that the Nvidia's drivers are what takes a cake. Solid interface, no MS .dot framework, application profiles, digital vibrance, and 8XS AA, for those older games.

dot net framework is not a big deal, and frankly if you're dead set against it, you don't have to use it. it just changes the control panel interface (ati tray tools is a cool addition).

digital vibrance may or may not be useful feature. i don't like it, don't use it, and never have. i suppose that could be subjective. ATI however does offer a similar feature in Avivo.

profiles are available in ATI's drivers as well. while i preferred the way NV driver's implements it (tho that's considered subjective; i have just been used NV's method for almost 2yrs) , ATI certainly does not "lack it" as you claim.

added:

entre nous said:
Having better performance when enabling AA & AF does not mean better image quality. Image quality would be comparing the ground with 16X AF enabled on an Nvidia and ATI, and seeing which is sharper. Image quality would be the texture shimmering BF2 gets when Nvidia's optimizations are on. http://www.pcper.com/article.php?aid=40&type=expert&pid=4

first of all, you're ignoring HQAF, which does offer an easily noticeable improvement in texture filtering (a mode nv cannot match).

secondly, while the implementation is limited, the ability to have BOTH HDR and MSAA offers another IQ mode which is "better", and cannot be matched by NV unless HDR is implemented via the shaders (as valve has shown).

entre nous said:
The ATI has better AA & AF performance, yes, but that doesn't mean their IQ is better. Say the ATI card gets 40 FPS with both enabled yet the Nvidia gets 30 FPS. Both are completely playable, yet since the ATI has better performance their image quality is higher? I don't think so.

it certainly makes a difference when ATI gets 30FPS And NV gets 20 FPS as the case is with oblivion.
 
entre nous said:
Having better performance when enabling AA & AF does not mean better image quality. Image quality would be comparing the ground with 16X AF enabled on an Nvidia and ATI, and seeing which is sharper. Image quality would be the texture shimmering BF2 gets when Nvidia's optimizations are on. http://www.pcper.com/article.php?aid=40&type=expert&pid=4

The ATI has better AA & AF performance, yes, but that doesn't mean their IQ is better. Say the ATI card gets 40 FPS with both enabled yet the Nvidia gets 30 FPS. Both are completely playable, yet since the ATI has better performance their image quality is higher? I don't think so.

You're not making any sense man. Better performance means turning up optimizations, which in turns gives better image quality. Simple as that.

Are you trying to say that 16x HQAF playable on ATI does not give better IQ than 8xAF on NV? That's rediculous.

People don't claim ATI IQ is better because we're getting residuals, it's from lots of subjective viewing which in the end is all that matters.
 
Trying to say ATI has "better image quality" just because they can run 5-15 FPS better with 4XAA & 16AF in Oblivion, BF2 and Fear doesn't make sense. So out of those three games, ATI has better image quality because it can perform better. So in Quake 4, Doom 3, Farcry HDR, HL2:Lost Coast, etc... since the 7900 can perform better with 4XAA & 16XAF its image quality is better? Even though 16XAF may look sharper on Nvidia's/Ati's? That doesn't make any sense.

To put it simple, comparing image quality would be comparing ATI's and Nvidia's 4X AA, to make it a fair comparison, and notice that ATI's AA is a tad bit sharper than Nvidia's. Thus ATI's image quality would be better than Nvidia. Not saying this is the case but I've seen scenarios when its true.

You can't just say, Oh, ATI's image quality is better because they can do HDR+AA. That is a feature set. Sure it may look better than any Nvidia shot, but that is not a fair comparison to image quality and filtering method compared and I doubt any review would say "ATI has superior image quality because it has the ability to do HDR+AA. Well no shit? But what if its 16XAF is blurry compared to Nvidia's 16XAF (as an example).

Same goes for ATI's HQAF and Nvidia's 8xS AA. Of course comparing image quality per feature is gonna give you massive differences. I think its only fair to compares image quality per filtering method and judge which is better. I know where everyone is coming from but it doesn't make a fair comparison to compare different features...

So, since image quality is based on performance per feature, the X1900XTX performs better in Oblivion, Fear, and BF2. The 7900 GTX performs better in all the rest. Thus, the 7900 GTX has better image quality. Per everyones reasoning.
 
entre nous said:
Trying to say ATI has "better image quality" just because they can run 5-15 FPS better with 4XAA & 16AF in Oblivion, BF2 and Fear doesn't make sense. So out of those three games, ATI has better image quality because it can perform better.

That's not what people are saying. Your post could've stopped right there because you're on the wrong track.

People are saying ATI IQ is better because 1. ATI's filtering method is better 2. ATI can do AA+HDR 3. ATI performs better allowing you to increase the level of optimization.
 
entre nous said:
You can't just say, Oh, ATI's image quality is better because they can do HDR+AA. That is a feature set. Sure it may look better than any Nvidia shot, but that is not a fair comparison to image quality and filtering method compared and I doubt any review would say "ATI has superior image quality because it has the ability to do HDR+AA. Well no shit? But what if its 16XAF is blurry compared to Nvidia's 16XAF (as an example).

Same goes for ATI's HQAF and Nvidia's 8xS AA. Of course comparing image quality per feature is gonna give you massive differences. I think its only fair to compares image quality per filtering method and judge which is better. I know where everyone is coming from but it doesn't make a fair comparison to compare different features...

So, since image quality is based on performance per feature, the X1900XTX performs better in Oblivion, Fear, and BF2. The 7900 GTX performs better in all the rest. Thus, the 7900 GTX has better image quality. Per everyones reasoning.

Features definitely count. If one company has a feature that another company does not even provide, then it makes sense to say that it has higher quality through diversity of features. No one said anything about one company's 4xAA being better than another company's 4xAA.
 
entre nous said:
Same goes for ATI's HQAF and Nvidia's 8xS AA. Of course comparing image quality per feature is gonna give you massive differences. I think its only fair to compares image quality per filtering method and judge which is better. I know where everyone is coming from but it doesn't make a fair comparison to compare different features...

It's not SUPPOSED to be fair. It's about what features and performance you can get for your money, bottom line.

Your posts are not helping the OP here.
 
So please, tell me why every single review out there, when they say "image quality" they compare apples to apples to see which filtering method is better. They compare the same filter method so it is fair. Now, why do they do that? I'm just saying all these people that say ATI has better image quality is wrong. You'll want to compare features for what you guys are talking about.

http://www.firingsquad.com/hardware/imagequality2/page3.asp
http://www.pcper.com/article.php?aid=40&type=expert&pid=3
http://www.driverheaven.net/articles/driverIQ/
http://www.xbitlabs.com/articles/video/display/doom-3-tests_3.html

are just a few of the many review sites that compare image quality. They compare it with filtering method to filtering method. I'm just trying to say to the OP that these people that say ATI have better image quality are not using the term right. ATI does not have better image quality. ATI has HDR+AA, HQAF. Nvidia has 8XSAA. Reviewers don't even test image quality anymore because they are identical.
 
Well if you compare shots in certain angles in games. Nvidia Aniso or Tri doesn't doesn't look as good as ATI..

Play with both cards and you will know ATI has better image quality.
 
entre nous said:
ATI does not have better image quality. ATI has HDR+AA, HQAF. Nvidia has 8XSAA.

So essentially you're just argueing semantics.

Yes, ATI has HDR+AA and HQAF which gives better overall image quality.
 
entre nous said:
Having better performance when enabling AA & AF does not mean better image quality. Image quality would be comparing the ground with 16X AF enabled on an Nvidia and ATI, and seeing which is sharper. Image quality would be the texture shimmering BF2 gets when Nvidia's optimizations are on. http://www.pcper.com/article.php?aid=40&type=expert&pid=4

The ATI has better AA & AF performance, yes, but that doesn't mean their IQ is better. Say the ATI card gets 40 FPS with both enabled yet the Nvidia gets 30 FPS. Both are completely playable, yet since the ATI has better performance their image quality is higher? I don't think so.

Not having AA of AF on because it doesn't run well enough = less image quality.

Not to mention that it has higher levels of AF.
 
Thanks for the input, guys. Especially CaiNaM and Stereophile. I think I'm going to go with the x1800xt, it seems to be the better all-around performer, and since I play my games with hi-fi earphones a little whining at high temps won't bother me, and the new ATI 3200 motherboard chipset seems to be pretty awesome. Of course, if I get that, I'll eventually have to get a second x1800xt, just because I can. No wonder I'm always broke. :rolleyes:
 
Back
Top