7900GTX vs. X1900XTX -- a true trade-off

Which card?

  • 7900GTX

    Votes: 90 36.4%
  • X1900XTX

    Votes: 157 63.6%

  • Total voters
    247
R1ckCa1n said:
Granted the 7800/7900 series cards are great cards but what does it buy you right now?
1. Doom3? Come on, this game is all but dead. And how far off are the two cards in modern OGL games? Not nearly the gap it used to be.
2. Multi monitor support has always worked for me. The only time you see posts are from the ones having problems (very few from a quick search), not the ones who have no problems.......
3. CCC isn't the best but it works just as needed. Most gamers don't notice how bad CCC is while gaming.
4. What gamer truely cares about Linux?
5. Your sound references are funny considering everyone who dismissed the 5900 sound levels :eek: Yes it is loud but nothing that can't be fixed very cheap. A quick search on this finds most X1900 users are saying the card is not a loud as advertised by the NV marketing machine.
6. I might be stepping out on a limb but Crossfire with a 3200 chipset is very easy and works great and funny Kyle found this to be true in his last Crossfire evaluation. I have yet to see many posts about crossfire problems on the forums so were is this data on how it is inferior? Odds are there are more posts on this forum about SLI issues than Crossfire issues.



HQAF should be a must for all gamers. Why sink $500.00 to $1000.00 in a 7900 when you are getting a card that creates inferior IQ? I, myself, was floored by the obvious difference when I bought my 7800GTX. This goes along with NV's marketing program that dismisses IQ for speed.

Now considering all games are becoming shader intensive, the 7800/7900 series will continue to suffer. Why buy a new card today when it can't even exploit TODAYS features?

Too each his own, I guess.

1. Who said anything about doom3? I was just pointing out opengl as a weak point in ati drivers. It's not a horrible weakness, it's just lame that they refuse to rewrite their opengl driver.
2. Multi-monitor *works* on ATi cards, but nview has about a million features hydravision doesn't have, I use multi-monitors at home and at work, and nview just makes hydravision look like someone's side project.
3. Plenty of people complain about the CCC all the time as I'm sure you've noticed. It's not a show-stopper, just another weak point in ATi's drivers.
4. It's not about gamers, it's about driver quality, ATi has great d3d drivers...well...nvidia has great d3d drivers AND opengl drivers AND linux drivers. What I'm saying here is, again, that nvidia's driver team simply does a better, more robust job.
5. I owned a 5900, it wasn't that loud, I think you're thinking of the 5800 Ultra. And having sat next to both a 5900 and an x1900xtx, I can tell you unequivocably that the x1900xtx was definitely a hell of a lot louder and more annoying. Either way, what's the point in dredging up 4 year old cards?
6. It doesn't scale as well. This was in over a dozen reviews, x1900xtx crossfire doesn't scale as well as SLi. And you have to buy a master card, and if you have an x1900GT, the x1900 master card will disable some pipelines in order to run in crossfire = lame. And there are more posts about SLi issues than CrossFire because there are approximately 2000 times the number of people running SLi as crossfire...just look at the steam survey.

If HQAF came for free, then yeah, it should be a must...but at the cost of the noise and heat? Clearly a lot of people just don't care that much.

And don't both bashing nVidia for dismissing quality for speed, ATi invented the art of over-optimization...funny how quick you are to bring up old nvidia cards, but I dont' see you mentioning anything from ATi's not-so-glorious past.

To answer you last question, "Why buy a new card today when it can't even exploit TODAYS features?", the exact thing I said before...because the majority of the population doesn't know what HQAF and HDR+AA are, much less care about using them. But everyone knows what "loud" and "hot" mean. Once again, this logic is not necessarily apparent to the rabid ATi !!!!!!s of the world, but it sure makes sense to somebody cuz the 7900s are selling like hotcakes.

The X1900XTX is a great card, the best in fact: http://www.gpureview.com/superlatives.php

But people have different priorities, and a lot would rather have a 7900GTX...
 
If I were in his shoes, I would buy neither. Instead, I would buy the 7900GT, the eVGA brand, and overclock it. I've never bought the top-end videocard of its generation but the 2nd fastest if the specs are the same except for the core and memory clocks.

For instance, I bought a 6800GT back in July of 2004 and I immediately overclocked it beyond the 6800Ultra levels by modifying the BIOS (voltages, clock rates) and flashing it. It still runs well and I've never had any problems despite the higher voltage feed and clock rates. Of course, there is no guarantee one would get a high overclock out of the 7900GT, but with DX10 and Vista looming in the horizon, I think it's a good idea to buy the 7900GT and overclock it b/c it's cheaper and runs almost as good even with stock speeds
 
^eMpTy^ said:
And don't both bashing nVidia for dismissing quality for speed, ATi invented the art of over-optimization...funny how quick you are to bring up old nvidia cards, but I dont' see you mentioning anything from ATi's not-so-glorious past.

Both companies have made mistakes, but this topic is about the X1900XTX and 7900GTX - not what has happened in the past.

^eMpTy^ said:
To answer you last question, "Why buy a new card today when it can't even exploit TODAYS features?", the exact thing I said before...because the majority of the population doesn't know what HQAF and HDR+AA are, much less care about using them.

The same could be said of Linux.....

However, ATi can fix the issue in software (unlikely as that may be), whereas the 7900GTX will never do HDR+AA using the common method because it's a hardware limitation.

This ATi vs nV banter is irrelevant - we're supposed to be comparing cards, and on that front the poll speaks for itself...
 
5150Joker said:
The OGL performance isn't nearly as "lackluster" as you make it out to be. In the OGL games that actually matter, the XTX stays pegged at 60+ fps at high res with 4xAA/16xAF (e.g. Quake 4 at 1680x1050 4xAA/16xAF is what I use).
Try looking at more OpenGL games like Riddick or Pacific Fighters. Or 3D openGL applications like Maya.
If the CCC isn't someone's cup of tea there are very good alternatives like ATi Tray Tools even though there's nothing wrong with CCC itself.
Enough people have a problem with it to make it an issue all over most forums.
Linux doesn't mean jack for 99% of gamers out there.
Since when are you 99% of the gamers and who elected you to represent us.

And as for Crossfire being inferior to SLi, if that's true, why does SLi get creamed at high resolutions + >8xAA? Obviously if someone is going to spend cash on a dual card solution, they're going to want to push the AA as high as possible and only Crossfire will give them that option because SLi has lackluster performance.
The majority of the benchmarks from reputable websites, state otherwise.
Face it, nV lost this battle, maybe they'll do a better job with G80 but I have my doubts.
A recent Tech Report article stated that the NV cards were outselling the ATI cards by 4 to 1. A recent Steam survey showed SLI had nearly 100% of the dual card market.

If that's losing the battle, then it's winning the war.
 
PRIME1 said:
Try looking at more OpenGL games like Riddick or Pacific Fighters. Or 3D openGL applications like Maya.

Riddick was rubbish, and also happened to be a TWIMTBP title if I remember correctly.

Pacific fighters uses a specific nV extension, which just highlights why OpenGL is a pain in the ass. The ARB are very slow to adopt things into the standard feature set, forcing developers to write separate codepaths based on what vendor-specific extensions are available. Bring on DX10, I say.

I have to agree with you on the 'professional app' front. However, I'd imagine that most Maya users have a workstation-class card. No way I'd buy a gaming card (or any ATi card) for this type of rig.


So yes, nV do have much better OpenGL support. But OpenGL, in it's current state, is leagues behind D3D.
 
rincewind said:
So yes, nV do have much better OpenGL support. But OpenGL, in it's current state, is leagues behind D3D.
Have you seen the screenshots for Quake Wars?
 
rincewind said:
Both companies have made mistakes, but this topic is about the X1900XTX and 7900GTX - not what has happened in the past.

The same could be said of Linux.....

However, ATi can fix the issue in software (unlikely as that may be), whereas the 7900GTX will never do HDR+AA using the common method because it's a hardware limitation.

This ATi vs nV banter is irrelevant - we're supposed to be comparing cards, and on that front the poll speaks for itself...

If you'll read the rest of my post you'll see I actually criticized him for trying to bring up the past...I agree, the past has no bearing on the current topic.

And about Linux, I was only trying to give examples where nvidia's drivers are better, I'm not saying that makes the gtx the better card.

About ATi fixing things in software, that was exactly my point...it's been years and they just don't fix things...that's my whole point about nvidia having better drivers...which is kinda a side-topic here...
 
I would (and did) go with a 7900GTX. I consulted Jason (MickeyMouse) before I made my decision (he's one of the top OC'ers in the world, he builds almost all of the LN2/DI containers used to set world records) and he told me that from his extensive experience with both cards, the 7900 is superior. It's hard to argue with someone who is so knowledgeable so I decided to buy his 7900GTX from him, the one that he used to set the current 3DMark06 Single Card World Record :D Needless to say, I'm satisfied with how it performs.
 
PRIME1 said:
Have you seen the screenshots for Quake Wars?
No, but how is that relevant to D3D being a better API? I'm basing this on programming experience, not screenshots. Quake wars uses OpenGL because the engine it is based upon does. It hasn't been written from scratch.

HiJon89 said:
I would (and did) go with a 7900GTX. I consulted Jason (MickeyMouse) before I made my decision (he's one of the top OC'ers in the world, he builds almost all of the LN2/DI containers used to set world records) and he told me that from his extensive experience with both cards, the 7900 is superior. It's hard to argue with someone who is so knowledgeable so I decided to buy his 7900GTX from him, the one that he used to set the current 3DMark06 Single Card World Record :D Needless to say, I'm satisfied with how it performs.

Hope you enjoy playing 3DMark then. For him, superior probably means 'fastest in 3DMark 06' - to a gamer, that doesn't hold true. I don't even have 3DMark 06 - I know my games run tip-top, so a score in a synthetic benchmark is irrelevant.

^eMpTy^ said:
If you'll read the rest of my post you'll see I actually criticized him for trying to bring up the past...I agree, the past has no bearing on the current topic.

About ATi fixing things in software, that was exactly my point...it's been years and they just don't fix things...that's my whole point about nvidia having better drivers...which is kinda a side-topic here...

Sorry, my bad. I was actually going to post that the Linux support issue highlights the single-mindedness of ATi's driver team, but re-considered as it's a bit OT.

^eMpTy^ said:

Well of course more people have a 7XXX card. That's bloody obvious, and I don't think anyone needs a poll to highlight that.
Sadly, it has no relevance at all to X1900XTX vs 7900GTX.
 
ATi is playing the percentages game - at the end of the day they have their target market and that's what they've decided to focus on. This is pretty common business practice. If you are a Windows user who wants to play games with maximum visual quality then they have the best product out there - if not, then yes you should probably steer cleer.

For me, the X1900 cards are the best. But then I make up part of ATi's target market. If I was a 3D designer I'd probably have a Quadro (now that 3DLabs have gone). If I was a well-to-do 3D designer then I'd probably have an E&S instead.
 
rincewind said:
Well of course more people have a 7XXX card. That's bloody obvious, and I don't think anyone needs a poll to highlight that.
Sadly, it has no relevance at all to X1900XTX vs 7900GTX.

nm...I thought it was funny that the polls seemed to contradict one another...then I realized the 7 series encompasses the 7800, which has been out for almost a year...my bad
 
oldmanwinter said:
Not to pick on you specifically, but I can't believe people are still perpetuating that myth.

The drivers might be on the same level, but the ATI control panel sucks IMO.
 
Volkum said:
The drivers might be on the same level, but the ATI control panel sucks IMO.

I never disputed that. I definitely prefer the faster, .NET-less nVidia user CP, but I'm not going to stick with what I perceive to be an inferior product solely because of that. CCC is slow and bloated (and ugly) but considering that I just move the slider to optimal quality and forget it, I can deal with having to use the CCC once in a while. Once ATI Tray Tools updates to allow overclocking properly with X1900 cards, I'll probably just uninstall the CCC altogether anyway.
 
Another HUGE advantage that ATI has that no one has mentioned is in overclocking. You can now adjust voltages through software. No more bios flash when upping the voltage on the GPU or even the RAM.
PS, I bought an 1900xt. Flashed to an XTX, used overlocker to clock it to 749/1710 on STOCK cooling.(3800rpm) No artifcants or instablilites in Graw or Oblivion.....thats what I call overclocking.
 
Everybody always mentions heat and noise.

Neither one of these effect performance and image quality. If you try to overclock it too hard it will artifact, but that will happen on any card you try to overclock too hard.

I was in the boat of a 7900GT or an 1800XT a few weeks ago, and I chose the 1800XT. Benchmarks were close between the two cards. This means it came down mostly to the image quality. Also the extra 256mb of ram helped sway my vote.

Most reviews I have read the 1900XTX comes out on top. I myself don't care what brand I have, i'm not loyal. Whatever has the best performance, image quality, and price is what will attract me. A lot of people posting seem to be biased, and I don't know why. It would only make your $$$ go to best use if you checked out all the options with an open mind before you spend.
 
oldmanwinter said:
I never disputed that. I definitely prefer the faster, .NET-less nVidia user CP, but I'm not going to stick with what I perceive to be an inferior product solely because of that. CCC is slow and bloated (and ugly) but considering that I just move the slider to optimal quality and forget it, I can deal with having to use the CCC once in a while. Once ATI Tray Tools updates to allow overclocking properly with X1900 cards, I'll probably just uninstall the CCC altogether anyway.

Yep. I went with 7900GTs since my SLI board cost me ~$55 bucks and a crossfire board would have cost me ~$200 so it was definitely cheaper to go SLI than Crossfire and with a simple voltmod, they become 7900GTXs. Plus I hate the idea of having to have a master card (that's stupid...why not use 2 of the same like SLI). when thinking of reselling at a later point.
 
^eMpTy^ said:
The X1900XTX is a great card

But people have different priorities, and a lot would rather have a 7900GTX...

Well ^eMpTy^, we are in agreement. :D

ATI users want fast with without sacraficing IQ
NV users want fast with no regards to IQ
 
Man...this versus stuff is stupid...

Just get what you like...you gotta be like me!!! Se when i was born 21 years ago i promised myself that i will be a GreenMan...or Nvidia fan man...meaning NO MATTER what i am gonna get nvidia...and thats why i want to name my kid Nvidia...of course i am still waithing for the go from Mrs.

Dude you gotta take a side and just stick with it...otherwise you can go on and argue for years as of which one is better...its not only video cards...its everything else in life...Brunette or Blonde? Does NOT matter...as long as it is good, you will enjoy it

If you cannot make a simple dicision to buy a video card for your computer, i don't know what the hell you gonna do when it comes time for getting a "wife" ...good luck.
 
ok lets make a quick conclusion:

Nvidia: marginally better performance in non shader intensive games
Ati: marginally better IQ options and IQ.

then here is the disclaimer:

Disclaimer! Marginal differences between the 2 cards vary from person to person. What is noticeable to one person, may not be noticeable to another person.

In the end YOU must decide what is best for YOU.
 
who does everybody say that they have seen becnhmakrs that has the x1900xtx beating the 7900 gtx. In all of the benchies i have seen the 7900 gtx beats the x1900xtx in most games by like 5 or like 3 fps. Someone please post a review where the x1900xtx beats it in most games. I am not being a smartass i actually haven't seen any reviews of the x1900xtx beating it in MOST games
 
to op

why are you still reading this silly bullshit? go out and get that 7900gt. or sit around and worry about hdq and aaf and how much more brilliant one guy is for buying one brand or another and which card shows the chain link fence in hl2 better. buy the gt and take your girl out for pizza with the change. its friday night, after all.
 
boomheadshot45 said:
who does everybody say that they have seen becnhmakrs that has the x1900xtx beating the 7900 gtx. In all of the benchies i have seen the 7900 gtx beats the x1900xtx in most games by like 5 or like 3 fps. Someone please post a review where the x1900xtx beats it in most games. I am not being a smartass i actually haven't seen any reviews of the x1900xtx beating it in MOST games
*pats head*

It's OK, you're not the only one who's realistic. It's just that some people are still living in the short one-month span when the X1900 was top dog. Realistically, there is no difference between the X1900XTX and 7900GTX besides the mentioned driver CP (where nVidia obviously wins). The ATI also does higher quality AF, but of course it costs on performance. ATI also boasts HDR+AA in the current HDR implementation; the fact of the matter is that the only reason is that HDR in it's current implementation is not very nVidia friendly, and also that there is one (uno) game on the market that uses HDR+AA.
 
InorganicMatter said:
there is one (uno) game on the market that uses HDR+AA.

Current- or in the (near) future (2006) arriving games which support true FP HDR:

Bet on Soldier
Bioshock
Brothers in Arms 3
Crysis
Duke Nukem Forever
Elvion
Far Cry
(with patch 1.3 installed)
Far Cry Instincts: Predator
Gears of War
Huxley
Juiced
(PC version only)
Kameo: Elements of Power
Lord of the Rings: Battle for Middle Earth
Lineage II
(with Chronicle 4 updated)
Project Gotham Racing 3
Project Offset
Perfect Dark Zero
Serious Sam II
Starship Troopers
Stranglehold
The Elder Scrolls IV: Oblivion
TimeShift
Tom Clancy's Ghost Recon: Advanced Warfighter
Tom Clancy's Splinter Cell: Chaos Theory
(PC version only)
Tom Clancy's Splinter Cell: Double Agent
(PC and Xbox 360 version only)
Tomb Raider: Legend
(PC and Xbox 360 version only)
Unreal Tournament 2007
Vanguard: Saga of Heroes

http://en.wikipedia.org/wiki/High_dynamic_range_rendering
 
InorganicMatter said:
Dude...WFT? Half those titles you mentioned are console games.

You mean "console AND PC"?

Afaik console only are PGR3, PDR, Kameo and Farcry Preditor
 
InorganicMatter said:
ATI also boasts HDR+AA in the current HDR implementation; the fact of the matter is that the only reason is that HDR in it's current implementation is not very nVidia friendly, and also that there is one (uno) game on the market that uses HDR+AA.

You mean the same type of HDR that nvidia have been saying is the only real way to do HDR when they launched the 6xxx series of cards.
 
UPDATE WITH THE SITUATION!

Seems he finally came to his senses and realized that dropping $500 on a dying breed of DX9 cards was stupid. He pulled the trigger on a 7600GT "XXX" and is gonna save the rest for a DX10 card in the fall.

You may now continue with the red vs. green war. Just let me make my way to the exit :p
 
Bona Fide said:
UPDATE WITH THE SITUATION!

Seems he finally came to his senses and realized that dropping $500 on a dying breed of DX9 cards was stupid. He pulled the trigger on a 7600GT "XXX" and is gonna save the rest for a DX10 card in the fall.

You may now continue with the red vs. green war. Just let me make my way to the exit :p

lol nice one bona :p
 
+1 for going nVidia
+1 for an awesome midrange card

I've got the 6800GS (comparable to the 7600GT), and it runs everything great at medium-high.
 
i just bought an X1900XTX and i have a 7900GTX on my sisters comp and i find the x1900XTX to be way better than the 7900GTX, its not much louder as ppl say and for the heat issue its all a matter of having good case airflow to make it quiet, my card never goes above 60C even while gaming and i keep all the settings on high at 1280x1024x32 with every bloody thing on high or maximum
 
The chances of him keeping that money saved till DX10 cards, and DX10 itself come out, is slim to none. Leaning heavily towards none.

InorganicMatter said:
+1 for going nVidia
+1 for an awesome midrange card

I've got the 6800GS (comparable to the 7600GT), and it runs everything great at medium-high.

What is this, some kind of game? Please, there are no points in this. As you can see by the results, most people think the X1900 is a better card.
 
fallguy said:
The chances of him keeping that money saved till DX10 cards, and DX10 itself come out, is slim to none. Leaning heavily towards none.

No, he's actually pretty good with his money. Of course, if something unpredictable comes up between now and fall, that money may be gone. But freak accidents aside, he won't really need to spend the money. His system is pretty beefed up otherwise.

Pentium 4 631 @ 4.5GHz
1GB DDR2-800 RAM
300GB HDD
XFX 7600GT XXX

Not the BEST system out there, but it's definitely more than enough. Chances are he'll have enough money by fall to double his RAM as well as get a top-end DX10 card.
 
I doubt it, but thats his pergoative. Hes seriously being held back by the budget card, and will have to play with lower settings until he gets a new card. Buying a top end card for around $400, would give him much beter frames, and its not like he wouldnt be able to sell it later. Or even $100 more for a 7900GT, or around $50 more for a X1800XT would net him much better frames. But as I said, everyone has a choice of their own. I just couldnt muddle thru games on a budget card. :eek:

If he insisted on buying a budget card, 2gigs of ram would be a good choice with the left over money. 1gig is hardly enough for me anymore... many games stutter with just 1gig now days.

But this thread appears to be over, hope he enjoys his card.
 
Back
Top