1900XTX is currently better than the 7800GTX512.. will that hold true down the road?

BurntToast

2[H]4U
Joined
Jun 14, 2003
Messages
3,677
(My basis on the 1900XTX being better than the 7800GTX is based on an average. Not on any one game like Quake 4 for example, where the 7800GTX512 is better)

Back in the day when I purchased my 9800Pro it kicked the living shit out of the current Nvidia 5950 series of the time.

But now when I see my 9800Pro running head to head against a 5950 Ultra in Quake4, Fear, BF2 (Current Gen Games) My 9800Pro isn’t so great any more. But it sure did enjoy the Quake3, Bf1942 and Unreal2004 Generation.

So... fast-forward 2-3 years from now, the lifespan that I normally keep a card. Will the 1900XT or 1900XTX continue to hold its ground? Or do you expect the 7800GTX due to better driver support, compatibility and paid in game support ("The Way It's Meant to be Played") for next gen games make it the better long term buy?
 
I don't know, I would think the X1900XT would hold up better due to more pixel shaders and future games becoming alot more pixelshader intensive.
 
Yeah that is one thing I took into consideration.

But there is always the fact that Nvidia has more money and they tend to brand games more so then ATI. So in theory Nvidia could support games so that the games themselves better support and take advantage of the Nvidia hardware.
 
I'd have to say that the overall performance difference will resonably close. The X1900 is the faster card now and with the better shader performance it may handle some games better where shaders are the limiting factor. Drivers these days won't really open up huge performance leads unless something is poorly implemented in the first place.

One thing to keep in mind is that we will see the release of Directx 10 within the year. I doubt it will become a requirement for games for some time to come but it will make everyones next video upgrade come sooner rather than later.

In any case all you can ever buy is the card with the best performance, features and price which is available, right now this is the X1900XT(X).
 
MAYBE IN 2-3 years Dice will have finally gotten BF2 to not lag too.....

the ATI is more future proof and b/c having the shaders gives it longer legs while possibly being slower horsepower-wise that's something drivers can help by optimization.
 
Point being, everyone said the 9800 was better suited for future titles... seems in some cases thats simply not the case. Honestly I think its odd how the 9800 went from being top dog in its generation to being regulated by an FX card. lol
 
dagon11985 said:
Point being, everyone said the 9800 was better suited for future titles... seems in some cases thats simply not the case. Honestly I think its odd how the 9800 went from being top dog in its generation to being regulated by an FX card. lol
The 9800 was not even "top dog" for many games in it's generation, the 5950 bested it in several benches.

It was arguably the better card, but the problem is that you can't really predict the way developers will code future games.

I know it popular to say "FEAR is the way of future games" but it's not necessarily true and I haven't heard of any licenses of it's engine so far. (whereas there are like 5 Doom3 licenses I've heard of)

Anyway, it's probably safe to say neither of these cards is what you'll want to have in three years, so it's a moot point.
 
The 9800 Pro still comes out ahead of a 5900 in most of todays games, it even comes out ahead in OpenGL games like Doom 3 and Chronicles of Riddick under some settings (and in HL2 the 5900 series defaults to PS 1.1) granted they are very close.

http://www.tomshardware.com/2005/07/05/vga_charts_vii/page9.html

But I agree the 5900 wasn't a bad card, I actually bought two of them, and although people knocked it at the time, I liked things like its angle independant AF, great compatibility with old games, and supersampling modes. The actually liked the AF quality on the 5900 though people knocked it at the time for brilinear (now all the cards have brilinear modes), shimmering was a non issue and like I said earlier the AF was angle independant. nVidia took a major step backwards with their 6800 and 7800 in terms of AF quality, not necessarily because theiy were angle dependant, but because of the shimmering issues.
 
chinesepiratefood said:
Actually, back when the 5950U came out it beat the 9800 Pro in its share fare of benches

Back during the 5950 and the 9800 pro days, it was Directx performance vs Opengl, which nvidia was doing better in (Still mostly)
 
chinesepiratefood said:
Actually, back when the 5950U came out it beat the 9800 Pro in its share fare of benches

It's a shame that the rest of the FX series was garbage.

Why isn't anyone comparing the 9800XT to the 5950 Ultra since that was its real competition?
 
9200/9600/9800 Series > 5200/5600/5900 Series HANDS DOWN!!!!!!!!!
 
NickS said:
9200/9600/9800 Series > 5200/5600/5900 Series HANDS DOWN!!!!!!!!!

Really?

http://www.anandtech.com/showdoc.aspx?i=1821&p=24
5900U beating 9800 Pro UT2003 16X12 4X8X, and note, that is with angle independent AF for the 5900U, angle dependent for the 9800Pro. A D3d game.

http://www.anandtech.com/showdoc.aspx?i=1821&p=26
5900U demolishing a 9800Pro at Jedi Knight 2 16X12 4X8X- 167>114fps!

http://www.firingsquad.com/hardware/leadtek_winfast_a350_ultra_tdh_review/page5.asp
5900U wins 12X10, loses 16X12 by 3fps at Nascar2003

http://www.firingsquad.com/hardware/leadtek_winfast_a350_ultra_tdh_review/page6.asp
Loses IL2 at 16X12 by less than 1fps

http://www.firingsquad.com/hardware/leadtek_winfast_a350_ultra_tdh_review/page7.asp
84 vs 89 fps at Quake 3

http://www.firingsquad.com/hardware/leadtek_winfast_a350_ultra_tdh_review/page9.asp

Man, I can't believe I'm going through the whole FX vs 9800Pro argument again! :rolleyes:

In any case, you can see there is no "hands down" about the 9800Pro vs 5900U. Their performance on the games of the time was pretty identical, they basically swapped small victories across most games. The only difference you probably would have noticed is the slightly sharper AF that nVidia had then, ATI has now.
 
Rollo said:


Yes really.

Come on, 5900 was like the worst card ever. Who cares if it's supposedly aged well. Which I doubt.

Lets move on from that card..Nvidia certainly has. Starting with the 6800 they improved greatly.

And lets not forget the 5900 resorted to junky IQ settings in some cases to even compete (covered in the FS review linked).

And to the original post, the X1900 should get much better with time. Even now look at the benches, new games like BF2 and FEAR are where it excels the most. I would think whereas right now the cards are within 20% of each other, in the future X1900 will be easily 50% faster than 7800 512 in future released games.

Remember internally it features 48 pixel shading pipelines.
 
Rollo has just stated absolute crap. In DX8 FXs were very competitive with ATI 9 series. As soon as you went to DX9 however, that picture changed. This is fact, and can be found very easily by Googling comparison reviews of the time.

Please explain to everyone Rollo, how in HL2 9600/9700/9800s all default to DX9, whereas ALL FX series cards default to 8.1.....The performance just wasn't there.

Also take into account that MS actually labelled HL2 its official DX9 technology game, a little after release.

Rollo is probably going to hit back with some more FUD about MS withholding DX9 specs from nV, or that MS modified DX9 without telling nV (both have been said in various places before). The fact is that nV deviated from the DX9 spec with its FX line, at the same time as ATI scored some great technology and put it to use from 9700Pro on.

I have no problem with you Rollo announcing your associations and pushing for nV; Its when you spread FUD and attack competitors products (often without any legitimate reason) that I get mad.

The value of these forums is in the quality of information supplied, and the pooling of tech talent to help others. Don't try and mislead people, you will only ever end up being shot down in flames.

Manic out.
 
Sharky974 said:
Yes really.

Come on, 5900 was like the worst card ever. Who cares if it's supposedly aged well. Which I doubt.
Well, I posted links to many benchmarks showing the cards equal in performance for the games of the time, I see nothing like that from you to back up your assertion?

Lets move on from that card..Nvidia certainly has. Starting with the 6800 they improved greatly.
Starting? I guess I must be forgetting how my TNT2 owned my Rage 32, my GF1 owned my MAXX, my GF2Pro owned my VIVO, and my GF4 owned my 8500? ;)

And lets not forget the 5900 resorted to junky IQ settings in some cases to even compete (covered in the FS review linked).
Could you link and quote that please? I must have missed it, but check this out:
http://www.anandtech.com/showdoc.aspx?i=1821&p=14
Finally said:
http://www.anandtech.com/showdoc.aspx?i=1821&p=15
Both ATI and NVIDIA's quality modes are virtually identical said:
Anandtech doesn't seem to agree with you?

And to the original post, the X1900 should get much better with time. Even now look at the benches, new games like BF2 and FEAR are where it excels the most.
http://www.xbitlabs.com/articles/video/display/radeon-x1900xtx_31.html
Age of Empires 3 uses SM3.0 special effects, supports HDR and seems to be just the application for the Radeon X1000 to show its best in, but we’ve got nothing like that in practice

I would think whereas right now the cards are within 20% of each other, in the future X1900 will be easily 50% faster than 7800 512 in future released games.
Of course, this is just your guess so we should probably give it the same weight as any other guess which has no basis in fact?


Remember internally it features 48 pixel shading pipelines.
 
The 9700 was probably the biggest leap in GPU technology I can think of in the last 5 years. ATI really took it to nVidia with the 9000 series in June 2002. nVidia in my opinion didn't recover until the 6800 almost two years later.

I think Rollo is literally correct about the 5900, but really, at the time most enuthsiasts were ATI heads because as a group, PC enuthisiats are a VERY discriminating bunch. Either we want the best value or the best product period.

As a group, very few of us are '!!!!!!s'. We buy based on the the knowledge and experience of others almost without fail, which makes us a pretty smart consumer group I think.

However, the days of buying GPU's is somewhat more complex with the advent of SLI and CrossFire. To a large extent, those of us that are putting high end multiple GPU's in our machines for the foreseeable future really not only can't switch GPU vendors but probably don't really need too for the most part.

Yeah, I would have liked to have played F.E.A.R. with an X1900, but ther wasn't much that X1900 CF was going to give me over 7800 512 GTX SLI, not enough to ditch 6 week old video cards and a motherboard.

I think as time goes on, it will become harder and harder to say without question what hardware is the best. A single X1900? Better than a single 7800 512 GTX. X1900 CrossFire? Well, I'm not convienced that its better, but that is primarily to CrossFire's immaturity more than anything.

But a statement like "X1900 owns 7800 GTX 512!" is a little too simplistic unless your talking about only one GPU.

Times are becoming more complex!
 
this is really constructive. informative, too.
buy the best card you can get now. if its not holding up in 2 or 3 years, i'll bet you will have saved up another 5 or 600 to buy what will then be the best card you can get. its not like you're buying a new car or a house.
there wont be a hell of a lot of difference between the 2 top end cards, just like there isnt now. this is not a competition or anything.
right.
 
Sharky974 said:
Yes really.
And to the original post, the X1900 should get much better with time. Even now look at the benches, new games like BF2 and FEAR are where it excels the most. I would think whereas right now the cards are within 20% of each other, in the future X1900 will be easily 50% faster than 7800 512 in future released games.

Remember internally it features 48 pixel shading pipelines.


It is true, but I think by the time the shader op advantage becomes that large, we will have moved on to different cards.
 
An advantage of buying high-end hardware is resale value. It is actually quite cost effective to buy the best, and sell once a year to upgrade again. Your card remains quite powerful, thus holds alot of its value, in turn reducing outlay upon upgrade.

Selling each year and dropping an extra couple hundred is now fairly equivalent to keeping a card for 3 years then topping out again.
 
heatlesssun said:
But a statement like "X1900 owns 7800 GTX 512!" is a little too simplistic unless your talking about only one GPU.

Times are becoming more complex!

I'd say it's a little too simplistic even for single card- the 512 GTX wins many benches at many games.
 
Rollo said:
I'd say it's a little too simplistic even for single card- the 512 GTX wins many benches at many games.

That is true, but I think my point is that overall, the X1900 is better than the 7800 512 GTX. I know that I would rather have one X1900 than one 7800 512 GTX. The simple reason is that when you look at the numbers, the X1900 is better at shader intensive games, and that's the majority of new titles. The 7800 512 GTX has more pixel pushing power and does well when shaders aren't as big of a concern.

And to me it looks like when the 7800 512 GTX wins, it's not quite as big of a margin as the X1900. Plus the X1900 does AA better overall.

And I think most enthusiasts agree with this assesment. Which is my other point. The enthusiast community gets a pretty good sense of what's the best in the market place, and I've never seen a majority of the community get it wrong.

Of course, it would be silly to switch from an 7800 512 GTX to a X1900 unless you got the X1900 for free or were able to recoup the cost by selling the 7800.

The general feeling of most enthusiats is that a single X1900 is better than a single 7800 512 GTX, and I concur.
 
It is true, but I think by the time the shader op advantage becomes that large, we will have moved on to different cards.

True, but we get a relatively long time going now. We wont see new cards for like six months. Then again, major PC game releases dont happen that often anymore either seems like.

ATI's conference call is out. They also hint G71 is just a die shrink and that they have a PE edition (R590) in the works. So I guess we'll see at least one more new release at the top. But these are minor..
 
NickS said:
9200/9600/9800 Series > 5200/5600/5900 Series HANDS DOWN!!!!!!!!!

try no. like. no.

im all for compitition. Amd and intel. its great, if intel had a monopoly, prices would suck. its the same with Nvidia and ATI. infact sometimes they release cards indirect compitition with each other. EX 6600 VS X700, 6800VS X800/850 7800 VS X1800. so saying "atis series are better then Nvidias series." is just stupid.

ok and to the starter of this page. X1900 is better, and will be for a long time. ATI compinsates for the losses between the X1800 and 7800. and they released the X1900. and yes. the X1800 and the 7800 are counter parts. thats it. the X1900 goes unchallenged because nvidia has been sidetracked by the 7800GS which really shoulda been called the 7300GS, and of course the gfx for the PS3.

but if you havnt heard. the 7900GTX codenamed G71 WILL BE OUT SOMEWHERE AROUND MARCH 15. i can assure you, with a 700Mhz clock, three times the pixel processers, and wayyy more pixel pipelines, it will put the X1900 in its place.

but its still a bad time to buy. with Direct X 10 comin out pretty soon, Windows switching from XP to vista, all current gfx cards still unable to use HDCP for dvds. (read about HDCP at http://www.firingsquad.com/hardware/ati_nvidia_hdcp_support/ ) and of course, if your a gamer, your an AMD fan, and if your an AMD fan, your waiting for socked AM2 to come out, along with the sposidly ground-breaking FX-62.

yea and to set the record straight. sli is and will always be better, in practise, then crosfire. because when sli was patented, any from of internal bridge between 2 gfx cards, is considered a breach of copyright. so ati had to go external, bumping up the cost, and losing a chunk of performance. although, the software for crossfire is nicer, cuz it auto detects the game and you dont have to wait a week for a multi gpu driver, to use splitscreen.ill admit even tho i like how much more simplistic nvidias interface usually is. but i think afr can be turned on without the game supporting sli.

conclusion? wait. just wait. but hey im a hypocrite, im upgrading from this old agp rig to pci-e as soon as the money comes in.
 
the 7900 has been out for less than a month. nvidia supposedly has some kind of update on the g70 coming up. i dont think we will see anything major for about a year.
 
vanilla_guerilla said:
the 7900 has been out for less than a month. nvidia supposedly has some kind of update on the g70 coming up. i dont think we will see anything major for about a year.

!!!!! REALLY!!!!!
 
MrWizard6600 said:
but if you havnt heard. the 7900GTX codenamed G71 WILL BE OUT SOMEWHERE AROUND MARCH 15. i can assure you, with a 700Mhz clock, three times the pixel processers, and wayyy more pixel pipelines, it will put the X1900 in its place.

Ridiculous bollocks! G70 architecture has ALUs coupled to the TMUs. I have my doubts this is going to be a 72 pipe card..............
 
vanilla_guerilla said:
the 7900 has been out for less than a month. nvidia supposedly has some kind of update on the g70 coming up. i dont think we will see anything major for about a year.

What? The only things nVidia has had out for "less than a month" from the 7 series are the 7300GS/GO and the 7600GO
 
BurntToast said:
(My basis on the 1900XTX being better than the 7800GTX is based on an average. Not on any one game like Quake 4 for example, where the 7800GTX512 is better)

Back in the day when I purchased my 9800Pro it kicked the living shit out of the current Nvidia 5950 series of the time.

But now when I see my 9800Pro running head to head against a 5950 Ultra in Quake4, Fear, BF2 (Current Gen Games) My 9800Pro isn’t so great any more. But it sure did enjoy the Quake3, Bf1942 and Unreal2004 Generation.

So... fast-forward 2-3 years from now, the lifespan that I normally keep a card. Will the 1900XT or 1900XTX continue to hold its ground? Or do you expect the 7800GTX due to better driver support, compatibility and paid in game support ("The Way It's Meant to be Played") for next gen games make it the better long term buy?

ATI doesn't really optimize their drivers much past present generation. If it's a new hot title they might optimize for it but if it's like 4 months after a release I doubt it. Cases where you keep up to date drivers and play the same game you will lose performance in some cases. I am sure this happens on Nvidia cards as well to some extent. My geforce 4 ti4600 cards lasted until sm3.0 came around and bf2 was popular and then the card just wouldn't play that bloated poorly done game. I have an old ATI AIW card that won't play anything current or kinda recent well at all hehe but it's an AIW I give it kudos for trying.
 
Rollo said:
I know it popular to say "FEAR is the way of future games" but it's not necessarily true and I haven't heard of any licenses of it's engine so far.
I hope your'e right. FEAR's engine should be trown out and forgotten of.
 
Ok rollo is full of crap. I just sold a fx5900xt couple months ago (overclocked almost to ultra speeds) and it was a POS in all new games (required 800x600 res. with med to low settings). It didnt even run dx9 in HL2!. My bro's radeon 9600pro ran HL2 better with dx9 and ran about the same in FEAR and quake4. I am sure the 9800pro/xt would surpass the fx5900.
 
MrWizard6600 said:
try no. like. no.

im all for compitition. Amd and intel. its great, if intel had a monopoly, prices would suck. its the same with Nvidia and ATI. infact sometimes they release cards indirect compitition with each other. EX 6600 VS X700, 6800VS X800/850 7800 VS X1800. so saying "atis series are better then Nvidias series." is just stupid.

ok and to the starter of this page. X1900 is better, and will be for a long time. ATI compinsates for the losses between the X1800 and 7800. and they released the X1900. and yes. the X1800 and the 7800 are counter parts. thats it. the X1900 goes unchallenged because nvidia has been sidetracked by the 7800GS which really shoulda been called the 7300GS, and of course the gfx for the PS3.

but if you havnt heard. the 7900GTX codenamed G71 WILL BE OUT SOMEWHERE AROUND MARCH 15. i can assure you, with a 700Mhz clock, three times the pixel processers, and wayyy more pixel pipelines, it will put the X1900 in its place.

but its still a bad time to buy. with Direct X 10 comin out pretty soon, Windows switching from XP to vista, all current gfx cards still unable to use HDCP for dvds. (read about HDCP at http://www.firingsquad.com/hardware/ati_nvidia_hdcp_support/ ) and of course, if your a gamer, your an AMD fan, and if your an AMD fan, your waiting for socked AM2 to come out, along with the sposidly ground-breaking FX-62.

yea and to set the record straight. sli is and will always be better, in practise, then crosfire. because when sli was patented, any from of internal bridge between 2 gfx cards, is considered a breach of copyright. so ati had to go external, bumping up the cost, and losing a chunk of performance. although, the software for crossfire is nicer, cuz it auto detects the game and you dont have to wait a week for a multi gpu driver, to use splitscreen.ill admit even tho i like how much more simplistic nvidias interface usually is. but i think afr can be turned on without the game supporting sli.

conclusion? wait. just wait. but hey im a hypocrite, im upgrading from this old agp rig to pci-e as soon as the money comes in.


How can someone be so wrong on so many things in one post?
 
Look at Quake Wars. That game has some SERIOUS shader implementation. The 1900XTX should always win the IQ contest but not neccessarily the FPS contest. Depends on what you want I guess. As always.
 
ivzk said:
How can someone be so wrong on so many things in one post?

Exactly. What kind of crap is this?

because when sli was patented, any from of internal bridge between 2 gfx cards, is considered a breach of copyright.

This is the biggest bunch of crap I have ever seen. nVidia most certainly can patent how their chip makes communications possible between two video cards, but they most certainly cannot patent the fact that it is part of the motherboard chipset.
 
If you guys feel the need to correct what you feel is misinformation, please do so, but do it in a constructive manner. Personal attacks and such will not be tolerated and will get your posts deleted.
 
Back
Top