Which is better: 1900XT or 7800GTX 512mb?

LordBritish

2[H]4U
Joined
Jan 28, 2001
Messages
2,062
Which is better: 1900XT or 7800GTX 512mb?

If price and availability was not a factor which one would you get?
 
Yeah and cheaper and easier to find too. In some games it's even but in FEAR Nvidia is taking a beatin'

But Nvidia is coming out with something new soon.
 
As a single card yes, the X1900 is a little better in most cases, some times a lot better.

Where the question becomes far less clear cut is dual GPU configs.

Two 512 GTX will always be better than one X1900. And if you already have an SLI mobo, you'd have to consider the cost and hassel of another mobo for CrossFire. Of course, if you're building or buying a system from scratch, then this doesn't matter. Two X1900's are better than two 512 GTX's

The second thing to remember with dual GPU's, unless you're going to run a CrossFire platform and an SLI, is that you are also picking a platform, not just a GPU. So say if you went CrossFire, and in two months time 7900 comes out and is better. One 7900 vs one X1900 is faster, but two X1900 is faster than one 7900 (a likely scenario). So do you dump CrossFire, and go SLI?

Probably not. Cause then two months later X2000 comes along the cycle repeats itself.

So, if you're never going to run more than one card, sure you can go and pick up the fastest card at any point and time and be happy.

But with dual GPU's, unless you have both platforms (and a VERY large computer budget), I just don't see any cost effectiveness or for than matter, enough of a performance delta in most back to back release cycles from vendors to warrant the migration back and forth, unless one of them makes a major blunder.

Dual high end GPU setups should last a while and be so fast anyway that you really wouldn't need about the lastest and greatest for a year anyway.
 
I'm just hoping that nvidia and ati eventually pull their heads out and make sli/crossfire motherboards compatible with both, then the decision will be a lot easier.
 
bobzdar said:
I'm just hoping that nvidia and ati eventually pull their heads out and make sli/crossfire motherboards compatible with both, then the decision will be a lot easier.

That would be nice, but obviously there is a GREAT lack of motivation for such technology from both sides obviously.

I like the X1900, had even thought of putting of building my sig rig, which is only about six weeks old, to wait for these.

At then end of the day, it wasn't only GPU performance, but the platform. CrossFire is still to new for me. I waited a year before I dealt with SLI.
 
As soon as ATi catches up in the dual cards department i'm sure there will be motherboards that support both ATi and nVidia dual card solutions, but at the moment nVidia would like to stay on top of that rather than jumping into the boards that support both
 
heatlesssun said:
As a single card yes, the X1900 is a little better in most cases, some times a lot better.

Where the question becomes far less clear cut is dual GPU configs.

Two 512 GTX will always be better than one X1900. And if you already have an SLI mobo, you'd have to consider the cost and hassel of another mobo for CrossFire. Of course, if you're building or buying a system from scratch, then this doesn't matter. Two X1900's are better than two 512 GTX's

Well, with the price and availability of 7800gtx 512's it would still be cheaper for dual x1900s and a crossfire board than sporting 2x 512 gtx's. In some games there's simply no competition (like fear). Nvidia just gets assrape, and if all the 7900 is is 33% more pipes and 33% more core clock speed, nvidia still won't take fear from the 1900. Also, fear should be a large indication of games to come. So, it seems like the 7900 is gonna rape the x1900 in older games (as if cards aren't fast enough in older games?) and the x1900 will be a bit faster in newer games (like fear, ut07, oblivion etc). That's how I see it
 
sabrewolf732 said:
Well, with the price and availability of 7800gtx 512's it would still be cheaper for dual x1900s and a crossfire board than sporting 2x 512 gtx's. In some games there's simply no competition (like fear). Nvidia just gets assrape, and if all the 7900 is is 33% more pipes and 33% more core clock speed, nvidia still won't take fear from the 1900. Also, fear should be a large indication of games to come. So, it seems like the 7900 is gonna rape the x1900 in older games (as if cards aren't fast enough in older games?) and the x1900 will be a bit faster in newer games (like fear, ut07, oblivion etc). That's how I see it
It's hard to tell if F.E.A.R is representative of all the games coming down road, we need a few more Next Generation games to come out to see if this will be the case. There are some games where ATI's architecture does really well, F.E.A.R is one of them, X3: The Reunion I would say is another.

If current 7800 GTX 512 SLI is a performance indicator though, 7900 GTX SLI should beat the X1900 XT Crossfire even in F.E.A.R.
 
coldpower27 said:
It's hard to tell if F.E.A.R is representative of all the games coming down road, we need a few more Next Generation games to come out to see if this will be the case. There are some games where ATI's architecture does really well, F.E.A.R is one of them, X3: The Reunion I would say is another.

If current 7800 GTX 512 SLI is a performance indicator though, 7900 GTX SLI should beat the X1900 XT Crossfire even in F.E.A.R.

sli, blah to sli and crossfire.Single cards are the win! This is how I see sli and crossfie, engineers are being lazy so they're like, "hey guys, I don't feel like working, lets just throw two cards together to make up for us being lazy" :D :p ;)
 
Since I don't care much for excess, I would never want dual GPU configs, so the choice is easily X1900, a much superior card by itself.
 
I agree, I would think it would take a couple months for them to show up in mass quantity
 
I'm just hoping that nvidia and ati eventually pull their heads out and make sli/crossfire motherboards compatible with both, then the decision will be a lot easier.

rumors are flying that intel doesn't like the cross chipset platform either, so they are creating a hybrid of both, so we can access either crossfire/SLI on the same board, will be awsome if this isn't just a rumor

It's hard to tell if F.E.A.R is representative of all the games coming down road, we need a few more Next Generation games to come out to see if this will be the case. There are some games where ATI's architecture does really well, F.E.A.R is one of them, X3: The Reunion I would say is another.

If current 7800 GTX 512 SLI is a performance indicator though, 7900 GTX SLI should beat the X1900 XT Crossfire even in F.E.A.R.

i fail to see your comparison in FEAR between the GTX512, X1900 and the 7900, i still think the X1900 will be ontop but not by much, maybe one up on the AA levels

how ever FEAR should be one of the clear representatives of whats to come, UT2k7 is going to feature the same thing FEAR does and thats shading each pixel, and we will probably be seeing alot of games coming out using the UE3, shading is becoming huge it is the next step in absolute eye candy

and with real time soft shadows coming in the Cry Tech Engine2 game, i can gaurantee having a lot more shader power will benefit there
 
I'd go with the 7800GTX (or 7900 when it's out).
The 1900XT still lacks some features that the 7800 has, which put it at a disadvantage with rendering techniques such as HDR. There were also some limitations with vertex texturing on the 1900XT, I believe.
I don't care too much about the speed in current games. Both cards are extremely fast, and there isn't a game that can slow them down yet. I just think the 7800 is better prepared for the future.
 
If you want to argue dual gpu configs even a 1800XT is much faster than a gtx 512MB with super aa since SLI antialiasing was just something to try to piss on ati's parade, lucky for ati their 1800/1900 composite chip does superb super aa so there's simply no comparison between the two.
With 1900's the difference is even more.
http://www.firingsquad.com/hardware/ati_radeon_x1900_crossfire_performance/page12.asp and next two pages for super aa/ sli AA comparisons.
 
Scali said:
I'd go with the 7800GTX (or 7900 when it's out).
The 1900XT still lacks some features that the 7800 has, which put it at a disadvantage with rendering techniques such as HDR. There were also some limitations with vertex texturing on the 1900XT, I believe.
I don't care too much about the speed in current games. Both cards are extremely fast, and there isn't a game that can slow them down yet. I just think the 7800 is better prepared for the future.

The x1900xt lacks nothing from the 7800gtx you mentioned, actually it does more. HDR+AA. The x1900 series also feature the Fetch4 missing on the x1800 series.
Here: http://www.hardocp.com/article.html?art=OTUz

The x1900 is the better card.
 
Shadow27 said:
The x1900xt lacks nothing from the 7800gtx you mentioned, actually it does more. HDR+AA. The x1900 series also feature the Fetch4 missing on the x1800 series.
Here: http://www.hardocp.com/article.html?art=OTUz

The x1900 is the better card.

I'm not convinced. Their 'HDR+AA' is simply multisampling on fp rendertargets. Granted, this is a thing that the 7800 lacks. But I was talking about FP16 blending and vertex texturing, which afaik is not supported by the X1000 series, not the 1900 either.
Lack of FP16 blending is why the X1800 is slaughtered in 3DMark06. The X1900 is faster, but afaik it's just because it has more raw processing power, not because it does support FP16. But if you have proof of vertex texture support and FP16 (or better) blending support on X1900, I'd like to see it. I haven't found it anywhere.
 
In General the X1900 looks to be a better card for the future which may hurt its reputation now. The X1900 is a large step from the X1800 but it wont be seen due to the software at the moment is what I am saying. If FEAR is any indication of what the future is the X1900 may be the longest living card we have ever seen. Of Course their are no gurantess that Software will move into the direction of FEAR. As a result the 7900 may crush it if Nvidia chooses to stay the traditional route until needed. Like with SM3.0. The 6800s had it but it didnt do much for them. Of course unlike the X1900 SM3.0 was just an added feature. With the X1900 performance is affected. It is faster but not leaps and bounds. Of course its price and availablility are enough of a reason to buy it over a 7800512.
 
I have both cards (ATI x1900xtx and BFG 7800 GTX 512 OC), and they are more similar than different in current DX9 games. The 7800 shows a little more horspower at times with the extra pipes, but, in HDR based games like FEAR, the ATI really does do better (with 2x the shader capacity). I also have to give the nod to ATI on IQ.

My take on the cards is this. If you want to go dual card, go 7800 GTX SLI. If you want the best single card, go with the XTX.
 
The GTX 512mb isnt an option really beacause its simply not available! Besides, the X1900XT is the better card. Its faster, less expensive and available. :)
 
Scali said:
Lack of FP16 blending is why the X1800 is slaughtered in 3DMark06.

Hmmm first I have heard of this... curious do you have any data to support this???


But if you have proof of vertex texture support and FP16 (or better) blending support on X1900, I'd like to see it. I haven't found it anywhere.

I think your right....

However I think the 1900xt is better prepaid as it looks like having more Shading power will be more helpful than having FP16 (or better) blending... but just a guess :)
 
Jbirney said:
Hmmm first I have heard of this... curious do you have any data to support this???

Not officially...

Jbirney said:
However I think the 1900xt is better prepaid as it looks like having more Shading power will be more helpful than having FP16 (or better) blending... but just a guess :)

I'm not sure. More shader power isn't really a substitute for FP16 blending, let alone vertex texturing. So that's why I'm saying the 7800 is a better card, assuming ofcourse I'm right about these things. If the X1900 does have all the features, that changes things ofcourse.
 
Codename: R580
Process technology: 90 nm
over 400 mln transistors (G70 contains 302 mlns)
FC package (flip-chip, flipped chip without a metal cap)
256 bit memory interface
Up to 1 GB of GDDR-3 memory
PCI Express 16x
48 pixel processors
16 texture units
Calculating, blending, and writing up to 16 full (color, depth, stencil buffer) pixels per clock
8 vertex processors
FP32 processing throughout the pipeline (vertices and pixels)
SM 3.0 support (Shaders 3.0) including dynamic branching in pixel and vertex processors.
Attention: there is no vertex texture fetch.
Effective branching
Support for FP16 format: full support for output into a frame buffer in FP16 format (including any blending operations and even MSAA). FP16 texture compression, including 3Dc+.
Attention: no support for hardware filtering during FP16 texture sampling.

New RGBA (10:10:10:2) integer data type in a frame buffer for higher quality rendering without FP16.
New high-quality algorithm for anisotropic filtering (a user is given a choice between a faster or higher-quality anisotropy options), improved trilinear filtering
Support for "double-sided" stencil buffer
MRT (Multiple Render Targets — rendering into several buffers)
Memory controller with a 512-bit internal ring bus, two 256-bit contradirectional rings, (4 memory channels, programmable arbitration).
Efficient caching and a new more effective HyperZ implementation

from: http://www.digit-life.com/articles2/video/r580-part1.html

hope that answers your question
 
the fetch4 feature is their work around for it, if a developer ever wants to add it they can, and honestly we will probably never see these features in full bloom in any game
 
Trimlock said:
the fetch4 feature is their work around for it, if a developer ever wants to add it they can, and honestly we will probably never see these features in full bloom in any game

Fetch4 is their workaround for shadowmaps, I don't see how you can do texturefiltering with it. And texturefiltering has been a standard feature since early Voodoo days, all games will use it when available for FP-textures, and it can make quite a difference in image quality (you know, blocky textures, like back in the software rendering days...).
And ofcourse you can't do vertex texturing with it either.
 
Scali said:
Not officially...

Ok just was wondering as I have heard rumblings that said the reasons that some of the ATI cards were slow are due to different things...but no one is sharing any real data sooo :) :)


Scali said:
I'm not sure. More shader power isn't really a substitute for FP16 blending, let alone vertex texturing.

Oh and I am not saying it is, just that from reading Tim S, John C and other developers recent posting, they are all saying that you will see games ALU to Texture ratio increasing for the ALUs. Now none of them are going to say what that that ratio is, so maybe ATI 3:1 is a good choice, maybe its not. But we have games out today FEAR/CS:CT that really have high ALU:TEX ratios...



Scali said:
And it also says that it still doesn't support vertex texture fetch.

There is a documented work around that ATI says is faster for both thier cards and NV if you care....
 
Jbirney said:
Ok just was wondering as I have heard rumblings that said the reasons that some of the ATI cards were slow are due to different things...but no one is sharing any real data sooo :) :)

Nope, that's because only Futuremark and probably ATi know the real answer. ATi isn't going to reveal its own weaknesses, and Futuremark can't reveal any weaknesses either, because they have to remain neutral towards all IHVs.

Jbirney said:
Oh and I am not saying it is, just that from reading Tim S, John C and other developers recent posting, they are all saying that you will see games ALU to Texture ratio increasing for the ALUs. Now none of them are going to say what that that ratio is, so maybe ATI 3:1 is a good choice, maybe its not. But we have games out today FEAR/CS:CT that really have high ALU:TEX ratios...

Yes, but what does that have to do with FP16 blending or texture filtering?

Jbirney said:
There is a documented work around that ATI says is faster for both thier cards and NV if you care....

That's not a workaround, there can't be a workaround. If you can't read textures inside a vertex shader, you can't read textures inside a vertex shader.
Ofcourse ATi spindoctors could take some technique that *could* be implemented with vertex texturing, but also with render-to-texture, and then make it so that the vast number of pixelpipelines outperform the modest number of vertex units, and then trying to make people believe that any technique can be implemented without vertex texturing, and that it will always be faster with pixelshaders. Problem is, a lot of people actually buy that sort of crap.... But we already covered that part elsewhere.
 
This is a good question that I may have some input into coming up soon.

I ordered a GTX 512, waited forever and cancelled it when the X1900 XTX came out. I bought an XTX at 6:00 AM on the 24th.... My friend, happenned to get lucky and actually get the BFG 512 MB card back last November, sucks is he ordered only three days before I did and he actually got his.

Now he has decided to buy two Dell 30" LCD screens...the only catch is the GTX does not support their native res and the XTX does. I am considering doing a card swap for a while to help him out, since the XTX does 2560x1600 on each DLDVI connector.

BUT: After I played a little Quake4 online with max settings...I don't know...it was damn sweet, I can't see the GTX being any better, and the XTX still has a major performance increase in the next release of cat drivers coming up soon...

I think I'll convince him to ebay his BFG and buy an XTX.
 
The X1900 series is a better card. The card is brand new, great performance, great visual quality, and its avalible. Even if going dual GPU you can get a crossfire board and slap 2 X1900's together and im sure it will be fast as hell.

I would get the X1900 in both cases, single and dual. The 7800 although fast enough for current games is still older technology.
 
BBA said:
BUT: After I played a little Quake4 online with max settings...I don't know...it was damn sweet, I can't see the GTX being any better, and the XTX still has a major performance increase in the next release of cat drivers coming up soon...
That is because it isn't better. ;) I wouldn't trade my X1800 or X1900 for any 7800 line card on the market.

BBA said:
I think I'll convince him to ebay his BFG and buy an XTX.
That is a good idea!
 
I think they caught up, hell they were in the lead last round in terms of price and performance in my eyes.
The thing that killed them was their {S}oft Launch.
We all know that if the X1800 was out at the same time of the 7800 launch it would have been a race too close to call. None the less Nvidia won 3/4s of last round. Which really does make them the winner.
 
Endurancevm said:
The X1900 series is a better card.

Even though it lacks some rather important features?

Endurancevm said:
I would get the X1900 in both cases, single and dual. The 7800 although fast enough for current games is still older technology.

I'd say the 7800 is newer technology, because of the features it already supports and the X1900 still lacks.
 
all i have to say is im glad to see ati back in the game.. its like seeing jeff gordon come from behind in the middle of a race and and come out first... not that ATI is "first". I feel both cards are great.. and with all this waiting im not supprised at all that this new ati card has a few extras then the 7800... more or less. I think both cards have their great qualitys.. and really feel you cant compare these cards... i think that the X1900XTX should be compaire with Nvidias next card... seeing how the X1900XT was the 512s match up and the 256 was the X1800s... ( in my mind).... right now im going for price over performance...im waiting for UT2k7 benches.. i want the best card at that time for that game...
 
ati=hdr+aa

Something i definately need and was one of the major deciding factors along with availability and cheaper price to get an x1900xt. Got the x1900xt a few days ago and clocked it to xtx speeds on air. Games look beautiful on 1920x1200 with 4xaa and 16xaf with hdr+aa at the same time, and runs like butter!
 
Scali said:
Yes, but what does that have to do with FP16 blending or texture filtering?


Thnk he is getting at fp16 blending is done through a shader on ATi cards ;)
 
Back
Top