ATI's x800xt pe leads by far vs 6800Ultra OC in new Far Cry benchmarks, 2nd August.

Blad3

Gawd
Joined
Dec 15, 2003
Messages
537
http://www.firingsquad.com/hardware/leadtek_winfast_a400_ultra_tdh/ - posted 2nd of August.

Now check out the 1600x1200, 4xAA one's - assuming full detail, which is all I care about:

There's one where the x800xt is 19fps(!) ahead of the 6800Ultra, one of which I believe is OCed: http://www.firingsquad.com/hardware/leadtek_winfast_a400_ultra_tdh/page15.asp

Generally, at this level, it appears the x800xt pe is leading generally by at least 12fps in almost all the High res High detail benchmarks. As soon as 4xAA and 8xAF is turned on at least.

What's going on, I thought Nvidia's 6800Ultra was level if not ahead of the x800xt in DX9 now? IMO this would mean the currrent "Half Life 2 is 30% faster on x800xt cards compared with Nvidia's latest" may actually be true to some extent...

Thoughts? Count-Benchmarks? [Note: I originally looked at this site to see how many times better Nvidia was doing in Far Cry now.] If Ati can gain FPS in Doom3, like they say they will...could we be seeing a take-over???

I'm going mad from all this. :confused: :D

EDIT: 5th August.

I posted this information further on in the thread:

Blad3 said:
I double checked the latest x800xt vs 6800ultra benchmarks in Far Cry, alot/all using SM 3.0, and here are the results.

1600x1200 4xAA 8xAF Results : * I originally went to these sites to see Nvidia's GeForce 6800 Ultra winning.

http://www.anandtech.com/video/showdoc.aspx?i=2113&p=7
ATi XTPE stock wins by 9fps vs OCed 6800 Ultra "Extreme".
=====
http://techreport.com/etc/2004q3/farcry/index.x?pg=4
Ati 9fps. Other times equal.
=====
www.firingsquad.com's latest (see original post)
ATi 7-19(!) fps
=====
www.Firingsquad.com's original/first:
ATi 7-13fps.
=====
http://www.pcper.com/article.php?aid=55&type=expert&pid=6
Uses older ATi Cat 4.6, yet new SM3.0 support for Nvidia: Nvidia wins one and ATi wins one by 3/4fps.
=====
http://www.xbitlabs.com/articles/video/display/graphics-cards-2004_17.html
ATI by 8fps.
=====
http://graphics.tomshardware.com/graphic/20040723/his-07.html
ATi 6fps (1.1)
=====
http://www.extremetech.com/article2/0,1558,1620524,00.asp
Ati by 5-10fps
=====
http://www.ixbt-labs.com/articles2/gffx/fc12.html
ATi 3-8fps
=====
* I didn't show some other sites because they didn't seem to show AAx4/AFx8 results or they didn't directly compare the 6800U to the x800xt.

Summary:

Far Cry - Both cards can play at 1600x1200 4xAA, 8xAF, full detail.
Doom 3 - with the 6800Ultra - which wins by about 7-15fps generally - you can play at 1600x1200 Ultra Quality, maybe 2-4xAA, and from user reports you can play on the same settings with the x800xt; (acceptable fps) Ultra Quality 1600x1200, maybe 2-4xAA and on a relatively "low-end" machine i.e. A64 3000+, 1Gb.
Half Life 2 - We'll see. To reiterate - the card that wins will be the card I buy.
 
WTF? can anyone comment on that? i was just about to get a 6800GT...
 
Well looking at the results all card produce acceptable framerates to get a decent playing expirience (30min fps) . With this in mind i would not base my buying decision on this game alone.
 
Yeah, I was just reading about that somewhere else.

What I find odd is the pretty darn big jump from SM2.0 to SM2.0B... It is interesting. If I am understanding correct the past was SM2.0 right?

Man I kinda find this stuff a little alarming. It's like each company is trying to lure game companies to code for their hardware. More then that though it seems like each one is going a different way with different types of tech. More so then ever that I can recall (I am tired though). Meaning each game might provide a totally different experience in almost ANY game (depending on how the game company codes it).

Am I close? If so that is really lame

Almost makes you want to buy a console Too bad I hate consoles and was pc gaming way back. To long to be cured.
 
InkSpot said:
Well looking at the results all card produce acceptable framerates to get a decent playing expirience (30min fps) . With this in mind i would not base my buying decision on this game alone.


LMFAO what like all you lot have decided on getting a 6800GT based on D3???? gimme a break....the hypocrosy is unbeleivable on here.....I think when the HL2 official benchies come out, there are gonna be a lot of red faced 'jumpers' around here......we all knew the the XT PE was better in FarCry and always will be.....same with HL2 and just about every other game out there.....like ive said before and will a million times again....the XT PE is the best card for the majority of games out there.......and you know what *SHOCK HORROR* it evens runs Doom 3 shit hot aswell *GASP*

Yeah Nvidia have a good product ,but the sheer horsepower of the XT PE is hard to beat......personally even when the OGL re write is out I think the 6800 series will still be better for OGL games....but for me and many others...I play more DX games.....and yeah 10fps is hard to notice at the sort of frames fps we are looking at, but I'd rather have the card that has more of the fps gap in most of the games I play......oh yeah and did i mention that Doom 3 runs shit hot on the XT PE? lol
 
I'm just waiting for burningrave101 to jump in this thread. Things might get interesting.
 
The excuse you will hear from 6800 owners is, "wait until the drivers get good" lol.
 
Wich0 said:
I'm just waiting for burningrave101 to jump in this thread. Things might get interesting.

ROFL - best comment possible for this thread, i give you that!
 
MFZ said:
The excuse you will hear from 6800 owners is, "wait until the drivers get good" lol.

Isn't the 61.45 nvidia driver being used an old driver?
 
Most people don't run at 1600x1200 with AA and AF. In the future, these cards will not run many games at those settings playably, so don't worry about getting a 6800 GT or Ultra.
 
MFZ said:
The excuse you will hear from 6800 owners is, "wait until the drivers get good" lol.

"But SM3.0 wasn't properly implemented!", "Wait until the 1.3 patch is released!" and by extension "But Nvidia has HDR and I'd take the superior image any day!", "But ATI has brilinear "optimizations"!", "Nvidia is getting brilinear soon and then you can compare! For now, add at least 25% more fps onto the score to get an idea of where they'll be at.", "Far Cry is only one game, and all future games will use the Doom 3 engine anyway!", "Nvidia still has the superior DirectX 9 support and so all future games will run better on Nvidia hardware!", and "I've got sand in my coot!" would all do as well.

With that out of the way, maybe a certain 3 or 4 individuals don't need to post and this thread can continue without turning into an Nvidia circle jerk. Then again, maybe that's asking too much.
 
ati passes nvidia, nvidia passes ati, ati passes nvidia, nvidia passes ati, ad nasuem
amd passes intel, intel passes amd, amd passes intel, intel passes amd, ad nasuem

all fan boys of any product should just stick their head in their own butt and be quiet

with video cards its even worse because you got different performance on each game title, and you got mfg's bribing the game designers to get better performance specifically with their game/card...it getting old

i just buy what has the best $$/perf ratio at the time I make my purchase, and try to make an overall assesment of the cards abilities, not whether or not its faster in one game because the game was optimized for it.

PS, im not a fanboy of nvidia, my last card was a 9800 pro which I was fairly happy with
 
Blad3 said:
http://www.firingsquad.com/hardware/leadtek_winfast_a400_ultra_tdh/ - posted 2nd of August.

Now check out the 1600x1200, 4xAA one's - assuming full detail, which is all I care about:

There's one where the x800xt is 19fps(!) ahead of the 6800Ultra, one of which I believe is OCed: http://www.firingsquad.com/hardware/leadtek_winfast_a400_ultra_tdh/page15.asp

Generally, at this level, it appears the x800xt pe is leading generally by at least 12fps in almost all the High res High detail benchmarks. As soon as 4xAA and 8xAF is turned on at least.

What's going on, I thought Nvidia's 6800Ultra was level if not ahead of the x800xt in DX9 now? IMO this would mean the currrent "Half Life 2 is 30% faster on x800xt cards compared with Nvidia's latest" may actually be true to some extent...

Thoughts? Count-Benchmarks? [Note: I originally looked at this site to see how many times better Nvidia was doing in Far Cry now.] If Ati can gain FPS in Doom3, like they say they will...could we be seeing a take-over???

I'm going mad from all this. :confused: :D

Did nVidia wrong you in a past life or something? This is the second anti-nVidia thread you started today...your like the unofficial ATI spooks person or something. Get a real job for crying out loud
 
PadanFain, Where do they show difference between sm2 and sm2b? I looked through this article and I could not find it. I'm curious about the difference.

Did the 61.45 drivers have support for sm3?

Thanks!
 
DemonDiablo said:
Did nVidia wrong you in a past life or something? This is the second anti-nVidia thread you started today...your like the unofficial ATI spooks person or something. Get a real job for crying out loud

It's no worse than the constant drivel about ATI and Doom 3......there is a bigger difference in FarCry than in Doom 3, All the Nvidiots have had their shout, now let the ATI side have their fun......I CANNOT WAIT.......till HL2 benchies are out.......that'll be 2-1 to the reds then ;) Well actually *counts all the other games the XT PE is faster at......hmm maybe here for a while* ;)
 
Both cards are excellent cards. So one is 20% faster than another, even? It's not going to extend the cards life by that much more.

I originally ordered 2 x800xt-pe's from Gateway. We all know how that went. After giving up and seeing the 6800GT was stlil a decent card, I grabbed 2 and haven't looked back.

Beore that I was the owner of two 9500's soft hacked to 9700. I've also at one time or another owneed 3-4 ti4200's, a voodoo1, voodoo2 in sli config, 2 voodoo3's, a TNT1, and a geforce2GTS 64MB as well as a 5900XT (which is being replaced by xfx by a 6800LE). Ain't no fanboy here, just whatever is available at my pricerange, I grab.

Rob
 
Tigerblade said:
It's no worse than the constant drivel about ATI and Doom 3......there is a bigger difference in FarCry than in Doom 3, All the Nvidiots have had their shout, now let the ATI side have their fun......I CANNOT WAIT.......till HL2 benchies are out.......that'll be 2-1 to the reds then ;) Well actually *counts all the other games the XT PE is faster at......hmm maybe here for a while* ;)


Wow.... just wow. I don't know why people care so much. I bought my 6800GT because of the price and performance. I can't afford a n X800pro or other, and I got the good deal on my BFG 6800GT OC. Honestly I don't care who is winning. I bought my card for the value.

Your guys who slam each other about which card is faster, act like it really matters who you side with. It's not a presidential debate. It's not like ATI will offer tax breaks or something, or Nvidia will end the war. Lord people. :rolleyes:
 
Quite honestly it's very strange that anybody would even care about minute differences such as this.

This is like the competition between the 5950 Ultra and the 9800 XT... just when everybody figured it all out, the X800 series and 6800 series came out... by the time everybody has an agreement on this bugger (i.e. when HL2 comes out) we'll be only a number of months away from the XI800's and the 7800 line... and those will blow everything we have out of the water (don't tell me that the speed jump made this time has "never-been-seen-before" because it has about once every 12 months or so.

My honest opinion? Many people seem to be saying that buying a graphics card just for one game (Doom 3) is silly. But if that one game is the only game that really needs a powerful graphics card (i.e. you're running everything else at 1600x1200 4xAA/8xAF at 60+ fps) then that's the game to cater to.

I think Half Life 2 will be pwned by both ATI and NVidia's latest cards (seeing that it was meant to play on the 9800 XT's and even the 9600 XT's very well) and the resulting spew of games based on the Source engine and the Doom 3 engine will be keeping us afloat until the Unreal 3 engine hits home.

So what if you're X800 Pro pwns Half Life 2... so will the 6800 GT's... but the 6800 GT's will be eating up Doom 3... and all the resulting spins off the engine... and the X800 Pro won't be.

Even if the Doom 3 engine is never ever used again, X800 people will still never be able to play it as well as the 6800 folk.

Then again, once the next upgrade season comes 'round, we'll all be playing it at max everything for sub $300
 
Robstar said:
Both cards are excellent cards. So one is 20% faster than another, even? It's not going to extend the cards life by that much more.

I originally ordered 2 x800xt-pe's from Gateway. We all know how that went. After giving up and seeing the 6800GT was stlil a decent card, I grabbed 2 and haven't looked back.

Beore that I was the owner of two 9500's soft hacked to 9700. I've also at one time or another owneed 3-4 ti4200's, a voodoo1, voodoo2 in sli config, 2 voodoo3's, a TNT1, and a geforce2GTS 64MB as well as a 5900XT (which is being replaced by xfx by a 6800LE). Ain't no fanboy here, just whatever is available at my pricerange, I grab.

Rob

I completely agree mate.....ATI's problem this round is availability.....IF they had brought them out on time they would have been in a strong position....

My arguement is the hypocrosy and pure BS.......the XT PE IS the fastest card in MAJORITY not ALL of games, but this fact seems to be lost on a lot of fanboys....
 
Wich0 said:
I'm just waiting for burningrave101 to jump in this thread. Things might get interesting.


And tranCendenZ too ofcourse


I cant believe neither of those two has jumped on a ati bashing/nvidia praising thread within the first 30seconds
 
Actually my biggest worry is that once the GeForce 9800 series comes out, I won't be able to find any info on it because all this Radeon 9800 stuff will come up... probably less of a problem, but it was really hard to find any info on the 6800 non-ultras for a while since searches kept pulling up GT's and Ultras.
 
I don't find this suprising at all. The x800 is a great card any way you slice it. We'll only continue to see driver improvements from ATI especially with the outcome of the Doom 3 findings regarding OGL. All in all good. Thanks for the links......
 
Emret said:
And tranCendenZ too ofcourse


I cant believe neither of those two has jumped on a ati bashing/nvidia praising thread within the first 30seconds

or fallguy ;)

I suppose the three of them must be sleeping or something :confused:
 
Tigerblade said:
My arguement is the hypocrosy and pure BS.......the XT PE IS the fastest card in MAJORITY not ALL of games, but this fact seems to be lost on a lot of fanboys....

I agree with this completely. I'm not bias one way or the other, in fact, I'm hoping to get an internship from nVidia next summer (Electrical Engineer from University of Michigan). However, it seems to me that a lot of people are looking too deep into this issue, and not stepping back to see the bigger picture. Instead of counting frames, lets just look at the physical cards themselves: the X800XT is a single slot solution requiring only one molex connection, while the 6800 Ultra needs two slots and two power leads. While I'm absolutely sure we all knew this already, what do you think would happen if ATI threw in another molex and an extra large dual-slot cooling solution? They could run up their clock speeds and stomp on nVidia, most likely. Sorry nVidia, I love ya but it's true. The reason ATI doesn't do this is because they don't have to. I think that most of us are too caught up in a dozen frames per second deviance, when in real world performance this makes no significant impact. While I'm sure many of you will argue the previous statement, just think about a year from now, we won't even be talking about these cards anymore. I love having the highest-of-high end components just as much as the next guy here, so I understand the need for a few frames and the subsequent bragging rights. I guess all that I'm saying is that ATI seems to be laying out a strong foundation for the future, and right now nVidia is playing catch-up with their need for large cooling and power needs. However, this fight while continue to go back and forth, so there is nothing to worry about, neither company is going anywhere for a long time. Hopefully I can help those guys out on the west coast to "turn the tides of war" next summer :) .
 
If you want to spend $400, get a GT.

If you want to spend $500, get an XT PE.

Simple.
 
Hmm interesting thread, do you know if ATI is running the sm 2.0b path, and Nvidia is running the sm 3.0 path? Patch 1.2 has bugs for ATi where it doesn't draw 50% of the textures. Think about patch 1.2 and ATi cards and rendering problems, this benchmark is BS.
 
JayD said:
I agree with this completely. I'm not bias one way or the other, in fact, I'm hoping to get an internship from nVidia next summer (Electrical Engineer from University of Michigan). However, it seems to me that a lot of people are looking too deep into this issue, and not stepping back to see the bigger picture. Instead of counting frames, lets just look at the physical cards themselves: the X800XT is a single slot solution requiring only one molex connection, while the 6800 Ultra needs two slots and two power leads. While I'm absolutely sure we all knew this already, what do you think would happen if ATI threw in another molex and an extra large dual-slot cooling solution? They could run up their clock speeds and stomp on nVidia, most likely. Sorry nVidia, I love ya but it's true. The reason ATI doesn't do this is because they don't have to. I think that most of us are too caught up in a dozen frames per second deviance, when in real world performance this makes no significant impact. While I'm sure many of you will argue the previous statement, just think about a year from now, we won't even be talking about these cards anymore. I love having the highest-of-high end components just as much as the next guy here, so I understand the need for a few frames and the subsequent bragging rights. I guess all that I'm saying is that ATI seems to be laying out a strong foundation for the future, and right now nVidia is playing catch-up with their need for large cooling and power needs. However, this fight while continue to go back and forth, so there is nothing to worry about, neither company is going anywhere for a long time. Hopefully I can help those guys out on the west coast to "turn the tides of war" next summer :) .

Tell me this .11 micron is the next step up for these cards right? Is there LOW-K for .11? No there isn't, ATI has to go .9 micron to get low K which probably won't happen for at least 2 lines. Hmm heat problems? More power usage?
 
rancor said:
Tell me this .11 micron is the next step up for these cards right? Is there LOW-K for .11? No there isn't, ATI has to go .9 micron to get low K which probably won't happen for at least 2 lines. Hmm heat problems? More power usage?

I think what he is saying is that the XT PE is faster than the Ultra NOW without the extra molex and cooling solutions.....bang them on and then what happens? The gap would widen....
 
UMMM they are using Driver version 61.45 on the nvidia cards, i thought the
61.77 or higher were required for SM3.0?

Also they are only using DX9.0b again dont you need 9.0c for SM3.0?

but yet they claim to be using sm3.0 path on farcry....


SHENS
 
Tigerblade said:
I think what he is saying is that the XT PE is faster than the Ultra NOW without the extra molex and cooling solutions.....bang them on and then what happens? The gap would widen....

He was saying ATi tech is better and the next versions of the cards ATi will be way ahead, which is not going to happen because they won't be able to use low k in thier next chip sets unless they go to .9 microns because they won't be able to clock as high without low k.

Also these benchmarks, this level that were tested ATi has always had a lead in, so again these benchmarks are skewed cause the other levels like Volcano, and a few others.

61.45 were the first 3.0 drivers released to the public but 61.77 sm 3.0 is faster.
 
JayD said:
I agree with this completely. I'm not bias one way or the other, in fact, I'm hoping to get an internship from nVidia next summer (Electrical Engineer from University of Michigan). However, it seems to me that a lot of people are looking too deep into this issue, and not stepping back to see the bigger picture. Instead of counting frames, lets just look at the physical cards themselves: the X800XT is a single slot solution requiring only one molex connection, while the 6800 Ultra needs two slots and two power leads. While I'm absolutely sure we all knew this already, what do you think would happen if ATI threw in another molex and an extra large dual-slot cooling solution? They could run up their clock speeds and stomp on nVidia, most likely. Sorry nVidia, I love ya but it's true. The reason ATI doesn't do this is because they don't have to. I think that most of us are too caught up in a dozen frames per second deviance, when in real world performance this makes no significant impact. While I'm sure many of you will argue the previous statement, just think about a year from now, we won't even be talking about these cards anymore. I love having the highest-of-high end components just as much as the next guy here, so I understand the need for a few frames and the subsequent bragging rights. I guess all that I'm saying is that ATI seems to be laying out a strong foundation for the future, and right now nVidia is playing catch-up with their need for large cooling and power needs. However, this fight while continue to go back and forth, so there is nothing to worry about, neither company is going anywhere for a long time. Hopefully I can help those guys out on the west coast to "turn the tides of war" next summer :) .

The 6800 GT can be easily overclocked to match/exceed the stock Ultra and that's a single slot, single molex card, so nVidia obviously didn't have to increase the size and power requirements of the Ultra.

One can easily say that it's a wonder the 6800 series keeps up even with a significant core speed disadvantage. "Imagine if the XT PE were dual slot, dual molex?" Well, one should also consider the level of performance that could be offered by a 520MHz Ultra refresh on low k... I think the 6800 platform has at least as much "in reserve" as the X800 does.
 
onetwo said:
One can easily say that it's a wonder the 6800 series keeps up even with a significant core speed disadvantage. "Imagine if the XT PE were dual slot, dual molex?" Well, one should also consider the level of performance that could be offered by a 520MHz Ultra refresh on low k... I think the 6800 platform has at least as much "in reserve" as the X800 does.
That's a pretty stupid thing to say though. If one were to realize clock speed is pretty damn meaningless overall, we'd stop getting these penis size contests over clock speeds. Last I checked an Athlon 64 runs at a slower clock speed than a P4, but in some things its faster. And in some things the P4 is faster. Wait this sounds really familiar.
 
well at the end of that review they seamed to lean more towards the 6800 being better but i dunno

i personally dont care about the performance in far cry i dont have it and i dont plan on getting it, im mostly concerned about how the 6800 will do in games like doom 3 and half life 2 with ati's new driver releases but you never know nvidia may make a newer driver that puts them back ahead
 
emorphien said:
That's a pretty stupid thing to say though. If one were to realize clock speed is pretty damn meaningless overall, we'd stop getting these penis size contests over clock speeds. Last I checked an Athlon 64 runs at a slower clock speed than a P4, but in some things its faster. And in some things the P4 is faster. Wait this sounds really familiar.
You thought you'd improve his 'stupid' post with one of your own?....
He wasnt trying to compare ati to nvidia at the same clock speeds... yes we all know by now thats useless.
If I interpret what he was trying to say that by increasing the clock speed on the card you of course get improved performance, compared to the same card at a lower speed.
Why not read and assimilate the information before you start flaming about penises(peni?) and such. :rolleyes:
 
Vermicious Knid said:
You thought you'd improve his 'stupid' post with one of your own?....
He wasnt trying to compare ati to nvidia at the same clock speeds... yes we all know by now thats useless.
If I interpret what he was trying to say that by increasing the clock speed on the card you of course get improved performance, compared to the same card at a lower speed.
Why not read and assimilate the information before you start flaming about penises(peni?) and such. :rolleyes:
Wow. You didn't pay attention did you? I didn't say he did say that, but I was saying why it's a pointless argument to say. And then you even go further to prove you're just trying to flame me personally rather than a concept because you mention that if you interpreted him correctly. Well there you go, assuming I was trying to attack him, what justification do you have for your direct flame on me other than possibly interpreting what he said differently.
 
emorphien said:
That's a pretty stupid thing to say though. If one were to realize clock speed is pretty damn meaningless overall, we'd stop getting these penis size contests over clock speeds. Last I checked an Athlon 64 runs at a slower clock speed than a P4, but in some things its faster. And in some things the P4 is faster. Wait this sounds really familiar.

???

The point of my post is that the X800 also has its own major benefits that haven't yet been tapped by the competition, therefore one cannot conclusively state that one platform is superior to the other.

He said, "Wow, the XT PE keeps up and sometimes beats the 6800U even though it's only a single slot, single molex solution. Imagine if the card had the better cooling and increased power supply of the 6800..."

One can easily turn this on its head and say, "Wow, the 6800U keeps up and sometimes beats the XT PE even though it's manufactured on a standard .13m process. Imagine if the card was on low K and can be physically clocked higher..."
 
You said, and i quote:
emorphien said:
That's a pretty stupid thing to say though.

You then went on to rant about clock speeds and penis size being unimportant, citing the p4 vs a64 example. Of course posting such following a post about clock speeds of an Nvidia card on a vid card forum draws the correlation to ati/nvidia clock speeds.
Arguing an argument thats not being argued now THATS pointless.

If this was not the point of that post, then what was it?
Perhaps all the synapses in my brain are not firing correctly and im INTERPRETING everything wrong. Thats one of the hazards of forums, you have to interpret what people are really trying to say as many choose thier words poorly, have a poor grasp of the english language, or just dont care.
 
wizzackr said:
WTF? can anyone comment on that? i was just about to get a 6800GT...

Look at this this way. If you are buying a new video card for one game, Doom3, get the GT. If you want to have a card that is going to be consistantly fast or faster in all games, go with ATI.
 
Back
Top