FiringSquad CS: S Benches

Status
Not open for further replies.
burningrave101 said:
IMO though, the 6800U is still the superior card because it is based off new technology instead of old and has more room to grow with driver updates. Its also superior to the XT in OpenGL and Linux and using the latest beta drivers its winning in the majority of D3D games at 1600x1200 w/ 4xAA + 16xAF. The 6800U also almost always wins when AA/AF isn't enabled no matter what the resolution is.

What majority of D3D games is it winning? Last I checked it is still LOSING in the majority of popular D3D games, especially when it comes down to games like BF:V, BF1942, JO, UT2004 etc. In CS:S its ahead by a few fps atm according to FS results but that can and will likely easily change--not to mention the delta between the cards was never big at all in CS:S so I don't know why everyone is so surprised that the 6800U is edging out the XT PE in this port.

The 6800U also has alot of cool driver features the ATI cards dont have such as digital vibrance

I was a fan of this from the Ti4600 series when I had my old 17" monitor. So I had fond memories of it until I bought a 6800 GT on my 21" and realized its overrated crap.

application profiles, coolbits, a plethora of legacy AA modes, 8xS which is superior to the AA modes ATI supports, better full Trilinear filtering support, a little better AF quality from what i've gathered, fewer game bugs, and new beta drivers which are constantly being leaked.

I'll agree about the supersampling modes, those are definitely nice for legacy games and its a shame ATi doesn't support it for the PC (I think they do have it for Macs). As for filtering quality, the shimmering issue is a big problem with the nVidia 6800 cards and it has not been fixed. The optimized AF doesn't match ATi's and now the filtering with opts off suffers from the same problem.

And lets not forget support for SM 3.0 and USII Technology.

By the time SM3 is even remotely useful these cards will be too dated/slow to matter. We're all enthusiasts here that tend to upgrade frequently, most of us will have moved on to newer gen. cards. This feature may be ok for guys that upgrade every 2 yrs and want full compatibility so I agree that much but for enthusiasts its a moot pt.

So IMHO the cards are pretty much a draw at this point, both have their ups and downs. There is no clear winner at the top end but I will say that nVidia did win the mainstream market because it clearly has a better priced lineup. Specifically the 6800 GT, its priced well and available in most stores. ATi has released the X800XT but its too late and still priced too high to be an effective competitor. nVidia also wins with the 6600GT in the lower segment as well. So of course, overall the round goes to nVidia for availability and price competitiveness at the mid and low segments. I'm sure ATi has learned their lesson and the next gen. will be here very soon.
 
MFZ said:
What majority of D3D games is it winning? Last I checked it is still LOSING in the majority of popular D3D games, especially when it comes down to games like BF:V, BF1942, JO, UT2004 etc. In CS:S its ahead by a few fps atm according to FS results but that can and will likely easily change--not to mention the delta between the cards was never big at all in CS:S so I don't know why everyone is so surprised that the 6800U is edging out the XT PE in this port.

Maybe you could link me to a few of these reviews you've been viewing? I havn't kept up on the latest reviews for a couple of months much so i probably missed a few of the good ones.

http://www.firingsquad.com/hardware/gigabyte_geforce_6800_ultra/page7.asp

There is one using the 65.76 drivers.

http://www.firingsquad.com/hardware/leadtek_winfast_a400_ultra_tdh/page8.asp

There is one using the 61.71 drivers.

Far Cry is one of the only D3D games that i've actually seen a benefit of the X800XT PE over the 6800U and thats only if their testing with the 1.2 patch with the 2.0b and 3.0 paths enabled. And we all know that the Far Cry 1.2 patch was pulled and may or may not have the 2.0b path when it re-releases.

MFZ said:
By the time SM3 is even remotely useful these cards will be too dated/slow to matter. We're all enthusiasts here that tend to upgrade frequently, most of us will have moved on to newer gen. cards. This feature may be ok for guys that upgrade every 2 yrs and want full compatibility so I agree that much but for enthusiasts its a moot pt.

Over a dozen games have announced support for SM 3.0 already. And this is just the beginning. SM 4.0 is a LONG ways away. HL2 will even have SM 3.0 support.

I'm a hardware enthusiast myself but i'm also in college and can't afford to buy $400 video cards every 6 months just to stay on top of things.
 
I can see the reason for reviews, I can see the reason for everyone wanting "their card of choice to win". I can't see the reason for spending $500 for a video card that has an absolutely finite lifespan in terms of gaming.

I just spent $145 on a 9800SE upgraded from a GF4MX400. I game all the time, I guess I can sacrifice 32 bit color for 16 bit color, I can sacrifice 1600 x 1200 in favor of 1024 x 768. I can get the desired level of FPS with what i have, it just requires comprimise in a sense.
 
stiltner said:
I can see the reason for reviews, I can see the reason for everyone wanting "their card of choice to win". I can't see the reason for spending $500 for a video card that has an absolutely finite lifespan in terms of gaming.

I just spent $145 on a 9800SE upgraded from a GF4MX400. I game all the time, I guess I can sacrifice 32 bit color for 16 bit color, I can sacrifice 1600 x 1200 in favor of 1024 x 768. I can get the desired level of FPS with what i have, it just requires comprimise in a sense.

You should have thrown in a few extra dollars for a 9800Pro. There's compromise and then there's just plain sense. You could have thrown in another 30 or 40 dollars and not only would you have no need to compromise performance, you would have been set with a good card with a decent lifespan.
 
MFZ said:
I'll agree about the supersampling modes, those are definitely nice for legacy games and its a shame ATi doesn't support it for the PC (I think they do have it for Macs).

Not legacy only. I am currently playing Rome Total War and PGA Tour 2005 with 8xSS, and they look very sweet. That is at 1280x1024 res, and there is an occasional dip, but that's very rare.
 
I play quite a few games with 8xS enabled. Elder Scrolls III: Morrowind is one of them. There are some games that are just too intensive to run it like Far Cry but in alot of others you can and still get acceptable frame rates.
 
burningrave101 said:
Maybe you could link me to a few of these reviews you've been viewing? I havn't kept up on the latest reviews for a couple of months much so i probably missed a few of the good ones.

With the exception of UT2004 being pretty much dead even (looks CPU limited to me until 1600x1200) and PE losing marginally in flight simulator:

VGA Charts IV

Guru3D X800 Pro Review

-for this one I'm specifically referring to D3D games like Halo and SC. Although keep in mind this is the plain X800XT and not the PE but in some graphs its just the XT vs GT and others its XT vs GT/U.

Finally a DH review (save the biased comments, seems like a legit review to me): PNY 6800 Review

-Has a nice diverse set of games which is a nice departure from the standard stuff. There's colin mcrae 5, tribes vengeance and ground control 2.

If I had the energy I'd try to dig up more reviews but there aren't very many recent ones so I guess we will have to wait awhile till they show up. I'm betting there will be 1-2 new ones when cat a.i. is released to everyone. I guess it really comes down to OGL if you want to declare a real winner. If you consider it real important to your gaming then I'd def. go nV but if you're primarily a dx gamer like me, there's no real reason to switch over especially with the shimmering problems still going on with the 6800 cards. To be honest, if I hadn't gotten this PE at MSRP from BB, I would likely have bought a 6800 GT and just WC'd that..shrug.
 
pahncrd said:
Not legacy only. I am currently playing Rome Total War and PGA Tour 2005 with 8xSS, and they look very sweet. That is at 1280x1024 res, and there is an occasional dip, but that's very rare.


I heard RTW runs like crap on most peoples systems b/c of CPU limitations. How fast is your system?
 
/me looks at title
/me reads thread
/me wonders what title has to do with thread :confused:
 
MFZ said:
Guru3D X800 Pro Review

-for this one I'm specifically referring to D3D games like Halo and SC. Although keep in mind this is the plain X800XT and not the PE but in some graphs its just the XT vs GT and others its XT vs GT/U.

The stock 6800GT is beating the OC'd X800Pro in Halo...

Finally a DH review (save the biased comments, seems like a legit review to me): PNY 6800 Review

Believe me when i say, NOTHING from Driver Heaven is "legit". The only reason i even go to Driver Heaven is to download new updates to Driver Cleaner lol.

MFZ said:
If I had the energy I'd try to dig up more reviews but there aren't very many recent ones so I guess we will have to wait awhile till they show up. I'm betting there will be 1-2 new ones when cat a.i. is released to everyone. I guess it really comes down to OGL if you want to declare a real winner. If you consider it real important to your gaming then I'd def. go nV but if you're primarily a dx gamer like me, there's no real reason to switch over especially with the shimmering problems still going on with the 6800 cards. To be honest, if I hadn't gotten this PE at MSRP from BB, I would likely have bought a 6800 GT and just WC'd that..shrug.

You have to realize that only about 10 of the most popular D3D games ever even get benchmarked while there are thousands and to say one card is better for D3D isn't very accurate really. The games used in benchmarks are both highly optimized for by nVidia and ATI both. They both want the highest benchmarks because benchmarks sell cards.

Also, alot of the reviews have been using the 61.77 drivers off nVidia's site and the new beta drivers are a hell of alot better.

You also can't just compare which card is better at 1600x1200 w/ 4xAA + 16xAF all the time, especially if you never even play at that resolution on your monitor lol. Its best to pick the card thats the fastest overall at all resolutions.
 
MFZ said:
I heard RTW runs like crap on most peoples systems b/c of CPU limitations. How fast is your system?


A64 - 939 3500 at 2.5ghz 250fsbx10
6800gt 406/1100
 
Uh oh ATI.............. :eek:

Of course CS:S isn't HL2, but it is close for comparison.
I predicted from a looooong time ago to more recently that HL2 won't be "all that" only on ATI cards. Seems that might become true, but of course we won't know that until next year when HL2 actually is released.

Overall, unless HL2 is a surprise and runs much better on ATI cards and is not "on par" or "even" with nVidia cards, ATI loses, on 2 counts:
- nVidia will improve performance through upgraded drivers after the game's release, substantially more than ATI can.
- Consumers that got a voucher with their 9600/9800 purchase with the promise that their new card would be "all that", will have much less trust for ATI.
- More consumers that have been holding out on a purchase will switch to nVidia cards once the game releases, based on the above points.
 
Badger_sly said:
- Consumers that got a voucher with their 9600/9800 purchase with the promise that their new card would be "all that", will have much less trust for ATI.


that's a bunch of crap. it is still going to play the game very well. it's just not going to be at 1600x1200 with 4AA
People got a great video card and the game free when it comes out

and the way things are looking the game seems to be coming out soon.
The game at least is done, some legal problems now but that could go in any direction.
 
M1ster_R0gers said:
I think gabe and ATI need to refund some $499 9800XT purchasers some money, or give them a free upgrade to an x800pe.

Read the box/voucher fine print, 99% certain there's something that prevents any liabilities and claims the release date is not definite... 'Sides I'm sure those people played a dozen other games while waiting and the radeons were better than Nvidia's offering at the time anyway. So now they're not, big whoop. You people are taking this way too far, ATI and NV are not presidential candidates!

Moloch said:
Ati, however, doesn't deserve the bashing, nvidia is a bigger company, it's a wonder they can even begin to compete

LOL, now I've heard it all... :rolleyes:

rancor said:
Valve was the first company to put thier game up for auction for bundling. That isn't right. Then they released numbers that wouldn't have even made a difference since the game wasn't even ready. Its almost like Valve had a vandetta against nV?

It's called a marketing deal, no one forces you to swallow their statements whole, but I'm sure you know that. It's not like NVidia doesn't do it to a lesses extent, HL2 is just a very prominent and prolongued example.
 
burningrave101 said:
8xS which is superior to the AA modes ATI supports, better full Trilinear filtering support, a little better AF quality from what i've gathered, fewer game bugs, and new beta drivers which are constantly being leaked.
.
Superior?
Havein talked to people that owned both, they say 6X is neary as good.. and guess what?
You can use it in todays game, not old games that don't need much fillrate.
The SM3 is really just a marketing tool, show me a game that will use it other than to get a few fps?
Digital vibrance is a feature for people who buy tv's based on saturation.
No thank you, I prefer natural looking colors.
 
Moloch said:
Superior?
The SM3 is really just a marketing tool, show me a game that will use it other than to get a few fps?
Digital vibrance is a feature for people who buy tv's based on saturation.
No thank you, I prefer natural looking colors.


You might want to check out pain killers new demo that was just released ;)
 
Impulse said:
It's called a marketing deal, no one forces you to swallow their statements whole, but I'm sure you know that. It's not like NVidia doesn't do it to a lesses extent, HL2 is just a very prominent and prolongued example.


nV doesn't pay for things like that. Also, ATi might have paid Crytek to get the sm 2.0b path in thier engine the sum of 500k.

nV might help in marketing of a game, but thats it, they don't hand over cash.
 
Moloch said:
Superior?
Havein talked to people that owned both, they say 6X is neary as good.. and guess what?
You can use it in todays game, not old games that don't need much fillrate.
The SM3 is really just a marketing tool, show me a game that will use it other than to get a few fps?
Digital vibrance is a feature for people who buy tv's based on saturation.
No thank you, I prefer natural looking colors.

If you've talked to someone that thinks 6xAA is nearly as good as 8xS then they havn't been testing it with games that were truly aliased lol. 6xAA is better for compatibility in todays games yes, but its definitely not as good as 8xS.

I've found 8xS to be perfectly playable on quite a few D3D and OpenGL titles. Especially ones that are more CPU intensive.

Elder Scrolls III: Morrowind is a D3D game and is relatively new as it was released in 2002 and it still has some of the best graphics for an RPG out there. I have no problem running 8xS in it because its more of a CPU intensive game.

And i dont think you know anything at all about SM 3.0. Want to know why? Because you've only seen it implemented into ONE game and the patch that did that was pulled lol. The way it was used in the 1.2 patch for Far Cry was also done in a manner that could of easily been done with SM 2.0b. It didn't take advantage of the speed and precision SM 3.0 can have over SM 2.0. Take the time to go look up the specs between SM 3.0 and SM 2.0b.

http://graphics.tomshardware.com/graphic/20040414/geforce_6800-08.html

BTW, alot of people like the digital vibrance feature in the ForceWare drivers so the simple fact that you dont doesn't mean its a worthless feature lol.
 
rancor said:
Digital vibrance is also on ATi cards so wheres the difference?

The digital vibrance on nVidia cards looks better then what you are able to adjust it to on ATI cards. I know several ATI users that have tried to manually adjust it to get it to emulate the look of digital vibrance but it never quite looks the same.
 
Moloch said:
Superior?
Havein talked to people that owned both, they say 6X is neary as good.. and guess what?
You can use it in todays game, not old games that don't need much fillrate.


I would say that 6x on my AIW 9800 pro does look as good as the 8xSS for edges, but for textures and general WOW, there is no contest.

As far as your comment about the fact that you can't use it in todays games, I would have to disagree.

As stated before I run 8xSS in most new games without a hitch. Keep in mind I run my games at 1280x1024 mostly.

Rome: Total War looks great
Dawn of War looks great
Morrowind looks great
URU ages of myst looks great
PGA tour 2005 looks great
 
Netrat33 said:
......it is still going to play the game very well. it's just not going to be at 1600x1200 with 4AA
People got a great video card and the game free when it comes out
Sure the cards will play the game, but please explain how most of those people that bought 9600/9800 cards last summer, for the purpose of HL2, are not at least a bit pissed that they are not playing the game on the latest/fastest video card.

and the way things are looking the game seems to be coming out soon.
The game at least is done, some legal problems now but that could go in any direction.
Same old song and dance, as the last year. You won't see it until '05.
 
Moloch said:
Superior?
Havein talked to people that owned both, they say 6X is neary as good.. and guess what?
You can use it in todays game, not old games that don't need much fillrate.
The SM3 is really just a marketing tool, show me a game that will use it other than to get a few fps?
Digital vibrance is a feature for people who buy tv's based on saturation.
No thank you, I prefer natural looking colors.

8x, though slow, is vastly superior to 6x since it can do the edges of textures, not just the edges of polygons...case in point, CoD...
 
burningrave101 said:
You can see in this review here how much performance is lost in comparison between the X800Pro and 6800GT using different AA and AF methods while running the Source engine.

http://www.elitebastards.com/page.php?pageid=6336&head=1&comments=1

8xS has a massive hit in comparison to 6xAA but because of it being super-sampling it does more then just clean up simple jagged edges like 6xAA.


If you really want to have fun with legacy software, try 16xss.
 
cool never really played around much with Digital Vibrance on ATi cards much, alittle on nV's cards, but thought it was the same capablities.
 
Badger_sly said:
Sure the cards will play the game, but please explain how most of those people that bought 9600/9800 cards last summer, for the purpose of HL2, are not at least a bit pissed that they are not playing the game on the latest/fastest video card.


Same old song and dance, as the last year. You won't see it until '05.

Nvidia had the same things on their cards

"The card ID recommends for Doom3" stickers before it came out. The FX line was the same thing, the only difference is at least with ATi you got the game free with it. You look at it like a 50 dollar coupon. While if you bought an FX line card, you're just screwed (in the recards of a top of the line card) because you didn't get Doom 3 with it.
It took a long time for HL2 to come out but it's still going to play really nice on 9800 cards.

Unfortunately (um..i just typed that) technology moved forward for HL2 to come out so the 9800 series isn't the 'premium' card anymore for it while the x800s and 6800s are
 
Netrat33 said:
Nvidia had the same things on their cards

"The card ID recommends for Doom3" stickers before it came out. The FX line was the same thing, the only difference is at least with ATi you got the game free with it. You look at it like a 50 dollar coupon. While if you bought an FX line card, you're just screwed (in the recards of a top of the line card) because you didn't get Doom 3 with it.
It took a long time for HL2 to come out but it's still going to play really nice on 9800 cards

Except..........

ID didn't set a release date of September 2003, and they didn't heavily promote the given video card. nVidia not offering a packaged deal of it's card + a coupon for D3, was it's smartest move, unlike ATI betting on Valve's disciplined development (or lack there of).
 
Badger_sly said:
Except..........

ID didn't set a release date of September 2003, and they didn't heavily promote the given video card. nVidia not offering a packaged deal of it's card + a coupon for D3, was it's smartest move, unlike ATI betting on Valve's disciplined development (or lack there of).

I could be wrong (very wrong)...but I thought package deal wasn't available then.

But either way, your logic still doesn't make sense. The game is STILL going to play VERY well on the 9800 series cards than FX line and THAT is what they were promoting at the time. and you got the game for free for buying the card. Everyone was or considering buying an ATi card at the time anyway, how's a little more of an incentive wrong?

Heck games on the unreal 3 engine says it's better on nvidia cards...how do they know in year?! or whenever it finally comes out...or it it the 4 engine...newest engine being worked on! ;)
 
Netrat33 said:
I could be wrong (very wrong)...but I thought package deal wasn't available then.

But either way, your logic still doesn't make sense. The game is STILL going to play VERY well on the 9800 series cards than FX line and THAT is what they were promoting at the time. and you got the game for free for buying the card. Everyone was or considering buying an ATi card at the time anyway, how's a little more of an incentive wrong?

Heck games on the unreal 3 engine says it's better on nvidia cards...how do they know in year?! or whenever it finally comes out...or it it the 4 engine...newest engine being worked on! ;)

Yes, HL2 will play quite well on a 9800 or 5900 series card. However, you're missing the point of the here and now. People bought those ATI cards last summer 2003, thinking that they would be top of the line for HL2, which was promoted as coming out that coming September. Now, it's a year later, the game still isn't out, their ATI cards won't be the fastest, and if they want the fastest card for HL2 they will have to spend more money. Also, how many of those people might have waited to buy a video card, had they known the game wouldn't be out for another year to a year and a half?

Sneaky marketing or however you say it, they got duped by the packaged deal.
 
Badger_sly said:
Yes, HL2 will play quite well on a 9800 or 5900 series card. However, you're missing the point of the here and now. People bought those ATI cards last summer 2003, thinking that they would be top of the line for HL2, which was promoted as coming out that coming September. Now, it's a year later, the game still isn't out, their ATI cards won't be the fastest, and if they want the fastest card for HL2 they will have to spend more money. Also, how many of those people might have waited to buy a video card, had they known the game wouldn't be out for another year to a year and a half?

Sneaky marketing or however you say it, they got duped by the packaged deal.


uh...wouldn't they have to spend more money anyway to have the top of the line card to play HL2? It's not like those people who bought those cards had nothing to do with those videocards. Those were king of the heap cards for playing the games of the time.
 
well there was no false advertising. the 9800xt WAS the top of the line card for hl2 when they said it was. hl2 just wasn't released to the public, thats all :eek:
 
Score means nothing image quality is where it's at, welcome to 2004!

We (ATI owners) don't care about you getting a few shimmering fps more... :rolleyes:
 
I personaly just ordered an X800XTPE but I'm still pretty impartial. That is, I don't hold any loyatlies to either I just want the better card/deal. But man, NVidia sure know's how to pull drivers out of their butts that's for sure. But I'm still happy with the preformace of my card but I still hope ATI can start making some miracle drivers like NVidia. I'm proud of the tech advancements of both companies. I still remember when HL first came out in 97' runnin' on a Pentium MMX 233 with a 6meg Dimond card hoping for 15fps max. So to see that we're getting 80+fps before the games EVEN COME OUT. is a pretty nice testement to both companies...

And it's cool that they AREN'T as radicaly different as D3 which is so NV sided it's obserd. Almost as if it wasn't just optimized for NV but build to run badly on an ATi.. but I know that's not true I'm just glad that both cards can run it fine and it isn't a "which card is better for which game" senario.
 
Oh yea, and does anyone else find it fishy that the X800XT AGP beats the XTPE in more than one test when the only difference is better clocks on the PE!?!?!?
 
Not really it looks like ATi's drivers are more CPU dependent then nV's now.
 
i just ran the stress test on my x800XT-PE at 1600X1200 6X and 16X i get 86.89 FPS which seems fine to me, everything maxxed out, i was expecting the graphics to be much better.
 
Nembot said:
Score means nothing image quality is where it's at, welcome to 2004!

We (ATI owners) don't care about you getting a few shimmering fps more... :rolleyes:

hahahaha...that's the most retarded thing I've ever heard...you're about as informed as a brick in a paper bag...
 
Status
Not open for further replies.
Back
Top