Geforce FX and dx8 in HL2 at the Firing Squad

For a game like CS:S I will probably turn everything I possibly can off. That's just how I play MP games.
 
Blah blah blah, who cares what the FX and 9x00 series does. I want to know how the X800, 6800, & 6600's do. (you can test them with the various paths too if you want to see how they work)

FX 5x00 & 9x00 are old news. yawn :cool:

(I can be smug now that I've had my 6800GT for a whole 6 days now! lol
 
^eMpTy^ said:
What's interesting is that it doesn't look all that bad...but you can't even force it to do 9.0 for some reason...so now they don't know how to benchmark the cards...no apples to apples comparison is possible for last gen...

Not on the FX line......
 
CrimandEvil said:
Thats was at the time of the Shader Day faux pas, things have changed since then.

It cant change. You didnt read that clearly. I believe they said they spent 6 extra months for the FX. They cant take that time back.

mr.fishie said:
i read that too, wasnt that an interview with gabe at shader days?

Yes I think so. I didnt bother trying to find a link, but I think it was.
 
fallguy said:
It cant change. You didnt read that clearly. I believe they said they spent 6 extra months for the FX. They cant take that time back.



Yes I think so. I didnt bother trying to find a link, but I think it was.



Gabe, and the rest of Valve, have sold their soul to ATI.

Remember the chatter about how AA wasn't going to work on GeForce FX and how it couldn't be fixed? That was the line from Valve. Of course, a month later they came out and found a "workaround".

Remember the line that performance on HL2 would be crap on GeForce FX? How they spent 6 months optimising and how it was still crap? Well, the linked CS:Source benchmarks don't indicate that the FX is being dominated at all. Sure, the 9800XT is faster, but it's not the huge gap that Valve made us believe it would be (note that this is comparing DX8.1).

Valve is whining that they spent 6 months optimizing for GeForce FX. Remember what Carmack said about Doom3? That he took out the NV30-specific path because the generic path was just as fast with NVIDIA's newer drivers. Valve didn't have to spend "6 months" optimizing for GeForce FX. NVIDIA optimized for them.

Now, I think that NV30 was the darkest point in NVIDIA's short history. Just like the RAGE FURY MAXX, NV30 failed to deliver the image quality and performance of the competition.

But you know what? After ATI failed with RAGE, they went on to produce a GPU that competed with NVIDIA's products - not by working harder but by working smarter. Radeon incorporated new technologies that made it perform better at a lower clock than GeForce.

GeForce 6 is NVIDIA's Radeon. ATI's R300 was a good GPU, and it has been rehased into R350, R360, and now R400. It's still going strong, but it's showing it's age. GeForce 6 is smarter, more powerful, and more advanced than R400. Anyone who doesn't believe that needs only look at the core and memory clocks of ATI's and NVIDIA's parts.
 
Hey, ATI paid $6 million for HL2. I'm sure if NV did that Valve would have found "some way" to inprove performance for the FX. :rolleyes:

And yes things have changed unless you haven't noticed that. Carmack had to write a work around for the FX but later removed it since the drivers got better over time. :eek: :rolleyes:
 
bsoft said:
Gabe, and the rest of Valve, have sold their soul to ATI.

Remember the chatter about how AA wasn't going to work on GeForce FX and how it couldn't be fixed? That was the line from Valve. Of course, a month later they came out and found a "workaround".

Remember the line that performance on HL2 would be crap on GeForce FX? How they spent 6 months optimising and how it was still crap? Well, the linked CS:Source benchmarks don't indicate that the FX is being dominated at all. Sure, the 9800XT is faster, but it's not the huge gap that Valve made us believe it would be (note that this is comparing DX8.1).

Thats your opinion. But you dont think Crytek and other havent done so for NV?

Yes, they jumped the gun on that, and made themselves look a little foolish.

First off, the VST doesnt represent HL2 gameplay. Secondly, the 9800XT does run it much faster, so Im not sure what you're talking about. Much faster, while doing more work.

CrimandEvil said:
Hey, ATI paid $6 million for HL2. I'm sure if NV did that Valve would have found "some way" to inprove performance for the FX. :rolleyes:

And yes things have changed unless you haven't noticed that. Carmack had to write a work around for the FX but later removed it since the drivers got better over time. :eek: :rolleyes:

ATi paid the money to have it bundled, not to give them some magical performace increase. Other games show the FX's poor performance in PS2.0 games. Such as Farcry.

What I said didnt change. I said they spent 6 months extra on the FX. You said things have changed. You didnt read my post right. Because obviously they cant take back that time. Its pretty obvious other aspects of the game have changed, but I wasnt talking about that.

Good job with the eye rolling though. Par for the course with you though.
 
fallguy said:
First off, the VST doesnt represent HL2 gameplay. Secondly, the 9800XT does run it much faster, so Im not sure what you're talking about. Much faster, while doing more work.

Way to post a link where the 9800XT is being obliterated by the 5900+ FX series in full eye candy mode. Granted they're probably running in DX9 but I doubt that would be a huge performance difference in this case.

In any case you really hit home your point about the 9800XT running faster....than the FX 5700 and down maybe, but those cards were certainly never the 9800XT's competition.
 
fallguy said:
First off, the VST doesnt represent HL2 gameplay. Secondly, the 9800XT does run it much faster, so Im not sure what you're talking about. Much faster, while doing more work.

Umm, every benchmark on that page shows the FX5950 beating the 9800XT !?

1024x768
5950u = 92.3 fps
9800xt = 62.8 fps

1280x1024
5950u = 58.5 fps
9800xt = 44.5 fps

1600x1200
5950u = 40.8 fps
9800xt = 34.6 fps

granted the X800 XT is winning the highend benchmarks away from the 6800U. But the 6800GT is beating the X800 Pro. And considering the 6800GT's are readily available at $400 and overclock to 6800 Ultra speeds vs. a $700-1000 card that cant be found b/c it clocked so high they cant make many of them!
 
The 5950U is winning simple because it's running DX8.1 and not DX9 like ATi's cards. :rolleyes:
Probably somethings else ATi bought, lol. :D
Any one else notice that the VST is nothing more then a synthetic benchmark? ;)
I would wait for the game to finally (if ever) ship before I say one side wins.:rolleyes:
 
fallguy said:
........
Secondly, the 9800XT does run it much faster, so Im not sure what you're talking about. Much faster, while doing more work.
LOL, even the 5900xt is beating the mighty 9800xt in those test ! And, a quote from the review:
"In terms of performance the GeForce FX-series still deliver excellent numbers that are higher compared to the competing RADEON 9500-, 9600- and 9800-based graphics cards.
"

ATi paid the money to have it bundled, not to give them some magical performace increase.
ATI paid Valve for the bundling, but more importantly, for the Shader Days Scam. If ATI paid for performance over nVidia in HL2, well, they sure deserve a refund.

thrawn42 said:
Way to post a link where the 9800XT is being obliterated by the 5900+ FX series in full eye candy mode. .......
It was one of his more entertaining posts.
 
SnakEyez187 said:
What an amazing phenomenon, everyone thinks they know everything
..............
But you all looked over his mistake and just attacked him, good job
.............
Yeah we know, firingsquad must be the review that is wrong and/or biased

Violinnopity.gif
 
Badger_sly said:

You don't have any colorful comments to add? You sure didn't have any lack of those when someone mistakenly posted the wrong review and you pounced on him. Now that we have reviews supporting both biases, silence from the peanut gallery
 
The guy is known for ranting and bitching about nVidia, so, when he posts a link showing the oposite of what he rants about, yes, he's going to get some shit.

BTW, the chill pills are in the medicine cabinet, middle shelf.......... :)
 
Nvidias 5x00 line is quite a bit slower than ATi's 9x00 line with DX9 stuff, its been documented all over the place.

Exceptionally slow in DX9 are the 5200 and original 5600. Even simple DX9 games are considered to be completely unplayable on the 5200, so its no surprise that it would fallback to a DX8 or 8.1 mode for a shader intensive game.

In that respect, the Radeon 9200, which cannot do DX9 at all might not have been such a bad idea for the low end... ATi didn't even try to include DX9 in it, knowing that it would be pretty sad performing.
 
Back
Top