FiringSquad CS: S Benches

Status
Not open for further replies.
rancor said:
Yes they did LOL, just before the VST was released, thats where the 30% came from, 40% was last year's things ;)


Well theyre still ahead in the vst from what i know, and ive yet to hear valve say 40% number that was always thrown around was 30%. Cs source in terms of graphics and effects won't match upto halflife 2, cs has always been the game that graphically was decent at best but madr up for it with good simplistic gameplay. Cs source has received a graphical overhaul because it badly needed one, but i dont think that the small amount of effects theyve tossed into cs will match upto whats in hl2.

Either way i dont really give a shit what comes out on top, to me hl2 has always been about pimping physics more so than anything else, every video demo, manipulator manipulator manipulator, zzzzzzz. Probably play it for a few hours then remove it from my steam list.
 
The cards both perform on par with each other. I just think this is more of an NVIDIA doesn't suck at CS:S thread rather than an ATI sucks at CS:S thread.

I think the real issue here is you have NVIDIA destroying ATI in OpenGL/Doom3 and performing on par with ATI in DX/HL2.
 
thrawn42 said:
Yes, yes they have. In fact Doug Lombardi stated on IGN not too long ago that the X800 series was the preferred card among HL2 testers because of somewhere around a 30% performance improvement over the Geforce 6 line.

you're response doesn't really make sense to Chris_B's statement

9800 vs the FX series hasn't really benched yet
as to the 30% performance improvment statement...well still has yet to be seen
Although this is a glimpse of what might happen with HL2...this still isn't HL2
 
The important thing to remember is that when you shoot open the Computers in cs_office you are greeted with a 9800 Pro. ;)
 
Ah...to be young and stupid again. :rolleyes:

If it plays well on your system, with whatever card you have, at levels that are acceptable to you...then what the hell difference does it make??

I can't believe how some of you bicker and squabble over a meager 2%. I will never understand this. :confused:
 
Netrat33 said:
you're response doesn't really make sense to Chris_B's statement

9800 vs the FX series hasn't really benched yet
as to the 30% performance improvment statement...well still has yet to be seen
Although this is a glimpse of what might happen with HL2...this still isn't HL2


Thats not going to happen either in the VST test nV was behind a max of 15% well in every benchmark other then the ones done by DH. 15% with the 61.77 drivers well with the newer ones that should be closed down to almost nothing.
 
jacuzz1 said:
It is for this player............66.81 run flawlessly
I get wireframe errors (you can see quad outlines, white polygon borders) in some places with 6800U and 66.81. But I don't know if this is an nV driver error or source engine error.
 
Netrat33 said:
you're response doesn't really make sense to Chris_B's statement

9800 vs the FX series hasn't really benched yet
as to the 30% performance improvment statement...well still has yet to be seen
Although this is a glimpse of what might happen with HL2...this still isn't HL2

What exactly doesn't make sense? He asked if anyone had commented about this generation of cards since last year's comments from Valve about the 9800 performance in HL2. I pointed out that Valve themselves have done so within the past month or so.
The 9800 vs FX series I'm sure has been benched by Valve but that is not what we are talking about here. This whole thread is referring to X800 vs. 6800 performance in CS:S and its possible meanings.
 
Chris_B said:
Well theyre still ahead in the vst from what i know, and ive yet to hear valve say 40% number that was always thrown around was 30%. Cs source in terms of graphics and effects won't match upto halflife 2, cs has always been the game that graphically was decent at best but madr up for it with good simplistic gameplay. Cs source has received a graphical overhaul because it badly needed one, but i dont think that the small amount of effects theyve tossed into cs will match upto whats in hl2.

Either way i dont really give a shit what comes out on top, to me hl2 has always been about pimping physics more so than anything else, every video demo, manipulator manipulator manipulator, zzzzzzz. Probably play it for a few hours then remove it from my steam list.
so ati wins another benchmark, and nvidia keeps coming back with actual gameplay, go nvidia
 
OriginalReaper said:
I get wireframe errors (you can see quad outlines, white polygon borders) in some places with 6800U and 66.81. But I don't know if this is an nV driver error or source engine error.


This what you mean?

de_cbble0001.jpg


Looks like a mapping error, where 2 different textures meet there seems to be a very slight gap between the two that shows as white.
 
I don't really know what the purpose of this thread is. The cards are so close, there is no edge to either one that would give any noticeable advantage difference in gameplay. Everyone already knew CS was going to play fast anyway. It's just not that taxing of a game even with the HL2 enhancements.

I don't why some people get so worked up over one car going a 100 miles per hour and another one going 104 miles per hour. What the hell is the difference?

Years ago speed was a good argument for a vid card. Today that's not really the case unless one just ran away from the other. That's not happening now.
 
A lefty ! ^^^

Chris_B I can't really see what you're talking about in the above screenshot though, maybe if it wasn't compressed it'd be easier to see. (the blowup is obvious but we don't play games blown up ;) ) I would like to see what it would look like normally.
 
de_cbble0000.jpg



Basically a small white flickering line, doesn't look much in the pic but in game its pretty noticable.
 
Ah that's much better! OK I see it. That is a pretty nasty glitch. Does it only happen in that location of the map or in various places?
 
Yiffy said:
Ah that's much better! OK I see it. That is a pretty nasty glitch. Does it only happen in that location of the map or in various places?


Seems to be pretty random, sometimes it occurs with edges of water meeting a wall, other times its where two different textures join on the ground or on the ceiling. Tried a couple of different drivers and different settings but to me it just seems to be a mapping error. Could see similar stuff to this in mohaa with textures not tightly joined.
 
psikoticsilver said:
I really think the title of the news post should be changed. To me "pwn," means something more then a 2-6 frames per second lead--PWND is like Monster Kill in Unreal Tournament. I didn't expect this from [H]. Actually this is the LAST thing I expect from [H].
Sarcasm, Learn it, Know it, Live it.
 
i get white and black wireframes with the 66.72. its pretty bad on some maps in certain locations on the map. kinda sucks cuz it is faster...
 
Chris_B said:
Seems to be pretty random, sometimes it occurs with edges of water meeting a wall, other times its where two different textures join on the ground or on the ceiling. Tried a couple of different drivers and different settings but to me it just seems to be a mapping error. Could see similar stuff to this in mohaa with textures not tightly joined.

well it doesn't happen with 61.xx's...
 
rancor said:
Yes they did LOL, just before the VST was released, thats where the 30% came from, 40% was last year's things ;)

Keep in mind that comment was said before new drivers were out so you have to view it in the context it was said. Doug said that before we have all of the new drivers so of course anything new will be out of context. The first VST test we saw showed about a 17 FPS for the ATI XT vrs the 6800U. Now thats not 30% but its still pretty good. And for the 40% stuff....I take it you never saw the VST/CS when the FX was forced to DX9 path did ya as that 40% is pretty close (again before new drivers).


burningrave101 said:
They got PWNED because ATI should have at least a 5-10% lead in CS:S because of the engine its built on and the fact ATI has spent millions of dollars investing on marketing their cards with Half Life 2. Not to mention all the FUD ATI and Valve have spread about ATI hardware running the HL2 engine better then anyone else.

What do you think people would of said if DOOM 3 had released and ATI had a 2% lead on nVidia? lol

It still reamins to be seen whos faster. You have to admit not running AI on the ATI cards and having the NV ops on is not apples to apples. I smart person whould hold on till we have more data.

rancor said:
Well this is with AI off, I guess ATi can't really even come close without shader optimizations in this game.

AI will help mainly the loss in AF/AA settings. If you look you can see ATI looses more when when AA/AF is applied compared to NV.
 
Cornelius, we know some people care more than others about benchmarks, but especially in terms of 3dmark a spread of 10 points could happen simply because the temperature in your room dropped a couple degrees between runs or something and really gives very little indication of game performance. But since that isn't the purpose of this thread, we'll just leave it at the fact that some care and some (or most) do not.

What I find disappointing about this is that even after all this time ATI has not been able to come close to matching NVidia for driver optimizations and improvements. A new driver set comes out almost monthly from nv, fixing bugs, etc., while major ati updates are few and far between (even from third party drivers). What exactly is ati spending all their time on?
 
Valve ATI and VU can all kiss my ass. I bought my 9800XT last year pretty much the day it hit the shelves in anticipation of HL2 and now my near $500 card is getting dwarfed by today's lineup.

Yes, I am the ass for jumping the gun, but I wouldn't have upgraded otherwise. It just angers me a bit.
 
Uh ati do release a driver every month, every catalyst revision includes bug fixes etc.
 
j0k3r said:
Valve ATI and VU can all kiss my ass. I bought my 9800XT last year pretty much the day it hit the shelves in anticipation of HL2 and now my near $500 card is getting dwarfed by today's lineup.

Yes, I am the ass for jumping the gun, but I wouldn't have upgraded otherwise. It just angers me a bit.

LOL! I fell for the same shit with my 9700-pro. Bought it couple of years ago because I needed a new card and Doom3 and HL2 were "just around the corner". So I bought the best thinking in a few months I would be tearing it up big time. Well, I spent almost two years playing SOF2 on it instead of buying a cheaper GF4 TI4200 which would have worked just fine and then spending the big bucks on this generation. Now that the games are here, my 9700-pro is a little on the slow side.

Lesson Learned: Don't buy video cards based on future release schedules even if it is id or Valve.
 
Jbirney said:
Keep in mind that comment was said before new drivers were out so you have to view it in the context it was said. Doug said that before we have all of the new drivers so of course anything new will be out of context. The first VST test we saw showed about a 17 FPS for the ATI XT vrs the 6800U. Now thats not 30% but its still pretty good. And for the 40% stuff....I take it you never saw the VST/CS when the FX was forced to DX9 path did ya as that 40% is pretty close (again before new drivers).




It still reamins to be seen whos faster. You have to admit not running AI on the ATI cards and having the NV ops on is not apples to apples. I smart person whould hold on till we have more data.



AI will help mainly the loss in AF/AA settings. If you look you can see ATI looses more when when AA/AF is applied compared to NV.


JB yes they do loose more without AI do you know why, whats drawing the pixels to the screen? The pixel shaders.....

So lets take an example. AI does shader replacement? Yes so what is happening why is it decreasing the loss with AA and AF? well if the pixel shaders are the ones that are drawing to the screen the shader replacements are the letting the pixel shaders do less work. Simply put but you get the idea.

BS without AI is apples to apples. Because ATi aa and af opts are on without the CCC!

And the only thing that CCC is good for is application detection and shader replacement. Everything else is included in the regular catalyst drivers (well other then being able to turn off the AA and AF opts)

You are the first person to say use shader replacement with ATi cards and compair that to other graphcis cards. You are feeling red.
 
Rancor, I would applaud you... but... I have no clue what you just wrote.

For the sake of being a smartass, let me introduce you to some friends.

- . (the period, used to roughly end an idea)

- , (the comma, used to separate brief thoughts)

- ABCDEFGHIJKLMNOPQRSTUVWXYZ (These help tell where a period is)

I understand some people in this community speak somewhat broken english, but all you have to do is punctuate and you'll sound more credible! :)

With that being said, I'll contribute to the discussion -- I'm glad I got an nVIDIA 6800GT, not only for past OpenGL games, but now for future games. And I had a 9800 pro before...
 
rancor said:
JB yes they do loose more without AI do you know why, whats drawing the pixels to the screen? The pixel shaders.....

You are the first person to say use shader replacement with ATi cards and compair that to other graphcis cards. You are feeling red.

As I said AI is not just shader replacement:

http://www.extremetech.com/article2/0,1558,1649802,00.asp
Catalyst AI turns on some general texture filtering optimizations that rely on algorithms to examine textures and adjust features like texture filtering and mip-map bias on the fly. There are two settings: Low, the default setting, only performs some very basic and minor optimizations that won't have a huge impact on performance. The High setting uses a more advanced texture analysis algorithm, potentially providing even better performance.

Thus FS may not even had the lowest level of AI on and since the only area that you see a drop is with AA/AF were the ATI cards fall father behind. Replacing Pixel shaders has nothing to do with the current CS scores. Its all about the AA/AF/Tri ops here.
 
I haven't seen, or heard of any problems with ati's shader replacement, if you can show me where they lower image quality, do so.
Ati cannot force FP16 like nvidia can for big speed gains, there isn't really much they can do to lower IQ without being obvious.
Also AI off vs opts.. ya weren't you(nvidia fans) the ones bitching when reviews would compare the X800 to the 6800 with the 6800 with the opts off?
But now it's fine and dandy that nvidia is in the lead :rolleyes:
 
Jbirney said:
As I said AI is not just shader replacement:

http://www.extremetech.com/article2/0,1558,1649802,00.asp


Thus FS may not even had the lowest level of AI on and since the only area that you see a drop is with AA/AF were the ATI cards fall father behind. Replacing Pixel shaders has nothing to do with the current CS scores. Its all about the AA/AF/Tri ops here.


That doesn't denounce what I just stated, that link and quote just stated their filtering algos are being changed which are better suited dependening on the engine. nV is doing this also and it can be turned off. In the nV control panel there is an option to turn of mip map optimizations. With out CCC and AI ATi is doing this by default too. This is noticable as perfromance drops varied depending on the applicaitons being tested with the 4.9 and lower drivers with AA and AF.
 
rancor said:
That doesn't denounce what I just stated

Yes you made up some theory about shader replacement in CS and said that AI does. Come one fess up I got you here :)

nV is doing this also and it can be turned off. In the nV control panel there is an option to turn of mip map optimizations. With out CCC and AI ATi is doing this by default too. This is noticable as perfromance drops varied depending on the applicaitons being tested with the 4.9 and lower drivers with AA and AF.

Thats the point I was trying to make. There is another seting for AI thats not being turned on. Who knows how much faster and what the IQ will be like. Thus your testing all of the ops on with the NV cards and not yet all the ops on with the ATI cards....
 
Jbirney said:
Yes you made up some theory about shader replacement in CS and said that AI does. Come one fess up I got you here :)



Thats the point I was trying to make. There is another seting for AI thats not being turned on. Who knows how much faster and what the IQ will be like. Thus your testing all of the ops on with the NV cards and not yet all the ops on with the ATI cards....


?

You are really confused arn't you...... The xbit labs tested with AI on? Yes they did LOL, man you can't do any correlations at all or somethin?


Since the 4.9's still have thier AA and AF opts on the xbit labs performance boost was only from the AI and the numbers are much closer. Wow interesting. Exactly what I was stated, there is shader replacement going on ..........
 
burningrave101 said:
They got PWNED because ATI should have at least a 5-10% lead in CS:S because of the engine its built on and the fact ATI has spent millions of dollars investing on marketing their cards with Half Life 2. Not to mention all the FUD ATI and Valve have spread about ATI hardware running the HL2 engine better then anyone else.

What do you think people would of said if DOOM 3 had released and ATI had a 2% lead on nVidia? lol

valve isn't as 'in bed' with ATI as you think. they've actually spent more time with nvidia optimizing for hl2, then with ATI believe it or not.

and im surprised firingsquad didn't use the ati beta driver in these tests while it did use an unofficial nvidia release.
 
This is great news and all, but it's more like a pre-preview if anything.
Now please, don't tell everyone to wait, because it will take 6 months before the average Joe can buy it. I'm sure the game sites will get it next month and jump up and down, but look at how long it took for the current high end cards to become readily available in the retail channel. Not that I'm ragging, but please, don't tell us to wait. Anyone buying a 6800GT or a 6800 Ultra will be happy for the games out right now.
 
Both companies perform great so far. However, ATi is using their older chipset just beefed up. Nvidia is using a new chipset and with better drivers will come better performance. I would guess this will be a regular thing with the new 6800 series cards.
 
doh-nut said:
valve isn't as 'in bed' with ATI as you think. they've actually spent more time with nvidia optimizing for hl2, then with ATI believe it or not.

and im surprised firingsquad didn't use the ati beta driver in these tests while it did use an unofficial nvidia release.


That was for the 5XXX series though, wasn't it?
 
My little 6800NU seemed to make a pretty good showing. Considering I play most of my games at 1280x1024 I think I got a pretty good card.
 
rancor said:
?

You are really confused arn't you...... The xbit labs tested with AI on? Yes they did LOL, man you can't do any correlations at all or somethin?


Since the 4.9's still have thier AA and AF opts on the xbit labs performance boost was only from the AI and the numbers are much closer. Wow interesting. Exactly what I was stated, there is shader replacement going on ..........
In Unreal Tournament 2003 and 2004, application detection kicks in highly tuned texture-filtering specific to that engine and those games. Something similar happens in Half-Life 2 and the mods that use its executable, such as Counter-Strike: Source and the somewhat synthetic Video Stress Test.
Prove shader replacement is going on, and if it is, show a difference in IQ between the 6800 and the X800.
I'm waiting(I have a feeling I'll be waiting for a LOOOONG time)
 
Moloch said:
Prove shader replacement is going on, and if it is, show a difference in IQ between the 6800 and the X800.
I'm waiting(I have a feeling I'll be waiting for a LOOOONG time)


Turn off AI in CS: Source with the new CCC and you will see there is a performance hit lol

I never stated there was an IQ difference did I? Don't assume things I never stated....
 
Y'know most of the small differences in D3D game performance between ATI and NVidia's offerings don't amount to an anthill... NVidia's still got a much better performing OpenGL driver and superior availability last time I checked, CS or HL2 numbers don't change either of those things. And before anyone starts hollering that OpenGL is D3 only, some people actually play other OpenGL games like CoH and KOTOR y'know...

I'm just happy ATI is doing better at keeping up with NVidia than 3dfx or anyone else did, keeps prices down, innovation going, and driver releases a'plenty. :cool:

P.S. Get over your semantics people, if you don't agree with a qualitative word (pwn) in a news blurb or review that's fine, but don't make a crusade out of it, jeez. One would think you actually have money on ATI/NVidia stock judging by how passionate you're about them... :p

P.S.S. At the time Valve made their performance claims they appeared to be true and CS did run better on ATI hardware then... I'm not sure why it's so funny it's changed, it's just good driver development. Carmack changed his stance on which card would play D3 better about a dozen times over it's development ('course the game was in development for about 5 generations of cards so that explains some of it :D ).
 
Status
Not open for further replies.
Back
Top