FiringSquad CS: S Benches

Status
Not open for further replies.
Moloch said:
Vsync just syncs the fps to the refresh rate, if you enable tripple buffering without vsync, your fps can go above your refresh rate.
I'm not some kid you likes ati who doesn't know how to test thing things dude..

Triple buffering doesn't do anything without v-sync enabled lol.
 
burningrave101 said:
Triple buffering doesn't do anything without v-sync enabled lol.
"if you enable tripple buffering without vsync, your fps can go above your refresh rate."
Does it sound like I am stateing it is?
jesus christ people :rolleyes:
 
Moloch said:
"if you enable tripple buffering without vsync, your fps can go above your refresh rate."
Does it sound like I am stateing it is?
jesus christ people :rolleyes:

It sounds like your stateing that if you enable triple buffering then your fps can go above your refresh rate. lol

But in actuality it doesn't do anything when you enable triple buffering unless your using v-sync. Triple buffering just gives you a third buffer to buffer to instead of having just two. And none of this comes in to play unless v-sync is enabled.
 
Moloch said:
50fps, 33.3fps, 25fps or so on with 100fps refresh..
You seem to be confused here :p


My bad mine goes from 60 to 30, so hell there ya go, there is nothing in the middle
so whats your complaint about no triple buffering and why that would cause the frame rates not to go in these increments?
 
burningrave101 said:
It sounds like your stateing that if you enable triple buffering then your fps can go above your refresh rate. lol

But in actuality it doesn't do anything when you enable triple buffering unless your using v-sync. Triple buffering just gives you a third buffer to buffer to instead of having just two. And none of this comes in to play unless v-sync is enabled.


exactly, triple buffering alone might increase vram to gpu access times since more materials can be given an address. But these cards go so fast its almost not needed anymore

When I stated I could increase the performance of my engine 40% quadruple buffering was one of the componets to do it ;) , but it costs a lot, and you don't want to start page flipping trying to do this.
 
Moloch said:
Did it happen here or at rage3d??
There's a bad few in every crowd, but the bad bunch of much bigger on the nvidia side.
That's one reason I don't want to get a 6800, because then I'd goto nvidia specific sites and I would be discusted by the lack of intelligence.


Obviously you haven't spent enough time at Rage. :p

BTW, it's disgusted. Learn how to spell before you make comments about others intelligence. Anybody else you want to anonymously slam?
 
burningrave101 said:
It sounds like your stateing that if you enable triple buffering then your fps can go above your refresh rate. lol

But in actuality it doesn't do anything when you enable triple buffering unless your using v-sync. Triple buffering just gives you a third buffer to buffer to instead of having just two. And none of this comes in to play unless v-sync is enabled.
Tell me something I don't know
 
poppachocks said:
Obviously you haven't spent enough time at Rage. :p

BTW, it's disgusted. Learn how to spell before you make comments about others intelligence. Anybody else you want to anonymously slam?
Spelling is not directly related to intelligence, some extremly smart people can be horrible spellers.
It's just so happens on message boards, idiots think that typoos and spelling errors must mean the person is a complete moron.
 
Moloch said:
I spent most of my time at rage3d, untill I said something I should not have..


well there ya go, probably something pro nV i'm guessing? They really don't have much up there. A couple developers are there they only post up demos for everyone to be wowed at.
 
rancor said:
well there ya go, probably something pro nV i'm guessing? They really don't have much up there. A couple developers are there they only post up demos for everyone to be wowed at.
No you fucking idiot, it was something I said in offtopic, btw, there is , or was a sub forum for nvidia cards..
 
Moloch said:
No you fucking idiot, it was something I said in offtopic, btw, there is , or was a sub forum for nvidia cards..


please they don't even allow any one to post up a x800 vs 6800 thread my god, why not? If the people that were there could actually do nothing but flame they wouldn't need that rule.....

And stop calling people names, because everything for the last few posts you did wasabosultely false!
 
Ok so I have an X800 XT PE. And by happenstance I have noticed that a majority of people in this thread own NVIDIA cards. However I am not here to play the mine is better than yours game. So are you saying because I invested in the X800 XT PE that I have wasted my money and should go throw it in the trash? I have used both companies and I like them both in this particular situation when I built the twins I decided that ATi was the way to go. But according to a lot of people in this thread I am wrong. All I want to know is is whether it was a sound decision to get the ATi card because now I keep seeing conflicting benchmarks on BOTH sides so I can't really ascertain the difference. Is the ATi X800 XT PE a piece of shit ? Or is it a great card?
 
iToast said:
Ok so I have an X800 XT PE. And by happenstance I have noticed that a majority of people in this thread own NVIDIA cards. However I am not here to play the mine is better than yours game. So are you saying because I invested in the X800 XT PE that I have wasted my money and should go throw it in the trash? I have used both companies and I like them both in this particular situation when I built the twins I decided that ATi was the way to go. But according to a lot of people in this thread I am wrong. All I want to know is is whether it was a sound decision to get the ATi card because now I keep seeing conflicting benchmarks on BOTH sides so I can't really ascertain the difference. Is the ATi X800 XT PE a piece of shit ? Or is it a great card?


Its still a good card no is saying its crap.....If you are thinking thats what we have stated read it again. At no time I don't think anyone said that the x800's suck?
 
Moloch said:
I spent most of my time at rage3d, untill I said something I should not have..

Then you know there is little or no objectivity on that site (expect for the few brave souls that dare to post something that shows ATI in a less than stellar light) There's plenty of the "12 year old mentality" that you mentioned there.

There was a thread there about a month or so ago about someone who had an issue with an XT card catching on fire (or something like that). The majority of replies were something like "you must have screwed something up" One enlightened person actually said "too bad it didn't burn your house down" That's a mature and intelligent crowd for sure.

BTW: this guy also posted his experience at NVNews and pretty much without exception received replies like "sorry to hear that", or "hope ATI will make it right with you". One of the mods from Rage actually posted in that thread with an apology for some of the Rage users.

Maybe you need to actually visit places like NVnews before you trash the users there.
 
rancor said:
please they don't even allow any one to post up a x800 vs 6800 thread my god, why not? If the people that were there could actually do nothing but flame they wouldn't need that rule.....

And stop calling people names, because everything for the last few posts you did wasabosultely false!
Why not? Look at this thread.. it just becomes mindless bashing.
what did I do that was false?
 
poppachocks said:
Then you know there is little or no objectivity on that site (expect for the few brave souls that dare to post something that shows ATI in a less than stellar light) There's plenty of the "12 year old mentality" that you mentioned there.

There was a thread there about a month or so ago about someone who had an issue with an XT card catching on fire (or something like that). The majority of replies were something like "you must have screwed something up" One enlightened person actually said "too bad it didn't burn your house down" That's a mature and intelligent crowd for sure.

BTW: this guy also posted his experience at NVNews and pretty much without exception received replies like "sorry to hear that", or "hope ATI will make it right with you". One of the mods from Rage actually posted in that thread with an apology for some of the Rage users.

Maybe you need to actually visit places like NVnews before you trash the users there.
Did post of the users have a decent post count?
Alot of people have the same"you must have a problem mine works fine" mentality, it's not just limited to ati, or nvidia for that matter.
I had an issue with firefox, where I got shitload of spyware, I thought that was kind of odd since it's supposed to be good with regards to spyware, and only about 2 replies out the 50+ people went to the site I told them to, and got all kinds of spyware.
 
iToast said:
Ok so I have an X800 XT PE. And by happenstance I have noticed that a majority of people in this thread own NVIDIA cards. However I am not here to play the mine is better than yours game. So are you saying because I invested in the X800 XT PE that I have wasted my money and should go throw it in the trash? I have used both companies and I like them both in this particular situation when I built the twins I decided that ATi was the way to go. But according to a lot of people in this thread I am wrong. All I want to know is is whether it was a sound decision to get the ATi card because now I keep seeing conflicting benchmarks on BOTH sides so I can't really ascertain the difference. Is the ATi X800 XT PE a piece of shit ? Or is it a great card?

This thread got way off topic since it was originally about CS source benchmarks. HL2 was one game where ATI was supposed to reign supreme, and now it's looking like it may be more of a draw (performance wise).

If you are happy with your card, don't worry about what others say or think.
 
Moloch said:
Why not? Look at this thread.. it just becomes mindless bashing.
what did I do that was false?


triple buffering........

Read up on it.......
 
I think it would be in everyones best interest if we all just talked about the video cards here and stopped exchangeing opinions we share about each other and other sites and forums.
 
Moloch said:
Reading comprehension.
Work on it. ;)

LOL what ever, Triple buffering has nothing to do with frame rate locking....


Look at it this way, Buffers are used for storage before things are sent to the GPU, so why would a third buffer create a frame rate lock?
 
rancor said:
Its still a good card no is saying its crap.....If you are thinking thats what we have stated read it again. At no time I don't think anyone said that the x800's suck?

I am not saying that someone said it as a statement of fact. I am simply gathering that implication from the posts I have read. So what im trying to ask is. Is this a case of anything you can do I can do better. Between the 6800 and X800 series cards becuase the implication that I am gathering here is that the X800 series cannot compete or outperform the 6800 in ANY arena is this correct? I was always under the impression that it was kind of an apples and oranges stand point. The NVIDIA was very very good at what it does in OpenGL and likewise to the ATi and DirectX. I may be wrong and if I am thats fine. If you want my opinion on it which since I am a n00b probably doesn't matter I belive that if ATi continues to create better driver sets for their cards we might see the tables turn. Now when I say that I am not saying that the two will invert but I am saying that we might just see the ATi pull ahead and then of course NVIDIA will have to develop a better driver to outperfom the ATi and so on and so fourth this process is what makes the free enterprise market so awesome, but i digress. NVIDIA has developed really great cards and continues to do so (except with the infamous hairdryer FX) and I am glad to see that a comany is planning for the future of gaming. However in the defense of ATi I belive that many of their performance based issues have to do with their dirvers. We all remember how difficult it was to load driver back when the Rage 128 was out and we as gamers and consumers went and got our Geforce or Our Voodoo cards and had at it with that. All I think is that ATi has a chance to really propel this hardware further than they have thusfar taken it. Yes it is an older architecture, but I think that with proper development of software we have some really good things to look into for the future of the X800 series of cards. And at any rate when the margin of difference is 2% (a VERY negligable amount) I now do not think that my decision to use this card is in ill will. Thanks for reading my rant. I feel relieved :D as you can tell I have had a rough day haha.
 
Moloch said:
Did post of the users have a decent post count?
Alot of people have the same"you must have a problem mine works fine" mentality, it's not just limited to ati, or nvidia for that matter.

Exactly. There are plenty of those kind of people to be found everywhere you look. But you were the one who claimed that the majority of them can be found on Nvidia forums. I took exception to that because it's simply not true.

Anyways like Burning said....can we get this thread back on topic?
 
iToast said:
I am not saying that someone said it as a statement of fact. I am simply gathering that implication from the posts I have read. So what im trying to ask is. Is this a case of anything you can do I can do better. Between the 6800 and X800 series cards becuase the implication that I am gathering here is that the X800 series cannot compete or outperform the 6800 in ANY arena is this correct? I was always under the impression that it was kind of an apples and oranges stand point. The NVIDIA was very very good at what it does in OpenGL and likewise to the ATi and DirectX. I may be wrong and if I am thats fine. If you want my opinion on it which since I am a n00b probably doesn't matter I belive that if ATi continues to create better driver sets for their cards we might see the tables turn. Now when I say that I am not saying that the two will invert but I am saying that we might just see the ATi pull ahead and then of course NVIDIA will have to develop a better driver to outperfom the ATi and so on and so fourth this process is what makes the free enterprise market so awesome, but i digress. NVIDIA has developed really great cards and continues to do so (except with the infamous hairdryer FX) and I am glad to see that a comany is planning for the future of gaming. However in the defense of ATi I belive that many of their performance based issues have to do with their dirvers. We all remember how difficult it was to load driver back when the Rage 128 was out and we as gamers and consumers went and got our Geforce or Our Voodoo cards and had at it with that. All I think is that ATi has a chance to really propel this hardware further than they have thusfar taken it. Yes it is an older architecture, but I think that with proper development of software we have some really good things to look into for the future of the X800 series of cards. And at any rate when the margin of difference is 2% (a VERY negligable amount) I now do not think that my decision to use this card is in ill will. Thanks for reading my rant. I feel relieved :D as you can tell I have had a rough day haha.

LOL I hear ya on the rough day :)

Well ATi has done alot to thier driver's and they are very good, there is so much you can optimize.

ATi knows what to do next round they will improve thier Ogl performance. Just like nV did with thier Dx performance. nV still lacks in vertex shader performance and thats probabbly thier major upgrade for the next line. And I stated this a while back, this fight will get closer next line, maybe around a month ago. Which is all good it pushes the wheel forward for better technology. ATi has more to chew on though, due to the fact they didn't revise thier technology, nV has laid the ground work with the gf 6 line for the bases of thier gf 7's or what every they are called.

The x800's are good performers and lead or are equal to nV's offerings in Dx. Unfortunately, programs they should have a huge lead in is not happing and that doesn't look good for ATi's sales. Alot of people have baught ATi cards for HL2, and if it turns out that nV is just as close or ties them like in other Dx games, well thats like taking everyone that baught ATi cards for HL 2, and throwing mud in thier face. (sounds a bit harsh, don't know any other way of saying it).

This isn't fully ATi's fault either, I think Valve should be blamed for this.

If it wasn't for Vavle releasing thier numbers for HL2, the Gf fx might have had better sales. at least for a bit.

Valve was the first company to put thier game up for auction for bundling. That isn't right. Then they released numbers that wouldn't have even made a difference since the game wasn't even ready. Its almost like Valve had a vandetta against nV?
 
rancor said:
LOL what ever, Triple buffering has nothing to do with frame rate locking....


Look at it this way, Buffers are used for storage before things are sent to the GPU, so why would a third buffer create a frame rate lock?
You or someone said something stupid about tripple buffering so I corrected whoever.
If you only have double buffering with vsync, when the frame rate drops below the refresh, it will be a fraction(usally half) the refresh.
I personally use a 85 refresh with my freebie monitor, so that would be going to 42.5.. not bad for games I play(RPG) but for a FPS that could be bad.
 
rancor said:
Valve was the first company to put thier game up for auction for bundling. That isn't right. Then they released numbers that wouldn't have even made a difference since the game wasn't even ready. Its almost like Valve had a vandetta against nV?
I agree with you on this one. Maybe Valve knew HL2 would get held up in court and were looking for some quick cash, sorta like selling HL2/CS:Source on Steam now just for the quick buck.
 
Moloch said:
You or someone said something stupid about tripple buffering so I corrected whoever.
If you only have double buffering with vsync, when the frame rate drops below the refresh, it will be a fraction(usally half) the refresh.
I personally use a 85 refresh with my freebie monitor, so that would be going to 42.5.. not bad for games I play(RPG) but for a FPS that could be bad.


There is one use for triple buffering I forgot about, for ATi cards they can multiply thier AA levels with triple buffering. x2 will look like x4. And this is temporal AA.

This only works with Vsync and triple buffering.

If you disable vsync for lets say Far Cry whats the max fps you will get? If you do that for Quake yeah tearing might become a problem. but then again it goes fast enough and vsync will help ya out there.

Why would you need triple buffering for the latest games? Vsync doesn't do any help either because you probably won't go past your monitor refresh rate anyhow if you are using highest settings.
 
I love how people are quick to jump to conclusions.
Just remember during cs:s beta ati cards blew nvidia out of the water in the tests .
ATi has been working hard to improve D3 performence , witch it did quite well & fix the memory problem. I dont think it spent much time optimising CS : S witch we will hopefully see in 4.10.

I am too having doubts about ATi right now , being an X800xt owner i feel kinda.... well not kinda im f*&^ing pissed off. I bought x800xt specifically to play HL2 the best way possible , i just hope drivers fix this :mad:
 
OMG this thread is now 10 pages! Rancor, given the amt. of time you spend on here, I bet you could have completed that game you're working on by now. :\
 
MFZ said:
OMG this thread is now 10 pages! Rancor, given the amt. of time you spend on here, I bet you could have completed that game you're working on by now. :\


LOL actually the engine core is done, waiting on bug fixing if there is any :cool:

Game isn't my job ;)
 
rancor said:
There is one use for triple buffering I forgot about, for ATi cards they can multiply thier AA levels with triple buffering. x2 will look like x4. And this is temporal AA.

This only works with Vsync and triple buffering.

If you disable vsync for lets say Far Cry whats the max fps you will get? If you do that for Quake yeah tearing might become a problem. but then again it goes fast enough and vsync will help ya out there.

Why would you need triple buffering for the latest games? Vsync doesn't do any help either because you probably won't go past your monitor refresh rate anyhow if you are using highest settings.
You haven't told me anything I didn't already know.. yet again :rolleyes:
rancor said:
Moloch if ya need triple buffering enable it can be

http://downloads.guru3d.com/download.php?det=362

This will do it for ya :)

Its not the cards can't do it, its just not a driver option anymore.
I know that..
 
Moloch said:
You haven't told me anything I didn't already know.. yet again :rolleyes:

I know that..


why did you state that it wasn't even capable?

nVidia drivers don't allow you to set triple buffering for games globally, so you are reliant on the game itself. This means setting my example Richard Burns Rally to vsync via drivers and enabling triple buffering in settings for the game itself would get me vsync with very little performance hit, I hope.
And I heard it flatout doesnt work in OGL.


You stated you have heard that is doesn't work in Ogl. Dx doesn't matter, it has to be hard coded anyways.
 
rancor said:
why did you state that it wasn't even capable?




You stated you have heard that is doesn't work in Ogl. Dx doesn't matter, it has to be hard coded anyways.
That wasn't me.. I don't own an nvidia card, it's a guy on beyond3d.
 
Seriously, drop the drama you two.

Here's one thing to bring this back on topic:

I'd like to see image quality and benchmarks of opengl rendering vs. direct3d rendering on cs: source.

From what I've been able to tell... there are no noticeable differences. Oh, there is one. Direct3d locks up on me :(

FPS seems about the same... image quality seems about the same... come on, who's brave enough to compare? :D
 
Moloch said:
No you fucking idiot
Name-calling makes you look bad and won't help your argument, I recommend you save that sort of language for everyday conversation outside of the forums. :)
 
iToast said:
Is the ATi X800 XT PE a piece of shit ? Or is it a great card?
It's a great card. Ok, OpenGL and Linux performance isn't exactly mind-blowing but I'd say it's the fastest D3D card in the majority of current games. Also, with the right cooling you can reach pretty mental overclocks too. Availability is it's only major problem. I wouldn't be unhappy if I owned a XT-PE, if someone tells you it sucks then they don't know what they're talking about.
 
coz said:
It's a great card. Ok, OpenGL and Linux performance isn't exactly mind-blowing but I'd say it's the fastest D3D card in the majority of current games. Also, with the right cooling you can reach pretty mental overclocks too. Availability is it's only major problem. I wouldn't be unhappy if I owned a XT-PE, if someone tells you it sucks then they don't know what they're talking about.

An X800XT PE is definitely a top of the line card. It performs extremely well for D3D games and the ATI drivers have a few interesting options i will admit. 6xAA, triple buffering, 3Dc, and temporal AA are all nice features that can be used effectively.

IMO though, the 6800U is still the superior card because it is based off new technology instead of old and has more room to grow with driver updates. Its also superior to the XT in OpenGL and Linux and using the latest beta drivers its winning in the majority of D3D games at 1600x1200 w/ 4xAA + 16xAF. The 6800U also almost always wins when AA/AF isn't enabled no matter what the resolution is.

The 6800U has already stolen the title for the CS:S benchmarks which was suppost to be the one engine that ATI would completely dominate in. The 6800U also has alot of cool driver features the ATI cards dont have such as digital vibrance, application profiles, coolbits, a plethora of legacy AA modes, 8xS which is superior to the AA modes ATI supports, better full Trilinear filtering support, a little better AF quality from what i've gathered, fewer game bugs, and new beta drivers which are constantly being leaked.

And lets not forget support for SM 3.0 and USII Technology.
 
Status
Not open for further replies.
Back
Top