Half Life 2 Benchmarks @ HardOCP.com

Status
Not open for further replies.
Some people just don't get it. :(

This reminds me of the time Terry left R3D, but this has been the fast-track version! :eek:

The nice man comes to talk to you and possibly give you info that you can't get anywhere else, and ya all feel it requires you to bash/troll him.

Well ain't that nice, bash a man who can't fight back since he has a professional reputation and job to maintain and drive away a possibly huge asset to your forum just to show how cool/tough/macho you are.

Big men, nice job. :rolleyes:
 
digitalwanderer said:
Some people just don't get it. :(

This reminds me of the time Terry left R3D, but this has been the fast-track version! :eek:

The nice man comes to talk to you and possibly give you info that you can't get anywhere else, and ya all feel it requires you to bash/troll him.

Well ain't that nice, bash a man who can't fight back since he has a professional reputation and job to maintain and drive away a possibly huge asset to your forum just to show how cool/tough/macho you are.

Big men, nice job. :rolleyes:

I did not bash Terry and never have. I apologize if my post is a little in your face.... but I just feel if ATi's drivers were stable to begin with they would not need smartgart or vpu recover. I feel smartgart and vpu recover useless and dont like them.

Bottom line is ATi and game developers gotta work together (like Valve and ATi did with HL2) I dont see why the API of both OpenGL and D3D are not followed to a T by both nVidia and ATi.


:eek:

EDIT:
I use the cat 3.10's and so far all my games work great.
Only game running slow for me is Vampire - The Masquerade Bloodlines.
 
bizzy420 said:
moloch sm3 is ALOT better?
i guess it helped alot in farcry huh....sm2 on ati's still smoking nvidias sm3
Geometry instancing isn't a SM3 feature(nvidia gets a decent boost in some situations from that), just microsoft for some reason tied it to SM3, and also both ati and nvidia got gains from SM3 because the original shaders were replaced with more effcient ones, or so they say.
 
digitalwanderer said:
Well ain't that nice, bash a man who can't fight back since he has a professional reputation and job to maintain and drive away a possibly huge asset to your forum just to show how cool/tough/macho you are.

Big men, nice job. :rolleyes:

Sorry, but being a "huge asset" doesn't include spreading PR fluff about your company's card beating the competition, then linking to benchmarks that used the timedemos your company created (or isolating a single benchmark instead of looking at the big picture) because custom timedemos and normal gameplay don't show your company's card to be any faster than the competition.

If he wants to stay and help that's one thing. If he wants to preach about ATI being faster in order to attempt to influence opinion in the forum because the game where ATI was supposed to be so much faster isn't significantly faster on ATI cards, that's another.

The reality is that Half Life 2 runs super fast with everything maxed, max res, aa/af, etc even on the $400 cards (x800pro/6800GT), nevermind the top end. And at the $400 point, its a tie with this game. Which goes back to the 6800GT being the best value as its much faster in Doom3 engine games (where the speed is actually needed as even a 6800U cant run the game maxed), just as fast in HL2, can display OpenEXR HDR in FarCry unlike x800, SM3.0, FP32, etc.
 
tranCendenZ said:
The reality is that Half Life 2 runs super fast with everything maxed, max res, aa/af, etc even on the $400 cards (x800pro/6800GT), nevermind the top end. And at the $400 point, its a tie with this game.

Is it? Aside from the [H] benches I've seen it looks like a runaway for the x800Pro. Mind linking me to the benches you're referencing? (not being sarcastic, actually curious) :)
 
6800GTOwned said:
That anandtech comparison is very interesting. The Xbit labs benches had me kinda worried (haven't got hl2 yet myself, but they made it look pretty rough on a GT at 1280x1024 w/AA & AF). Thx :)

xbit labs used the old 66.93 drivers instead of the HL2 67.02 drivers that HardOCP, Anandtech, and Tommti used, thats why their results are so skewed.
 
NEVERLIFT said:
Sorry to burst your bubble Terry but everyone I know that has a ATi card and me included turn off vpu recover(its buggy and not stable). And we hate dumbgart(we use our bios and tend to know better what settings should be used... and there are alot posts on the Rage3D forums about how bad smartgart is) :(
And PowerStrip, Rage3D Tweak and RivaTuner overclock videocards and unlock alot more stuff ;)
PS: Alot my friends are still even using the Cat 3.10's cause the newer drivers are not stable for them.(this is with 9800pro's).

I don't know anyone that has had problems with the CATALYST 4,x series drivers, and I know around 20 people with 9600 series cards and higher. None of us has experienced bugs with VPU recover. Maybe you and your friends would experience more stability with the most recent drivers if you didn't mess about with your BIOS so much?
 
6800GTOwned said:
That anandtech comparison is very interesting. The Xbit labs benches had me kinda worried (haven't got hl2 yet myself, but they made it look pretty rough on a GT at 1280x1024 w/AA & AF). Thx :)

Anand also doesnt use 1600x1200.

I find it kind of ironic, that certain people are complaining about hardware sites using "ATi timedemos", yet didnt have a problem with the same situation in Farcry. NV made the timedemos that are included in a patch.
 
jon67 said:
I don't know anyone that has had problems with the CATALYST 4,x series drivers, and I know around 20 people with 9600 series cards and higher. None of us has experienced bugs with VPU recover. Maybe you and your friends would experience more stability with the most recent drivers if you didn't mess about with your BIOS so much?

Had a whole host of problems here with a 9600 & Cat 4.x that's now running on a secondary rig... Granted it's a slightly older rig but in doing the research (after I bought it sadly) there was a long-assed trail of complaints regarding compatibility of the card with slightly older hardware. I ended up fixing it amazingly quickly after the huge amount of reading I did (combination of BIOS update and a Fastwrites tweak solved the constant freezing) but it was mildly annoying regardless.

That being said, I don't see what this has to do with the thread at all. By and large the driver support of both comapnies seems stable in this day and age. Both have their problems but neither have drastic flaws, past history is exactly that.
 
Kyle and Brent,

I've got a BFG 6800 OC (NU). I'd love to see an in-depth [H] style comparison of the real-world effects of using the RivaTuner 16X6 softmod, versus stock 6800 and 6600GT in particular, in HL2, Doom3, etc. etc. In addition to giving me a feel for the potential advantage, I think it could shed some instructive light on architectural benefits & design philosophies from a theoretical perspective.

I've seen posts with limited info, but the "pro treatment" of the subject would be a lot of fun.
 
Commander Suzdal said:
Kyle and Brent,

I've got a BFG 6800 OC (NU). I'd love to see an in-depth [H] style comparison of the real-world effects of using the RivaTuner 16X6 softmod, versus stock 6800 and 6600GT in particular, in HL2, Doom3, etc. etc. In addition to giving me a feel for the potential advantage, I think it could shed some instructive light on architectural benefits & design philosophies from a theoretical perspective.

I've seen posts with limited info, but the "pro treatment" of the subject would be a lot of fun.

That'd be real nice but success of the procedure is real hit and miss even across the same brand/version of cards. It's one of the few things that might make a 6800nu worth the price over a 6600GT I guess though.

I'm planning to try it myself once I test the max OC of the card with the stock pipes and make sure the install of my new system is otherwise stable regarding all other components. Will most surely post results. Won't have much to compare it to though (other than a Radeon 9600) but you can use the numbers as a watermark.
 
fallguy said:
Anand also doesnt use 1600x1200.

I find it kind of ironic, that certain people are complaining about hardware sites using "ATi timedemos", yet didnt have a problem with the same situation in Farcry. NV made the timedemos that are included in a patch.
Just remember, shader replacements are bad unless nvidia are using them, anything nvidia does is right, they never make mistakes.
If ati does shader replacement's that dont effect IQ, it's bad, if ati supplies time demos that are more demanding and highlight their hardware's speed, its bad:)
 
I'm waiting for some more in-depth analysis of HL2 performance frankly... Everyone's saying ATI's timedemos just stress the hardware more, well that's certainly plausible, but what exactly is in them that stresses the hardware more (other than general shader use). How do we know the drivers themselves aren't optimized for these specific timedemos? No one's even pointed out the lowest FPS of any card in the reviews, just average fps scores... Not good enough.

Has anyone tried recording comparable demos to ATI's and then running them on all the hardware? I'm not trying to debunk the fact that the ATI timedemos may certainly reveal some shortcomings here and there or may be more taxing, I'd just like an explanation rather than taking it at face value.
 
Moloch said:
Just remember, shader replacements are bad unless nvidia are using them, anything nvidia does is right, they never make mistakes.
If ati does shader replacement's that dont effect IQ, it's bad, if ati supplies time demos that are more demanding and highlight their hardware's speed, its bad:)

Actually, it's quite the other way around...shader replacements were "evil" until they became a "feature" when ATi introduced "AI"...just like dual slot cooling is horrible till the x850...and PS3 is unnecessary till r520...and SLI is a horrible idea till AMR...and bridge chips are bad till ATI uses them next year on the x700s...did I miss anything?
 
jon67 said:
I don't know anyone that has had problems with the CATALYST 4,x series drivers, and I know around 20 people with 9600 series cards and higher. None of us has experienced bugs with VPU recover. Maybe you and your friends would experience more stability with the most recent drivers if you didn't mess about with your BIOS so much?

Try using the "System Cache" setting with an ATi card and watch what happens
:p

Right click my computer then properties then advanced then performance setting and check the System Cache setting in there.
 
Actually, it's quite the other way around...shader replacements were "evil" until they became a "feature" when ATi introduced "AI"..

Shader replacements are evil as long as you can't disable them. ;)
 
NEVERLIFT said:
Try using the "System Cache" setting with an ATi card and watch what happens
:p

Right click my computer then properties then advanced then performance setting and check the System Cache setting in there.

OK, done that.

Restarted Windows and verified that System cache was still checked. Have run HL2 with a slight GPU oc, MS Office programs and are now using the internet without any problems. What kind of errors/bugs should I expect? And why would I check System cache in the first place?

btw I'm using the 4.12b drivers
 
Shuttle XPC SB75G2
P4 3.2C
1gb Kingston HyperX PC4000
BFG 6800GT OC @ 439 / 1170
Kingwin AWC-1 Water Cooling Kit
3DMark05 Score 5442

XPC_WATER2.jpg


XPC_WATER3.jpg
 
dworley, why post your PC pics in here? This is a HL2 benchmark thread, not a post your PC thread.
 
Apple740 said:
Shader replacements are evil as long as you can't disable them. ;)

I feel the same way about anistropic filtering optimizations (which, despite what ATi might say, still can't be disabled)...
 
^eMpTy^ said:
I feel the same way about anistropic filtering optimizations (which, despite what ATi might say, still can't be disabled)...
You talking about angle dependency? Or what?
The doom3 shader replacements ati does actually improves the quality, so I like em:)
 
..a thing that has been quite never speaked...
do the two cards ( x800 and 6800 ) show THE SAME images ?

I mean apart from benchamrks do the two cards give out the exactly same output images on same res. / settings ?

any answer / pics. comparison would be higly appreciated.
 
boodi said:
..a thing that has been quite never speaked...
do the two cards ( x800 and 6800 ) show THE SAME images ?

I mean apart from benchamrks do the two cards give out the exactly same output images on same res. / settings ?

any answer / pics. comparison would be higly appreciated.

X-bit labs has a bunch of comparison images... All the reviews I've seen point towards IQ been virtually the same 'cept for some slight fog difference on distant landscape (ATI's is lighter, NVidia's is denser).
 
spyderz said:
Mr Bennett, just wanted to let you know how much i like your website, but the best thing of all is how you respect the opinions given here in the forums and how you always try to keep things calmed up in here, but there have been a few that i have also been dissapointed with due to there position i guess i expected a different attitude, i can take being called a fanatic (this always happens when you have an answer to there answer or you have an awnser they can't explain right or explain it to us simple folk in simple terms) but one thing i can't stand is this ( :rolleyes: ) emoticon, in a face to face situation (rolling eyes) would open a can of woopass, other than that i love the article, thanks

*update* yea where i come from, this shit is true
*update* it's monday and i'm at work with a hang over, had my wife type this up for me last night beer can be an evil thing, but damn does it taste good, disregard previous notes they where dictated by a hilarious drunk
 
boodi said:
could you guys give me any reason to choose an ati x800 xt over a 6800 gt ?

An X800XT is going to cost you as much as $100 more then the 6800GT so you can't really compare the two. In terms of price/performance the 6800GT is an easy winner.
 
boodi said:
could you guys give me any reason to choose an ati x800 xt over a 6800 gt ?

because the xt is a better card ... fyi .. those cards are not compared because they do not match in performance ... especially in this thread ... the xt over powers the Ultra .. if you were trying to compare the 6800gt to an Ati card in this thread .. it would be the x800 pro ..
 
mohammedtaha said:
because the xt is a better card ... fyi .. those cards are not compared because they do not match in performance ... especially in this thread ... the xt over powers the Ultra .. if you were trying to compare the 6800gt to an Ati card in this thread .. it would be the x800 pro ..

The XT overpowers the Ultra?

Yea in certain games the XT is a little faster but the 6800U is faster in alot of DX9 games as well while being faster in any game built on the OpenGL API by a fair margin and dominating in Linux performance.

The way i see it is the 6800U and X800XT are tied for performance in DX9 while nVidia holds the OpenGL and Linux crown easily.

Thats 2.5 to .5 and yet the X800XT overpowers the Ultra?

Which is better also depends on what resolution and settings you play at. Even if the X800XT is faster in a game at 1600x1200, that doesn't mean its faster at 1280x1024.

I'm sorry but the 6800GT should be an easy choice over the X800XT simply because of the price difference. The X800XT is faster then a stock 6800GT but only marginally in DX9.
 
burningrave101 said:
The way i see it is the 6800U and X800XT are tied for performance in DX9.

This is a typical example of misunderstandings arising from CPU limited video card benchmarking tests.
 
mohammedtaha said:

Maybe you should get informed.

jon67 said:
This is a typical example of misunderstandings arising from CPU limited video card benchmarking tests.

I wasn't talking about Half-Life 2 alone. I was talking about the entire DirectX collection of games out there which comprises hundreds of titles. Half-Life 2 does NOT equal DX9 lol. Its ONE game and ONE engine and the performance shown in it will only hold true in other games based off the same exact engine.

nVidia also hasn't had the time to optimize for HL2 like ATI had. You can be sure there will be a performance gain on the NV40 side once nVidia has time to get out some more driver revisions.

And you ATI fans are talking about the CPU limited benchmarking tests in HL2 but the people that benchmarked the game were just simply actually playing the game. They didn't go hunting for small instances where performance is going to be more GPU intensive. Which card is faster in those scenes doesn't mean shit because the only performance that matters is the performance you actually see when playing the ENTIRE game. And since the Source engine is fairly CPU limited its not going to matter whether you have a 6800 or an X800 because your going to get the same top of the line performance out of each.
 
Dear boodi, I have explained my point of view in a previous post in this thread (and taken flak for it as well). Basically CPU limited tests will not bring out max performance of best video cards, thereby effacing the differences between them.
 
burningrave101 said:
Maybe you should get informed.



I wasn't talking about Half-Life 2 alone. I was talking about the entire DirectX collection of games out there which comprises hundreds of titles. Half-Life 2 does NOT equal DX9 lol. Its ONE game and ONE engine and the performance shown in it will only hold true in other games based off the same exact engine.

nVidia also hasn't had the time to optimize for HL2 like ATI had. You can be sure there will be a performance gain on the NV40 side once nVidia has time to get out some more driver revisions.

you're right and this might seem obvious to you and me and also according to Anand nvidia only had two days with the game and ATI had a week and this can make a big diffrence but to people with ati hardware this doesn't matter ati's performance advantage is "due to better hardware", but back when doom3 was out it wasn't nvidias hardware or better opengl support, in there eyes it was because ID gave nvidia more time with the game and or the game was optimized for nvidia hardware, the big problem this time around is like i stated before they where expecting a whopass ala doom3 they didn't get it and now there left picking at whatever little thing they can find, kudos to nvidia who only had two days with the game and there performance is good can't wait to see what they have up there sleaves now that they have to game in there hands
 
jon67 said:
Dear boodi, I have explained my point of view in a previous post in this thread (and taken flak for it as well). Basically CPU limited tests will not bring out max performance of best video cards, thereby effacing the differences between them.

The only performance that matters is what you see when actually playing the game. Nobody but ATI enthusiasts give a rats ass about the performance ATI can show in their special graphically intensive benchmark demo's that last for a few seconds. If that same performance doesn't show when someone runs through an entire level in HL2 like HardOCP did then those results are worthless.

You play the GAME not the BENCHMARK!
 
burningrave101 said:
Maybe you should get informed.



I wasn't talking about Half-Life 2 alone. I was talking about the entire DirectX collection of games out there which comprises hundreds of titles. Half-Life 2 does NOT equal DX9 lol. Its ONE game and ONE engine and the performance shown in it will only hold true in other games based off the same exact engine.

Maybe you should read the title before posting
 
Status
Not open for further replies.
Back
Top