Half Life 2 Benchmarks @ HardOCP.com

Status
Not open for further replies.
Apple740 said:
Firingsquad is showing some benches where it looks like that Nv has a problem with water shaders.

canals1280aaf.gif


canals1600aaf.gif



http://www.firingsquad.com/hardware/half-life_2_performance/default.asp
Did they use the same drivers as[H]?
 
why the hell would you have a 2 page report...and have 1/6th of it devoted to AA modes that no one uses? why would you even bother comparing super sampling modes on the 6800s to the 6x mode on the XTPE??? that makes ZERO sense.

how about comparing the 6x mode to nvidia's 4x and see how much of a tangible difference in image quality it makes since we all know that the super sampling modes on nvidia hardware are COMPLETELY DIFFERENT, atleast give a shot of what super sampling can do vs multi-sampling...just this blind comparison with no explanation is very confusing...especially given the brevity of the preliminary benches...

I give it an hour before people are in here talking about how the XTPE is 100% faster than the 6800U based on those graphs...this hasn't been in any other [H] vid card review that I've seen...so why start now?
 
^eMpTy^ said:
why the hell would you have a 2 page report...and have 1/6th of it devoted to AA modes that no one uses? why would you even bother comparing super sampling modes on the 6800s to the 6x mode on the XTPE??? that makes ZERO sense.

how about comparing the 6x mode to nvidia's 4x and see how much of a tangible difference in image quality it makes since we all know that the super sampling modes on nvidia hardware are COMPLETELY DIFFERENT, atleast give a shot of what super sampling can do vs multi-sampling...just this blind comparison with no explanation is very confusing...especially given the brevity of the preliminary benches...

I give it an hour before people are in here talking about how the XTPE is 100% faster than the 6800U based on those graphs...this hasn't been in any other [H] vid card review that I've seen...so why start now?
Calm down dude.
 
FingerSlut said:
anyone try this game with 2 gigs of ram? I hear the review systems had 2 gigs. Ive tried with 1 gig and 1.5 gigs, and the stuttering did go down, just not totally, im just out of ram slots :(

'heh read the benchmarks, n/m
Yeah the norton defragger did help ALOT with the stuttering I had prior. I used to use executive software diskeeper defragger but it didn't do a good job defragmenting with Half-life II like the Norton defragment software. It did cut the stuttering by a significant margin.
 
Moloch said:
Didn't read whole thread, but unless brent turned on +r_fastzreject 1 , the benchmarks are not good enough for me, as the nvidia cards have this enabled by default, and the ati ones dont, because are cpu limited levels, it could be slower, but the gains of said tweak is 40+fps.

What are you saying here...? Is this a command for the HL2 console? This tiny command will improve the performance of any not-cpu-limited ATI card? Including my 9800PRO?
 
^eMpTy^ said:
why the hell would you have a 2 page report...and have 1/6th of it devoted to AA modes that no one uses? why would you even bother comparing super sampling modes on the 6800s to the 6x mode on the XTPE??? that makes ZERO sense.

how about comparing the 6x mode to nvidia's 4x and see how much of a tangible difference in image quality it makes since we all know that the super sampling modes on nvidia hardware are COMPLETELY DIFFERENT, atleast give a shot of what super sampling can do vs multi-sampling...just this blind comparison with no explanation is very confusing...especially given the brevity of the preliminary benches...

I give it an hour before people are in here talking about how the XTPE is 100% faster than the 6800U based on those graphs...this hasn't been in any other [H] vid card review that I've seen...so why start now?

Why is all of this so important to you, what people are saying on some forum? Go outside and get some fresh air or something
 
Moloch said:
But it's supposed to head to head.
I agree if you're going for an out of box experience, but I think it would be worthwhile to see what happens when you enable that command, since users are reporting such large gains.
It seems kinda silly not do it, if you even knew about the command, to satify curiousity.

I didn't know about the command, actually, until this thread was created this morning and I read about it today when someone posted about it.
 
Mr Mean said:
Hey Brent did you try using the driver forced AA/ANISO and disabling the ingame AA? Are the results the same?

Didn't make any comparisons.

I used the in-game AA/AF settings.

I always use the in-game settings unless there are none, then I fall back to CP settings.
 
Moloch said:
Didn't read whole thread, but unless brent turned on +r_fastzreject 1 , the benchmarks are not good enough for me, as the nvidia cards have this enabled by default, and the ati ones dont, because are cpu limited levels, it could be slower, but the gains of said tweak is 40+fps.
Why don't you ask Catalyst Maker about this? Seems like a waste not to make good use of his expertise while he's here.
 
Apple740 said:
Firingsquad is showing some benches where it looks like that Nv has a problem with water shaders.
I've already mentioned this earlier in the thread. It looks like the 66.93 drivers that ATi and FS used seem to struggle on the Canal level, the 67.02 drivers that the [H] and Anand used seem much better. Not sure why that is but 67.02 is supposed to have better shader performance and the Canal level seems pretty shader-intensive.
 
coz said:
Why don't you ask Catalyst Maker about this? Seems like a waste not to make good use of his expertise while he's here.
http://www.beyond3d.com/forum/viewtopic.php?t=18170
everything is answered in there.
The decision to have it disabled was during the 9800/FX era, and in parts of the game that are not cpu limited, gains are to be had, but in cpu limited situations, it can slow performance down a bit.
Btw, I post at beyond3d.. so it's not like seeing him is a big wow anymore:)
 
spyderz said:
anybody cares to explain how to launch the [H] demo's

put them in your half life 2/hl2 directory

unzipped of course

bring down the console and if you just want to play the demos in regular speed type playdemo demoname

demoname = the name of the demo

if you want to run a timedemo then type timedemo demoname
 
Brent_Justice said:
put them in your half life 2/hl2 directory

unzipped of course

bring down the console and if you just want to play the demos in regular speed type playdemo demoname

demoname = the name of the demo

if you want to run a timedemo then time timedemo demoname

thanks
 
spyderz said:
anybody cares to explain how to launch the [H] demo's
https://mywebspace.wisc.edu/phora/web/HL2_demos/

d* - ATI
hard* - [H]
at* - anandtech

How to Bench
You need to add the -console in the HL2 shortcut, if you have one. For those running it from steam menu, select HL2 in the menu -> right click on the game -> properties -> launch options -> put "-console" in there.

Start the game.
Console is pretty advanced and does auto complete for you. If you want FPS shown, type: cl_showfps 1
It will autocomplete 1/2 way thru, and you can use arrow keys to toggle commands.
then run timedemo filename. Note, filename will also autocomplete provided that you have put those .dem files in "C:\Program Files\Steam\SteamApps\EMAIL\half-life 2\hl2"
You just need to enter the 1st character and it will auto complete.

If you use the command timedemoquit, then after each demo, HL2 quits. This is useless. Better to do all together using the timedemo command.

In either case all the results are logged (appended, so dont worry abt ols ones getting deleted) to the following file "C:\Program Files\Steam\SteamApps\EMAIL\half-life 2\hl2\sourcebench.csv"

Its an excel (CSV) file. You can just copy and paste from there.
Also, these demos are spoilers, so watch out.
 
so i guess HL2 is a tie between nvidia vs ati huh!!!! that's nice to knoe!! so the 6800 is faster in this map but the x800 is faster in that map!!! i love it!!! now i don't feel bad about getting an ultra for HL2 knowing the x800 runs faster :) well thnx for the benchmarks Hardocp and i am off to play some HL2..!! woot :) !!!!! :p
 
ahhhh! nvm hehe the ati radeon x800xt gets ahead by a bit in other benchamrks i've looked at :) exspcially with aa and af on. hehe o welps its no biggy time to go hl2!!!
 
nice write up...how the sad part is most people would never know the difference between the hardware while they were playing unless you told em whats in the box. Right now you got the Nvidia camp cheering because the 6800 is not getting bitch slapped around as many said it wouldnt and you got the ATi camp trying to find commands to help the card out, which is doing a damn fine job all by its lonesome..myself looking in it seems they are grasping at straws trying to make their e-en0s bigger instead of just sitting down and playing the damn game. Now Im gonna go off and play some more HL2...for some reason I keep playing the air boat level...tons of fun...especially Carmageddon'ing those combine jack asses that try and swat rope climb down into the canal :D
 
Hi everyone, i need some help with running the HL2 Hardocp TimeDemo's, i've d/loaded and extracted the Demo's to my HL2 folder and enabled the 'developers console' and i'm not sure what to type in after you type.....timedemo, i've tried a few different commands but i cant' get the demo to run.So if someone could give me(noob) some instructions i would appreciate it,.

thanks
 
macatak said:
Hi everyone, i need some help with running the HL2 Hardocp TimeDemo's, i've d/loaded and extracted the Demo's to my HL2 folder and enabled the 'developers console' and i'm not sure what to type in after you type.....timedemo, i've tried a few different commands but i cant' get the demo to run.So if someone could give me(noob) some instructions i would appreciate it,.

thanks

type the name of the demo

if you put it in the correct folder then when you type timedemo hit h and it should have a drop down list of all the timedemos starting with an h, since ours starts with an h it will list the ones you have there, then use your arrow keys to select it and hit enter
 
I didn't even come close to reading all the replies in this thread but the outta the ones i read there were a few talking about how doom 3 is optimized for nvidia cards but hell when hasn't an nvidia card won in a OPENGL benchmark? It was one of those givens to begin with...
 
Brent_Justice said:
It would be nice if the timedemo result HL2 gives you also shows the Min FPS. Right now all it shows is the AVG FPS.

When we do gameplay evaluation using HL2 we will of course note the Min FPS in our graphs using FRAPS as usual. IMO the Min FPS is more important than average FPS.



I am waiting on that...sometimes a "slower" card (lower PEAK speeds) can have HIGH minimum framerate...the 6800GT vs. X800XT in Far Cry for example. The ATI part has a mimimum framerate at 1600x1200 of almost double the 6800GT at 1280x1024....read the review, it is right there...
 
Moloch said:
Didn't read whole thread, but unless brent turned on +r_fastzreject 1 , the benchmarks are not good enough for me, as the nvidia cards have this enabled by default, and the ati ones dont, because are cpu limited levels, it could be slower, but the gains of said tweak is 40+fps.

And who's fault is that that its disabled by default on ATI cards? Its sure the hell not Brents fault and its not his responsibility to use console commands to try and make the benchmarks seem more "fair".

digitalwanderer said:
True, but that don't really mean the cards were exactly "apples to apples" compared. ;)

Next comes the image quality thing, I just got done comparing in-game AA/AF to forced thru control panel/radlinker and the forced one is just drop-dead gorgeous to me....and I'm getting amazing framerates at 6xAAt2 16xAF! :D

I LOVE THIS GAME!!!!

Next up, I gotta somehow tear myself away from playing on me X800 to see how it looks on a GT....but I probably won't be able to until I finish the game. :LOL:

Good benchies, sorry about me earlier confusion about v-sync....I was still pre-coffee/my head in HL2 and I thought the graphs all topped out at 100. :oops:

(BTW-HI TERRY!!!! :D )

CP AA/AF is only there for games that do not have the option of enabling it in-game. CP AA/AF doesn't apply it the same way as the game developer intended and its usually more optimized to try and boost fps.

Control Panel enabled AF on ATI hardware is more optimized, especially for DX9. Optimized means more things like brilinear filtering. You'll get a little better performance with a little lower IQ.

And the benchmarks were very much "apples to apples" because it was default to default, and out of the box to out of the box. If you start changing settings around to try and boost performance on one card just because it doesn't have something enabled by default then its no longer apples to apples because the cards aren't running at their default settings.

And like Brent said, the majority of NV40 and X800 users dont have a shits clue about some console command from beyond3d forums that boosts performance on the ATI cards so why show results the majority of users will never see without that command.
 
How's it apples to apples if its enabled on one and not on the other? Sounds like laziness to me...
 
Gavinni said:
How's it apples to apples if its enabled on one and not on the other? Sounds like laziness to me...

Its apples to apples because the cards are both running at stock. It is NOT Brent's responsibility to enable features that are DISABLED by the cards manufacturer to try and make the benchmarks seem more fair.

And if its so detrimental to the ATI cards performance, why did ATI not choose to have it enabled? ATI had a copy of HL2 for a whole week while nVidia only had it for 2 days. If anyone is lazy it is none other then ATI.
 
Gavinni said:
How's it apples to apples if its enabled on one and not on the other? Sounds like laziness to me...
Jesus-Hopscotching-Christ, it's enabled by default on NV cards BUT not by ATi. It's ATi/NV's fault for that, go cry to them about it. It's not Brent or any other reviewer's job to enable it on both since thats not the experience that the End user will get "out of box."
 
CrimandEvil said:
Jesus-Hopscotching-Christ, it's enabled by default on NV cards BUT not by ATi. It's ATi/NV's fault for that, go cry to them about it. It's not Brent or any other reviewer's job to enable it on both since thats not the experience that the End user will get "out of box."

It's actually not ATi/NV's fault, its valves, but w/e...
 
Gavinni said:
It's actually not ATi/NV's fault, its valves, but w/e...

Ya know, it just seems like its been one thing after another this year that people have screamed needs to be enabled on the X800's to make the performance "fair" against the NV40's lol. DOOM 3 and the Humus HACK, Far Cry and the 2.0b path, and now HL2 and this console command are just some of the more well known.

If its not enabled by default then it shouldn't be permissible in a review. To me it just leans toward skewed benchmarks.
 
burningrave101 said:
CP AA/AF is only there for games that do not have the option of enabling it in-game. CP AA/AF doesn't apply it the same way as the game developer intended and its usually more optimized to try and boost fps.

Control Panel enabled AF on ATI hardware is more optimized, especially for DX9. Optimized means more things like brilinear filtering. You'll get a little better performance with a little lower IQ.
True to an extent, I forced it thru radlinker though using the "force all levels" in AF setting to improve the visuals even though it's a bit of a performance hit.

And the AA thru control panel looks better for two reasons I think;

1. It applies AA to all the layers rather than just those the developers intend (better visuals but usually a bit of a performance hit)

2. Temporal AA

I'm not trying to argue, just explaining my settings better.

I have no clue how I'll set it up on the 6800, I'll have to do a bit of research since I'm a bit rusty on all things nVidia right now. ;)
 
Well they have to do something don't they? ;) Honestly I'm rather tired of seeing the same old excuses again and again too. Isn't it enough that both NV and ATI have great cards?
burningrave101 said:
Ya know, it just seems like its been one thing after another this year that people have screamed needs to be enabled on the X800's to make the performance "fair" against the NV40's lol.
 
Ya know, it just seems like its been one thing after another this year that people have screamed needs to be enabled on the X800's to make the performance "fair" against the NV40's lol

And nobody ever needs something to be applied to a Nvidia based card? Even people like you scream about newer drivers, different AA settings, non apples to apples scenarios, Sm3.0 , etc.. to be enabled so that Nvidia gets shown in a more positive light, how is that any different?

Why does it seem like you and some other users are trying to "sell" one company over another? Why must you make posts just to put doubt in peoples minds over their own decision on what to spend their own money on? Is this all just a big game with extremely loud and overzealous cheerleaders?
 
burningrave101 said:
And who's fault is that that its disabled by default on ATI cards? Its sure the hell not Brents fault and its not his responsibility to use console commands to try and make the benchmarks seem more "fair".



CP AA/AF is only there for games that do not have the option of enabling it in-game. CP AA/AF doesn't apply it the same way as the game developer intended and its usually more optimized to try and boost fps.

Control Panel enabled AF on ATI hardware is more optimized, especially for DX9. Optimized means more things like brilinear filtering. You'll get a little better performance with a little lower IQ.

And the benchmarks were very much "apples to apples" because it was default to default, and out of the box to out of the box. If you start changing settings around to try and boost performance on one card just because it doesn't have something enabled by default then its no longer apples to apples because the cards aren't running at their default settings.

And like Brent said, the majority of NV40 and X800 users dont have a shits clue about some console command from beyond3d forums that boosts performance on the ATI cards so why show results the majority of users will never see without that command.
I never said it was his fault, I said that if he knew about(which he didn't) that he should enable it to see how much of a gain, its like enabling SM3 for nvidia cards in far cry.. see what kind of gains are to be had.
 
SnakEyez187 said:
Why does it seem like you and some other users are trying to "sell" one company over another? Why must you make posts just to put doubt in peoples minds over their own decision on what to spend their own money on?

Because maybe that person that wants to spend $400-$500+ would like to spend their money on the "best" video card and with the help of people like me and others that argue which is better they are able to get all the facts laid out in front of them and dont just go off some halfbaked review they read or what some friend of theirs said that had no clue what the hell they were talking about.

If everyone did as you suggest then the video card section would be dead except for those whining about technical issues with the card they already purchased.

What the hell else would we bicker about in here anyways lol.

I mean if you dont like people debating about which is better, you dont exactly have to come in this section of the board. Even i get tired of it sometimes and i spend my time in other sections of the board.
 
burningrave101 said:
Because maybe that person that wants to spend $400-$500+ would like to spend their money on the "best" video card and with the help of people like me and others that argue which is better they are able to get all the facts laid out in front of them and dont just go off some halfbaked review they read or what some friend of theirs said that had no clue what the hell they were talking about.
That's just the thing....I think I know a bit about what I'm talking about having played with both and there just ISN'T a "best card" right now, it comes down to a matter of preference.

I still prefer ATi's AA over nVidia's, and temporal AA & ATi's regular monthly driver releases along with my personal favoritism towards ATi (oh hell yeah I'm biased, I just try to not let that influence my opinions too much or get in the way of the facts as they come) has decided me this round on which is better....but if someone is a Linux user or prefers nVidia for some other reason I really can't say they're wrong for it.

Personally I prefer it this way, everybody is a winner. :)
 
Status
Not open for further replies.
Back
Top