Half Life 2 Benchmarks @ HardOCP.com

Status
Not open for further replies.
It appears that 6800 loves the Athlon64FX cpu. I really wonder what ATI did with those benchmarks showing the 800xt in lead by 60%...
 
CATALYST MAKER said:
Whats up Kyle, Brent.
For any folks who don't know this guy he's Terry Makedon and he works for a company called ATi. Kudos to Terry and ATi for getting guys on the forums and allowing us poor geeks some direct contact with them. Now, if only NVIDIA would do the same!! :D

Hey Terry, why are the ATi HL2 benchmarks showing a lead for X800XT-PE over 6800U of up to 55% (1600x1200 4xAA 8xAF) when HardOCP and Anand show it's more like 15-22%. Seems like a large discrepancy to me. Also, are your benchmark results the ones from Driverheaven or did they get theirs from you?

Cheers. :)
 
Kern said:
Brent! Jesus Christ! I could drive that buggy better with my mouse duct tape to ma pen-15! ;) j/k

hehe, it gave me a chance though to use the gravity gun, and in the water it caused splashes, which was a good effect to get in the demo

i love that gravity gun
 
CATALYST MAKER said:
I did. And I still do. But I guess some would call me biased :)

I wont get into it in here though. Just wanted to pop in and say I love this game.

At home I am using a 9800XT and at work I just popped in a X800 PCI-E board.

I wonder if we could get a poll as to what settings people are playing this game with? Im curious how much everyone is cranking it.

1600x1200, everything on, 4XAA and 8XAF...if the ATi cards are really 40% faster then thats ok because the game runs really well at those settings for me...40% over that, you can keep it.
 
coz said:
For any folks who don't know this guy he's Terry Makedon and he works for a company called ATi. Kudos to Terry and ATi for getting guys on the forums and allowing us poor geeks some direct contact with them. Now, if only NVIDIA would do the same!! :D

Hey Terry, why are the ATi HL2 benchmarks showing a lead for X800XT-PE over 6800U of up to 55% (1600x1200 4xAA 8xAF) when HardOCP and Anand show it's more like 15-22%. Seems like a large discrepancy to me. Also, are your benchmark results the ones from Driverheaven or did they get theirs from you?

Cheers. :)

Sup cuz ....
or is it coz...

Thanks for the welcome. I can explain the difference in results and what you are seeing from different places. I dont know if I should though. The !!!!!!s of the other team will just dismiss it as being biased and not believe me anyways.

But sure if I see others with interest in my explanation at the differences let me know and I will give it a shot.

I really just came in here to say that this game is awesome.
 
Does [H] plan to release expanded benchmarks, sort of like a "Official Half Life 2 [H]ardware Guide?" I'd be interested in seeing how the game performs on the ATI cards that were originally supposed to be shipped with HL2 included a year ago, but instead had vouchers.
 
I ran [H]'s timedemos. At 1600x1200, 4xAA/8xAF, every game option turned up to highest, including 5.1 sound, and driver quality set to highest. Im curious Brent, did you use highest driver quality, the the default one notch under?

My specs, [email protected]. 2x512 HyperX PC4000@1:1 CAS3 4-4-8, BFG 6800GT@430/1.16. Anyhow, I got 83.93 on Canals, and 68.95 on Coast. Slower than what you guys got in each compared to the Ultra. I guess your CPU and 2gig of ram, with lower timings makes a pretty big difference. I got about 100Mhz on you, but you have a FX, where as I dont. :( Unless our in-game settings and/or driver settings are different, the CPU makes a pretty big difference.

edit, pic

HardHL2.jpg
 
These were the Settings I used under the Advanced Video Options:

Model Detail = High
Texture Detail = High
Water Detail = Reflect World
Shadow Detail = High
Anti-Aliasing Mode = (None, 2X, 4X)
Filtering Mode = Anisotropic 8X
Shader Detail = High
Wate for Vertical Sync = Disabled
 
last jan when HL2 was suppose to be released with the "flagship 9800xt", it obviously made all parties involved think again, cause the results sucked! where are those fps ratings ati? eh?eh?eh?eh?eh? and maybe added to the aspect of the long wait?? and to now hear my $620 GT is not at the top, BA who cares it rocks! let the expensive x800 have it's glory. all options are on here in 1280 and it kicks but. I game and talk with tooooOOoOOonnes of people and few play in 1600x1200. nice, real bench'z? I think not...

my GT rocks!
 
G'ßöö said:
last jan when HL2 was suppose to be released with the "flagship 9800xt", it obviously made all parties involved think again, cause the results sucked! where are those fps ratings ati? eh?eh?eh?eh?eh? and maybe added to the aspect of the long wait?? and to now hear my $620 GT is not at the top, BA who cares it rocks! let the expensive x800 have it's glory. all options are on here in 1280 and it kicks but. I game and talk with tooooOOoOOonnes of people and few play in 1600x1200. nice, real bench'z? I think not...

my GT rocks!

congratulations man.
 
Catalyst maker! VERY cool that you look at what people are posting and gather feedback.
Feels like Ati listens. And as always ignore immature people ;)
In the format of Brent:

Model Detail = High
Texture Detail = High
Water Detail = Reflect Everything
Shadow Detail = High
Anti-Aliasing Mode = (4X)
Filtering Mode = Anisotropic 16X
Shader Detail = High
Wate for Vertical Sync = Disabled

1600x1200
 
I know you wanted to get the article out quickly, was that the reason only two levels were tested?
 
CATALYST MAKER said:
Sup cuz ....
or is it coz...

Thanks for the welcome. I can explain the difference in results and what you are seeing from different places. I dont know if I should though. The !!!!!!s of the other team will just dismiss it as being biased and not believe me anyways.

But sure if I see others with interest in my explanation at the differences let me know and I will give it a shot.

I really just came in here to say that this game is awesome.

yes, I would like to know why the official benchmarks have ATI so far it the lead. Please explain it to us. I (for one) won't flame you. :D
 
CATALYST MAKER said:
I can explain the difference in results and what you are seeing from different places. I dont know if I should though. The !!!!!!s of the other team will just dismiss it as being biased and not believe me anyways.
Screw the bloody phanboyz!! I'd love to hear what you have to say on the matter and most of the other guys here would too. I'm sure Kyle and Brent will frag any FB's that get too rowdy so go right ahead.....please. :)
 
SocketA said:
I know you wanted to get the article out quickly, was that the reason only two levels were tested?

have you seen how many maps there are in HL2? no way we can timedemo ALL of them, well, we could, but the article wouldn't have been posted in 1 day of the game being released ;)
 
I just want to know somthing here?

Why is it that all the ati fan's out there, are complaining that [H] used a Overclocked 6800 and comparing it to a stock x800

do these people not know these cards come oced outa box?
if ati did the same thing there would be no bitching. but when nvidia does anything, it's looked down on?
wtf is that shit?
can some explain in there opioion why this is?
 
Awesome work, as always, guys. :D

I'm wondering if you'll be doing an expanded HL2 benchmark later on including older cards (9600 series/ 5700 series on up) as well as some different hardware configs (A64, P4 and Athlon XP)?
 
SocketA said:
I know you wanted to get the article out quickly, was that the reason only two levels were tested?
You just answered your own question. Yes, we wanted to get the numbers out quickly so we evaluated what levels we thought would be good for benchmarking and executed. That said, Anandtech used 5 demos that were very short in lenght (demo files are 2.5 to 3.5MB in size) and we used 2 demos that show the entire level from load to load (our demos are approx 10.5MB each).

If you think our demos do not represent gameplay, I would highly suggest you discount our thoughts about them when making a purchasing decision. :)
 
[RIP]Zeus said:
I just want to know somthing here?

Why is it that all the ati fan's out there, are complaining that [H] used a Overclocked 6800 and comparing it to a stock x800

do these people not know these cards come oced outa box?
if ati did the same thing there would be no bitching. but when nvidia does anything, it's looked down on?
wtf is that shit?
can some explain in there opioion why this is?

I wouldnt call a few people "all the ati fan's out there"........most people are actually enjoying the game instead of worrying about who has the better card.
 
fallguy said:
I ran [H]'s timedemos. At 1600x1200, 4xAA/8xAF, every game option turned up to highest, including 5.1 sound, and driver quality set to highest. Im curious Brent, did you use highest driver quality, the the default one notch under?

My specs, [email protected]. 2x512 HyperX PC4000@1:1 CAS3 4-4-8, BFG 6800GT@430/1.16. Anyhow, I got 83.93 on Canals, and 68.95 on Coast. Slower than what you guys got in each compared to the Ultra. I guess your CPU and 2gig of ram, with lower timings makes a pretty big difference. I got about 100Mhz on you, but you have a FX, where as I dont. :( Unless our in-game settings and/or driver settings are different, the CPU makes a pretty big difference.

The FX has a dual channel controller, the RAM timings were much tighter, possibly a faster motherboard, and the driver settings were left at default except for v-sync i would assume. They would run the visual quality at a higher level then default because default is the same quality the ATI cards are running at.

Half-Life 2 looks to be farely CPU limited so the CPU->RAM conversion time will have a greater impact on performance.

It will be impossible for anyone to get exactly the same fps results while running the recorded demo's. Windows XP isn't even a static OS so performance could change by a few fps every time you reboot.
 
anyone try this game with 2 gigs of ram? I hear the review systems had 2 gigs. Ive tried with 1 gig and 1.5 gigs, and the stuttering did go down, just not totally, im just out of ram slots :(

'heh read the benchmarks, n/m
 
If you go back through the article now I have put in Maximum Setting comparison graphs on page 2 and 3 at the bottom.

So you can see how they stack up with 6XAA/16XAF, 4xSAA/16XAF, 6xSAA/16XAF and 8xSAA/16XAF.
 
It's almost like some of these guys wanted nvidia to do poorly. I just don't get it. Competition is good for us. If ati did happen to dominate this game, it would keep the prices up and I would still have to watch guys come to a LAN running a GF2mx.

Celebrate how awesome of a game that we have that runs exceptionally well on most hardware. I believe this is the best game that I have ever played. Both ati and nvidia run it great, go play it FFS. Your gaming experience isn't going to be any better or worse because the jones are getting +/- 5 fps in various levels. Grow up and go play some games. :D
 
Brent_Justice said:
If you go back through the article now I have put in Maximum Setting comparison graphs on page 2 and 3 at the bottom.

So you can see how they stack up with 6XAA/16XAF, 4xSAA/16XAF, 6xSAA/16XAF and 8xSAA/16XAF.

Wow! so is that 40%?
Does that have reflect all for water too?
 
coz said:
Screw the bloody phanboyz!! I'd love to hear what you have to say on the matter and most of the other guys here would too. I'm sure Kyle and Brent will frag any FB's that get too rowdy so go right ahead.....please. :)

Sure. I can give it a shot.

The main reason why there are such differences between the Hard benchmarks and DH benchmarks is because of the timedemos used.

In summary [H] created timedemos that were indicative of average gameplay. In other words [H] used a full level as a timedemo. Showing the average framerate of a full level will cause all cards in the same category to have near identical performance. There will be areas in that level that are not graphically comlex at all, and when taking an average will benefit the inferior product.

Example, below are 2 hypothetical products that are benchmarked for 10 seconds. Each sample point is the FPS at the second.

90,90,90,90,90,90,90,90,90,90 = average is 90 FPS
60,60,90,90,90,90,90,90,90,90 = average is 84 FPS

Take a large enough sample (i.e. extend the 90's for another 50 seconds) you will see that the FPS will trend to the same number.

The other alternative to the above method is to benchmark only complex graphic situations. This in the above example would be a 2 second time demo.

90,90 = average of 90 FPS
60,60 = average of 60 FPS

The difference between the two hypothetical products is much more evident in this situation.

We have four timedemos created that are graphically complex. They were used by DH and 2 of them were even used by TheTechLounge.com.

I am prepared to put these demos on an FTP site for anyone who wants them. Interested?
These demos will show complex situations and will show that one hypothetical product is at times 90% faster than the second hypothetical product.

One such case will be using the flashlight. Record a timedemo of your own if you like using the flashlight alot. You will see what I am talking about.

:)


And that concludes my lesson for the day. Cheers guys.

Edit.and as I see as soon as I post that, Brent has commented right below mine a good way of putting what I am trying to say. Average is one thing but for some Min is more important.
 
It would be nice if the timedemo result HL2 gives you also shows the Min FPS. Right now all it shows is the AVG FPS.

When we do gameplay evaluation using HL2 we will of course note the Min FPS in our graphs using FRAPS as usual. IMO the Min FPS is more important than average FPS.
 
BTW if anyone missed it on the previous page I updated page 2 and 3 of the article with two new graphs showing performance with the maximum AA/AF settings on the X800XT-PE and 6800Ultra OC. You'll see that 6XAA/16XAF is very playable on the x800xt-pe on average in our timedemos.
 
burningrave101 said:
They used the 66.93 ForceWare instead of the new 67.02 released for HL2 as well.
Interesting. It seems that Firing Squab's (hehe) custom Canals timedemo shows the same sort of advantage (with 1600x1200 4xAA 8xAF) for the X800XT-PE over 6800U as ATi's numbers suggest (up to 66%). Both sets of benchmarks (ATi and FS) use the 66.93 driver whereas HardOCP and Anand used 67.02 and the difference was at most 17% with their results. It looks like the 67.02 driver improves performance dramatically on the Canals map - maybe that stuff about 67.02 improving shader performance is true?

Can we see some 66.93 vs 67.02 comparisons on the Canals timedemo please Brent? ;)
 
Netrat33 said:
Wow! so is that 40%?
Does that have reflect all for water too?

Those supersampling modes do a much better job at providing better IQ then normal AA does so you have to take that into consideration when your comparing 4xS, 6xS, 8xS, to 6xAA. If there wasn't such a performance hit when using supersampling compared to multisampling then there would be no question which video card provided better image quality.
 
Brent_Justice said:
These were the Settings I used under the Advanced Video Options:

Model Detail = High
Texture Detail = High
Water Detail = Reflect World
Shadow Detail = High
Anti-Aliasing Mode = (None, 2X, 4X)
Filtering Mode = Anisotropic 8X
Shader Detail = High
Wate for Vertical Sync = Disabled

Same as I did, thanks. What were the driver quality settings?
 
CATALYST MAKER said:
Sure. I can give it a shot.

The main reason why there are such differences between the Hard benchmarks and DH benchmarks is because of the timedemos used.

In summary [H] created timedemos that were indicative of average gameplay. In other words [H] used a full level as a timedemo. Showing the average framerate of a full level will cause all cards in the same category to have near identical performance. There will be areas in that level that are not graphically comlex at all, and when taking an average will benefit the inferior product.

Example, below are 2 hypothetical products that are benchmarked for 10 seconds. Each sample point is the FPS at the second.

90,90,90,90,90,90,90,90,90,90 = average is 90 FPS
60,60,90,90,90,90,90,90,90,90 = average is 84 FPS

Take a large enough sample (i.e. extend the 90's for another 50 seconds) you will see that the FPS will trend to the same number.

The other alternative to the above method is to benchmark only complex graphic situations. This in the above example would be a 2 second time demo.

90,90 = average of 90 FPS
60,60 = average of 60 FPS

The difference between the two hypothetical products is much more evident in this situation.

We have four timedemos created that are graphically complex. They were used by DH and 2 of them were even used by TheTechLounge.com.

I am prepared to put these demos on an FTP site for anyone who wants them. Interested?
These demos will show complex situations and will show that one hypothetical product is at times 90% faster than the second hypothetical product.

One such case will be using the flashlight. Record a timedemo of your own if you like using the flashlight alot. You will see what I am talking about.

:)


And that concludes my lesson for the day. Cheers guys.

Edit.and as I see as soon as I post that, Brent has commented right below mine a good way of putting what I am trying to say. Average is one thing but for some Min is more important.

Min is important, but i don't think that a 2 second benchmark would do any good besides all 2 second effects would have to be benched just to make it fair as some cards are more efficient at rendering some effects more than others, also theres some hitching going on this game when theres certain effects going on and that will hinder the result obtained, i say the game runs good but there are a few bugs that need to be dealt with before a two second blast effect is taken into consideration
 
spyderz said:
Min is important, but i don't think that a 2 second benchmark would do any good besides all 2 second effects would have to be benched just to make it fair as some cards are more efficient at rendering some effects more than others, also theres some hitching going on this game when theres certain effects going on and that will hinder the result obtained, i say the game runs good but there are a few bugs that need to be dealt with before a two second blast effect is taken into consideration

as you point out the problem with really short timedemos like that is that you may be benchmarking only one type of shader effect

maybe its a water shader for example, and maybe that water shader does render faster on one card

but

that doesn't take into account other aspects of the game, HL2 is not just one water shader

so you have to bench a long timedemo encompassing lots of gameplay to get a good feel how the game feels between cards

that's why we did very long timedemos, to try and represent what end users really experience when they play this game, and see who comes out on top overall

i still wish the timedemo mode outputted the Min FPS though, if it had I wold have put that in the graphs, or a table, cause i think its important
 
CATALYST MAKER said:
And that concludes my lesson for the day. Cheers guys.
Very well explained - even I understood that. So ATi's numbers show a sort of graphical worst-case scenario with far less CPU-limited frames than the longer timedemos? That would certainly show better results for X800XT-PE with more shader 'horsepower' right?

Also, you used a different driver (66.93) to HardOCP (67.02) which may have had slightly worse shader performance which in turn, has a big effect on the shader-heavy Canals timedemo score? I'm not blaming anyone for using 66.93 because 67.02 only appeared very recently.

I think I prefer HardOCP's longer timedemo approach because it better reflects real gaming situations in HL2. However, I also see merit in ATi's shorter, graphically-intensive timedemos too.

Cheers Terry. :)
 
That said, from what gameplay we got in on launch day, we can tell you that Half Life 2 is far from a "video card hog" and will likely be as forgiving to legacy hardware as DOOM 3 was, if not more so. One thing we feel compelled to mention is that Half Life 2 is not as forgiving as DOOM 3 in the resolution department. Where DOOM 3 looked decent at 640x480, we found that is not the case with Half Life 2.

Well I can run Half-Life 2 at 1024x768 with all textures set to high and I get great performance, generally around 130fps when there is no action but big explosions will cause a stutter. I can't say the same for Doom 3. I have to play at 800x600 with medium settings and the game is still laggy as hell. My specs are in my sig.
 
Status
Not open for further replies.
Back
Top