Half Life 2 Benchmarks @ HardOCP.com

Status
Not open for further replies.
Looks like good stuff all round to me. I haven't got HL2 yet and hope to download it this evening. The thought of playing a game with graphics as good as the screenshots look at 1600x1200 4xaa 8xaf at 80-odd fps average on my 6800 Ultra OC is pretty heartening.

I look forward to the more detailed review I'm sure is coming - I'd like to see what the minimum frame rates are like as I've heard a few vague tales about slight jerkiness running at top resolutions on 6800 series hardware in the more open areas of the game.
 
I would like to see this test with a 9800 Pro 256. Can we also see a test with an Athlon 3200+. I haven't made the jump to 64 bit yet so until an OS supports it I don't see the point.
 
mulpsmebeauty said:
Looks like good stuff all round to me. I haven't got HL2 yet and hope to download it this evening. The thought of playing a game with graphics as good as the screenshots look at 1600x1200 4xaa 8xaf at 80-odd fps average on my 6800 Ultra OC is pretty heartening.

I look forward to the more detailed review I'm sure is coming - I'd like to see what the minimum frame rates are like as I've heard a few vague tales about slight jerkiness running at top resolutions on 6800 series hardware in the more open areas of the game.
I have a BFG 6800 Ultra water cooled OC card in my system, and I get the occasional hitches with the 66.81 driver. I am about to install the 67.02 and try it out. I understand it has some HL2 optimizations in it. If that does not fix it, I suspect NVIDIA will soon have it fixed though. I am left to understand that NVIDIA did not have access to the code drops that ATI did, so that would obviously leave them a little behind on getting everything working correctly.
 
coz said:
Awwww, but where's the fun in that? ;)

Many thanks to Brent for his hard work getting this article done in such a short space of time. Nice to see some honest HL2 benchmarks instead of that PR rubbish we've been fed from ATi and Driverheaven.

Thanks guys, now it's time to quit worrying about who is faster and just PLAY!! :)
Thanks from all of us. It has been a long day.

I have to admit Driver Heaven's "benchmarks" left me a bit dissappointed in our community. We wanted to be first with our benchmarks, but certainly not to the point of simply regurgitating PR drivel. :(

Now, off to play with the 67.02 drivers and see if it fixes anything....
 
Great benchmark guys. As expected, the PR benchmarks don't really seem to be true. Looks like the 6800GT and X800 Pro are pretty close once again. As far as DH goes...well the charts are pretty.
 
Excellent work as usual. Ive been playing 1280*1024 4xaa 8xAF High Quality on my 6800GT all day and can tell you it is smooth. O so very sweet smooth. Frankly im amazed at the quality this card is pushing at these frames. Kudos to Valve on their engine, it works very well.

Also, the 67.02 drivers fix the very, VERY annoying shimmering issue on all 6800GT cards with AF enabled. Its worth picking them up just for that.
 
obs said:
Great benchmark guys. As expected, the PR benchmarks don't really seem to be true. Looks like the 6800GT and X800 Pro are pretty close once again. As far as DH goes...well the charts are pretty.

Well im glad that the benchmarks are that are showing up now are showing the true FPS on the NV chips. gonna wait a few more weeks and buy HL2 i can see the few bugs plaging the nv chips gone by then. eather way my computer wont be ready till then. great review
 
Were the Nvidia cards overclocked, and if so why? Lets see stock verses stock here.
 
Whoa, what a review!

Thanks for the time investment. Really shocking to me, I thought that the ATi cards would have a slight lead at least. Seems like they are pretty even as they are with most games. I sure am wondering what nvidia can do now that they have the software now.
 
Dithium said:
Were the Nvidia cards overclocked, and if so why? Lets see stock verses stock here.

The test used the BFG 6800GT OC and Ultra OC - both have a moderate oveclock on the core included by the manufacturer. I wouldn't have thought it would have a huge impact on the overall results - 2-5 fps maybe at the extreme highs and lows of the fps range.

I don't think it's unusual for Ultras at least to come with a core of 425Mhz instead of the reference 400Mhz, but I could be wrong.
 
I'd have to agree that since the nvidia cards are at the manufacturers rated speeds the benches are valid.

Anyway, performance looks good for both ATi X800 and nVidia 6800 owners. Those owning either should be happy with their performance in both HL2 and D3 (and just about everything else for that matter).

I don't have HL2 so I am wondering if there are any notable differences in IQ?
 
Thanks to the [H] for getting this out so soon! Looks like no real losers this round! It also goes to show that NV and ATI both have good drivers out there. Maybe this will stop some of the rampant fan-boi-ism schism out there... :D
 
from what i can tell HL2 seems to be very CPU dependent and it likes LOTS of ram
id like to see some CPU scaling test like what anandtech did with Doom 3 i get hitching too but i only have 512MB of ram soo looks like time to pick up 2nd stick
 
Even if the ATi cards did score way higher, I can't say that I care...I'm running 1600x1200 with 4xaa and 8xaf, and the game is just breath-taking...my 6800GT is soooooooooooooo worth its weight in gold right now...
 
Dithium said:
Were the Nvidia cards overclocked, and if so why? Lets see stock verses stock here.
We used retail versions of BFGTech video cards that come at their own clock settings. They are the best selling video card in retail in North America, so that is what we go with. Yes, they are OCed from the factory, but still at their own retail clocks.
 
Elios said:
from what i can tell HL2 seems to be very CPU dependent and it likes LOTS of ram
id like to see some CPU scaling test like what anandtech did with Doom 3 i get hitching too but i only have 512MB of ram soo looks like time to pick up 2nd stick

Make sure your AGP Aperture Size is set to 256 if it's not already.
 
After a reboot of my system, still using the 66.81 drivers, a bunch of my sound issues and stuttering issues were cleared up. I did not see Steam download an update, as the icon did not prompt me that it was doing anything.


So if you are having stuttering issues with NV cards, you might try a system reboot. Still have not tried the 67.02 drivers.
 
^eMpTy^ said:
I think it's only fair...especially since you also used the XTPE, which isn't even a real card anymore since it was replaced by the vanilla XT...
Yeah, certainly availability has diluted that model's impact. Remember the "6800 Ultra Extreme" that Tom and Anand used for benchmarks? :D Interestingly enough, NVIDIA to this day, has never mentioned that card to us. Guess they knew we would BS on BS. More "aggressive" video card marketing...:rolleyes:
 
RAutrey said:
Make sure your AGP Aperture Size is set to 256 if it's not already.

yup did that and set my RAS delay to 10 that cleared up alot of it but theres still some stuttering when it loads a lvl or has to swap soo it the mem i.e i need more lol other then that the i think the physics and AI are choking my CPU witch is why i would like to see some CPU scaling tests :cool:
 
I was waiting for an un bias benchmark and you guys delivered thanks, i was looking at the benchmark from ati and thought something was wrong, by the way driverheaven has the same benchmark with the same slides and all but claim it was done by them and they swear this is doom3 in reverse due to ati's superioir shaders, i just thought it was kinda of funny, i'm guessing they won't like you benchmark
 
I'm curious why you didn't use x16 AF? I've been playing the game at 4xAA with x16AF and reflect all

With doom3 everything was maxed out as much as possible and with hl2 it didn't appear to push ITS limits
 
Thanks guys, I figured this was the case. Valve releasing that VST benchmark early instead of a gameplay benchmark demo and the fact that the retail game has no timedemos made it clear Valve & ATI didn't want people to benchmark the actual game because it would expose the truth, that there is no clear cut winner for HL2 :)

Kudos to being the first site to release reliable benchmarks.
 
spyderz said:
I was waiting for an un bias benchmark and you guys delivered thanks, i was looking at the benchmark from ati and thought something was wrong, by the way driverheaven has the same benchmark with the same slides and all but claim it was done by them and they swear this is doom3 in reverse due to ati's superioir shaders, i just thought it was kinda of funny, i'm guessing they won't like you benchmark

while I'm sure you can probably get a little more speed out of the XTPE than the Ultra in HL2 in certain levels...it isn't needed nearly as badly as it was in Doom3...DH, being extremely ATi biased, was just A. trying to get benchmarks out first to get some attention and B. still trying to nurse their bruised ego back to health from Doom3...it's actually quite sad...the guy actually openly claimed right in his article that he had had HL2 for a WEEK prior to launch and had been "testing it on various hardware"...if DH had any respect whatsoever, it's absolutely gone now...
 
spyderz said:
I was waiting for an un bias benchmark and you guys delivered thanks, i was looking at the benchmark from ati and thought something was wrong, by the way driverheaven has the same benchmark with the same slides and all but claim it was done by them and they swear this is doom3 in reverse due to ati's superioir shaders, i just thought it was kinda of funny, i'm guessing they won't like you benchmark

Good old unbiased journalism. They are going to look like a right bunch of boobs for believing and embellishing something that ATI funded to such a high degree.

Look - someone left a window open and your credibility has just fallen out of it.
 
Netrat33 said:
I'm curious why you didn't use x16 AF? I've been playing the game at 4xAA with x16AF and reflect all

With doom3 everything was maxed out as much as possible and with hl2 it didn't appear to push ITS limits
We had less than 24 hours to get this done so we had to make some calls on apples to apples settings and that is what we felt was best.
 
[H] crew, once again, you reach the summit of providing difinitive, unbiased, and consice information when we need it most. Thank you.

Now wait... Remember back when Doom3 came, nVidia's performance was suspiciously higher than ATi's, and so the next day ATi made a cat4.9 beta? That 4.9 beta made quite a (positive) difference on my friend's 9800...
Seeing how nVidia sided with id, and ATi sided with valve... perhaps we can expect a quick driver release from nVidia that will give optimizations to HL2? If it will be anything like the performance jump of those 4.9 beta's, this could mean good, good things.

Now I just gotta wait for it to arrive (legally) in Russia so I can support Valve for the most amazing assortment of 1's and 0's ever to be run by a CPU to date.
Gotta say though, I was quite flabbergasted 2 months ago when I tried the jacked copy... Can't wait.
 
tranCendenZ said:
Thanks guys, I figured this was the case. Valve releasing that VST benchmark early instead of a gameplay benchmark demo and the fact that the retail game has no timedemos made it clear Valve & ATI didn't want people to benchmark the actual game because it would expose the truth, that there is no clear cut winner for HL2 :)

Kudos to being the first site to release reliable benchmarks.

I don't wanna get all anti-ATi in here or anything...but honestly...why is "ATi" stamped all over the HL2 box...all over the CDs...they're coming out with "ATi" levels...I mean...seriously...this is a bit ridiculous...

But yeah, thanks for the benchies...I'm guessing HL2 will line up much like FarCry did...the higher core clock of the XT will bring it some victories with pure shader performance...but the 6800s will be nipping at its heels the whole way...

Thanks again Valve for lying to us for 6 months about HL2 performance so you can sell more ATi cards...
 
When has marketing ever been truely representative of the end product in the graphics department? Tons of salt must be applied.
 
^eMpTy^ said:
I don't wanna get all anti-ATi in here or anything...but honestly...why is "ATi" stamped all over the HL2 box...all over the CDs...they're coming out with "ATi" levels...I mean...seriously...this is a bit ridiculous...

But yeah, thanks for the benchies...I'm guessing HL2 will line up much like FarCry did...the higher core clock of the XT will bring it some victories with pure shader performance...but the 6800s will be nipping at its heels the whole way...

Thanks again Valve for lying to us for 6 months about HL2 performance so you can sell more ATi cards...

Because they are they sponser. That's why. Much like nvidia sponsers EVERY other game out there
 
^eMpTy^ said:
while I'm sure you can probably get a little more speed out of the XTPE than the Ultra in HL2 in certain levels...it isn't needed nearly as badly as it was in Doom3...DH, being extremely ATi biased, was just A. trying to get benchmarks out first to get some attention and B. still trying to nurse their bruised ego back to health from Doom3...it's actually quite sad...the guy actually openly claimed right in his article that he had had HL2 for a WEEK prior to launch and had been "testing it on various hardware"...if DH had any respect whatsoever, it's absolutely gone now...

lol agree, when i saw his claim that he had been playing with hl2 and that the benchmark he posted was his i was like "man doesn't he know ati release that same benchmark + graphs hours ago" :rolleyes: what a liar!
 
^eMpTy^ said:
Thanks again Valve for lying to us for 6 months about HL2 performance so you can sell more ATi cards...
Heh, remember, the promotional cards for HL2 were the 9600XT and 9800XT...
They were the ones which came with the coupons and such. Now it would be interesting to pit the 9800xt vs nVidia's equivalent of that generation, the FX5950.
That would really show us how much valve and ATi worked together, whether there would be the same pwnage as when nVidia and id pwned the optimization for doom3.
 
Call me a £whore£, but I'd happily tell you that an X800-series is 40% quicker than a 6800-series for $6 million.
 
^eMpTy^ said:
I don't wanna get all anti-ATi in here or anything...but honestly...why is "ATi" stamped all over the HL2 box...all over the CDs...they're coming out with "ATi" levels...I mean...seriously...this is a bit ridiculous...

But yeah, thanks for the benchies...I'm guessing HL2 will line up much like FarCry did...the higher core clock of the XT will bring it some victories with pure shader performance...but the 6800s will be nipping at its heels the whole way...

Thanks again Valve for lying to us for 6 months about HL2 performance so you can sell more ATi cards...
Well, I think we all need to realize that anything that is NOT 3rd party will very like be influenced by aggressive marketing from some company somewhere. Of course being 3rd party doesn't gurantee its validity either. I am interested to see some other results using non-ATI built demos today with the latest drivers. Hehe, it is always scary going first. :)
 
mulpsmebeauty said:
Call me a £whore£, but I'd happily tell you that an X800-series is 40% quicker than a 6800-series for $6 million.
Yeah, cos ruby was hotter than that prissy mermaid.
But still, I'm really happy that the ATi/nVidia brawl finally settled down with this benchmark showing that these cards are not in seperate ballparks. God Bless [H].
 
These HL2 benchmarks are great Kyle lol.

DriverHeaven is the biggest pile of shit of a site i've ever looked at benchmarks on. I should of known those results were nowhere near accurate.

It looks like nVidia continues to hold strong in DX9 gaming while maintaining control of OpenGL and Linux performance :).

I can't believe i was considering buying an X800Pro TIVO last night to go with my XFX 6800GT lol.
 
Finally a real-world benchmark with a list of the top-end cards to support my arguments. Those who tried to bash me yesterday should look at it then check back to my replies (i predicted the difference would be less than 5% but i was surprised that even at max settings, it did impressive FPS so it is pointless to tell who is the best card). I knew Valve released a biaised set of results, just because ATI is supporting him.
 
While I dont like apples-to-apples, props again to the [H]

I found a typo!

"b...ut in this constant world of “very aggressive” vide card marketing, we felt..."
 
Status
Not open for further replies.
Back
Top