Half Life 2 Benchmarks @ HardOCP.com

Status
Not open for further replies.
Gavinni said:
How's it apples to apples if its enabled on one and not on the other? Sounds like laziness to me...
It's not laziness to not know about an obscure setting that only the lunatics who bloody live on B3D's boards would happen to have caught on to, it wasn't exactly posted on the main page.

Besides, I'm surprised/impressed that Brent & Kyle took the time off from playing to benchmark like this...I still haven't brought myself to that point yet, every spare minute I got I'm playing. :D

It's way too soon in this to call fouls or make accusation, 'specially against Brent who I personally feel works a bit harder than most to stay on top of these kinds of things.

The more reviews/benchies that come out, the more we'll all learn, the betterer we'll all get at it.

Patience padawan, patience.
 
digitalwanderer said:
It's not laziness to not know about an obscure setting that only the lunatics who bloody live on B3D's boards would happen to have caught on to, it wasn't exactly posted on the main page.

Besides, I'm surprised/impressed that Brent & Kyle took the time off from playing to benchmark like this...I still haven't brought myself to that point yet, every spare minute I got I'm playing. :D

It's way too soon in this to call fouls or make accusation, 'specially against Brent who I personally feel works a bit harder than most to stay on top of these kinds of things.

The more reviews/benchies that come out, the more we'll all learn, the betterer we'll all get at it.

Patience padawan, patience.

DW! What's up! This is a great game, Valve really scored big on this one, game of the year for sure! Hope the wife and kids are all well.
 
SnakEyez187 said:
Why does it seem like you and some other users are trying to "sell" one company over another? Why must you make posts just to put doubt in peoples minds over their own decision on what to spend their own money on? Is this all just a big game with extremely loud and overzealous cheerleaders?

Look at his and others posting history over the last few months and its clear some have a sort of angenda :)


Brent, completly understand why you did not test with the +r_fastzreject set to 1 on the ATI cards. However since its supposed to be turned on soon, and it may impact ATI (both postive/negitivly) then when you get time can you try it? You should realize the the readers here and on your forums are much more adept at tweaking things and this is something that 99% of the people reading could do and if it does bost/hurt FPS then it could be interesting to see....
 
With all this +r_fastzreject stuff going on... let me propose a poll... in fact, I may even START a poll...

How many people with ATI cards... no, fuck it... how many people PERIOD have heard of this command BEFORE it was stated here? It IS apples to apples because the VAST consumer audience has no clue about it. They'll buy HL2, install it, and play it. Same with video cards. They look at the graphs, think they don't have to do anything, and they'll buy it (poor wording, but I meant reviews such as [H]).

I'll start a poll up real quick. Edit when I get it.
 
burningrave101 said:
Ya know, it just seems like its been one thing after another this year that people have screamed needs to be enabled on the X800's to make the performance "fair" against the NV40's lol. DOOM 3 and the Humus HACK, Far Cry and the 2.0b path, and now HL2 and this console command are just some of the more well known.

If its not enabled by default then it shouldn't be permissible in a review. To me it just leans toward skewed benchmarks.

Well I don't even bother with the +r_fastrejctwhatever because the game is blazing fast and looks incredible.

But come on
First: Doom3 Humus hack...it was like 2 frames. No one was saying "Bench it now!" They basically said how their might be hope to play Doom3 faster. Now with driver updates it plays very good (although it wasn't bad in the first place)

Second: Farcry and 2.0b path...uh...well at the time nvidia users had to use the console too to activate sm3. Later everyone found out about sm2b and the scenario was back to before sm3 and 2b benching ..if that made sense...either way NOW it's on by default since it's official and not a beta

HL2: Original statement. Uh..if a person has a problem without +r being enable with the game running average 85 fps with x6aa and x16af...that person has a REAL problem..soo just ignore them :)
 
Jbirney said:
Brent, completly understand why you did not test with the +r_fastzreject set to 1 on the ATI cards. However since its supposed to be turned on soon, and it may impact ATI (both postive/negitivly) then when you get time can you try it? You should realize the the readers here and on your forums are much more adept at tweaking things and this is something that 99% of the people reading could do and if it does bost/hurt FPS then it could be interesting to see....

Why dont the people interested just try it themselves and see lol.

I dont see what the point would be in Brent including it in the review. Its NOT enabled by default and HardOCP makes it a point to benchmark hardware "out-of-the-box" instead of changeing settings that the majority of users have no idea even exist or what they do.

There are probably some tweaks out there to make the nVidia cards run a little faster in HL2 but you dont see any of us demanding they be introduced into the review.
 
What's the point of this benchmarking when the NVIDIA products are tested OVERCLOCKED.. but the "matching" video card from ati is left stock?
 
Laforge said:
What's the point of this benchmarking when the NVIDIA products are tested OVERCLOCKED.. but the "matching" video card from ati is left stock?
\


We won't even get into that... read the thread.


As promised, I put up a poll for all this bologna... hehe, bologna... go vote! And be honest!Poll about +r_fastzreject

:)
 
Laforge said:
What's the point of this benchmarking when the NVIDIA products are tested OVERCLOCKED.. but the "matching" video card from ati is left stock?

Because they are SOLD that way! lol

God, has this question not already been asked about 100x in this thread or what?

And the BFG cards are only overclocked a few MHz, definitely not enough to make any kind of huge difference.
 
burningrave101 said:
Because they are SOLD that way! lol

God, has this question not already been asked about 100x in this thread or what?

And the BFG cards are only overclocked a few MHz, definitely not enough to make any kind of huge difference.


Right..
and If I sell an ati card with a 40% overclock out of the core out of my store, that makes for a fair comparison?

Just because BFG overclocks their card from THEIR factory (i.e. higher than what's on the nvidia actual reference board) doesn't make it not a "benchmark cheating" card.

I understand that the BFG card was overclocked. What my point was, is that, if the BFG card is overclocked, say. 7% above the "official" nvidia spec for the processor and core, then the fair comparison is an ati card at 7% over it's official spec. (the 7% is an arbitrary # of course)

If I wrote a comparison of an ati card that's 40% overclocked versus the BFG card that's 7% overclocked and the ati card did 30% better.. wouldn't that be "un-fair"?

I think it's an error in methodology, and the funny thing is, without overclocking "direct from factory" the ati card seems within what.. 4% ?

I mean.. a water block cooled, $600 video card versus a stock settings, stock fan/hs cooled $500 card and that nvidia can barely beat it?

Seems a bit telling to me.

I'm betting that if the bfg card was not OVERCLOCKED it'd actually SHOW a difference
 
Point is with OCing... you can buy the card that way, it's the #1 selling card with that nVidia chip. If Brent OCed his card, it's not giving you the "stock" settings... yes, BFG cards are OCed, but the x800XT and x800XT PE are the same core... PE is overclocked. Hmm.
 
Laforge said:
Right..
and If I sell an ati card with a 40% overclock out of the core out of my store, that makes for a fair comparison?

Just because BFG overclocks their card from THEIR factory (i.e. higher than what's on the nvidia actual reference board) doesn't make it not a "benchmark cheating" card.

I understand that the BFG card was overclocked. What my point was, is that, if the BFG card is overclocked, say. 7% above the "official" nvidia spec for the processor and core, then the fair comparison is an ati card at 7% over it's official spec. (the 7% is an arbitrary # of course)

If I wrote a comparison of an ati card that's 40% overclocked versus the BFG card that's 7% overclocked and the ati card did 30% better.. wouldn't that be "un-fair"?

I think it's an error in methodology, and the funny thing is, without overclocking "direct from factory" the ati card seems within what.. 4% ?

I mean.. a water block cooled, $600 video card versus a stock settings, stock fan/hs cooled $500 card and that nvidia can barely beat it?

Seems a bit telling to me.

I'm betting that if the bfg card was not OVERCLOCKED it'd actually SHOW a difference
I agree! I say lets downclock the ATI X800 XT PE to 6800 Ultra speeds for true apple to apple comparison! :p

I am sorry but if a retailer is selling an overclocked model for retail it's fair game. Nobody was complaing when EVGA was selling the 5900 Ultra above spec. Find an ATI dealer selling overclocked ATI cards for retail and notify Kyle about it and he will include it in his future benchmarks. Kyle and Brent are using the top of the line 6800 Ultra at retail (still $499), there is nothing wrong with what they're doing. Only the ATI fanatics are getting mad about this that their prized Half-Life II isn't dominating like it should on their hardware.
 
Dont feel like cruising threw the whole thread so this is from the console:

"r_fastzreject" = "1" ( def. "0" )
- Activate/deactivates a fast z-setting algorithm to take advantage of hardware with fast z reject. Use -1 to default to hardware settings


Mine was set to 1 and I have a 6800NU.
 
burningrave101 said:
I dont see what the point would be in Brent including it in the review. Its NOT enabled by default and HardOCP makes it a point to benchmark hardware "out-of-the-box" instead of changeing settings that the majority of users have no idea even exist or what they do.

I did not ask Brent to add it to his review. Just run a test or two and see if the claims live up to it. And since its going to be enabled in the furture....Why not? Are you sure Brent will come back to this review an update? It was a simple request to see if it lived up to the said hype. And if it does add more FPS for ATI users is that now a bad thing?


I am sorry but if a retailer is selling an overclocked model for retail it's fair game. Nobody was complaing when EVGA was selling the 5900 Ultra above spec. Find an ATI dealer selling overclocked ATI cards for retail and notify Kyle about it and he will include it in his future benchmarks. Kyle and Brent are using the top of the line 6800 Ultra at retail (still $499), there is nothing wrong with what they're doing.

I dont have a problem with the card they are using. However just realize not every 6800 sold is OC nor can be OC to the same level (most can do higher). Thus the results you see are not of the stock 6800 but the OC ones. So long as most poeple realzie that (which most except the die hard !!!!!!s) then no problemo...
 
burningrave101 said:
Ya know, it just seems like its been one thing after another this year that people have screamed needs to be enabled on the X800's to make the performance "fair" against the NV40's lol. DOOM 3 and the Humus HACK, Far Cry and the 2.0b path, and now HL2 and this console command are just some of the more well known.

If its not enabled by default then it shouldn't be permissible in a review. To me it just leans toward skewed benchmarks.

right, this from the same person who bitched NON STOP about nvidia optimizations not being enabled in reviews from certain websites...this is potentially the most hypocritical thing I have ever read on this board...

*edit* yep, I just check, the most hypocritical statement on the whole board...;)
 
Gavinni said:
It may be the #1 selling, but it vs all of the other GT's combined does not make a majority.

How fast does it take anyone with half a brain to Oc an NVidia card. My dad does it on his 5200 thanks to coolbits. BFG is representative of both the best solution out there (like the ATI PE) and of what a user is more then likely to do the card anyways. I get monster OC's on my HTPC's 5700Ultra and on my gaming's 9800Pro.

If you REALLY want to get technical about it, how about looking at apples to apples in availability. I can go to Comp-freaking-USA and land the top end BFG part. I have to go to the BFE to get the top end PE ATI part, if at all, and at higher price. That higher price seems to be justified if only play HL2 at 16XAA, assuming the rest of my setup could handle that.

BTW, props to CatMaker for dropping in and explaining the detail end of testing these kind of things. Us testers don't get many props, but trust me, a high level tester in any operation has his fingers in ALL the pies, so he would know his stuff.

KYLE! I WANT MY CPU SCALING! (hee hee, he must sick of that comment by now). Seriously, how does it scale on a platform that is 32 bit and does not have a built-in memory controler ala intel and athlon xp? The RAM latency issue would be magnified to high level on those platforms so it would be a great test to see whose northbridge can manage the data latency better (925i vs NForce2 400 Ultra rev2 for example...that would be a GREAT compare).

Overall guys, still kicking ass and I love it.

WAR TUX!
 
Excellent benchmark Kyle.That only show how desperate nVidia really is. Their super-high-tech new nV40, even overclocked can't beat 3 yers old ATi tech :D LOL. GREAT work :D :D :D
 
inotocracy said:
Dont feel like cruising threw the whole thread so this is from the console:

"r_fastzreject" = "1" ( def. "0" )
- Activate/deactivates a fast z-setting algorithm to take advantage of hardware with fast z reject. Use -1 to default to hardware settings


Mine was set to 1 and I have a 6800NU.
That is cleary evident, since I said it's enabled for nvidia cards.
I have no problem with brent doing a out of the box experience, but I'm sure alot of users will want to know how it will perform that way it will, likely with an upcoming patch.
 
Moloch said:
That is cleary evident, since I said it's enabled for nvidia cards.
I have no problem with brent doing a out of the box experience, but I'm sure alot of users will want to know how it will perform that way it will, likely with an upcoming patch.

Why would Valve disable the fastzreject on ATI cards unless it had graphical issues or some other problem?
 
I just flipped thru Anand's article, was pretty interesting with regards to the cards they tested... The 6800GT performed pretty much identical to the X800 Pro (due to the lower # of pipes on this one obviously), and more interestingly the X700, 6600GT, and 6800 vanilla all were neck and neck...

I bought a 6800 vanilla recently and while I certainly don't regret it (the card does perform a touch better than a 6600GT in other situations) I'm interested in how OC'ing both would affect results... Seeing as the 6600GT has substantially faster memory at stock clocks and OCs much higher percentage-wise, from what I've seen anyway.

Definitely a great time for video hardware though, can't remember the last time there was this many great choices at so many different price points and from competing chipset makers.
 
tranCendenZ said:
Why would Valve disable the fastzreject on ATI cards unless it had graphical issues or some other problem?

Supposedly it did, but with older hardware mostly? 'Least that's what I think someone around here insinuated earlier, I wouldn't know really. Looking forward to more in-depths look at all of this as the reviewers manage to catch a few ZZzzz's. :D
 
Interesting read. I have read so many reviews that the 6800 was somewhere around 40% slower than the x800 in HL2. Well, I have to thank HardOCP to discover the reality, once again.

Also, as a note, the Ultra OC from BFG should have the "OC" dropped. There more than a few nVidia vendors shipping their cards with 425 as the stock clock, not 400. So, the BFG card is, in current form, just an average 6800 Ultra. What would everyone have said had they used a Gainward, XFX or eVGA UEE that runs at 450/1200 out of the box for this test?
 
tranCendenZ said:
Why would Valve disable the fastzreject on ATI cards unless it had graphical issues or some other problem?
read thread dedicated to the issue..
 
i'm confused why does anand's benchmarks have 2 6800 GT's on it, which are not equil
 
Rasha said:
i'm confused why does anand's benchmarks have 2 6800 GT's on it, which are not equil

They had a 6800 GT and a 6600 GT, which are definitely not equal.
 
-=bladerunner=- said:
Excellent benchmark Kyle.That only show how desperate nVidia really is. Their super-high-tech new nV40, even overclocked can't beat 3 yers old ATi tech :D LOL. GREAT work :D :D :D


Kinda sad considering the x800 needs to have its clocks speeds ramped way up just to stay a couple frames ahead of the NV40...now clock that NV40 as high as the x800 or down clock the x800 to the speed of an ultra and see who gets whos ass handed to em.
 
Both the clock speed and life lenght of the technology are kinda irrelevant, they are what they are... And both are great cards to play HL2 on, game on!
 
Ya what the man said. At the end of the day it's all about performence, price, and IQ. Fuck the non-essentials. Seriously, did anyone here buy their 6800GT simply because they like the color green and think all things red should go fuck themselves. If so, please remove yourself from the gene pool (or join Rage, same difference :D). Thank you.
 
Have just surfaced after hours of play and read the review. All I can say is the 6800 Ultra is giving me the graphics and the speed that I put this machine together to deliver and I am loving it!!! Far Cry, Doom III, CODUUO and now this fantastic game. Can't really ask for a better return on the system investment!!!!!! Using the 67.02 drivers thanks to a thread on hardocp!
 
Visable-assassin said:
Kinda sad considering the x800 needs to have its clocks speeds ramped way up just to stay a couple frames ahead of the NV40...now clock that NV40 as high as the x800 or down clock the x800 to the speed of an ultra and see who gets whos ass handed to em.
What are you talking about the X800 needing high clock speeds, the X800 pro does extremely well against the 6800GT OC, and it has a fillrate of 6000 vs 5700 for the X800 pro, and the memory clocked at 1ghz, vs 900 for the pro.
Suddenly it seems like you have it all backwords- the way I see it, the X800 pro has 12 pipelines that work harder than the 16 pipe GT- less pipelines, but doing more work through faster clockspeed, and the X800 XT is faster even without the fastzreject feature, which is enabled for nvidia cards, so untill valve enabled the feature by default, or a site decides to benchmark enabling that command, the benchmarks are not really apples to apples, and it isn't anyones faults expect possibly valve for not enabling it, since they made the decision not to use with ati cards back in the R300 era.
Anyway...
 
confused as to why reflect world instead of reflect all was used. try as i might i cant think of 1 logical explanation for this.
 
Visable-assassin said:
Kinda sad considering the x800 needs to have its clocks speeds ramped way up just to stay a couple frames ahead of the NV40...now clock that NV40 as high as the x800 or down clock the x800 to the speed of an ultra and see who gets whos ass handed to em.

meh, who cares which card "gets its ass handed to it".... in this day and age an ass-handing usually only amounts to 5-10 fps in real world gaming.... and since both cards have been shown to deliver over 60fps comfortably at high res with AA/AF, does it even matter at this point?
 
hovz said:
confused as to why reflect world instead of reflect all was used. try as i might i cant think of 1 logical explanation for this.
I agree 100%.

They're benchmarking graphic cards performance on Half-Life2. And why would they not push graphic cards the hardest they could with MAXIMUM visual quality settings ?
 
Apple740 said:
To people who are interested in the difference between Ati's 4xAA and 6xAA@1280x1024:
The only reall difference I see is on the powerline over head, anything else is unnoticable to me.
 
Visable-assassin said:
Kinda sad considering the x800 needs to have its clocks speeds ramped way up just to stay a couple frames ahead of the NV40...now clock that NV40 as high as the x800 or down clock the x800 to the speed of an ultra and see who gets whos ass handed to em.

Why should this make any difference, who cares what clock speed it needs
 
Good grief, some of the ATi owners in here are whinging so bad you'd think it was NVIDIA who had the fastest HL2 card!! It's all waaa-waaa-waaa no +r_fastzreject, waaa-waaa-waaa overclocked NV cards, waaa-waaa-waaa no 16x aniso, waaa-waaa-waaa no reflect all......

Guys, you've got the faster card, why the rivers of tears?? How much faster than NV do you want to be before your little eyes will dry? Ok, ATi is 8000% faster than NV!! There, now don't you feel all warm inside? :)
 
coz said:
Good grief, some of the ATi owners in here are whinging so bad you'd think it was NVIDIA who had the fastest HL2 card!! It's all waaa-waaa-waaa no +r_fastzreject, waaa-waaa-waaa overclocked NV cards, waaa-waaa-waaa no 16x aniso, waaa-waaa-waaa no reflect all......

Guys, you've got the faster card, why the rivers of tears?? How much faster than NV do you want to be before your little eyes will dry? Ok, ATi is 8000% faster than NV!! There, now don't you feel all warm inside? :)

We are not crying just asking to see if it lives up to what others have said. I guess getting a few extra FPS is only a good thing if its an NV card?
 
creedAMD said:
DW! What's up! This is a great game, Valve really scored big on this one, game of the year for sure! Hope the wife and kids are all well.
Hey Creed! Thanks and I'm happy to report all is very well on the family homefront for me. My wife is doing goodly and so are both the kids, and we got the black labs back again. (Well, one now...one passed on the day after I got her, it pretty much sucked. :( )

I agree about the game, truly an outstanding gaming experience! :cool:

I just finished late-late last night and am thinking about screwing around a bit benching it now. :)

(Sorry for the OT and lateness of the reply, I'm been spacey lately. ;) )
 
Status
Not open for further replies.
Back
Top