Half Life 2 Benchmarks @ HardOCP.com

Status
Not open for further replies.
Jonsey said:
Want to give me a list of the stores you have heard of? So does $698 count? How about $697? $696?

I'm not really serious of course. I just can't help taking a thread off course when someone makes a dogmatic statement. I realize the X800 XT PE is very hard to find at reasonable prices, I won't argue that with you. You don't need to give me the dreaded "roll eyes." :)
:rolleyes: LOL :D
If you didn't want to agrue you shouldn't have started.
 
Jonsey said:
Want to give me a list of the stores you have heard of? So does $698 count? How about $697? $696?

I'm not really serious of course. I just can't help taking a thread off course when someone makes a dogmatic statement. I realize the X800 XT PE is very hard to find at reasonable prices, I won't argue that with you. You don't need to give me the dreaded "roll eyes." :)

:rolleyes:

;)
 
^eMpTy^ said:
In HL2 only...and by the same token, the performance difference between the Ultra and Ultra OC is small as well...so the question remains...why don't you just let it go?
Only?
How about any heavy DX9 game?
Far cry anyone?
The problem is there isn't enough DX9 games, and even far cry I've heard has mostly SM1 shaders.
 
You know if HL2 is supposed to be an 'ATI' win why the hell is their side throwing so much shit? I remember the D3 days, there was no shit slinging from our side [until the Humus tweak hit a week later and traded IQ for FPS]...hells no son, we were too busy gloating and generally enjoying the 'HammerDown' mood [H] had.

Ah the good ol' days. ;)

My advice to the red team is to enjoy this day instead of picking Brent and Kyle apart, it's not like you'll have many more like them. :D

Now if the lot of will you will excuse me, I'm going to go play HL2 at 1600x1200 4xAA 16xAF Reflect All with perfectly playable frames.
 
Moloch said:
I can't believe you think that 2 card makers that I know of that have overclocked stocked gpus is better than using reference clockspeeds
Actually it's BFG, Gainward, Asus and XFX who offer higher (than NV 6800U ref) stock clocks on their 6800 Ultras.

In other news, the X800XT-PE has been renamed from 'Press-Edition' to 'Phantom-Edition' (in case anyone wonders what 'PE' stands for!!). :)
 
Interesting thread, I must say....

I guess there is no way to speak logically to some people that are ruled by blind fanaticism.

I really didnt want to get into it here, but I am suprised that people still dont realize what being limited by the CPU means. Bringing up graphs of CPU bound benchmark runs, backs up what I said earlier. On CPU bound cases both companies are neck in neck.

Even more strange is that a few people still dont realize what graphically intensive scenes mean. In cases where the scenes are very complex both the X800PE and X800 Pro dominate.

Edit/Update: Here is a sample of what I'm talking about: http://www.pcper.com/article.php?aid=93&type=expert&pid=7
"The X800 Pro continues to dominate on this level when the water effects are so obviously in play. "

Of course the ultra high end is a foregone conclusion so I wont provide links.

Guys all I am saying is that in complex scenes the X800 series of cards wins.
Now I am back to "testing"
 
CATALYST MAKER said:
Interesting thread, I must say....

I guess there is no way to speak logically to some people that are ruled by blind fanaticism.

I really didnt want to get into it here, but I am suprised that people still dont realize what being limited by the CPU means. Bringing up graphs of CPU bound benchmark runs, backs up what I said earlier. On CPU bound cases both companies are neck in neck.

Even more strange is that a few people still dont realize what graphically intensive scenes mean. In cases where the scenes are very complex both the X800PE and X800 Pro dominate.

Given the highly publicized relationship between Valve and ATi, is it safe to assume that the game is already pretty well optimizted for ATi hardware? and that speed bumps in HL2 will probably be minor?

And by the same token, given the complete lack of relationship between Valve and nvidia...do you think it's safe to assume that nvidia will easily be able to optimize for HL2 and get some serious speed boosts?
 
Ah, you have no clue at how absolutely tempting it is to go off on a total rant about the X800 pro over the 6800 GT for me right now. In the interests of not sounding too !!!!!!ish I'll restrain, but ATi is still better in the AA dept than nVidia.

Besides, how come no one ever brings up the X800 pro VIVO modded? I'm sooo bloody in love with mine, it's the 9500 pro of this generation and what I think to be the best bargain around.

The GT is nice, but I can't see saying it's better than the X800 pro definatively...not by a longshot.

(BTW-Terry, your patience and tolerance still amazes me! :shock: ;) )
 
digitalwanderer said:
Ah, you have no clue at how absolutely tempting it is to go off on a total rant about the X800 pro over the 6800 GT for me right now. In the interests of not sounding too !!!!!!ish I'll restrain, but ATi is still better in the AA dept than nVidia.

Besides, how come no one ever brings up the X800 pro VIVO modded? I'm sooo bloody in love with mine, it's the 9500 pro of this generation and what I think to be the best bargain around.

The GT is nice, but I can't see saying it's better than the X800 pro definatively...not by a longshot.

(BTW-Terry, your patience and tolerance still amazes me! :shock: ;) )

Well I figure if a huge ati fan will concede that the GT is "nice" that's the same as a normal human being saying "yeah the GT is the better card"...which, btw, if you haven't noticed, is exactly what every other person on the face of the planet has said in every review I've read and in every thread on this forum...but keep fighting the good fight dw...:)
 
digitalwanderer said:
Ah, you have no clue at how absolutely tempting it is to go off on a total rant about the X800 pro over the 6800 GT for me right now. In the interests of not sounding too !!!!!!ish I'll restrain, but ATi is still better in the AA dept than nVidia.

Besides, how come no one ever brings up the X800 pro VIVO modded? I'm sooo bloody in love with mine, it's the 9500 pro of this generation and what I think to be the best bargain around.

The GT is nice, but I can't see saying it's better than the X800 pro definatively...not by a longshot.

(BTW-Terry, your patience and tolerance still amazes me! :shock: ;) )

I really like my x800 pro vivo -> XT PE too. It was just a feel good moment when I flashed it to an XT PE and realized it would work.
 
digitalwanderer said:
(BTW-Terry, your patience and tolerance still amazes me! :shock: ;) )

Digi buddy, I just like interacting with the community. It's one of the main reasons we have been able to work miracles with CATALYST since we introduced it. For me, the main way of finding out what people want from our drivers is by talking to you all.

I merely consolidate what people want, and try and deliver it as soon as possible. (However it does have to make some business sense, so if features/support is not there, you can assume because its not a big enough demand). Example people were concernd about the size of our all in one download. Go take a look at the latest beta driver posting you will see we took steps to bring it down quite a bit.

On a side note we do have a really cool CCC roadmap coming, I was just working on a presentation for a meeting in Germany next week.

So keep letting me know what you guys want, I really do value the input.
 
spyderz said:
so to be fair, why we just underclock to x800's to the same clock speeds as the nvidia's cards?

Because then the card would not be running at the MANUFACTURERs SPECIFICATIONs

and by manufacturer, I mean ATI, not sapphire or visiontek

and NVIDIA not BFG or eVGA

The nvidia based card is overclocked. The ATI card is not.

People benchmark P4 3200s that run at 3.2 ghz next to AMD 3200+'s which are signifigantly slower because of the claim by AMD that a 3200+ is equal to or better in performance than the pentium 4.

Nvidia claims that it's STOCK SPEED core is better than the ATI stock speed card.

BUt, I don't see anything backing this up, because I have not seen an NVIDIA card running at "Reference" speed. Only overclocked speed.
 
:rolleyes: Dude give it up and get it through your thick skull. BFG is the manufacturer, NV designed the card. THAT IS THE STOCK SPEED FROM THE FACTORY. :rolleyes:
 
CATALYST MAKER said:
Digi buddy, I just like interacting with the community. It's one of the main reasons we have been able to work miracles with CATALYST since we introduced it. For me, the main way of finding out what people want from our drivers is by talking to you all.

I merely consolidate what people want, and try and deliver it as soon as possible. (However it does have to make some business sense, so if features/support is not there, you can assume because its not a big enough demand). Example people were concernd about the size of our all in one download. Go take a look at the latest beta driver posting you will see we took steps to bring it down quite a bit.

On a side note we do have a really cool CCC roadmap coming, I was just working on a presentation for a meeting in Germany next week.

So keep letting me know what you guys want, I really do value the input.
You're preaching to the choir with me, I really almost did buy a 6800GT that night but went and hunted down an X800 VIVO just because of how great you've been with the drivers and support lately....I just couldn't lose that!

I'm glad I did too, absolutely no regrets. (Well, mainly because my wife is still speaking to me after finding out about me spending $450us on a card. ;) )

I'm still not into the CCC thing, but I'm following it's development and can't wait to see some of these surprises I've heard rumored....sounds like it's going to be compelling enough to make the switch soon.

And as for "everyone on the web saying the 6800 GT is better than the X800 pro", I ain't put my head-to-head comparison online yet. ;)
 
CrimandEvil said:
:rolleyes: Dude give it up and get it through your thick skull. BFG is the manufacturer, NV designed the card. THAT IS THE STOCK SPEED FROM THE FACTORY. :rolleyes:


So... what you are trying to tell me is that BFG is printing on a processor fab nvidia chips?

Or are they taking a box of nvidia chips shipped to them, and sticking them on some video boards?

BFG (as well as sapphire) are as much a MANUFACTURER as dell MANUFACTURES pentium 4 processors.

And..

the reason why I'm 'balking' is because I want to see a REAL benchmark.

Brent claims to not have overclocked the card by installing coolbits.. however.. It has been said that the cards are OVERCLOCKED at the assembly plant.

So.. again.. that point is moot.


Again.. If I, theoretically, had an ASSEMBLY plant that sticks the processors on boards, and then overclocks them 40%, would that prove the power of ATI or the power of OVERCLOCKING?

Would my increased speed cards mean that ATI is better, or that my specific BRAND of video cards which "happen to use ati chips" is better?



Lets put it this way

You build two computer systems.


All parts are identical except for CPU's

One cpu is a P4 3.0 ghz actually running at 3 ghz

Once cpu is a P4 2.4 being run at 3.8 ghz

The 2.4 ghz/3.8ghz processor system completely blows away the P4 3.0 system

Does that mean that a P4 3.0 processor is a piece of crap? I think not.

I do not know the ACTUAL numbers.. (i'm sure I could find out)

but..

If the nvidia CHIP is, specced by NVIDIA to run at say.. 500/1000 and is being run at 540/1080

and the ati CHIP is specced to run at 540/1080 and is actually RUN at 540/1080 and comes neck and neck [as is shown] with the nvidia CHIP that's been overclocked, that shows to ME that the ATI chip is superior.

it DOES NOT MATTER what the real stock settings are supposed to be.. I don't care if the nvidia CHIP is supposed to run at 200/400 and the ati at 10000/20000..

NVIDIA is claiming that their STOCK chip is faster than ati's STOCK chip

the bfg card does not USE a STOCK chip, the ATI (say, sapphire) DOES.


So..
either run both cards at ACTUAL stock

or ..
find out how much the overclock is on the NVIDIA based card, and overclock the ATI that same percentage.


back to the p4/athlon comparison, when both processors are run at STOCK speeds (3200+ versus p4 3200) they do sort of match up because of the different ways they function.

When the AMD is run at the same REAL speed, it outperforms.
but that's ONE product overclocked, and another NOT overclocked
 
Laforge said:
So... what you are trying to tell me is that BFG is printing on a processor fab nvidia chips?

Or are they taking a box of nvidia chips shipped to them, and sticking them on some video boards?

BFG (as well as sapphire) are as much a MANUFACTURER as dell MANUFACTURES pentium 4 processors.

Clock speeds can be controlled by the bios, which is set by the manufacturer...ie...BFG...it's really not that hard of a concept...if you wanna bitch about that, you should also be bitching about the inclusion of the XTPE, which doesn't exist in retail...
 
^eMpTy^ said:
Clock speeds can be controlled by the bios, which is set by the manufacturer...ie...BFG...it's really not that hard of a concept...if you wanna bitch about that, you should also be bitching about the inclusion of the XTPE, which doesn't exist in retail...
The XT vanilla still wins, so what does it matter?
 
^eMpTy^ said:
if you wanna bitch about that, you should also be bitching about the inclusion of the XTPE, which doesn't exist in retail...
There was alot of hot air in his post, I guess he's just disheartened by Gabe's claim of a 40% performance for ATI cards turned out to be a pile of horse sh**.. LOL
 
mulpsmebeauty said:
Looks like good stuff all round to me. I haven't got HL2 yet and hope to download it this evening. The thought of playing a game with graphics as good as the screenshots look at 1600x1200 4xaa 8xaf at 80-odd fps average on my 6800 Ultra OC is pretty heartening.

I look forward to the more detailed review I'm sure is coming - I'd like to see what the minimum frame rates are like as I've heard a few vague tales about slight jerkiness running at top resolutions on 6800 series hardware in the more open areas of the game.

I hate ppl like you that don't buy an actual game, but instead download it. The game is worth the money. :(
Anyways with my current rig Hl2 runs well.. but after running the CS stress test only gettin 25 fps..which is really...bad. MIght have to invest in a newer graphic card since my ATi 9600 256 is letting me down.
 
^eMpTy^ said:
it doesn't matter...that's kinda the point...
Well the 6800GT and 6800U are still slower, the U more so.
The X800 and GT are still about the same,but for maximum IQ, the X800 being faster, as always.
 
Spike23 said:
I hate ppl like you that don't buy an actual game, but instead download it.
Erm, how do you know he's not downloading it from Steam? :confused:
 
Spike23 said:
I hate ppl like you that don't buy an actual game, but instead download it. The game is worth the money. :(
Anyways with my current rig Hl2 runs well.. but after running the CS stress test only gettin 25 fps..which is really...bad. MIght have to invest in a newer graphic card since my ATi 9600 256 is letting me down.


Before you accuse him of something...
You do realize he could mean "download it on steam".. right?

That's how *I* got mine.. I DOWNLOADed it.. fully legitimately, too.. No crime here.


Meh.. the poster above me said the same thing.. :)
 
Visable-assassin said:
Kinda sad considering the x800 needs to have its clocks speeds ramped way up just to stay a couple frames ahead of the NV40...now clock that NV40 as high as the x800 or down clock the x800 to the speed of an ultra and see who gets whos ass handed to em.

Down clock video card? LoL, true [H].:D Would you down clock Athlon64 4000+ only because Intel is not able to deliver something faster than P4 3.8? Completely nonsense
 
Moloch said:
The XT vanilla still wins, so what does it matter?
Exactly, it doesn't matter - even with higher stock clocks the 6800U is still slower than the XT in HL2. Complaining about unfair clocks on the NV part when it's slower in HL2 than the XT seems utterly pointless. This is like someone winning an election and then demanding a recount!! :D
 
pahncrd said:
My point was that i am enjoying features unavailable on Nvidia only. I would probably switch if that were not the case.
I certainly don't feel any mouse lag, and it never dips below 35fps or so in extreme firefights. Though there is an occasional 'chug' as there is a lot of physics.
I will go through some action parts and take a few screens.

So enjoy your chceck box features and broken video processor, something , what only nVidia can offer :D :D
 
I haven't seen anyone reply about the supposed poor clock per clock performance of the X800.
The X800 pro is doing fairly well considering it's going against a card with a bit more fillrate, and a memory clocks peed of 1ghz vs 900 mhz, if the X800 pro had poor clock for clock performance, the X800 pro would be lucky to win any benchmark, ever, since it only has 12 pipes.
coz said:
Exactly, it doesn't matter - even with higher stock clocks the 6800U is still slower than the XT in HL2. Complaining about unfair clocks on the NV part when it's slower in HL2 than the XT seems utterly pointless. This is like someone winning an election and then demanding a recount!! :D
But someone might say the 6800 series is on par with the X800 series, and the water on the 6XXX series seems to be alot slower, and the min framerates are what's killing the 6XXX series.
 
Since this thread is 20 pages long I skimmed though it in a bleak attempt to catch up. What I have found with the reviews so far it that the Nvidia camp is only slightly behind the ATI camp in most benches unless water effects are in that scene. Sometimes though we see one like this http://www.xbitlabs.com/articles/video/display/half-life.html in which the high end ATI cards trounce Nvidia's, sometime doubling them. I think Nvidia needs to get another driver set out soon because there is no reason the ATI cards should seperate by that much. 20% is fine but 50% - 100% something is wrong.
 
^eMpTy^ said:
lol...this has been explained a hundred times already...the BFG cards are the most popular ones...more people have them than any other card...

Irrelevant. clock speeds are defined by IHVs and the fact that someone selling overclocked cards is just cheap excuse.
 
-=bladerunner=- said:
Irrelevant. clock speeds are defined by IHVs and the fact that someone selling overclocked cards is just cheap excuse.

Clock speeds are RECOMMENDED by nVidia and ATI but the vendors are free to clock them at any speed they want and as long as that card falls into the same price point as the other cards then its just as permissible in a review as any other video card. :rolleyes:

-=bladerunner=- said:
So enjoy your chceck box features and broken video processor, something , what only nVidia can offer :D :D

Enjoy your bug ridden drivers, complete lack of linux support, poor OpenGL performance, extensive brilinear filtering tricks, and worthless features like Temporal AA, Overdrive, and 3Dc lol.

BTW, the X800's dont even have a video processor on-chip so the fact that the 6800's is currently broken doesn't mean much :).
 
burningrave101 said:
Clock speeds are RECOMMENDED by nVidia and ATI but the vendors are free to clock them at any speed they want and as long as that card falls into the same price point as the other cards then its just as permissable in a review as any other video card. :rolleyes:



Enjoy your bug ridden drivers, complete lack of linux support, poor OpenGL performance, extensive brilinear filtering tricks, and worthless features like Temporal AA, Overdrive, and 3Dc lol.

BTW, the X800's dont even have a video processor on-chip so the fact that the 6800's is currently broken doesn't mean much :).
Erm shimmy shimmy shimmy till the break of dawn much?(filterting tricks?) and we can turn them off
Bug ridden?
No video processer?
...“VIDEOSHADER HD” which includes both dedicated and Pixel Shader accelerated Video Engine,
TAA is only useless if you insist on using some insane refresh rate, aslong as the the avg framerate is above, 85 for an 85 refresh, it will work great, and btw, even during the dips it will turn back on when the frame frate gets back to the refresh rate.
3DC improves performance(see far cry) and is higher quality:p
 
mappy said:
Since this thread is 20 pages long I skimmed though it in a bleak attempt to catch up. What I have found with the reviews so far it that the Nvidia camp is only slightly behind the ATI camp in most benches unless water effects are in that scene. Sometimes though we see one like this http://www.xbitlabs.com/articles/video/display/half-life.html in which the high end ATI cards trounce Nvidia's, sometime doubling them. I think Nvidia needs to get another driver set out soon because there is no reason the ATI cards should seperate by that much. 20% is fine but 50% - 100% something is wrong.

yes.... the x800pro is walking all over my 6800ultra 85% of the time.... i this isn't even the x800XT.. hehehe i guess HL2 really is made for ATI's.. maybe i should've gotten an x800xt... damn ps 3.0 :p o welps.... time to enjoy Black Mesa ....
 
Moloch said:
Erm shimmy shimmy shimmy till the break of dawn much?(filterting tricks?) and we can turn them off
Bug ridden?
TAA is only useless if you insist on using some insane refresh rate, aslong as the the avg framerate is above, 85 for an 85 refresh, it will work great, and btw, even during the dips it will turn back on when the frame frate gets back to the refresh rate.
3DC improves performance(see far cry) and is higher quality:p
br

You cannot completely disable all the filtering tricks in the ATI drivers.

Bug ridden? YES BUG RIDDEN! Check a few major gaming forums. ATI drivers have a far greater number of game bugs then nVidia ever has.

TAA is useless. You must maintain a high FPS count in order for it to even work and if your FPS is that high you might as well use REAL Multisampling AA because TAA is only emulated AA and it is not as good as the real thing. AA is only applied to every other frame with TAA.

3Dc has no real world performance or IQ gain over DXT5 and 3Dc will be very lucky to get any kind of real support by game developers. DXT5 has been around for a long time now and ATI tried to push it as well and hardly anyone ever used it. DXT5 was used in DOOM 3.
 
Enjoy your bug ridden drivers, complete lack of linux support, poor OpenGL performance, extensive brilinear filtering tricks, and worthless features like Temporal AA, Overdrive, and 3Dc lol.



I don't know what card you are talking about.. I see a linux driver for both ATI and NVIDIA products.

Maybe you got confused and think he's talking about a sis product or something?
 
Laforge said:
I don't know what card you are talking about.. I see a linux driver for both ATI and NVIDIA products.

Maybe you got confused and think he's talking about a sis product or something?

If you had ever ran Linux on an ATI video card then you would know exactly what i'm talking about lol. The performance is horrid. A 5950u could easily outpace an X800XT PE in Linux. The FX 5200 beats a 9800 pro.

ATI might as well not even support Linux because if you go and talk to some Linux guru's they will tell you what video card you want to have if your into Linux.
 
burningrave101 said:
You cannot completely disable all the filtering tricks in the ATI drivers.

Bug ridden? YES BUG RIDDEN! Check a few major gaming forums. ATI drivers have a far greater number of game bugs then nVidia ever has.

TAA is useless. You must maintain a high FPS count in order for it to even work and if your FPS is that high you might as well use REAL Multisampling AA because TAA is only emulated AA and it is not as good as the real thing. AA is only applied to every other frame with TAA.

3Dc has no real world performance or IQ gain over DXT5 and 3Dc will be very lucky to get any kind of real support by game developers. DXT5 has been around for a long time now and ATI tried to push it as well and hardly anyone ever used it. DXT5 was used in DOOM 3.
Yes you can, where did you hear you couldn't
Bugs? perhaps I should cut and paste the list that nvnews must have like rage3d has?
Having a details bug list does not mean ati has more bugs, it means they have more DOCUMENTED bugs.
I guess every game on nvidia hardware should be listed at a bug since they have to hack the LOD to get rid of shimmering, not to meantion stutterting.
Why would you wan't to use"real" AA when you have 8X effective AA with the performance of 4xAA?
 
Moloch said:
Yes you can, where did you hear you couldn't
Bugs? perhaps I should cut and paste the list that nvnews must have like rage3d has?
Having a details bug list does not mean ati has more bugs, it means they have more DOCUMENTED bugs.
I guess every game on nvidia hardware should be listed at a bug since they have to hack the LOD to get rid of shimmering, not to meantion stutterting.
Why would you wan't to use"real" AA when you have 8X effective AA with the performance of 4xAA?

You cannot disable ALL of the optimizations in the ATI drivers through their new AI option. Your able to disable what ATI's wants you to disable. It disables all the AI extra stuff, but the brilinear filtering is still in full effect, and when a colored mip map is detected it stops cheating.

I said go look at some game forums and see which video card has more issues.

With 4xAA TAA you get the "effect" of around 6xAA, not 8xAA. And if your FPS drops then you have a bad shimmering effect. And like i said already, TAA only applies AA to every other frame. It IS NOT as good as the real thing.
 
I hate to interrupt a good fight and all, but has anyone installed HL2 on more than one of their PCs? Am I allowed to do that with me official Steam version or is that a big no-no?

I figured it's an ok thing if I just play it on one PC at once, but I'm getting to the point after reading all these posts that I'm REALLY wanting to check out how it plays on different rigs/configurations and if'n anyone has done this I'd really appreciate hearing about it. :)
 
digitalwanderer said:
I hate to interrupt a good fight and all, but has anyone installed HL2 on more than one of their PCs? Am I allowed to do that with me official Steam version or is that a big no-no?

I figured it's an ok thing if I just play it on one PC at once, but I'm getting to the point after reading all these posts that I'm REALLY wanting to check out how it plays on different rigs/configurations and if'n anyone has done this I'd really appreciate hearing about it. :)

I've got it on 2 computers, can only have one instance of it goin at one time though cause it connects through steam.
 
Status
Not open for further replies.
Back
Top