If you think Half-Life 2 will avenge ATi...

Status
Not open for further replies.
^eMpTy^ said:
What does the difference in APIs or the specifics have to do with anything? Atleast one source can show Nvidia beating ATi in Half Life 2. In my opinion that disspells the rumor that ATi will significantly outperform Nvidia in HL2. And that's all I'm trying to do, disspell a rumor.

You are not "dispelling" anything....you are adding to rumor. There is no significant evidence to suggest that ATI won't own nVidia in Halflife........likewise there isnt anything to suggest they will. Why is it some people think they need to "educate" anyone here to begin with? There are far more capable person(s) here to do that.......


They run [H]ardOCP.

^eMpTy^ said:
But when have you ever seen a beta of a game not be indicative of final performance to the degree that it would have to be in order for HL2 to counter balance ATi's poor performance in Doom 3.

you kind of just supported the arument against you....there was nothing indicative in prior beta benches to suggest ATI would perform so poorly in final release....but it happened.
 
Why is it that this forum has more mindless flaming and pointless arguments than General Mayhem? If my memory hasn't failed me, I seem to remember a time where [H] was a place where nerds could help nerds, not another "The Pit"...This thread makes me sad inside...:(
 
Where's all this going? Knife fights at LAN's over whose ATi and whose NVidia? It's crazy. There are better things to argue about. I don't even know what people are arguing about. They're numbers. You can't argue numbers. So NVidia's card gets 4 more fps. Big deal. And I guess if you have a p4 then you just went ahead and pissed all you money away then didn't you.

I happen to prefer ATi because I've always had a good experience. I've had 2 NVidia cards on different occasions how up DOA. So I just lean that way. I probably would try out NVidia again this time around. But I already have a MAZE4 I plan on reusing so I'm gonna wait for the x800.

BTW. I'm tired of the non released game benches. I'm sure some people care. But All I want is Lock on:Modern Air combat and Flight Simulator 2004. Can someone go ahead and give me scores for that?
 
swoop56 said:
BTW. I'm tired of the non released game benches. I'm sure some people care. But All I want is Lock on:Modern Air combat and Flight Simulator 2004. Can someone go ahead and give me scores for that?

I can't speak for MS FS 2004, but as far as LOMAC is concerned you'll need a Cray computer and Geforce 10 video card w/ 1 Gig of RAM to run it with all graphic features enabled in high resolution. ;)


I haven't tried it lately, but it could bring my 3.2 gig P4 and a 9800XT to a slideshow and it still wasn't maxed.
 
"The 325Mhz GT beating the 520Mhz XTPE is significant, you can't deny that"

Oh come on, the clock speeds are insignificant, like the 9X00 series vs the NV3X series. It is a significant result though, I agree. And I personally doubt the difference between X800 and 6800 will be 40%.

For more HL2 benchies, check out the www.XBITLABS.com 6800/X800 game performance preview, "Highly Anticipated DX9.0 Game 1". Interesting stuff...
 
Blackwind said:
You are not "dispelling" anything....you are adding to rumor. There is no significant evidence to suggest that ATI won't own nVidia in Halflife........likewise there isnt anything to suggest they will. Why is it some people think they need to "educate" anyone here to begin with? There are far more capable person(s) here to do that.......

They run [H]ardOCP.

you kind of just supported the arument against you....there was nothing indicative in prior beta benches to suggest ATI would perform so poorly in final release....but it happened.

Ok here's the concept one more time: People seem to think NV cards are shit in HL2 because of benchmarks posted by Valve using this same code on an NV3x.

Here we can see that the 6800 keeps up quite nicely in HL2.

So from this the assumption then becomes that both cards will be very competitive in this game like other games and not the "doom 3 revenge" blowout that some people believe it will be...

Also, yes there absolutely was evidence in the beta of doom 3 posted on this very website that ATi would lose badly in Doom3. Thier performance has actually gone UP since then. If you can assume anything about beta -> final software performance change, you can assume that performance will generally go up, which it did.

And if you actually read the thread and try to understand what I'm saying instead of dissecting every word I type looking for inconsistencies that aren't there, you would know this by now.
 
Already being an owner of a GT, all these benchies did was make me wanna buy an A64
10-12fps differnce between pf 3.2 and a64 3200+??? :eek:

and befor peeps start spamming "but you'll never notice the difference" yes when your talking the the difference between 50 and 60fps there is a noticable difference in smoothness. Especially considering this will be an uber 133t twitchy fps game like the original, every fps will count.
 
wow, its really REALLY gay how some of yall is arguin over some alpha (and according to valve, technically not even an alpha yet) benchmarks. for the life of me i dont know why some people hate nvidia and why some hate ati. yall sound like the biggest bunch of losers on the planet. theres more important shit to debate over on this forum, than which card scored higher in a pre-alpha build of a pirated game. it's not like the doom 3 benchmark which was conducted by id, it's some damn website benchmarking a pre-alpha!!!! what people need to realize is, if ati gets it's ass whooped, all it will do, is drive their prices down to better compete with nvidia's cards, and make them work harder on the next generation of their gpu's. and if nvidia gets its ass whooped, it'll just make them compete even more, and again still drive prices down. competition is the greatest thing for consumers (whether you're ati or nvdia).

it's just really disgusting, someone needs to officially change the nvidia/ati loyalists to fagboy instead of fanboy.
 
This thread is just begging to be pulverized.

Anyways, here's my $0.02:

First off, to nail my colours to the mast, I'm driving a 6800ultra, which I'm very happy with, and I'm coming from a r9700pro, which I was very happy with.
I have no loyalty at all to either Nvidia or ATI (I mean, it's not like they're football teams), I pretty much just flipped a coin this time round (more accurately, nvidia was selling when I was buying).

The HL2 and D3 story isn't finished yet, neither is the ATi vs Nvidia struggle.

My prediction , swings and roundabouts, a draw.
I don't think either side has squeezed everything that they're going to out of their drivers (possibly moreso in nvidias case), I think ATI will pull a better ogl driver out and eat into nvidias doom3 lead.
There will be patches for D3 and HL2 and FarCry for months to come that will change performance in different areas.
I think when it all shakes out there isn't going to be a massive real-world difference between these cards anywhere.
When you're dealing with different architectures you're always going to have strengths and weaknesses in different areas, but so far I have yet to see anything that convinces me that my decision was particularly right or wrong.

I'm happy with the performance I get in FarCry, I reckon I'll be happy with the performance I get in D3 and HL2, I reckon the same would be true if I'd gone with ATI.
 
I love posts like this.

It’s understandable the nV gang is riding pretty high since the release of the latest nV gear. After all, the nVidia guys have been getting their arses handed to them by the ATI guys for a long time now and even longer in terms of image quality.

Let the nV guys have a little daylight for now. Once the final software is released and the drivers are updated, we’ll all know the truth.
 
ICOM said:
I love posts like this.

It’s understandable the nV gang is riding pretty high since the release of the latest nV gear. After all, the nVidia guys have been getting their arses handed to them by the ATI guys for a long time now and even longer in terms of image quality.

Let the nV guys have a little daylight for now. Once the final software is released and the drivers are updated, we’ll all know the truth.


And where do you get this info from? I like it when people say "then" we will know the truth, because that means you absolutely don't know why the nV architecture is better then ATi's now.

Btw is ATi going to .11 microns for their next highend cards or R500? Answer this then ask youself why I asked this.
 
first off i love how the "benchmarks" used an oced 6800 ultra and didnt bother to oc the x800

secondly who the fuck cares
if you are so much of a douche to worry about 1-4 fps you should get a life ;)

just get whichever one you want... there will always be something better coming out within the next 6mos anyways
 
Wasn't a "beta" :rolleyes: . As with doom 3 it was a build of the game that was put together for the e3 show. The build that got released on the net a lot of the textures didn;t work, on the strider map the citadel was missing, characters were missing, buildings were missing etc. All this has an obvious effect on performance, if people are actually dumb enough to take numbers from a "benchmark" of pure crap then let them. Think most others would prefer for hardocp or some other site to get a hold of a finished product and run those benchmarks.
 
rancor said:
And where do you get this info from? I like it when people say "then" we will know the truth, because that means you absolutely don't know why the nV architecture is better then ATi's now.

Btw is ATi going to .11 microns for their next highend cards or R500? Answer this then ask youself why I asked this.

Huh :confused:

What lines are you reading between?

I only suggested we wait until the software is release before anybody goes around beating their chests. That's all... :rolleyes:
 
So Empty, what we're saying here is "Why would anyone buy a x800xt when they can get a 6800u?"

I STILL want the x800xt (Ice Q II), however the deciding factor will be 1) if x800xt is better in HL2, apparently it isn't... 2) The Graphics - Will doom3 and especially Half Life 2 LOOK better, even if just slightly better, on the x800xt???

However now the 6800u is more tempting than ever...I'm starting to be convinced...


edit: "first off i love how the "benchmarks" used an oced 6800 ultra and didnt bother to oc the x800"

:O omg, then maybe the x800xt IS better in HL2....we will have to see...
 
ICOM said:
Huh :confused:

What lines are you reading between?

I only suggested we wait until the software is release before anybody goes around beating their chests. That's all... :rolleyes:


Anwer my questions then, I doubt you can. In fact even if you do, then you will really be in a corner :D
 
"If god had wanted us to be safe, he wouldn't have put us on a tectonically unstable ball of molten rock flying through space at 66-and-a-half thousand miles per hour with only a thin layer of gas to protect us. "

lol at that quote, nice.
 
^eMpTy^ said:
Isn't it obvious?

I'm just out to educate the masses.

Now THAT is THE funniest thing I've heard for a long time...out to educate the masses with a stolen leak!

Keep you GT ;) I'm sure the differnce in performance wont be as bad as ur worried about, with this continous drivel about Nvidia beating ATI in half life lol
 
rancor said:
Anwer my questions then, I doubt you can. In fact even if you do, then you will really be in a corner :D

Rancor! Don’t make me fart (spoken with French accent) in your general direction! :p
 
hehe, well this has been coming along time for ATi, they went to a fab process that they shouldn't have done so soon.
 
MartinX,

I applaud you. One of the most sensible posts I've read on here in quite some time. People are truly getting hung on 3-10fps. Both cards have pros and cons, deal with it.

- D.
 
IMAGE QUALITY :)

Both are about as fast, which will look better, is the question... :)
 
Just keep in mind that if 3Dc support is added to Doom 3 in a patch then there should be a 10 to 20 percent FPS improvement on x800's at "high" settings. Basically it should be the speed of medium settings with the visual quality of high settings.
 
Dude I don't take ANYTHING personally, I'm just out to educate the masses.

^empty^,You have a daily post count average of around 45, with many being in the video card forum on 6800 threads. Many times, you post two or three times in a row before anyone else responds. No one spends that much time on a topic if they haven't taken it personally. I think you care more about 6800 GT's than just about anything. :)
 
arentol said:
Just keep in mind that if 3Dc support is added to Doom 3 in a patch then there should be a 10 to 20 percent FPS improvement on x800's at "high" settings. Basically it should be the speed of medium settings with the visual quality of high settings.

10-20% no its more like 2%, explained early in the doom 3 sticky. It helps marginally in speed and quality.
 
rancor said:
hehe, well this has been coming along time for ATi, they went to a fab process that they shouldn't have done so soon.

ati have been using 130-nm low-k since the rv360. rv360 and r420 are almost clocked at the same frequency, so what do you mean with to soon?

about 110-nm, the lowend cards x300 (rv370) use it, it wont be used on next gen highend gpus though since 90-nm low-k highspeed will be available around q4 this year so they will get some time to test it out first with there low end.
 
I thought the Half Life 2 beta ran in OpenGL, something about D3D not implemented yet. I, no wait, a friend remembers not being able to select the D3D path in the options.

Can anyone comment on this?
 
I think they will be close in hl2 with a slight lead to ati, but I'll wait til official benchmarks at hardocp before I pick a card to definitively own in hl2 :p
 
Jonsey said:
^empty^,You have a daily post count average of around 45, with many being in the video card forum on 6800 threads. Many times, you post two or three times in a row before anyone else responds. No one spends that much time on a topic if they haven't taken it personally. I think you care more about 6800 GT's than just about anything. :)

Thanks for analysis, I'm so glad you care...

What I care about is people making informed decisions when they buy video cards...I don't take it personally, I just really really like to talk about it...it's kinda sick really...
 
OriginalReaper said:
I thought the Half Life 2 beta ran in OpenGL, something about D3D not implemented yet. I, no wait, a friend remembers not being able to select the D3D path in the options.

Can anyone comment on this?

My understanding is that it's a glitch in the alpha and the whole game is D3D only...
 
I guess alot of peeps went by the link on the front page of [H], where Gabe (chief cook and bottle washer at Valve) said there is a significant difference in fps, advantage going to ATI. Maybe it has a something to do with the $6 million ATI gave Gabe, much like the D3 advantage may have something to do with the fact it is a TWIWMTBP game, ie: big $$$$$ for Carmack. If you think Carmack isn't in it for the $, why is the "game only version" delayed, while the "game + $1 figurine" will be out en-masse on Aug 4th. It's a buis, and it's about $
 
I really don't care. Im in the market for a new vid-card anyway. Originally wanted a Radeon X800 BUT the 6800 is looking mighty tasty(yup, the vanilla one, it's almost as fast as an X800 Pro for $100 less)
 
Just a quick comment to those saying this performance may not be indicative of the final version. Has anyone ever considered that this is an actual beta? If Newell is to be believed that HL2 is about to be released, then they would of passed the beta stage quite some time ago.
 
ShowMeThe$$$$$$ said:
Careful. Mommie or daddie might find some of your grass in the laundry and wanna have a talk. Get a life offline for the love of jesus.

^empty^,You have a daily post count average of around 45, with many being in the video card forum on 6800 threads. Many times, you post two or three times in a row before anyone else responds. No one spends that much time on a topic if they haven't taken it personally. I think you care more about 6800 GT's than just about anything.

AHEM !!!!

Nothing like posting the same exact information across every thread. Here is a sample.

Yeah, the 6800 owns in Doom3

Yeah, ATI suck in everything

I would suggest stepping away from the computer for a good two months to get those things worked out.

/rant.... sorry.
 
Ardrid said:
Just a quick comment to those saying this performance may not be indicative of the final version. Has anyone ever considered that this is an actual beta? If Newell is to be believed that HL2 is about to be released, then they would of passed the beta stage quite some time ago.

there is no proof that it is a recent build. so unless you can find some, its just the incomplete leak from a year ago.
 
I didn't say it was, I just said don't discount the fact that it might be.
 
oh i haven't. its most of the 6800 owners that are closeminded on the issue.
 
God damn it people, ati was forced to disable the bilinear filtering for these tests. Wait let me say it my way ati was forced not to cheat this time with these benchmarks... :eek:
 
Status
Not open for further replies.
Back
Top