More HL 2 Source Benches at Gamers Depot...

Most important part:

Given this early peek at Source performance with today's current top-hardware, it looks as if Half Life 2 will run excellent on whichever brand of GPU you happen to put in your PC. Sure there's a slight nod towards ATI, but we won't count NVIDIA out until we test a shipping version of the game and give both companies yet another chance to release timely drivers once the game hits retail.

:)
 
The Batman said:
Why is everyone ignoring the fact that the GT beats the Pro in those benchs with OPTS OFF I might add. I mean WHY is everyone throwing up a big shit about the XTPE? You can't even buy the damned thing at this point.

dude...you're the only one I see getting excited in this 4 post thread...:p
 
The Batman said:
Why is everyone ignoring the fact that the GT beats the Pro in those benchs with OPTS OFF I might add. I mean WHY is everyone throwing up a big shit about the XTPE? You can't even buy the damned thing at this point.
It's OVERCLOCKED. OVERCLOCKED. DO YOU READ IT? Okay.
 
Although the benchmark is only a possible preview of real game performance, it's still interesting. For all those making a big deal about the performance disparity between ATI and nVIDIA, take a look at these percentage values and relax. It's not a big deal and the game, remember, is still have not gone gold.

Most importantly, if this benchmark is represenative of actual real gameplay, then the Half Life 2 and its derivatives will be very playable.

Well, here are the quick calculations I did based on that Gamer's Depot benchmark chart to give an overall view:

Percentage difference in performance between the X800XT PE and 6800 Ultra

1600x1200, No AA, No AF: 7.95% (ATI)
1280x1024, No AA, No AF: 11.9% (ATI)
1024x768, No AA, No AF: 0.25% (ATI)

1600x1200, 4xAA, 8xAF: 26.9% (ATI)
1280x1024, 4xAA, 8xAF: 17.2% (ATI)
1024x768, 4xAA, 8xAF: 11.2% (ATI)

Percentage diffference in performance between the 6800GT and X800 Pro:

1600x1200, No AA, No AF: 2.56% (nVIDIA)
1280x1024, No AA, No AF: 2.47% (nVIDIA)
1024x768, No AA, No AF: 1.79% (nVIDIA)

1600x1200, 4xAA, 8xAF: 4.10% (nVIDIA)
1280x1024, 4xAA, 8xAF: 1.80% (nVIDIA)
1024x768, 4xAA, 8xAF: 1.85% (nVIDIA)
 
SuperTurtle said:
It's OVERCLOCKED. OVERCLOCKED. DO YOU READ IT? Okay.

you should be banned on general principle...

it's a 6800 GT OC...that's the name of the card...it's only clocked to 370 and it sells in retail at that speed so it doesn't count as an overclock...besides nearly everyone who owns a GT is running at atleast 400Mhz...
 
Optimummind said:
Although the benchmark is only a possible preview of real game performance, it's still interesting. For all those making a big deal about the performance disparity between ATI and nVIDIA, take a look at these percentage values and relax. It's not a big deal and the game, remember, is still have not gone gold.

Most importantly, if this benchmark is represenative of actual real gameplay, then the Half Life 2 and its derivatives will be very playable.

Well, here are the quick calculations I did based on that Gamer's Depot benchmark chart to give an overall view:

Percentage difference in performance between the X800XT PE and 6800 Ultra

1600x1200, No AA, No AF: 7.95% (ATI)
1280x1024, No AA, No AF: 11.9% (ATI)
1024x768, No AA, No AF: 0.25% (ATI)

1600x1200, 4xAA, 8xAF: 26.9% (ATI)
1280x1024, 4xAA, 8xAF: 17.2% (ATI)
1024x768, 4xAA, 8xAF: 11.2% (ATI)

Percentage diffference in performance between the 6800GT and X800 Pro:

1600x1200, No AA, No AF: 2.56% (nVIDIA)
1280x1024, No AA, No AF: 2.47% (nVIDIA)
1024x768, No AA, No AF: 1.79% (nVIDIA)

1600x1200, 4xAA, 8xAF: 4.10% (nVIDIA)
1280x1024, 4xAA, 8xAF: 1.80% (nVIDIA)
1024x768, 4xAA, 8xAF: 1.85% (nVIDIA)

Thanks for the numbers...not to forget optimizations are off on the 6800s and they aren't using the new drivers...that could yield up to 15-20% boost in performance...I think it'll turn out to be a lot closer than people seem to think...
 
SuperTurtle said:
It's OVERCLOCKED. OVERCLOCKED. DO YOU READ IT? Okay.

IT'S RETAIL. IT'S RETAIL. IT'S ALSO BY 20MHZ WHICH IS NO BIG DEAL. Okay.

BTW, you can knock the clock down by 20mhz, that'll take away about 1 FPS [this I know from actually HAVING a GT and knowing my performence benefits from OCing] but then you gotta enable optomizations. Thos opts typically give me 4-5 FPS. The GT wins by an even greater margin.

In on ther words it's clear that the GT wins.
 
Let's calm down boys. No need to get your panties in a bundle over something as minimal as this.
 
Optimummind said:
Although the benchmark is only a possible preview of real game performance, it's still interesting. For all those making a big deal about the performance disparity between ATI and nVIDIA, take a look at these percentage values and relax. It's not a big deal and the game, remember, is still have not gone gold.

Most importantly, if this benchmark is represenative of actual real gameplay, then the Half Life 2 and its derivatives will be very playable.

Well, here are the quick calculations I did based on that Gamer's Depot benchmark chart to give an overall view:

Percentage difference in performance between the X800XT PE and 6800 Ultra

1600x1200, No AA, No AF: 7.95% (ATI)
1280x1024, No AA, No AF: 11.9% (ATI)
1024x768, No AA, No AF: 0.25% (ATI)

1600x1200, 4xAA, 8xAF: 26.9% (ATI)
1280x1024, 4xAA, 8xAF: 17.2% (ATI)
1024x768, 4xAA, 8xAF: 11.2% (ATI)

Percentage diffference in performance between the 6800GT and X800 Pro:

1600x1200, No AA, No AF: 2.56% (nVIDIA)
1280x1024, No AA, No AF: 2.47% (nVIDIA)
1024x768, No AA, No AF: 1.79% (nVIDIA)

1600x1200, 4xAA, 8xAF: 4.10% (nVIDIA)
1280x1024, 4xAA, 8xAF: 1.80% (nVIDIA)
1024x768, 4xAA, 8xAF: 1.85% (nVIDIA)

May i congratulate you on making the first god damned uesful post during this entire saga since the DH benches were released. Your numbers are very useful and show that its not the end of the world...the 6800GT and X800Pro are very close...nothing to worry about. The 6800U and XTPE show more disparity but as you have pointed out this maybe due to driver related issues (such as opts not being enable etc).
 
doh-nut said:
because it kicks all other cards' asses maybe?

Vaporware victories don't impress me much. Now if the card was actually available...funny how you choose to leave that part of my post out of your quote.
 
The Batman said:
Vaporware victories don't impress me much. Now if the card was actually available...funny how you choose to leave that part of my post out of your quote.

funny i know several people that have it, maybe you're not looking hard enough. oh ill repeat it in the other thread in case you miss this.
 
doh-nut said:
funny i know several people that have it, maybe you're not looking hard enough. oh ill repeat it in the other thread in case you miss this.

Instead of talking down to me why don't you put your damned money where your mouth is. You wanna win me over you post a link with an XTPE in stock at MSRP. If you can't do that...go home titto.
 
The Batman said:
Instead of talking down to me why don't you put your damned money where your mouth is. You wanna win me over you post a link with an XTPE in stock at MSRP. If you can't do that...go home titto.

you are a bitter little boy who is anti-ATI. this is true. but you don't have to act like an asshole on this forum to justify your purchase.
 
He has a valid point, though. It's NOT available retail.

Remember when the 5800 Ultra 'came out'? It was so rare, and completely non-existent in retail channels, that [H] refused to use it in benchmarks, they would always clock it down to the non-Ultra levels as 'that was the only card ACTUALLY available'.

Yeah, some people actually HAD 5800 Ultras...hell, some people still DO. Several manufacturers came on board to produce them! But, they are rare as HELL because they never entered mass production, and always sold way over MSRP wherever they DID show up.

In any case, as has been pointed out several times, turning OFF driver opts on nVidia cards but leaving them ON for ATI, AND running the latest speed-enhancing Cats and the month-old Forcewares, AND running a stress test using SM2.0b - ATI's highest level - instead of SM3.0 - nVidia's - when the final game will end up with SM3.0 support....nVidia still only loses by a little bit! Compare that to the utter pwning nVidia dealt ATI in Doom3.

Give this another month to come out, and a few weeks after that new drivers and an SM3.0 patch....

You know what's going to happen, and you're just in denial. The holiday season graphics card roundups are going to show nVidia cleaning ATI's clock in HL2 *and* Doom3 (although, granted, much closer in HL2 than Doom3). Bet on it!
 
dderidex said:
You know what's going to happen, and you're just in denial. The holiday season graphics card roundups are going to show nVidia cleaning ATI's clock in HL2 *and* Doom3 (although, granted, much closer in HL2 than Doom3). Bet on it!

you assume im an ATI fanman. well im not. im an anti-fanman. im about to get a 6800gt anyway. the point is , batman is ridiculous the way he bashes ATI, and any benchmark that shows ATI doing well he disregards. its like nvidia has wired his brain.
 
Well from all the benchies i seen since these high-end cards are out this is my conclusion:

ati > nvidia in direct3d
nvidia > ati in opengl

As most games are in Direct3d my vote goes for ATI :)
 
hmm... yes, the GT beat the Pro, but isn't the GT nVidia's 16-pipe non-ultra card, while the Pro is ATi's 12-pipe card? wouldn't it be more fair to compare the XT and the GT, not the Pro and the GT? Eh, whatever.
 
The Batman said:
Vaporware victories don't impress me much. Now if the card was actually available...funny how you choose to leave that part of my post out of your quote.

Funny, the X800 XT PE that has been in my system since mid-June seems very real to me.
 
milling_hordesman said:
hmm... yes, the GT beat the Pro, but isn't the GT nVidia's 16-pipe non-ultra card, while the Pro is ATi's 12-pipe card? wouldn't it be more fair to compare the XT and the GT, not the Pro and the GT? Eh, whatever.

While the number of pipelines might be interesting to us, normal custemer will watch only the cards at the same price point, not those with same number of pipelines. That's why 6800GT with MSRP price of 400$ is compared to 400$ X800pro. Btw they should include Ultra extreme in those benchmarks too.
 
The closest I've gotten to getting an actual X800XT PE was yesterday when the powercolor's came in on newegg and the only reason I didn't get it was because I've owned a 6800 Ultra for the past 3 weeks. That's the funny part.

I've been actively pursuing a reasonably priced one for about 2 months now, almost obsessively at some points. Anyone who says that the XTPE at this time is virtual vaporware considering its availability and price is not full of crap nor a !!!!!!. They're being realistic.

As for the recent benchmarks concerning 6800 Ultras and X800XTs on the currently available HL2 source demo, the discrepancy between Driver Heaven's results and all other sites/gamers posting results based on this demo should be cause for alarm in any hardware consumer's mind. I have been going to DriverHeaven for some time now and there is some truth to the accusation of an ATI bias concerning a matchup between nVidia and ATI. This is not a unique situation however. FiringSquad has shown a similar bias in the recent months.

Whether or not this bias is intentional is questionable of course. The problems start to come in when (as HardOCP consistently touts) an "Apples to Apples" comparison is attempted between the two cards.

Problem 1: Optimizations

By default ATI's X800 performs texture optimizations that cannot be turned off. For some reason hardware sites seem to not be able to grasp this or cannot comprehend how to configure a "fair" test based on this difference. For a base test in this type of comparison the answer is simple really, turn on similar optimizations on the nVidia and run the test.

Problem 2: Drivers

This is probably an even larger problem considering the various differences between updates for the two cards. It is not as if each company makes an update and they go tit-for-tat in the technologies they are updating. Not only do both cards feature different graphics technologies, they also have inconsistent performance in those technologies that they share due to a multitude of hardware and software differences. Therefore it is not enough to simply say well this is the newest available driver each has out and go from there, nor is it enough to say well these are the latest beta drivers each have out and go from there. Unfortunately concerning this matter the aforementioned standards are the best any hardware site can hope to apply.

Problem 3: Environment

The proverbial Thunderdome for all hardware is what is running against/with. What's your CPU? What's your memory? What's your chipset? What are your BIOS settings? What OS are you running? What API versions are we dealing with (DirectX, OpenGL)? What order did you install drivers/software? Are there any conflicts between drivers/software? Is this a fresh install of hardware?.....This is the major downfall of all benchmarks, not just these. Not only can each have an effect on your test, but they can have an effect on each other which can indirectly affect your test.


I have never seen a benchmark I did not have some cause for question in the setup or results.
 
thrawn42 said:
The closest I've gotten to getting an actual X800XT PE was yesterday when the powercolor's came in on newegg and the only reason I didn't get it was because I've owned a 6800 Ultra for the past 3 weeks. That's the funny part.

I've been actively pursuing a reasonably priced one for about 2 months now, almost obsessively at some points. Anyone who says that the XTPE at this time is virtual vaporware considering its availability and price is not full of crap nor a !!!!!!. They're being realistic.

As for the recent benchmarks concerning 6800 Ultras and X800XTs on the currently available HL2 source demo, the discrepancy between Driver Heaven's results and all other sites/gamers posting results based on this demo should be cause for alarm in any hardware consumer's mind. I have been going to DriverHeaven for some time now and there is some truth to the accusation of an ATI bias concerning a matchup between nVidia and ATI. This is not a unique situation however. FiringSquad has shown a similar bias in the recent months.

Whether or not this bias is intentional or not is questionable of course. The problems start to come in when (as HardOCP consistently touts) an "Apples to Apples" comparison is attempted between the two cards.

Problem 1: Optimizations

By default ATI's X800 performs texture optimizations that cannot be turned off. For some reason hardware sites seem to not be able to grasp this or cannot comprehend how to configure a "fair" test based on this difference. For a base test in this type of comparison the answer is simple really, turn on similar optimizations on the nVidia and run the test.

Problem 2: Drivers

This is probably an even larger problem considering the various differences between updates for the two cards. It is not as if each company makes an update and they go tit-for-tat in the technologies they are updating. Not only do both cards feature different graphics technologies, they also have inconsistent performance in those technologies that they share due to a multitude of hardware and software differences. Therefore it is not enough to simply say well this is the newest available driver each has out and go from there, nor is it enough to say well these are the latest beta drivers each have out and go from there. Unfortunately concerning this matter the aforementioned standards are the best any hardware site can hope to apply.

Problem 3: Environment

The proverbial Thunderdome for all hardware is what is running against/with. What's your CPU? What's your memory? What's your chipset? What are your BIOS settings? What OS are you running? What API versions are we dealing with (DirectX, OpenGL)? What order did you install drivers/software? Are there any conflicts between drivers/software? Is this a fresh install of hardware?.....This is the major downfall of all benchmarks, not just these. Not only can each have an effect on your test, but they can have an effect on each other which can indirectly affect your test.


I have never seen a benchmark I did not have some cause for question in the setup or results.


hence my sig
 
Wow

Yea...I can't get any 6800ultra's either. Seems like that is just as "vaporware" to me.
That's a retarded statement too...vaporware. Just because it's not in you're hands right away doesn't mean it doesn't exist and isn't coming.

I also find it funny how people (especially nvidia owners) are a little too passionate about the benchmarks. When Doom 3 came out and did very well with nvidia cards, no one wanted to even think that "Hey, maybe ATI will work on drivers to help close that gap"
which isn't unreasonable or impossible (Farcry for nvidia)
Now that ATi has winning benchmarks, *yelling* It's a driver problem! Cheating! Biased! You didn't enable this and that! Holy crap people need to relax. seems like these boards are more "violent" lately.

It just seems like the same thing is going to happen with HL2 that happened with doom 3. ATi cards are going to perform better (game out of box) for HL2 (much like Nvidia out of box for Doom3). Patches/drivers will come out for BOTH cards that will speed up the game.

Here's the deal too with both cards...You're going to buy a new one in 2 years to play whatever great thing comes along. It ALWAYS happens. This top dog card now is going to be mid-level later. So there is also no "future proof" card. When I read that it's funny.
Both companies this time did a GREAT job of making good cards. [H] even had a little comment on the main page stating that and it's true. We as consumers won this time and don't have to worry if we'll be hating gaming for a while
 
Considering that ATI paid Valve US$6 million: Is anyone suprised? nVidia asked Carmack what he wanted and they built it, money not necessary. Oh ATi could have had fun too, had their employee(s) been a little more honest and not leaked the Alpha. Just like nVidia could've tossed Valve US$7 million just to outperform ATi by a little. Pretty soon listing who invested in the game is going to become commonplace methinks...

We knew it would be faster, but that difference has to stay about 15% higher than nVidia, or they are in sales trouble. !!!!!!'s aside anyway.

Me, I bought my rig to play D3. The two DX3D games I play are SWG & UT. I never liked the *original* HL, and if I was going for realism, well I was gonna play AA not CS. Of course tastes vary so one must always choose based on what they will play.

ATi's First *major* problem: The XT PE is just not available to most people, and they keep seeing the pro getting it nuts handed to it (at least how most will view it). The company I work for has *never* seen an ATi XT. Powercolor yes, but no one else yet. (I've had 2 Powercolors incidentally 1 Pro 1 XT in my store) I'll get told, "Sure we'll send you some," only to find their "allotment has been cut." The real stickler here is that all the while this is happening, 6800GT's / OC and Ultra / OC's (from eVga & BFG) are coming in, and walking out near same day if not the same day.

The "cheaper chip", according to ATi, has more problems getting out of the foundry? It's harder to get it out of the foundry because of the clock speed maybe? Cheaper chip? Yeah until you factor in failure rate at the foundry I bet. That is the *only* logical explanation why you cannot get an XT. Nothing else fits, unless they just sold their stock to a company that doesn't plan on selling them.

ATi's Second problem: C'mon, you didn't see the GT coming? Your Pro is getting spanked simply because you went cheap. Did you have to lower the Clock Speed and cut the number of pipes? You followed your predictable strategy and nVidia spanked you for it.

ATi's Third problem: Anyone ever try overclocking their XT's without water-blocking? The kid who bought the XT from me came back and bought a TT Aquarius 3 and Video Block, because he couldn't bump it 10MHz without crap mystically appearing on his desktop and his game refusing to run. He waterblocked it and I haven't heard back from him yet, but he's supposed to come in and tell me how that went ;)

ATi's Fourth problem: nVidia hasn't been dancing shoulder to shoulder with Valve throughout the creation process. Now when the game starts shipping, will nVidia pull off one of their driver optimizations which will evaporate your lead? You have yet to diminish their Doom Destruction of your cards. Will this slowdown your "new" opengl driver? You can't cry foul either, oh great ATi optimizers, for the moral high ground you tried to take fell when you had your own optimizations for mip's put in.
 
My own feeling is simple, untill either ATI allows there optimizations to be shut off in driver without having to use several registry hacks to do it, then the optimizations should be left on for the Nvidia cards. Had they been left on in the Gamers Depot review I think you would have seen the 6800 Ultra and X800 PE performing much closer to each other.

Not to mention the GT would be throughly trouncing the X800 Pro ;)

I am none the less pleased to see the GT getting 57 fps on average in the Drivre Haven review, despite it's biased reviewers, it's actually running faster than D3 does roughly at 16x12 4xaa 8xAF, which bodes well for playability across the spectrum of vid cards.

I played through Doom 3 16x12 4xAA 8xAF and my GT at ultra speeds tore through the game like it was buttah.
 
Netrat33 said:
Wow

Yea...I can't get any 6800ultra's either. Seems like that is just as "vaporware" to me.
That's a retarded statement too...vaporware. Just because it's not in you're hands right away doesn't mean it doesn't exist and isn't coming.

May I suggest you can't use Google? Ultras pop on and off sites every few days. The easiest to get your hands on at the moment seem to be evga's version. Again I could give two shits about this ATI vs. nVidia thing. This is coming from someone who has been doing extensive searches for both cards over the past two months.
 
Yeah Ultras are in stock all over the place. I got my BFG Ultra in the UK from PCworld of all places...

But XTPE's...cant find one for the life of me...
 
thrawn42 said:
May I suggest you can't use Google? Ultras pop on and off sites every few days. The easiest to get your hands on at the moment seem to be evga's version. Again I could give two shits about this ATI vs. nVidia thing. This is coming from someone who has been doing extensive searches for both cards over the past two months.

Both are still hard to find. People are still buying XTs as soon as they hit the selves much like Ultras keep flying off the shelves

The moment either show up on websites in stock, they are gone fast. I've seen XTs available and then gone just as fast. I see people posting all the time how they just got there XTs. It's a huge demand for both cards.
 
Inglix_the_Mad said:
The "cheaper chip", according to ATi, has more problems getting out of the foundry? It's harder to get it out of the foundry because of the clock speed maybe? Cheaper chip? Yeah until you factor in failure rate at the foundry I bet. That is the *only* logical explanation why you cannot get an XT. Nothing else fits, unless they just sold their stock to a company that doesn't plan on selling them.

ATi's Second problem: C'mon, you didn't see the GT coming? Your Pro is getting spanked simply because you went cheap. Did you have to lower the Clock Speed and cut the number of pipes? You followed your predictable strategy and nVidia spanked you for it.

ATi's Third problem: Anyone ever try overclocking their XT's without water-blocking? The kid who bought the XT from me came back and bought a TT Aquarius 3 and Video Block, because he couldn't bump it 10MHz without crap mystically appearing on his desktop and his game refusing to run. He waterblocked it and I haven't heard back from him yet, but he's supposed to come in and tell me how that went ;)

Correct me if I'm wrong, but the Pro and XT are the same board, no? I mean, the 16 pipes are there, just not enabled or cut. I think the only difference is the Theatre chip, but the Pro VIVO's have them.

I have a Sapphire X800 Pro Vivo that I picked up for $390. I'm very pleased with it. I guess 1280x1024 2xAA and 8xAF in Doom3 is a really low setting to play at, but I'll just have to cope with that.

Now ATi vs Nvidia is good for everyone. I've owned both, but only recently switched to ATi. I laugh when people write these messages about how Nvidia destroys ATi or vice versa like they are the ones who did the work. Seriously, when you go into what the opposition can't or doesn't do, you're immediately shutting off most folks. If you explain what your company of choice CAN do, people will be more receptive.
 
All this nVidia !!!!!! bullshit is making me upset. You all forget that if it wasn't for ATI, you would still be producing utter shit like the 5800's. ATI put the hurting on with the R3xx series chips. That pressure was enough for nVidia to get the sand out of their heads and to stop resting on their laurels. I am an ATI !!!!!!, but I will say that nVidia has won the first battle in a long time with the NV40 and NV43. Kind of reminds me of when the yankees lose to the red sox and the red sox fans get all anal. Calm the f*ck down and enjoy the innovations that each company is coming out with and stop being immature zealots.
 
covertclocker said:
All this nVidia !!!!!! bullshit is making me upset. You all forget that if it wasn't for ATI, you would still be producing utter shit like the 5800's. ATI put the hurting on with the R3xx series chips. That pressure was enough for nVidia to get the sand out of their heads and to stop resting on their laurels. I am an ATI !!!!!!, but I will say that nVidia has won the first battle in a long time with the NV40 and NV43. Kind of reminds me of when the yankees lose to the red sox and the red sox fans get all anal. Calm the f*ck down and enjoy the innovations that each company is coming out with and stop being immature zealots.


another noobie to go to my ignore list
 
The Batman said:
Instead of talking down to me why don't you put your damned money where your mouth is. You wanna win me over you post a link with an XTPE in stock at MSRP. If you can't do that...go home titto.

LOL actual doh-nut that was kinda funny, but guys calm down
 
I also find it funny how people (especially nvidia owners) are a little too passionate about the benchmarks. When Doom 3 came out and did very well with nvidia cards, no one wanted to even think that "Hey, maybe ATI will work on drivers to help close that gap"
which isn't unreasonable or impossible (Farcry for nvidia)

Well, the thing is, we've been seeing 'Catalyst' drivers for several years now, and nVidia's 'ForceWare' for about a year and 'Detonators' a few years before that.

I don't know what it says about nVidia's driver development team (not good), but the original driver that comes with a card is....functional. It performans okay, but EACH AND EVERY SINGLE CARD nVidia has put out has had at *least* 3 driver releases after that improve performance by 10% or better. There is no exception - the 60.xx/61.xx have delivered the first bump for the 6800s, the 65.xx (whatever they WHQL as) look to be providing #2, and we thus all expect at least one more bump by Christmas. It's happened every other time (what, 8? 9? times?), it'll happen again.

ATI, on the other hand, has no such luck with theirs. Or, rather, they have better luck with theirs - at release. The final Catalyst build that 'improves performance' over the initial release has a maximum boost of around 10% in a couple example apps - not even 'across the board' as the Dets/Forceware do.
 
milling_hordesman said:
hmm... yes, the GT beat the Pro, but isn't the GT nVidia's 16-pipe non-ultra card, while the Pro is ATi's 12-pipe card? wouldn't it be more fair to compare the XT and the GT, not the Pro and the GT? Eh, whatever.


Price wise no xt doesn't compair to the gt. xt compairs to the ultra, and the xt pe compairs to the ultra extreme.
 
dderidex said:
Well, the thing is, we've been seeing 'Catalyst' drivers for several years now, and nVidia's 'ForceWare' for about a year and 'Detonators' a few years before that.

I don't know what it says about nVidia's driver development team (not good), but the original driver that comes with a card is....functional. It performans okay, but EACH AND EVERY SINGLE CARD nVidia has put out has had at *least* 3 driver releases after that improve performance by 10% or better. There is no exception - the 60.xx/61.xx have delivered the first bump for the 6800s, the 65.xx (whatever they WHQL as) look to be providing #2, and we thus all expect at least one more bump by Christmas. It's happened every other time (what, 8? 9? times?), it'll happen again.

ATI, on the other hand, has no such luck with theirs. Or, rather, they have better luck with theirs - at release. The final Catalyst build that 'improves performance' over the initial release has a maximum boost of around 10% in a couple example apps - not even 'across the board' as the Dets/Forceware do.


Very true, ATi's drivers are very mature since the base of these cards are the r300's so they can't milk out much more performance.
 
Corleonee said:
Well from all the benchies i seen since these high-end cards are out this is my conclusion:

ati > nvidia in direct3d
nvidia > ati in opengl

As most games are in Direct3d my vote goes for ATI :)


Right and get your head out of the sand do the tests without aa and af if you want see dx performance.............
 
Back
Top