HardOCP ATI CrossFire Preview

I've seen a bit of flaming on both of the 'boyism sides.

1. It would be stupid for nVidia to intentionally sabotoge the NF4 SLi + Crossfire. Less motherboard sales for them.
2. It would be stupid for ATi to intentionally sabotoge the NF4 SLi + Crossfire. Less video card sales for them.
3. nVidia has a more mature and better featured chipset, get over it. They've been working on chipsets since 2000, ATi didnt release any chipsets until what? 2003? Even they they weren't targeting the high end. I would be disappointed if nVidia's solution wasn't better just because of how much longer they have had in this market.
4. Crossfire is a terrible initial investment at this point. Unless you already own an X850 PCIe, there is no point in investing in THIS generation considering that the NEXT generation of nVidia will be here before Crossfire. And Crossfire will cost approximately the same as a 7800GTX. (Especially if you consider that you can use your current PCIe board with the 7800)
 
FanATIc said:
Would appear ATI did alot more work on that northbridge then anyone knew. Has to be a rather complex peice of hardware to be so versatile. I highly doubt its a fluke. Though i guess we now know where the "needs a bridge" "doesnt need a bridge" comments came from.
Kyle made an update to the frontpage on this, DFI is effectively rendering each frame twice without putting them back together.
 
^eMpTy^ said:
What can I say...I'm a !!!!!!...I admit it...I hate ATi for absolutely no reason...sue me

^eMpTy^ said:
but let's be honest...I wouldn't buy an ATi card if it came with a free bj and a porsche...

And then you say this...

^eMpTy^ said:
I would happily switch over to ATi if they came out with the better product...and R520 may be that product...

Why do you expect anyone to believe any of your ramblings, with posts like these? I mean, really.
 
DFI has been showing two slave cards running in a dual configuration at Computex as seen above, and they have in fact been showing 3DMark 2005 scores that increase by nearly 80% when they turn on "CrossFire" in the driver. Well to make a long story short, the two slave cards are sharing the workload and nearly doubling the 3D Mark2005 score, but you are only seeing every other frame supplied by one of the cards. So keep in mind that rendering double the frames is pretty "easy," but putting them back together again is not just going to happen by accident.

Anyway, chalk the confusion up to pre-release hardware and drivers. Kudos to DFI for finding something cool to share, but don't ever think you are going to reap the rewards of CrossFire rendering without a CrossFire hardware compositing engine to put all the frame or tiles back together again.

And just for the record, DOOM 3 showed absolutely no benefit from the configuration as you might guess.
 
tranCendenZ said:
hopefully the R520 will be able to do SLI without the messy external cables.

Why is one extra cable on the back consider messy?

^eMpTy^ said:
My only bitch with Crossfire right now, is that the x850s/x800s don't have FP blending and PS3.0, so they make a very unwise high-end videocard investment right now...everything else about Crossfire is just peachy...

You keep forgetting that xfire adds better AA opitons. I have an PCIe 800xl atm. I am an AA junky. So for under $500-ish I can upgrade to xfire and get better AA modes and more frame rate that I have now. How is that a bad move? Granted the g70 "maybe" faster? But given the fact that NV has really fallen behind on AA for years I don't epect anything better AA wise out of them. Dont get me wrong as I think the G70 will rock. And I think that xfire and SLI are both very nitch market places. But saying its a terrible investment is a bit short sighted....
 
fallguy said:
cman, dont forget you got that from the front page, not yourself.

I was talking about it last night...long before it was posted on the front page...not that I care, mostly just being silly...
 
Why doesnt ATI just put that compositing chip on all their cards and get rid of this master slave nonsense? I think its pure cash so they can charge an extra $100 or $200 for the master version of the exact same card.
 
JL_Audio_User said:
And of course you have an Nvidia card in your sig.....and an SLi board
No bias there..... :rolleyes:

I go with whatever manufacturer has the best deal at the time.

I've gone from a XpertXL > TNT2 > GF2 > GF3 > R7000 > GF4 > R9700 > GF6600

I got a really good deal on my SLi board and Video card, totaled $200 for both of them. Simple as that. I'm actually a little bit biased in favor of ATi, but nvidia had the best deal. Besides, name one A64 chipset that is better for PCIe then the NF4.
 
^eMpTy^ said:

Nothing in your posts looks sarcastic to me. You've made comments like that many times.

^eMpTy^ said:
I was talking about it last night...long before it was posted on the front page...not that I care, mostly just being silly...

I wasnt replying to you. cman posted news from the front page, without a quote or link, which it look like it was his own.
 
fallguy said:
Nothing in your posts looks sarcastic to me. You've made comments like that many times.



I wasnt replying to you. cman posted news from the front page, without a quote or link, which it look like it was his own.

Thats how you percieve it. i'ts the content that matters,not where it came from. It's too early to say how Crossfire will really work anyway,All i'm saying is it is going to be expensive,unlike Nvidia where you can use (and can get ) cheaper cards for SLI.
everthing right now on crossfire is speculation. You can ignor my posts if they bother you,or better yet put me in your kill file.
 
fallguy said:
I wasnt replying to you. cman posted news from the front page, without a quote or link, which it look like it was his own.

Right, I had just posted something that would have warranted such a response...and yet now I can't find that post...so I must just be really confused or something...
 
Dew said:
And Crossfire will cost approximately the same as a 7800GTX. (Especially if you consider that you can use your current PCIe board with the 7800)
You can? Sorry I'm late to the party and read through a couple pages of posts, but didn't see anything about using current pcie cards with the g70. If that's the case, realistically, we can go for days about which solution is better (which is part of the reason this thread is so long---the other part being useless flaming). IMO, the best option if you have an existing card is to stick with your existing brand. i.e. if you have an 850xt pcie (which happens to be the fastest card available now), buying a xfire mobo and master card would be a hell of a good investment. If you are doing a full sytem upgrade, you really can't go wrong. People get too fired up about ati vs nvidia being better, when really both companies have their advantages, and both companies make damn fine cards. Just buy the one you want and stop trying to make everybody else do the same.
(BTW, nvidicrap suxxorz) :p

 
IMO the SLI implementation is very messy. That you need to have special identifiers so that certain games will run is not a good thing.

I'm not particularily happy with CrossFire either. Although previewed it looks cleaner than SLI (you don't need special drivers/application detection for it to run) older games still do not get the same benefit as newer ones. This tells me its not a true native Dual GPU solution.

Both Nvidia and ATi still have a ways to go in the multicore/card arena. All in my opinion.
 
With the two boards in there can you actually use any of the other PCI slots? It seems like all you can have is the video cards and nothing else. I would want to have at least a sound card and my SCSI card in there. Is that going to be a possibility?
 
jebo_4jc said:
IMO, the best option if you have an existing card is to stick with your existing brand. i.e. if you have an 850xt pcie (which happens to be the fastest card available now), buying a xfire mobo and master card would be a hell of a good investment. If you are doing a full sytem upgrade, you really can't go wrong.
[/size]

That's a terrible investment. Why the hell would you buy a new motherboard and videocard to support your old video card. You would be better off moving to the next gen cards (G70, R520) and selling your 850.

ATI should not have wasted their time releasing crossfire for their current generation of cards. They should have released it along side the R520 and with their new chipset.
 
PRIME1 said:
That's a terrible investment. Why the hell would you buy a new motherboard and videocard to support your old video card. You would be better off moving to the next gen cards (G70, R520) and selling your 850.

ATI should not have wasted their time releasing crossfire for their current generation of cards. They should have released it along side the R520 and with their new chipset.
You are exactly right, I should have specified that I was thinking that using your x850xt is a good card now, and you can buy a new mobo and a r520 master card later and still reap the benefits of the x850xt. Sorry.

Edit: Although, if the rumors we hear are true, an 850xt will seem like a x300 compared to r520/g70. So the benefit may not be as great as it seems.

 
^eMpTy^ said:
QFT...on the front page...told ya it sounded hokey...

Interesting, just read Kyle's update. No performance benefit in DOOM 3 with that configuration. That settles it then :)
 
BoogerBomb said:
Why doesnt ATI just put that compositing chip on all their cards and get rid of this master slave nonsense? I think its pure cash so they can charge an extra $100 or $200 for the master version of the exact same card.

So that everyone that already owns an X800 or X850 series video card, and all cards that are still on the shelves, can go ahead and setup a dual card system.
 
ZenOps said:
I'm not particularily happy with CrossFire either. Although previewed it looks cleaner than SLI (you don't need special drivers/application detection for it to run) older games still do not get the same benefit as newer ones.

How do you know? We have not seen benchmarks or performance results for a CrossFire solution that didnt come from ATI. As Kyle mentioned older cames could be run with much higher resolution and AA settings versus what they were before. I still play RTCW and wouldnt mind going to 1600x1200 with the highest AA and AF if possible. Quake 2 & 3 should zip right along (not that they don't already).
 
ZenOps said:
older games still do not get the same benefit as newer ones. This tells me its not a true native Dual GPU solution.

Well an older game that is CPU limited, not video card limited would obviously not recieve a performance benefit with two video cards with either system. That's just the nature of the beast. At least with CrossFire you can run at super high AA modes in them 12X MSAA / 2X SSAA
 
R1ckCa1n said:
I started to read through this thread but came to the quick realization that empty has "RUINED" it.

The board looks nice and overclocks like a champ but I will wait for the dual R520's.


No sh!t. Normally I actually like his banter ( :eek: ) but this time just seems sad.

and yea i'm just waiting for r520s too to see what they do.
from there decide if i want dual g70s or dual R520s.
 
^eMpTy^ said:
Well here's what I think about G70...nvidia held off on a refresh of the NV40 so they could beat ATi to the punch and release G70 ahead of R520...and then they'll quickly refresh G70 on 90nm and trump R520 when it's released...or atleast...if I were them...that's what I would do...

The show's not over yet...ATi can still announce R520 at computex...and if they do...all will be forgiven on my end...I just can't believe that they would launch Crossfire like this without R520...just doesn't make any sense...Dave over at B3D had similar sentiments...


I believe it's being reported that the G70 isn't going to be 90nm as planned and is goign to be 110 or something like that. Also a power hog. (top of my head and could be very wrong) i think it was posted at like 114 watts of power. Needs it from the PCI-e slot and external slot. I figured someone would have posted it
 
fallguy said:
Nothing in your posts looks sarcastic to me. You've made comments like that many times.



I wasnt replying to you. cman posted news from the front page, without a quote or link, which it look like it was his own.

And your point is?
 
Netrat33 said:
I believe it's being reported that the G70 isn't going to be 90nm as planned and is goign to be 110 or something like that. Also a power hog. (top of my head and could be very wrong) i think it was posted at like 114 watts of power. Needs it from the PCI-e slot and external slot. I figured someone would have posted it
150W according to this. Interestingly, both the article FS linked to, as well as another article one of the posters linked to have disappeared. Perhaps because it's a load of crap and these sites don't want to spread false information, but also perhaps it's true and nv don't want us to know that g70 is a power sucking, heat producing monster (as their cards have tended to be in the past). ATi has typically seemed to hold an advantage in the power consumption/heat dissipation areas.

 
jebo_4jc said:
150W according to this. Interestingly, both the article FS linked to, as well as another article one of the posters linked to have disappeared. Perhaps because it's a load of crap and these sites don't want to spread false information, but also perhaps it's true and nv don't want us to know that g70 is a power sucking, heat producing monster (as their cards have tended to be in the past). ATi has typically seemed to hold an advantage in the power consumption/heat dissipation areas.


6800U vs X800XTPE power consumption wise were only apart about ~7Watts at load. Both operated at around 75-80watts. You can see what a leap even 115 watts is let alone 140.
 
FanATIc said:
6800U vs X800XTPE power consumption wise were only apart about ~7Watts at load. Both operated at around 75-80watts. You can see what a leap even 115 watts is let alone 140.
Anandtech:
5728.png

5729.png

I'd like to point out the 11W difference between the x800xl and 6800gt. Not an insignificant difference between two of the most popular cards avail right now.
Note: these figures are for a whole FX-55 system---not individual cards.
 
Brent_Justice said:
Well yeah it does matter where the info came from.
Well if thats the case most of the posts in here are worthless,Nodody really knows how well it works or what will be required,it's all speculation untill the shipping units are available.
 
CMAN said:
Well if thats the case most of the posts in here are worthless,Nodody really knows how well it works or what will be required,it's all speculation untill the shipping units are available.

You're missing the point. You copy and pasted news from the front page, without a link, or telling what you did. Making it look like you typed it, being yours. Its not the same as someone sharing their opinion.
 
jebo_4jc said:
Anandtech:
5728.png

5729.png

I'd like to point out the 11W difference between the x800xl and 6800gt. Not an insignificant difference between two of the most popular cards avail right now.
Note: these figures are for a whole FX-55 system---not individual cards.


Its not really the watts that were crazy but needing 26A on the 12V is quite a leap from the 18A the 6800 runs on. I'll venture to believe most people, even those with 6800s and X800s, dont have 26A on their 12V.

Btw, i was dead on with the X800v6800u. :)
 
CMAN and fallguy please just stop posting if you have nothing to contribute to this thread but petty bickering and flaming. Im getting sick and tired of reading it. Thread unsubscribed.
 
fallguy said:
You're missing the point. You copy and pasted news from the front page, without a link, or telling what you did. Making it look like you typed it, being yours. Its not the same as someone sharing their opinion.
No you miss the point, Dont read my post. You dont understand what i'm saying,please put me on you ignor list. So what if i copied and pasted,thats what those functions are for.
Show me the rule where you cant do that,give me a link or copy and paste it.
 
So - does anyone know the actual date when the public will be able to buy a cross-fire setup? mobo and cards?
 
Back
Top