HardOCP's ATI CrossFire Evaluation

Nice review guys . The refresh thing is a bummer.

I Though 'm glad I just got the N7800 GTX top. I was waiting but I had the feeling.
This is one fast card and boy it looks incredible.
 
First of all, great review.


Second of all, I cannot believe that anyone is touting around the old "don't compare the 7800GTX with the X850." Of course that is what HarpOcp is going to compare because that is what is currently on the market right now. You compare products with the same PRICE POINT. At the moment the 7800 series is at the same price point as the X850 so that is what is should be compared to. And crossfire gets annhilated. Anyone who would buy a crossfire setup right now has a few screws loose IMHO. Maybe when the R520 comes out they'll have something to show, but now? Why did they even launch the thing? No one is going to buy it.
 
The manner of implementation of Crossfire is completely unsatisfactory. Noone to blame but ATI itself.

In the rush to counter NV marketing they jumped at a non-solution.

Given that we all by know understand that NOTHING on the motherboard is required in order to support SLI or Xfire, the requirement for a "special" motherboard for either one is total marketing BS, anti-consumer, anti-enthusiast. I say to both companies FK U on that point.

Any motherboard with 2 x16 connectors (operating in x8 and x8) are all that is needed outside the video cards and drivers to implement "SLI/VFire"-ness.

Any assertion by either company to the contrary is utter total BS, bold face LIES.

I will stick with my DFI NF4 SLI board because it is simply THE BEST A64 board out there so far. I get the most out of my A64 chip and my memory using this board, PERIOD.

ATI chose NOT to put a bridge connection on thier video cards, forcing them to use the DVI connector to make the interconnect between boards. They were NOT precluded by patent/legal constarints from using a bridge of somekind. They CHOSE not to.

OK, then wake up and BE SMART. By making your XFire a DVI based interconnect, simply go a step further. Make an external BLACKBOX with 2 DVI-I inputs, and a DVI-I output.

Inside the BBox is a frame buffer, control chip, maybe use a USB connection to allow the PC to "configure" the BBox, or just embed info in the DVI signal, whatever.

The BBox would just take the 2 outputs and knit them back together into a single monitor signal.

ANY TWO video cards with DVI-I could then be used for "SLI-ness". Only the driver need be altered to "KNOW" that the video frame is to be divided up (by a host of methods, AFR, etc) across the 2 cards and outputed, and the BBox outside will knit them back together appropriately.

Peice of FKing Cake. Moreover, the BBox would have OTHER uses as a product. An example would be, notice how HDTV's have only ONE DVI input? What if you want to use your HDTV with BOTH a HDTV Tivo cable/DirectTV box and also with your HTPC??

WHat you get up and swap cables all the time? Use some DVI switch device? Or the BBox, which could even )under the control of your HTPC) implement picture-in-picture with 2 DVI sources to one HDTV.

Further, the 2 video cards could output at thier max resolutions/refreshrates and the BBox can use the NEW TMDS transmitter that handles the high res high refresh that the individual cards cant quite handle. So CURRENT production X850's would be ready to go, with no "SPECIAL" cards needed, etc.

I chose NOT to be a member of TEAM SPED. ATI's new team... TEAM SPED.

Word to ATI, nobody's gonna buy this crap. And noone is going to toss a perfectly good NF4 motherboard JUST to be able to use your silly convoluted botched Crossfire nonsense. It aint happening, talk about IN DENIAL.

Hint: only way you could get a move would be to make a FAR SUPERIOR motherboard design....which you haven't come close to.


And this is all from the owner of 2 X850XTPE PCIe and 1 X800XTPE agp and 0 NVidia cards. If you cant sell ME on any of this, you have already LOST the battle. Knock off the BS and get your lazy good for nothing ass's in gear.
 
60hz refresh rate is only bad if you have a CRT. I am completely confused why this limit even matters when using DVI...

people "petting" their widescreen LCDs and whatnot wondering about this "limit"... do you people not use the DVI connection?
 
Yashu said:
60hz refresh rate is only bad if you have a CRT. I am completely confused why this limit even matters when using DVI...

people "petting" their widescreen LCDs and whatnot wondering about this "limit"... do you people not use the DVI connection?


Not that many LCD's will do 1600 res (for an affordable price) but CRT's can with no real penalties. Many gamers still prefer CRTs (and like playing at 1600 res) as you dont have to worry about the LCD timings and swapping resolutions. Therefore, not running at anything less than at least 75Hz at 1600 is a big no no (or headache).

I wont be giving up my IIyama CRT for some time yet.
 
I'm really surprised adding another $300 video card only gets you another 2x AA.. I mean whats the point? :confused:
 
well as I said, I understood the gripe when it was coming from a CRT user. hehehe...

See here is the problem I see... written all over ATI's products, features, drivers, ect... is the word n00b!!! in giant letters.

Their whole company "image" looks like it was designed by a 13 year old kid that loves "transformers" or something.

Are you a n00b on 1024X768 CRT (but just bought $1000s of new hardware)? Great! we have the perfect system for you! Do you like red and black and hardware that looks like it came from wallmart? we have the PERFECT system for you! Do you *hate* having to choose a driver? we have the perfect system for you! if you are a n00b and don't care about openGL... (microsoft pwns?) we have the perfect system for you! do you want a watered down solution like the beer that the astrodome? boy do we have the system for you!

ATI is shortsighted once again.

When one spends that much cash on something it should have some quality and class. Right now ATI is more like the official videocard of NASCAR or something...
 
Im puzzled, is the refresh limitation only on hte DIGITAL signals of the DVI, or does it (under Crossfire) also hit the analog output as a limitation?

DVI-I is simply DVD-D with the addition of analog VGA RGB signals also in the ocnnector.... the analog signals COULD have been on thier own connector, but the back panel on our cases really only leaves room for ONE VGA and ONE DVI and something like the vivo/svideo. Technically an X850 could have 2 DVI-D and 2 VGA HD15 connectors as well as the SVideo, they pack them into say 2 DVI-I connectors and you use a DVI-I-to-VGA adapter to break out the analog signals.

So is the limit in the "compositing" chip that would effect BOTH Digital AND analog final output? or in the digital DVI transmission of the final result only.

IE: Are analog CRT's still free to display 1600x1200 @ 75-80Hz as normal, or not?

Again, the entire thing is a botched abortion, and simply not up to the capabilities of the ATI engineering staff. What a mess.
 
I still am trying to figure out why both companies have not moved to a dualcore chip. maybe I have missed something but the move to this in CPU has met nothing but enthusiasm and not to mention that the solution solves other problems such as the dongle and SLi bridge. As I understand it ATI is partnered with AMD in some way so why not go to them for dual core and implement that into one elegant simple solution. Perhaps next year will be the time for that move. I also hope that I am not the only one with this thinking.
 
Good review! ...spend a little more and get a little less... that seems to sum it up for me... looks like CrossFire is not done baking to me... but in all fairness, I think similar things were said about SLi when it first emerged.. for the 2nd time. heh..
 
Willsonman said:
I still am trying to figure out why both companies have not moved to a dualcore chip. maybe I have missed something but the move to this in CPU has met nothing but enthusiasm and not to mention that the solution solves other problems such as the dongle and SLi bridge. As I understand it ATI is partnered with AMD in some way so why not go to them for dual core and implement that into one elegant simple solution. Perhaps next year will be the time for that move. I also hope that I am not the only one with this thinking.

GPU's are already "multiple core" in a sense with all their different levels of pipelines and features etc....

Maybe you mean dual GPU's on one PCB?
 
Crossfire can be summed up with one word: kludge.

I can't think of any reason at all why anybody would choose to build a crossfire system over SLI.
 
What would be sweet is that is graphic card and motherboard manufacturers utilised dual core CPU properly. I mean, for those folk with AMD64 X2's (and the intel equiv) had a CPU core running for each graphic card, surely that Would Be A Good Thing.

SLI and CrapFire use only 1 CPU of a X2 chip, while the other core plods on doing background stuff or nowt at all. Think of the performance a single 7800GTX has on a single CPU core, and then 7800GTX SLI on a single core (and the cpu holding it back). Then imagine if each 7800GTX in SLI had a core each almost to itself!

Both companies are missing a trick here I reckon. Perhaps that's the future.
 
ATI shouldn't have bothered with a dual card solution, rather like "Willsonman" was trying to say, make a card that has a 2nd GPU and really have an SLI killer.

Instead they invest a pile of money on a system that is being considered a failed idea right off the bat.


CROSSFIRE SHOULD HAVE ONLY BEEN RELEASED WITH THE X1800!
 
Brent_Justice said:
GPU's are already "multiple core" in a sense with all their different levels of pipelines and features etc....

Maybe you mean dual GPU's on one PCB?

Multiple GPUs yes but on the same die... like the X2. Just enrich level of the card rather than use what already has been done. I think this launch really should have pushed the technological envelope a bit more. Perhaps the dual GPU on one card such as what gigabyte has done with the 6600 and now 6800 in sli but it still requires an sli chipset MoBo... what is the point in that? One card for one slot... not two... not to mention you have to get a gigabyte MoBo. I like the ideas that are put out there by both companies but I will continue to refuse the dual card configuration for myself. I do love HL2 but my x700 will have to do for now. I'm just not excited about either solution at the moment given the technology available and the products as a result. Maybe I am picky but that is how I feel and God bless america for letting me feel that way!!
 
the review became worthless in a couple places b/c the controls werent implemented properly. Need to keep the AA's and AF's the same across all cards, and if the card wont support it then ditch the benchmark all together b/c its scientifically confounded by variables in Anisotropy and AntiAliasing (referring to Doom3 benches).
 
AndoOKC1 said:
the review became worthless in a couple places b/c the controls werent implemented properly. Need to keep the AA's and AF's the same across all cards, and if the card wont support it then ditch the benchmark all together b/c its scientifically confounded by variables in Anisotropy and AntiAliasing (referring to Doom3 benches).

Isn't that what the Apples-to-Apples tests were for? [H] reviews on video cards are always in this format: Different settings that result with similar speeds, then same settings that result with different speeds.
 
AndoOKC1 said:
the review became worthless in a couple places b/c the controls werent implemented properly. Need to keep the AA's and AF's the same across all cards, and if the card wont support it then ditch the benchmark all together b/c its scientifically confounded by variables in Anisotropy and AntiAliasing (referring to Doom3 benches).


The minute it showed/mentioned dongles it was all over, so I wouldnt be too concerned about that. :)
 
You guys need to stop and think for a minute. Given current manufacturing materials and technology there is no way you could make a dual core chip, like the X2, with current video card chips. Right now the cores are already 300+ million transistors. Can you even begin to imagine what it would be like to get yields on a single chip with 600+ million transistors?

It's nice to dream, but don't expect it to happen any time soon.

To the x-fire naysayers, I notice most are NV and/or SLI owners. Better stop and look at SLI's history. It most certainly didn't hit the ground running at 100%, and some problems, such as widescreen support were only very recently fixed. Also, just because ATi opted to include the X8xx series as being supported in no way represents the entire package that x-fire has the potential to be once R520 comes along.

I think there are a few nvidiots that are a little jealous that they no longer have the only SLI solution and can claim that as a major selling point to go NV when people ask/argue over product recommendations.
 
Un4given said:
You guys need to stop and think for a minute. Given current manufacturing materials and technology there is no way you could make a dual core chip, like the X2, with current video card chips. Right now the cores are already 300+ million transistors. Can you even begin to imagine what it would be like to get yields on a single chip with 600+ million transistors?

It's nice to dream, but don't expect it to happen any time soon.

I think that it should be relatively feasable to work the problem given that the current amd athlon 64 x2 has 233.2 million transistors http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2484
and the others of the x2 series follow suit.
http://www.simhq.com/_technology/technology_042a.html
I Think that the x800xl with its 160 million transistors could possibly push technology a bit further and make a larger die to accommodate current manufacturing processes to the 320 mark. Cooling is certainly an issue with the enlarged core but when IS'NT cooling an issue with the latest cards. I feel the technology that is available can create a card that is worthy of competition in the SLi/Crossfire arena and give BOTH options a good run for their money.
 
AndoOKC1 said:
the review became worthless in a couple places b/c the controls werent implemented properly. Need to keep the AA's and AF's the same across all cards, and if the card wont support it then ditch the benchmark all together b/c its scientifically confounded by variables in Anisotropy and AntiAliasing (referring to Doom3 benches).
You are confusing the goal of the benchmarks. The goal of the benchmark isn't to show which card produces faster results with the controls being the same. The goal is to keep the "playability" constant (let's say 50fps average), and modify the other controls to see which card can meet the "playability" standard with the highest value of the controls (res, AA, AF).

Really, the way [H] does it, they don't even need FPS graphs anymore. Their graphs might as well look like this:
http://www.whelehonconsulting.com/gallery/main.php?g2_view=core:downloadItem&g2_itemId=2141&g2_serialNumber=1
 
jebo_4jc said:
You are confusing the goal of the benchmarks. The goal of the benchmark isn't to show which card produces faster results with the controls being the same. The goal is to keep the "playability" constant (let's say 50fps average), and modify the other controls to see which card can meet the "playability" standard with the highest value of the controls (res, AA, AF).

Really, the way [H] does it, they don't even need FPS graphs anymore. Their graphs might as well look like this:
http://www.whelehonconsulting.com/gallery/main.php?g2_view=core:downloadItem&g2_itemId=2141&g2_serialNumber=1

If I could get a static frame rate across the cards, that is exactly the way I would do it. :)
 
Please stop talking about dual core GPU's. As brent said, the current X800's are already quad-core, if you want to think in terms of multi-core.
 
Also, I will say, if you do not like the way we evaluate how well video cards play games, I can certainly appreciate you not finding value in it. I kindly invite you to investigate the tons of other links we post on a daily basis so you can find sites that give you exactly what you need.
 
well now i understand the purpose and it makes sense...however, the inability to maintain a static frame rate and the resolution, AA, and AF to maintain it is impossible to create. B/c in some instances you may need odd resolutions such as 1245x1077 or 3.75AA or 7.25 AF in order to create a video card that can consistently run 50fps. Since software doesnt really allow for this it seems irrelevant to try and bench this way b/c there isnt precision. Im not trying to convince a change in the methodology as I dont really care how they bench, but since this is a thread evaluating their evalution, I thought i would comment on the methodology of the bench and its imperfections. I dont know how many people here have written research papers and had it peer reviewed, but if you have, then you understand where I'm coming from. :)
 
AndoOKC1 said:
well now i understand the purpose and it makes sense...however, the inability to maintain a static frame rate and the resolution, AA, and AF to maintain it is impossible to create. B/c in some instances you may need odd resolutions such as 1245x1077 or 3.75AA or 7.25 AF in order to create a video card that can consistently run 50fps. Since software doesnt really allow for this it seems irrelevant to try and bench this way b/c there isnt precision. Im not trying to convince a change in the methodology as I dont really care how they bench, but since this is a thread evaluating their evalution, I thought i would comment on the methodology of the bench and its imperfections. I dont know how many people here have written research papers and had it peer reviewed, but if you have, then you understand where I'm coming from. :)

They try for the best "Playable IQ" in the hard reveiws.. The highest res with the highest quality settings, highest AA, highest AF, etc., while still keeping playable FPS.. Not flawed, just different.. I prefer it that way myself.. They have an apples to apples comparison up as well.. Although, true "Apples to Apples" is not really possible anymore..
 
AndoOKC1 said:
different is fine but might as well stick to the scientific method as close as possible.

a "scientific method" would then not be testing the "gaming experience" delivered by each card

the way we test is exactly the way all gamers go about their gameplay when they play a game

you install your game and then what do you do? you go in and find out the best resolution, the best aa/af settings, the best in-game settings to play at, you choose the highest quality settings you can without dropping below your level of acceptable performance for said game

by using this same model when comparing video cards we can see which one allows a higher level of gameplay experience
 
well if this is the case and you are trying to demonstrate the gamers experience, why are you using the FX-55 and Raptors. This is hardly a good representation of a typical gamer here at HardForums. SO the review tells me that in order to get around 50 something FPS at 1600x1200 4AA, 8AF with a crossfire setup in a certain game, then I am going to have to shell out for an FX-55. WHy not choose a 3200+ venice overclocked to somewhere like 2.4ghz then tell what settings I should use to get to 50fps. A top end processor like that is hardly a real world demonstration as the population of hardcore gamers with FX-55's is pretty small. Essentially, this is hardly showing me a contrast between the scientific method and the real world application you are trying to achieve.
 
AndoOKC1 said:
well if this is the case and you are trying to demonstrate the gamers experience, why are you using the FX-55 and Raptors. This is hardly a good representation of a typical gamer here at HardForums. SO the review tells me that in order to get around 50 something FPS at 1600x1200 4AA, 8AF with a crossfire setup in a certain game, then I am going to have to shell out for an FX-55. WHy not choose a 3200+ venice overclocked to somewhere like 2.4ghz then tell what settings I should use to get to 50fps. A top end processor like that is hardly a real world demonstration as the population of hardcore gamers with FX-55's is pretty small. Essentially, this is hardly showing me a contrast between the scientific method and the real world application you are trying to achieve.

we don't want to bottleneck the video cards, we want them to achieve their fullest potential when gaming so that we are reviewing the video cards and not the cpu

with a fast cpu then we can look at the gaming experience being delivered and compare video cards to each other

for our mainstream video card testing we do use a more popular flavor, the 3500+

the fx-55 is used for the high-end enthusiast level video cards
 
sounds good to me...but can you tell me how much a 3200+ @ 2.4ghz bottlenecks these video cards. Ive always been confused as to exactly how much bottle neck exists with a nicely overlcoked CPU. How about the CPU in my sig. Its very close to an FX55. Will it give about the same bottlenecks. Thanks for all the clarifications and explanations. :)
 
I need to stop and think when is when enough, that its a matter of which card works best at a high ass resolution that ima spend the money to hit anyways, ati, nvidia its the same old thing ima spend the cash and ima hit the resolitions I wanted, wtf do I care about 10 fps for?
It's just dawned on me now how silly this is. Goos luck to you however, and what is important to you.
 
the actual results were irrelevant to me. The point being that the methodology of achieveing those results are skewed due to some imprecision and innaccuracy. As far as the best at high resolutions...as long as i get over 50fps with vsync on with my 2005FPW, then I dont care whcih card it is. However, if there are two cards that are roughly the same, then the benchmarks make a difference in choosing which one will give better framerates espscially if they cost the same. Im not here to nitpick, but HardOCP has a good reputation and it didnt come by doing what the average joe does. Processes like these is what gives you a trusted website with reliable data. I understand that some may not enjoy the discussion or process of discussing methodology, but I find it very important to the credibility to HardOCP. Ive seen people here bash other benching websites...why...b/c they thought they were innaccurately portraying the data. For HardOCP's sake, I hope they never go down that path.
 
AndoOKC1 said:
the actual results were irrelevant to me. The point being that the methodology of achieveing those results are skewed due to some imprecision and innaccuracy. As far as the best at high resolutions...as long as i get over 50fps with vsync on with my 2005FPW, then I dont care whcih card it is. However, if there are two cards that are roughly the same, then the benchmarks make a difference in choosing which one will give better framerates espscially if they cost the same. Im not here to nitpick, but HardOCP has a good reputation and it didnt come by doing what the average joe does. Processes like these is what gives you a trusted website with reliable data. I understand that some may not enjoy the discussion or process of discussing methodology, but I find it very important to the credibility to HardOCP. Ive seen people here bash other benching websites...why...b/c they thought they were innaccurately portraying the data. For HardOCP's sake, I hope they never go down that path.

Credibility is not a fact, but an earned reward for doing something that people deem of value. If the [H] would do the same thing as everyone else...why not just shut down the site? Just like Atkins diet....it may not be the "approvied way" to lose weight...but damn if it doesn't work. It has credibility.

The [H]'s method of holding the FPS near a constant and then trying to maximise around it is a very excellent method in my books for most enthusiats and average people. Once you get above a certain level of FPS, a large majority of the people can't tell the difference. The whole 30-60FPS window they try and focus on is exactly what needs to be done which a focus on the 60 FPS because of the enthusiast nature of this site.

Also, they do give apples to apples as per all the other sites thus showing another point of view. I think the [H] overall does a better job. They just don't run canned sciprts...print graphs, and then try and make an advertising buck by you hopefully on a bad click go to an add. They try and understand what is going on a draw logical conclusions.

As for the whole dongle/installation thing. I think that is a point where I disagree with them. An enthusiast will never "bitch" about something so trivial. It is like cutting off your hand to spite your nose. Crossfire has shown that compared to a 6800 series that it is a very good if not better than SLI. It may not be as simple/fluid as SLI installation....but in no way is it bad.

-tReP
 
Trepidati0n said:
As for the whole dongle/installation thing. I think that is a point where I disagree with them. An enthusiast will never "bitch" about something so trivial. It is like cutting off your hand to spite your nose.

I agree with everything you said except this. To me, an enthusiast ALWAYS is picky about the little things. That's why they're (or we're) enthusiasts to begin with. Examples of this reasoning:

a.) We want a 12ms LCD instead of an old, crappy 16ms one.
b.) No way am I getting that slooooow CAS 3 RAM, I'm gettin' the CAS 2.
c.) I ain't paying Newegg $200 for that RAM, I'm going to Monarch and getting it for $198.
d.) I squeezed out another 0.02ghz outta my Venice today!
e.) I lowered my ambient temperature from 35C to 34C by using a 120x120x35mm instead of my wimpy 120x120x25mm fan!
f.) I don't want to use a stupid dongle on the back of my PC to hook up 2 video cards, I just want one cable.

This be the way I see it.
 
credibility is not a fact as i never said that...but what i did say is that credibility is derived from having the facts.

I agree...enthusiasts are all about nitpicking b/c we care more about this stuff. So why wouldnt I be concerned or bring up issues.

p.s. - Atkins is a load of crap...with your body any means to an end isnt cool...does bulimia ring a bell...take it from a doctor ;)
 
ahh, i new this new idea was going to be a flop. Maybe 1800s will do better, more along the lines of internal data transfer.

A dongle... heheahah, But i do admit atleast you dont need to go out and buy a new x850 along with a mastercard.
 
Brent_Justice said:
we don't want to bottleneck the video cards, we want them to achieve their fullest potential when gaming so that we are reviewing the video cards and not the cpu

with a fast cpu then we can look at the gaming experience being delivered and compare video cards to each other

for our mainstream video card testing we do use a more popular flavor, the 3500+

the fx-55 is used for the high-end enthusiast level video cards

HEHE this post is funny. Seems like you guys can't win. One post is "use the scientific method" this one is bitching because you are removing the bottleneck of the cpu and testing just the cards. Which is supposed to not be "real world". (BTW on the full systems we have sold the slowest A64 we have had a person buy with a 7800 series card is a 4000+. Could be just an odd trend but in five years of business most people that plunk down big bucks for a videocard also buy a beast cpu to go with it.)

I many ways I like how you compare the cards max IQ to FPS. Also you DO an apples to apples comparision. (AKA scientific method...only one variable changes.) The latter is more important to me as I want to see a direct level comparision of the two cards as max settings. I think your reviews work fine. I am interested to see the one on Tuesday. ;)
 
Anyone else getting flashbacks to the nVidia GeForce FX 5800 Launch?

*shudder*

:eek:
 
Back
Top