HardOCP's ATI CrossFire Evaluation

Oh man Kyle layed the smack down on ati on the front page, we aint yo bitch ati... talk to the hand mmmhmmm :D
 
AndoOKC1 said:
sounds good to me...but can you tell me how much a 3200+ @ 2.4ghz bottlenecks these video cards.

That's the exact speed of a A64 3800+. All you need to do is lookup benches for those chips on the web. They are out there, trust me. Which is a stones throw from a A64 4000+. The difference between the 3800+ and the 4000+ is 512k extra cache ram on the 4000+.

Those two chips aren't all that far from a FX-55 either in terms of overall performance. Where you'll see the largest drop, is below the 3500+ or the 3200+. Depending on the game and the resolution.

There are some reviews out there that have tested the same high end configuration with different CPU's at different clock speeds. They are pretty close together until you get to the 3200+. Then below that it falls VERY quickly.
 
I searched this thread and could not find a good answer to this question... perhaps I missed a post but it seems that part of the topic was brought up but not exactly as my questions would state.

In this review it is stated that:

"The problem with this is that although you paid for an X850 XT, your performance might be “dumbed-down” to match your current video card. You will not be getting everything out of the video card that you paid for. That just doesn’t seem like a smart buy to me."

But in the tech preview that is linked off this review it is stated:

"What does not need to be the same are the clock speeds between both video cards. Unlike what you might think, the controller video card will not downclock itself to match a slower secondary video card. The clock speeds of both video cards can be different as long as the pixel-pipeline count and framebuffer size are the same. With the buffers in the compositing engine, it is able to buffer up data if one card is still drawing while the other is waiting for the data before blending and still maintain a performance advantage."

I am wondering which statement is the true statement. The one in this review would make the crossfire seem rather worthless to even attempt in the first place if you have a card lower than the 850xt or 800xl, and the statement in the tech preview would impress me in that the higher end card would pick up the a little more of the workload. As if dividing the workload in a 60/40 manner instead of 50/50 and "dumbing down" the first card.

I am not trying to point out fault or anything like that. I am just trying to help my decision on which way I might go looking at the future of upgrading equipment.

I have two machines (one for me and one for my wife) and I normally upgrade my machine and then my wife uses my old one. SLI or Crossfire have never seemed to make a lot of sense being that I upgrade an entire machine at a time. Our machines are not too outdated (amd64 3200 (@2337 275x8.5) w/x850 and amd-m2600 (@2500 200x12.5) w/6600GT) so I may upgrade my mobo and use Crossfire instead of buying a whole new machine this time. I could just buy a new video card but them I would need to update her mobo and processor (and perhaps some other stuff) in order to use my current card in her machine.
 
flatliner said:
I searched this thread and could not find a good answer to this question... perhaps I missed a post but it seems that part of the topic was brought up but not exactly as my questions would state.

In this review it is stated that:

"The problem with this is that although you paid for an X850 XT, your performance might be “dumbed-down” to match your current video card. You will not be getting everything out of the video card that you paid for. That just doesn’t seem like a smart buy to me."

But in the tech preview that is linked off this review it is stated:

"What does not need to be the same are the clock speeds between both video cards. Unlike what you might think, the controller video card will not downclock itself to match a slower secondary video card. The clock speeds of both video cards can be different as long as the pixel-pipeline count and framebuffer size are the same. With the buffers in the compositing engine, it is able to buffer up data if one card is still drawing while the other is waiting for the data before blending and still maintain a performance advantage."

I am wondering which statement is the true statement. The one in this review would make the crossfire seem rather worthless to even attempt in the first place if you have a card lower than the 850xt or 800xl, and the statement in the tech preview would impress me in that the higher end card would pick up the a little more of the workload. As if dividing the workload in a 60/40 manner instead of 50/50 and "dumbing down" the first card.

I am not trying to point out fault or anything like that. I am just trying to help my decision on which way I might go looking at the future of upgrading equipment.

I have two machines (one for me and one for my wife) and I normally upgrade my machine and then my wife uses my old one. SLI or Crossfire have never seemed to make a lot of sense being that I upgrade an entire machine at a time. Our machines are not too outdated (amd64 3200 (@2337 275x8.5) w/x850 and amd-m2600 (@2500 200x12.5) w/6600GT) so I may upgrade my mobo and use Crossfire instead of buying a whole new machine this time. I could just buy a new video card but them I would need to update her mobo and processor (and perhaps some other stuff) in order to use my current card in her machine.


The one where each card retains the use of all of its ram and clockspeed is correct. The crossfire engine does some load balancing giving more work to the higher performing card if mismatched.
 
forcefed said:
Oh man Kyle layed the smack down on ati on the front page, we aint yo bitch ati... talk to the hand mmmhmmm :D

Brent and Kyle do this alot in thier reviews. They favor Nvidia and its apparent in the reviews. This is why they released a statement about it stating thier biased towards thier readers, too many people are noticing it. I wonder if it has anything to do with the apparent drop off in ATI advertising money paid to this site. :rolleyes:
 
Back
Top