Does ANYONE make a dual-card solution that WORKS?

InorganicMatter

[H]F Junkie
Joined
Oct 19, 2004
Messages
15,461
http://www.guru3d.com/article/content/302/1/

ATI promised that Crossfire would have much better gains than SLI when adding the second card. I guess not, as these benchmarks show the X1800XT having the same performance quirks that SLI has. Some games show Crossfire only being better than single by a few frames, and some show Crossfire being worse than a single card. What's up with this? I was kind of hoping that these issues would only be specific to SLI and driver problems, but the same thing exists in Crossfire! Why can't these people (both red and green) release a product that actually works? :rolleyes:

</rant>
 
Brent_Justice said:
Sometimes it is out of their control, such as when a game is CPU limited.
Like what? They used an FX-57 in that review, so I highly doubt CPU bottlenecking would be an issue.
 
sac_tagg said:
Like what? They used an FX-57 in that review, so I highly doubt CPU bottlenecking would be an issue.

Even with an FX-57 there is extreme CPU bottlenecking in some games with dual cards keeping them from scaling up as much. This is not just an opinion either, I have cold hard experience (using an FX-55) to back this up seeing as how I've tested every dual card solution. This topic has also been discussed by us editors at editor day events that both ATI and NVIDIA have held. CPU bottlenecking is real with dual cards depending on the game whether you believe it or not.
 
SuperAA seems to work great - much better than SLI-AA - although its actual impact to IQ isn't all that great now that we have adaptive/transparency AA.
 
I would like to see crossfire after a few months (to allow for driver maturing). LOL I throw it around so much now it may seem like a poor excuse but a good driver implementation is vital with these kind of things
 
The Carrier level in FarCry, especially the battle towards the end on the upper/flight deck is CPU limited. Even with high resolution and AA turned up.
 
Brent I hope in your review you address some quality issues. Namely vsync and triple buffering. A high quality image not only has AA and AF and looks good in a screenshot, but doesn't have tearing when in motion.
 
GURU3D Article said:
Mind you that at 1920x1200 you see "0" scores for the 512 GTX, these card simply have not been tested at that resolution just yet.

huh??? Why not??? In my eyes that renders their benchmarks incomplete, 16x12 was here with the 68XX & X800 cards, now its time to move on with the 7 and X1800's series.
 
Brent_Justice said:
Sometimes it is out of their control, such as when a game is CPU limited.


I say if a you have a condition where an FX57 is the limiting factor in a game, you should crank the resolution or FSAA or AA until the limiting factor becomes the video card.

Brent, don't you think that's the point where dual cards will make a huge advantage?
 
BBA said:
I say if a you have a condition where an FX57 is the limiting factor in a game, you should crank the resolution or FSAA or AA until the limiting factor becomes the video card.

Brent, don't you think that's the point where dual cards will make a huge advantage?

The problem then becomes that at resolutions where you're graphics bound and not CPU...then you are getting unplayable framerates...
 
kuyaglen said:
huh??? Why not??? In my eyes that renders their benchmarks incomplete, 16x12 was here with the 68XX & X800 cards, now its time to move on with the 7 and X1800's series.

yea, 7 and x1800 series are the now. There's no way that a 6 or X800 series card is cpu bottlenecked in anything that came after D3/HL3/Farcry. BF2/FEAR/COD2/Q4/Serious Sam 2 are all much more graphics intensive than SLI/Xfire 6 and X800 series cards can handle.

Brent_Justice said:
Only if the game and drivers are actually utilizing both cores.

Nvidia's recent 80 series drivers do seem to show a large improvement with dual over single, so multithreading has started to happen w/ nvidia.

------

I don't believe that SLI GTX 512mb can handle fear/cod2/q4 at 16x TR SS AA, 16x AF, 1600x1200 and higher (and not even close on 1920x1200 and 2048x1536), max settings (ultra quality on Q4 for sure), and not drop below 30-40 fps ever. No way. Until that happens, the CPU is not a bottleneck in my eye. Increasing AA and res only taxes the GPU more, not the CPU. And if people can handle these games at 1024x768 just fine, then it's not the CPU's problem.
 
that's just one aspect, the game itself also needs to utilize both cores to relieve cpu dependency
 
couldn't you use a monitoring utility to see whether or not a CPU is being bottlenecked? I find it extremely hard to believe that the CPU is a bottleneck with high end solutions, given that most times, few performance changes are seen between 2.2-3.2ghz (running a decent res/IQ, not 1024x768). So, how about someone "proves it".
 
you have to give crossfire some time, and sli to an extent too. The technology is still in its infancy (don't talk about multigpu voodoo cards or whatever). If you think about it, there isn't really a 2nd generation sli or crossfire mobo chipset. The 7800gtx and gt are second generation multi-gpu capable cards but imho games aren't being optimized enough for dual-gpu solutions (i'm not sure how much they could be optomized).

Give the tech time to mature, but you will never see a 100% percent gain in performance by adding a second card if ya ask me.
 
No matter how well your hardware may be, poor coding and poor utilization by the people who write the games and design the drivers is the problem. Hardware is always ahead of the technology that runs on it. It sucks, but it's the truth. As time goes on, the coders will learn to exploit the hardware like they have in the past. It's a cycle.
 
TheTMan said:
you have to give crossfire some time, and sli to an extent too. The technology is still in its infancy (don't talk about multigpu voodoo cards or whatever). If you think about it, there isn't really a 2nd generation sli or crossfire mobo chipset. The 7800gtx and gt are second generation multi-gpu capable cards but imho games aren't being optimized enough for dual-gpu solutions (i'm not sure how much they could be optomized).

Give the tech time to mature, but you will never see a 100% percent gain in performance by adding a second card if ya ask me.
from the [H] front page. NV50 coming out late Q1 next year (I'm kinda confused over teh date. To me late Q1 is march and it also says summer somewhere else in the article, so one of those). That's nforce5 right? If you are right about needing a second gen SLI chipset, that's where it'll take off. Especially since it will be timed with the release of the "G71, 72, and 73" (90nm refreshes). This is also the time of AM2 (supposedly). It's gonna be an interesting year come March/June/whenever!
 
Back
Top