Who's faster at adapting to new games? SLi or Crossfire??

SX2233

Gawd
Joined
Sep 9, 2006
Messages
874
If a new game just came out, and theres no optimized driver for it yet, who would stand out more? CF or SLi??
 
I had been using SLI for the last 18 months until I added a Vista drive to my sig rig. There just wasn't any support for the QUAD-SLI setup for my 7950GX2's, so I went with an 8800 Ultra.

I would say that SLI is better supported overall. Even if a game doesn't have a built in profile in SLI, you can usually create one. And to be honest, with the current 2900XT GPU, CrossFire is a very bad investment. One 8800 Ultra can come close to matching 2900XT CF surpassing it in most cases for roughly the same price with less heat, power consumption and fewer problems: http://techreport.com/reviews/2007q2/radeon-hd-2900xt/index.x?pg=1
 
If you are just asking about OFFICIAL driver support for new games, Crossfire is the obvious candidate.

nVidia has only been putting out official driver updates every 6 months or so (well...ALMOST every six months - the current nVidia 7-series driver for XP is over 7 months old, now). ATI puts out an official Catalyst release EVERY month.

On the other hand, Catalyst 'leaks' adding beta support for things are pretty rare, while Forceware releases get leaked pretty much weekly.

So...that's some information, anyway.
 
all of u sort of misunderstood me, what i'm asking is that, if there's a new game out that don't have optimized CF and SLi drivers for it yet, which would be faster? SLi or CF ??
 
all of u sort of misunderstood me, what i'm asking is that, if there's a new game out that don't have optimized CF and SLi drivers for it yet, which would be faster? SLi or CF ??

SLi, Crossfire is still behind SLi.
 
Do I need to prove to you the sky is blue?

It's common knowledge, backed by tons of articles in PC magazines, online magazines, and various tech sites that SLi is the supperior technology.

With the new R600 series, one of the things that ATi did actually do is improve Crossfire, but still it isn't on par with SLi.

But in all honesty, SLi and Crossfire are both quite wasteful unless you're playing games at 1920x1440 or above.

I use a single GTX, and it does fine all the way up to 2048x1536, the highest my monitor would go.

I had a 7950gx2, which is basically a "SLi" card and SLi can cause you quite a few headaches.
 
Do I need to prove to you the sky is blue?
With the new R600 series, one of the things that ATi did actually do is improve Crossfire, but still it isn't on par with SLi.
Can you provide a link to an article, proving that the R600 series Crossfire is still inferior?
 
all of u sort of misunderstood me, what i'm asking is that, if there's a new game out that don't have optimized CF and SLi drivers for it yet, which would be faster? SLi or CF ??

I posted this before but it has CF vs SLI vs single card performance benchmarks side by side and I think it provides answers to your question: http://techreport.com/reviews/2007q2/radeon-hd-2900xt/index.x?pg=1

However, there's no definitive answer to your question. Looking at the benchmarks in TheTechReport link, games that are based on the Source engine (Half Life 2) would probably do better in CF with no optimized drivers; all others with SLI.

And as I said before, why would you even consider CF when a single 8800 Ultra provides similar performance, even better in some cases, never worse enough to be noticeable, at the same cost in a single card, with less power, less heat and fewer problems (no SLI or CF issues to deal with)?

Look at all the benchmarks from all the reputable review sites. There is absolutely no reason to get 2900XT CF. None at all. I'm not trying to flame, I simply don't see what an $800 2900XT CF solution provides that is significantly better, if not even worse, than an $800 800 Ultra or even $600 dollar 8800 GTX provides.

Single card solutions that provide similar performance to multi-card solutions are inherently superior as that are inherently simpler. Occam’s Razor at work.
 
Valve has been friends with ATi for quite some time, so you can always expect HL2 to run better on an ATi card just like Doom3 engine-powered games usually run better on Nvidia cards. :)
 
Do I need to prove to you the sky is blue?

It's common knowledge, backed by tons of articles in PC magazines, online magazines, and various tech sites that SLi is the supperior technology.

With the new R600 series, one of the things that ATi did actually do is improve Crossfire, but still it isn't on par with SLi.

But in all honesty, SLi and Crossfire are both quite wasteful unless you're playing games at 1920x1440 or above.

I use a single GTX, and it does fine all the way up to 2048x1536, the highest my monitor would go.

I had a 7950gx2, which is basically a "SLi" card and SLi can cause you quite a few headaches.

the sky is not blue, it's only the sun that makes it look blue lol
 
I posted this before but it has CF vs SLI vs single card performance benchmarks side by side and I think it provides answers to your question: http://techreport.com/reviews/2007q2/radeon-hd-2900xt/index.x?pg=1

However, there's no definitive answer to your question. Looking at the benchmarks in TheTechReport link, games that are based on the Source engine (Half Life 2) would probably do better in CF with no optimized drivers; all others with SLI.

And as I said before, why would you even consider CF when a single 8800 Ultra provides similar performance, even better in some cases, never worse enough to be noticeable, at the same cost in a single card, with less power, less heat and fewer problems (no SLI or CF issues to deal with)?

Look at all the benchmarks from all the reputable review sites. There is absolutely no reason to get 2900XT CF. None at all. I'm not trying to flame, I simply don't see what an $800 2900XT CF solution provides that is significantly better, if not even worse, than an $800 800 Ultra or even $600 dollar 8800 GTX provides.

Single card solutions that provide similar performance to multi-card solutions are inherently superior as that are inherently simpler. Occam’s Razor at work.

Um I seriously think the max OC limit of 2x 8800 gts ($600) would be better than a max OC 8800 Ultra ($800) from some of the Lost planet DX10 benchmarks that I've seen.
 
Well, here's the problem, the 9800gtx will be more poowerful than 3 8800gtx cards, so why not just use one 8800gtx, and then go to the 9800gtx?
 
Um I seriously think the max OC limit of 2x 8800 gts ($600) would be better than a max OC 8800 Ultra ($800) from some of the Lost planet DX10 benchmarks that I've seen.

These DX9 benchmarks show GTS 640 SLI vs 8800 Ultra performance as nearly the same: http://firingsquad.com/hardware/lost_planet_demo_directx_10_performance/page8.asp

Throwing in overclocking I doubt would change things a whole lot, and looking at these and bechmarks all around, GTS 640 SLI and a single 8800 Ultra are paractically dead even across the board.
 
I still don't understand why a game has to support SLI. Can anyone explain this to me?

nvidia's implementation is similar (if not the same as) to 3dfx's implementation and 3dfx didn't need a game to support it. Also, what I've read either at nvidia's website or elsewhere, can't remember, is that no special game coding is needed.

I understand video card drivers need to be written to support SLI, but as for games? I don't think so.

Also, the main advantage (to me) of SLI is the ability to play with 8x AA at 1920x1200 and get great framerates. I don't care if a non-AA'd game is faster or not if it's already fast enough, but Oblivion with 8x AA and the grass slider maxed out (like I like it) and it's almost too much for my GTX. Noticeable jerkiness in framerates. So I have to back down on the grass slider or reduce AA. I would love another GTX to enjoy this game to it's fullest.
 
It's common knowledge, backed by tons of articles in PC magazines, online magazines, and various tech sites that SLi is the supperior technology.

Good, then you should be able to find at least one. Have at it.
 
There is no spoon neo......

Sigh.. Crossfire is not worth it IMO for this current evolution of cards by Ati. If you want to, feel free, nobody will stop you, but several may question the rationale for this.


Why? Sigh.. please refer to multiple articles (here and elsewhere) about heat, noise, and benchmarks. I'm not trying to flame Ati, but this product really should have more "pounce for the ounce".

Hopefully, this is nothing more than a "lessons learned" stepping stone for the next gen.
Be that as it may, Ati has not, repeat NOT impressed myself, and many, many others with the 2900XT.
 
Probably Nvidia since it seems like they have new beta drivers out every 3-4 days lol. (or at least that's the way it was when I had one).

I remember with my 6800GT I upgraded everytime a new beta or WHQL driver came out. I gave up a few weeks later.
 
SLI has one major advantage over CF right now, and that is the ability to create a game profile and force whatever SLI mode you want for every game out there on the planet. So if there is no built in driver profile for that game enabling an SLI mode, you can create one and force AFR or SFR yourself.
 
I still don't understand why a game has to support SLI. Can anyone explain this to me?

Games do not have to support SLI. The NVIDIA drivers are the ones that need to support the game.

nvidia's implementation is similar (if not the same as) to 3dfx's implementation and 3dfx didn't need a game to support it. Also, what I've read either at nvidia's website or elsewhere, can't remember, is that no special game coding is needed.

Not even close. 3DFX's solution was a pure hardware implementation that wasn't driver dependant. With more modern games and GPU designs, this has apparently had to change. The name SLI is the same but it stands for something completely different. 3DFX's implentation was Scan-Line-Interleaving and the NVIDIA moniker stands for Scalable-Link-Interface. The definition for each companies' acronym illustrates different concepts of design and implementation.

I understand video card drivers need to be written to support SLI, but as for games? I don't think so.

The NVIDIA drivers need to have SLI profiles for a game in order for SLI to work. If a no profile exists for a given game then the user can create a profile of their own. Granted it is up to the user to test the different rendering modes and settings to get the best performance out of their user-defined profile. If I am not mistaken current NVIDIA drivers default to some kind of split frame or alternate frame rendering and because of this some games work with no changes assuming that the default SLI configuration is indeed optimal for a given title with no pre-existing SLI profile. Of course if the default profile or user created profile is not optimal they can actually reduce performance in that game vs. running the system in single card mode alone.

Also, the main advantage (to me) of SLI is the ability to play with 8x AA at 1920x1200 and get great framerates. I don't care if a non-AA'd game is faster or not if it's already fast enough, but Oblivion with 8x AA and the grass slider maxed out (like I like it) and it's almost too much for my GTX. Noticeable jerkiness in framerates. So I have to back down on the grass slider or reduce AA. I would love another GTX to enjoy this game to it's fullest.

Ultra high resolution gameplay is where SLI shines. There are no current games where a single 8800GTX or Ultra is not sufficient at sub 1920x1200 games that I am aware of even when some AA and AF are used.
 
SLI has one major advantage over CF right now, and that is the ability to create a game profile and force whatever SLI mode you want for every game out there on the planet. So if there is no built in driver profile for that game enabling an SLI mode, you can create one and force AFR or SFR yourself.

Precisely. I've had several SLI rigs and two Crossfire rigs of my own. (Not counting SLI and Crossfire motherboard testing for the [H]) and I always enjoyed having that control with SLI and hated not having it in Crossfire. Having had SLI since the Geforce 6 series I have been able to enjoy games using SLI since that time relatively pain free. Though in the beginning it was much worse.
 
Thank you Dan_D for your response. It's the first knowledgable explanation I've seen.
 
Thank you Dan_D for your response. It's the first knowledgable explanation I've seen.

You are welcome. :cool:

I currently own both a Crossfire rig and an SLI rig and have owned both in the past so I've got alot of experience gaming with both of them. Overall I prefer SLI though Crossfire got a much needed improvement with the X1950Pro's which use bridges like SLI does. It has been problem free and my girlfriend games on that box and never has had issues with it. I took one card out to use in another box temporarily and she noticed right away that it was gone, so I guessed it worked in every game she played without any tweaking.
 
Well, here's the problem, the 9800gtx will be more poowerful than 3 8800gtx cards, so why not just use one 8800gtx, and then go to the 9800gtx?

3 8800 gtx!? 9800 gtx?? I doubt we'll even see those yet by the Summer of 2008
 
These DX9 benchmarks show GTS 640 SLI vs 8800 Ultra performance as nearly the same: http://firingsquad.com/hardware/lost_planet_demo_directx_10_performance/page8.asp

Throwing in overclocking I doubt would change things a whole lot, and looking at these and bechmarks all around, GTS 640 SLI and a single 8800 Ultra are paractically dead even across the board.

yeah but Ultra is still way too expensive to be had, cause nVidia is playing monopoly on the highest end.
 
3 8800 gtx!? 9800 gtx?? I doubt we'll even see those yet by the Summer of 2008

We don't know what NVIDIA's next card and GPU will be called. Right now we only know the codename G92.

Well, here's the problem, the 9800gtx will be more poowerful than 3 8800gtx cards, so why not just use one 8800gtx, and then go to the 9800gtx?

I am not sure where you are getting the data necessary to create this conclusion on a product that doesn't even exist for all intensive purposes, much less speculate on performance. If you are basing this off of NVIDIA's near 1 Tflop performance than I'd say you are being optimistic at best. The thing is even if the GPU can do this there will likely be other factors in the card or GPU design limiting it's performance. If history repeats itself and NVIDIA continues it's track record then I'd estimate that what we will see at best is an 85%-100% increase in performance. NVIDIA has never leapfrogged it's last generation by a margin greater than that in the past.

Unless ATI pulls a seriously kickass refreshed R600 out of it's ass in three months time I don't think NVIDIA will bother to kick that much ass. They'll likely release a revamped G80 to compete against the R600 refresh. That would make good business sense, but if G92 is really that powerful and they can produce it at a reasonable cost, they might as well go for the kill. In any case we'll have to wait and see. Until we get closer to show time we won't know anything for certain.
 
Back
Top