HardOCP's SLI Upgrade Guide

RAutrey said:
Anyway, I know this article is to feature the video card tech but will we see a review of the A8N-SLI soon?

Also, you guys are pretty good about this but I think this one will deserve a few updates as SLi matures.
The motheboard review is a ways off. Brent still has some SLI stabilty testing to do with the motherboard, then it will be back to me for benchmarking, then it will be off to Morry or Keith to do the rest of the review and photos and graphs, then it will be back to me to edit and kick corrections back, then it will go live on the right day.

If there are any significant SLI changes we will surely note them.
 
Just wondering if it's better to just go with a 6800GT SLI now. Cost would be about ($200 MB + 2 6800GT @ $400) = $1000. Or if you spent ($200 MB + 2 6600GT @ $200 )=$600 plus if the GT's drop in half later you would still have to buy 2 6800GT's for $400 more to replace the 6600GT's. = $1000.
 
Just a gripe, but,

The 6600 SLI vs. single 6800gt bench comparisons would be more useful if run under the same res/AA/AF...apples to apples.

With the current graphs, the SLI rig consistently scores a lower minimum framerate than the single 6800...sometimes with as much as a 38% spread. Having different resolutions and AA settings makes comparison and analysis difficult.

Is the choke because of the lower memory bandwidth/capacity of the 6600's (vs the 6800), or because of the higher resolution/AA?

To my eye, significant framerate drops are more noticable than peaks.
The 6600 single and SLI rigs consistently hit lower lows, and more often, at least on your FRAPS graphs.
 
all i can say is, top knotch review :D
hopefully i can get my sli setup soon.

good job again :D
 
Oh ye, I forgot to say, great review ! Did a great job comparing everything, so we can see what we would get.
 
Hello all,

Is it true that Kyle seems to be suggesting that we buy a 6600GT SLI configuration to start out with?

But wouldn't it be better to go with the 6800GT single, since they are almost comparable, and then when prices drop, then go SLI with them?

-Victor
 
Xilikon said:
Indeed, if the fps match the refresh rate of the monitor, it is as best as you can go. I feel it too with my eyes.

However, Brent is a little right too on it because the eye cannot really discern 60 fps and 80 fps.

It is different for different people, and it also depends on what you're looking at. The more detail and light there is, the faster you can see it. Personally, I can see the difference between 100 and 60, and I'm definitely not alone on that. If you can't see the difference, great, crank up your IQ and enjoy the eye candy a little more, but that's still no reason to benchmark with a 60 fps cap. Why do you think benchmarks since the beginning of 3d gaming have been done with vsync off?
 
Suicider said:
Why do you think benchmarks since the beginning of 3d gaming have been done with vsync off?

because of the "3dfx mentality" and it seems a lot of people are still stuck in it

but today we use it off because with vsync on your framerate becomes a fraction of vsync, so for example if you are at 60Hz and running at 60FPS you're either gonna get 30FPS or 60FPS, with vsync off it actually helps your minimum framerate as well, the downside is tearing, so you just have to decide which is worse for you...

personally i'd love to see more games with capped FPS of 60FPS

the less change in FPS you have the less you'll notice it and the less annoying framerate dips will be

for example say you are in a game at 100fps, a huge battle sequence comes on the screen and you drop to 50fps, that change in fps you are gonna notice, and its gonna be bad

but if the framerate was capped at 60fps, and it dropped to 50fps in that sequence, there would be less change in fps, and you wouldn't perceive it as much


we are seeing a shift away from worrying about maximum framerate's, and seeing a shift towards what is the highest playable quality settings a card can offer with smooth framerate's

what is important is minimum fps and what quality settings you can play a game at with smooth performance, and smooth performance is 30-60FPS with consistency, less changes in fps
 
For most cases, in single player at least, that's completely true: 30 fps is ok, as long as it's consistent. I played through doom on a ti4600 and it was just fine. Multiplayer is a whole different thing for me though.... I want the highest fps I can get at every given instant. If that means I get a huge and noticable drop every now and then, that's better for me than sticking at a cap that's off-sync from my refresh.

Anyway, all this stuff is entirely subjective, personal, and beaten into the ground, so I'll shut up and just say I'd *like* to see the performance graphs without the cap, because it gives a much better indication of what the hardware is capable of, whether going over 60 fps is something someone's interested in or not. :)
 
enricht said:
Hello all,

Is it true that Kyle seems to be suggesting that we buy a 6600GT SLI configuration to start out with?

But wouldn't it be better to go with the 6800GT single, since they are almost comparable, and then when prices drop, then go SLI with them?

-Victor
I think if you are going to only buy one card at first and have the cash, the 6800GT is a great investment. If you only have 400 bucks, the 6600GT is a great place to start as well.
 
Suicider said:
For most cases, in single player at least, that's completely true: 30 fps is ok, as long as it's consistent. I played through doom on a ti4600 and it was just fine. Multiplayer is a whole different thing for me though.... I want the highest fps I can get at every given instant. If that means I get a huge and noticable drop every now and then, that's better for me than sticking at a cap that's off-sync from my refresh.

Anyway, all this stuff is entirely subjective, personal, and beaten into the ground, so I'll shut up and just say I'd *like* to see the performance graphs without the cap, because it gives a much better indication of what the hardware is capable of, whether going over 60 fps is something someone's interested in or not. :)

I agree. Single player I am good with 30 or 40 solid. DM I want as many as I can get.
 
What is the PSU requirements on the SLI setup? Remembering the 460w recommending on the 6800ultra I wonder what you need to run at 6800u sli and a FX55 cpu system?


How mutch does the cpu maters in the games as Doom3, Far Cry and Half Life2, if you only have a 3000+ cpu, will buying a SLI setup of two 6600gt's be a waste?
 
Yeah, I should have clarified, in single player games personally I'd like to see more games capped at 60FPS. In Multiplayer, that's a different story.
 
Ruiner said:
Just a gripe, but,

The 6600 SLI vs. single 6800gt bench comparisons would be more useful if run under the same res/AA/AF...apples to apples.

I agree. I know the benchmarks are to demonstrate the highest playable frame rates a card can do, but it's hard to think of the results when they're not apples to apples.

For example, I read the article at work and had no idea how the Ultra stacked up to my card because the AA and AF were completely different from the article done last week. But now I'm home with my results and they're shocking.

[H] score with the Ultra SLI rig at 1600x1200x32 Reflect world All 2xAA/16xAF = 94.2
My single BFG 6800 Ultra Waterblock @ 451/1200 FX53 (2400) 1600x1200x32 Reflect world All 2xAA/16xAF = 103.88

I win. I even ran the benchmark twice to make sure. Same driver too. The big differance here is the memory on my BFG is OCed. I've found that makes a huge differance in my scores, usually 5 fps.
 
While only concenrating on the high end, it looks thoroughly dissapointing in my opinion. It may be a CPU limitation, but after seeing these numbers, the chances of me spending $1600 (Canadian) to gain 10 average FPS and under 10 minimum FPS from what I'm at right now is absurd. Let's hope higher numbers come out at some point.

Going from 1600 2xAA / 16xAF in Far Cry to 4xAA / 16xAF at the price of over $1500 is just.... not intelligent, and that's with a PE at stock, I already play Far Cry at those settings. I personally was expecting a lot more from two 6800U's :(

Kind of depressing :(

On the flipside, another great article from my favourite video card reviewers, in depth, informative and well done. Don't you dare stop :)
 
NoGodForMe said:
I agree. I know the benchmarks are to demonstrate the highest playable frame rates a card can do, but it's hard to think of the results when they're not apples to apples.

For example, I read the article at work and had no idea how the Ultra stacked up to my card because the AA and AF were completely different from the article done last week. But now I'm home with my results and they're shocking.

[H] score with the Ultra SLI rig at 1600x1200x32 Reflect world All 2xAA/16xAF = 94.2
My single BFG 6800 Ultra Waterblock @ 451/1200 FX53 (2400) 1600x1200x32 Reflect world All 2xAA/16xAF = 103.88

I win. I even ran the benchmark twice to make sure. Same driver too. The big differance here is the memory on my BFG is OCed. I've found that makes a huge differance in my scores, usually 5 fps.
You cannot compare those scores as the demo you ran in no way represented real gameplay that we used in the SLI article. :( Sorry. That is one of the reasons we try to stay away from timedemos.
 
joecuddles said:
While only concenrating on the high end, it looks thoroughly dissapointing in my opinion. It may be a CPU limitation, but after seeing these numbers, the chances of me spending $1600 (Canadian) to gain 10 average FPS and under 10 minimum FPS from what I'm at right now is absurd. Let's hope higher numbers come out at some point.

Going from 1600 2xAA / 16xAF in Far Cry to 4xAA / 16xAF at the price of over $1500 is just.... not intelligent, and that's with a PE at stock, I already play Far Cry at those settings. I personally was expecting a lot more from two 6800U's :(

Kind of depressing :(

On the flipside, another great article from my favourite video card reviewers, in depth, informative and well done. Don't you dare stop :)
From a personal point of view, I would not buy two Ultras now as there is really no need to, at least not the way I game. Buy one now and buy another when they are $250 or so and then you have a bargain.
 
FooKerama said:
What is the PSU requirements on the SLI setup? Remembering the 460w recommending on the 6800ultra I wonder what you need to run at 6800u sli and a FX55 cpu system?


How mutch does the cpu maters in the games as Doom3, Far Cry and Half Life2, if you only have a 3000+ cpu, will buying a SLI setup of two 6600gt's be a waste?
500 to 550w is suggested, but honestly, we did numbers on what our test system was running and it came in right at 300W. Still a PSU is not something you want working at 100%...ever, if possible. I would say if you want to SLI Ultras, get the 550 and sleep well at night.
 
Another things that we all, I'm sure, want to know. How far can you go with the SLI 6800Ultras? You mention 8xSAA at 1600*1200 wasn't playable in HL2. At was resolution is it playable? Does that resolution+8xSAA look better or worse than 1600*1200 4xAA?
What are the results running games (that support it) at 2048x1536? At what AA and AF levels? Where are the limits of the SLI 6800Ultras? Dammit ;)

Informal testing with result posted here perhaps, eh Brent?

Oh, "Must have hardware"? Despite all the issues and bugs? We are not even sure the bugs are game or hardware/driver related yet.
 
im really dying to find out how OC friendly these SLi mobos are...do the locks work properly etc...

a few other things id like to knw abt the mobo are, when does one use the power connector on the mobo next to the pci-ex slot...only in SLi mode?

do we still need to power both the cards with the 6-pin pci-ex connectors?
 
Personally, I'm not really THAT impressed with SLI. It seems nice, but I think I'll wait it out a generation or two until it's perfected.
 
kick@ss said:
Personally, I'm not really THAT impressed with SLI. It seems nice, but I think I'll wait it out a generation or two until it's perfected.

heh I hear that....

still though, I am impressed...impressed with my x800pro @ x800xt pe

must have hardware?....?...
 
RandomTrend said:
heh I hear that....

still though, I am impressed...impressed with my x800pro @ x800xt pe

must have hardware?....?...
I'd be impressed if I had an X800 Pro @ XT PE speeds, too.
 
kick@ss said:
Here is SLI at Anandtech: http://www.anandtech.com/video/showdoc.aspx?i=2284.

It seems like SLI 6600GTs aren't as fast as a single 6800GT with AA and AF on at high res.

That's expected. The memory bus on the 6600 is half as wide IIRC. Did they bench twin 128 mb 6600 cards? 2x128 SLI does not equal 256mb.
I'm more interested in comparing the valleys on those FRAPS graphs.
 
googles said:
im really dying to find out how OC friendly these SLi mobos are...do the locks work properly etc...

a few other things id like to knw abt the mobo are, when does one use the power connector on the mobo next to the pci-ex slot...only in SLi mode?

do we still need to power both the cards with the 6-pin pci-ex connectors?

with both cards in sli you have to use the power connector on the mobo, if you don't it could be unstable, a red light comes on to let you know to plug it in

and you still need the pci-e power connectors as well
 
Brent

do u know if 2 ultras or 6800 gts will fit on that asus board in sli with the nv5 silencer coolers on ?
 
I noticed on the last issue of PC Mag that Voodoo and Falcon Northwest have been selling SLI setups using a P4 Xeon mobo, with NVidia vid cards. I understand pricing will keep that solution from the mainstream but how exactly are they making this work? Custom drivers?

I'd imagine the Forcewares and/or goldfinger connector wouldn't work right outside of NF4 boards... 'Least for the time being. Or am I wrong. I'm just curious with respect to alternate future offerings... Will each company's (ATI/NVidia) SLI setup require custom boards made by themselves, where does this leave VIA's SLI board? I imagine compatibility issues will bear themselves out even more as you try to think about all of this...
 
Is it possible to run dual monitors in SLI mode? either on the desktop, or with some sort of cool dual screen spanning shit going on while gaming. .
 
Two points:

(1) That A8N is one gorgeous board, much better that that crappy mustard color Asus uses.

(2) Kyle or Brent, Do you think that the SLI bridge adaptor should be flexible as opposed to the rigid design? Since the rigid design might be a potential weak point. Or is the Nvidia Certfied SLI progarm supposed to take care of that? (this has been brought up somewhere else, can't remember where.)
 
I just wonder when a good supply of this hardware will kick in. It seems we always read about new mobos coming out with this or that, and we don't see them at a respectable price point until 3 months later. :mad:
 
Karash said:
I just wonder when a good supply of this hardware will kick in. It seems we always read about new mobos coming out with this or that, and we don't see them at a respectable price point until 3 months later. :mad:

I'm not worried about the mobo, I'm concerned about the supply of PCI-E 6800s. Look how badly BOTH Nvidia and ATI have mismanaged their high end AGP card supply. With Asus and Abit, I don't think you'll have a problem getting a mobo at a decent price and time. MSI is another story.
 
Thanks for the articale, I won't be moving to the SLI from what I've read not enough of a jump in frame rates vs price. You tell me why an Ati pe isn't being put out to pasture with Sli.Shoot there is something bogus with all that horsepower under the hood and thats the best Sli can do. Heck I'm an addict when it comes to computers but even I can tell, and back away from this HYPE, Its a marketing Con Job. I agree with some of the above, were reaching a plateau, and things are slowing down,so lets get the advertising HYPE in motion and keep our products moving out the door.
 
Quote:
Originally Posted by spine
You don't have to buy the 2nd card straight away. You don't even have to buy the mobo. So you can buy the card today knowing that sometimes in the future you might be able to get into an sli setup.
It's like RAID on mobos in the early days. I bought one thinking that I'd maybe use it. Then, a year later, I bought a 2nd hard disk and did indeed use it.


Wth that thinking and lets say I have a 9700 pro And now I want SLI, tell me where I can Find a 9700 pro at my local diss. Heck they quit making the card, Oh I forgot I could maybe find one on Ebay, Ya right.
 
Kyle, just wondering if you could do a benchmark to see what kind of performance EQ2 can get out of a SLI setup. Maximum resolution and IQ settings. I looking to build a new computer to be able to play EQ2 the way it was meant but have not been able to find any reviews done on SLI with EQ2. I would hate to spend all that money and find out that the engine is either not SLI compatible or is so CPU bound that a SLI configuaration does nothing for the IQ.
 
This question has been asked a few times in this thread but I have not seen it answered.

IS THERE ENOUGHT ROOM ON THE ASUS SLI BOARD TO HAVE 2 6800GT'S WITH NV5 SILENCERS ON THEM???​
 
poach said:
IS THERE ENOUGHT ROOM ON THE ASUS SLI BOARD TO HAVE 2 6800GT'S WITH NV5 SILENCERS ON THEM???

WHEN YOU TURN THE CAPS LOCK ON, IT MAKES IT LOOK LIKE WHAT YOU'RE SAYING IS IMPORTANT!!!
DON'T FORGET ABOUT CHANGEABLE SIZE AND COLOR, TOO!!!




NV5's have a 2-slot design, so YES there is enough room.

-SEAL
 
Back
Top