Triple SLI benchmarked...

Hmm. Interesting. 800w is fucking ridiculous. Cmon nVidia... Get a new high end card out, rather than tripling up :mad:

I still think "TriSli" sounds better though :D
 
I dont know German, do they use pci-e molex adapters to connect all 6 power connections cause the pics dont look like pci-e-to-molex adapters.
 
still pretty damn impressive scaling was damn good a good jump from regualr sli.
 
When I was screwing around and put 3 video cards on my 680i, the cooling fan inlets were practically covered. Looks like a good way to bake a $500 video card to me.
 
Maybe if they were able to use 3 8800GTs with the new wider fans it would be a lil cooler, more stable, take up less slots and require a lot less power. Why would Nvidia not allow the new 8800GT and 512MB 8800GTS to be triplSLI'ed? Maybe bcuz ppl would be spending less money I guess.
 
I'm cooling on water right now so I know I could handle the heat, but it would be expensive for me to grab an 8800GTX, water cool it and give it a go. I do already have the necessarily SLI bridges. I've got a short SLI bridge that I kept from my Tyan S2895 build. I kept it even after I sold the board because the bridge was purchased seperately as my board didn't include the bridge when I bought it.

So I can do it technically, but damn what an expensive undertaking. Though I am not 100% positive the third card would clear my south bridge waterblock.
 
That's an insane amount of electricity going on there...

instead of tripple sli, why cant they just put more cores on a gpu??

seems like overkill.. e-penis... thats all. after looking at tripple-SLI, my 8800GTS 640 looks, for the lack of a better word; flaccid.
 
Interesting question, why can't they make something like the Intel Quad Cores or Core 2 Duo, but on a GPU? Wouldn't that make it handle more streams of data quickly?
 
That's an insane amount of electricity going on there...

instead of tripple sli, why cant they just put more cores on a gpu??

seems like overkill.. e-penis... thats all. after looking at tripple-SLI, my 8800GTS 640 looks, for the lack of a better word; flaccid.

That concept doesn't work the same on GPUs as it would with CPUs. Someone explained it to me pretty well once, but now I can't think of it. I hate how people think thats just the solution to everything.
 
Probably due to the size of the thing. Right now the G92 GPU is extremely large, but still smaller than G80. In any case the heat combined with cost, as well as die size are the likely reasons why they can't do that. Already the modern high end GPUs have 700 million transistors. Imagine how many there would be with that multiplied by two or four times. You'd have a GPU that was a size probably exceeding a realistic size, power draw and heat output for todays PC architecture.

They could always go with a less powerful solution that uses an internal SLI approach but that's probably less efficient than a single more powerful GPU core.
 
If it makes the difference between LotRO w/ DX10 eye candy as a slide-show, a moderately satisfying experience, or smooth as glass, I'll be doing this. . . with the next gen parts.

Why? Because I need to wag my e-peen around? No, because I'm spending so much time on LoTRO that I haven't been spending any real-world money for months. . . and with the wife's help, we've had our credit cards paid off for months now. No credit card payments = savings. :)

So. . . got money to burn, and an enthusiast's lust for tinkering and screwing around. . . = Triple SLI.

Gotta get this geekery in there before we start having kids!
 
So - it appears a system that will play Crysis at 30fps, at 1920x1200 and everything very high actually now exists.

1197459453197.PNG
 
That concept doesn't work the same on GPUs as it would with CPUs. Someone explained it to me pretty well once, but now I can't think of it. I hate how people think thats just the solution to everything.

GPUs are stream processors, which are parallel (multithread sorta) already.
 
If it makes the difference between LotRO w/ DX10 eye candy as a slide-show, a moderately satisfying experience, or smooth as glass, I'll be doing this. . . with the next gen parts.

Why? Because I need to wag my e-peen around? No, because I'm spending so much time on LoTRO that I haven't been spending any real-world money for months. . . and with the wife's help, we've had our credit cards paid off for months now. No credit card payments = savings. :)

So. . . got money to burn, and an enthusiast's lust for tinkering and screwing around. . . = Triple SLI.

Gotta get this geekery in there before we start having kids!

I understand where you are coming from. I'll probably be going with Triple SLI on my next build or massive system overhaul. Though that r4eally depends on what G80's proper successor turns out to be like.
 
TRI SLI is worthless, three cards to gain 2x performance WTF! Not to mention the money you have to spend and the amount of heat it produces. I think I'll pass :)
 
Interesting question, why can't they make something like the Intel Quad Cores or Core 2 Duo, but on a GPU? Wouldn't that make it handle more streams of data quickly?

The GPU has been multicore for years.

Other then that, the only option is a GX2 with multiple GPUs on one board like they did with the 7 cards. But that has its drawbacks. Huge power needed, limited area for components, and a single PCI express slot simply doesn't have the bandwidth, and thats the biggest reason right there.
 
Interesting question, why can't they make something like the Intel Quad Cores or Core 2 Duo, but on a GPU? Wouldn't that make it handle more streams of data quickly?

Bah that's old news. Gold 'ole Voodoo 5. ;)
 
TRI SLI is worthless, three cards to gain 2x performance WTF! Not to mention the money you have to spend and the amount of heat it produces. I think I'll pass :)

According to the link provided the performance was almost tripled. It was at least twice as fast as a single card and then some.

The GPU has been multicore for years.

Other then that, the only option is a GX2 with multiple GPUs on one board like they did with the 7 cards. But that has its drawbacks. Huge power needed, limited area for components, and a single PCI express slot simply doesn't have the bandwidth, and thats the biggest reason right there.

In a sense yes.

So SLI works decent now with Crysis? Does crossfire work any good?

SLI is getting better in regard to Crysis performance. Crossfire still doesn't work correctly if at all if I am not mistaken in that particular game. Though I'm sure a time will come where that will change. Hopefully soon.

Bah that's old news. Gold 'ole Voodoo 5. ;)

Indeed. :D
 
That concept doesn't work the same on GPUs as it would with CPUs. Someone explained it to me pretty well once, but now I can't think of it. I hate how people think thats just the solution to everything.



B3D forums have some great threads on 'multi core' GPU's,and what to expect.As well as how it will be damn hard to implement,even with the best of the best of Nvidia or AMD's engineers plugging away at it. :( Its going to be insanely complicated and expensive.

Might be better to continue to go monolithic die (1 billion+ transistors) for a while yet...
 
any links about TRiSLi where there's two pcie 2.0 slots and one pcie 1.0 slot. I think the idea was that one card would be used for physics or something to that end
 
any links about TRiSLi where there's two pcie 2.0 slots and one pcie 1.0 slot. I think the idea was that one card would be used for physics or something to that end

On a PCIe 2.0 compliant motherboard all the PCIe slots are 2.0 compliant. Additionally the cards being produced now are designed with a PCIe 1.0/1.0a interface. So throwing them into PCIe 2.0 slots won't change anything.
 
On a PCIe 2.0 compliant motherboard all the PCIe slots are 2.0 compliant. Additionally the cards being produced now are designed with a PCIe 1.0/1.0a interface. So throwing them into PCIe 2.0 slots won't change anything.
Have you heard of a setup where you have 2 same model cards linked and a 3rd different model card in the configuration?
 
I'm curious.....I have the 680i MB, but as I recall, the third GPU slot is 8X as the two "normal"SLI GPU slots are 16X correct???? Im not sure if that screws anything up or not.....

I did some figuring this afternoon......
1) new 8800 GTX BFG OC = 479 dollars after rebate
2) new PSU,Ultra 1000W X3= 249 dollars
3) new case, mine would be too small, CoolerMaster Stacker 830= 179 dollars
4)4 new 120mm fans= 40 dollars
5)the tri-SLI bridge= 10 dollars

Total to minimally upgrade = 957 dollars.......holy guacomole.:eek:

and my sound card would have to go.:mad:
 
I'm curious.....I have the 680i MB, but as I recall, the third GPU slot is 8X as the two "normal"SLI GPU slots are 16X correct???? Im not sure if that screws anything up or not.....

I did some figuring this afternoon......
1) new 8800 GTX BFG OC = 479 dollars after rebate
2) new PSU,Ultra 1000W X3= 249 dollars
3) new case, mine would be too small, CoolerMaster Stacker 830= 179 dollars
4)4 new 120mm fans= 40 dollars
5)the tri-SLI bridge= 10 dollars

Total to minimally upgrade = 957 dollars.......holy guacomole.:eek:

and my sound card would have to go.:mad:

Yes the middle PCIe x16 slot actually is only an x8 slot electrically speaking. I'm currently running a x4 PCIe RAID controller in mine. The nice thing is that since my cards are water cooled they are single slot solutions and I can use all my PCI and PCIe slots.

If I pull that I can run three 8800GTX's however I would have to get another waterblock for the third 8800GTX and that's another $100 because I would have to get one that matched the ones I already have. (I'm a perfectionist.)

So I'd be looking at a $500 upgrade to tide me over for just over a month. I think I'll pass and I'll be waiting to see what D9E looks like in retail form.
 
Have you heard of a setup where you have 2 same model cards linked and a 3rd different model card in the configuration?

Yes, and currently there is little to no support for GPU physics. I researched this a bit when I tried this: (Two 8800GTS and an 8400GS)

threevid.jpg
 
OP link talks about a patch for crysis optimized for TRiSLI. I wouldn't be supprised if crysis also gets a nice gpu physics patch considering it has a number of cmd line settings for sys_GPU_physics

Now that would be interesting, so does regular sli (two video cards) work with crysis now, do you see a big performance boost compared to 1 card? or is there still not much benefit. Possibly they skipped two and fixed it for three? lol
 
Now that would be interesting, so does regular sli (two video cards) work with crysis now, do you see a big performance boost compared to 1 card? or is there still not much benefit. Possibly they skipped two and fixed it for three? lol

I got a nice improvement with SLI in Crysis running on XP......demo was about 25 FPS and when the retail came out with the 169.09 drivers I averaged 37 FPS, running as high as XP allowed and no AA on 1920x1200.
I was disappointed in Vista, I ran high settings,no AA and 1600x1200 and rarely went over 26 or 27 FPS.

I'm hoping the new patch will make an improvement, unless you're right and Crytek and nvidia are conspiring to make tri-SLI the norm.
 
Yes the middle PCIe x16 slot actually is only an x8 slot electrically speaking. I'm currently running a x4 PCIe RAID controller in mine. The nice thing is that since my cards are water cooled they are single slot solutions and I can use all my PCI and PCIe slots.

If I pull that I can run three 8800GTX's however I would have to get another waterblock for the third 8800GTX and that's another $100 because I would have to get one that matched the ones I already have. (I'm a perfectionist.)

So I'd be looking at a $500 upgrade to tide me over for just over a month. I think I'll pass and I'll be waiting to see what D9E looks like in retail form.

I'd have to give up my X-Fi PCI. And that middle card would be too long and hit my drive cage.....so I'd need a Stacker.
I saw an X-Fi PCI-e card on Creative's site that I could install in the top PCI-e x 1 slot, but I cant imagine it would have the same quality as the extreme music I'm using now.
So, I would have to do as you did and WC my three cards.....arrrrgh.
February isnt that far off, I'm with you.
 
OP link talks about a patch for crysis optimized for TRiSLI. I wouldn't be supprised if crysis also gets a nice gpu physics patch considering it has a number of cmd line settings for sys_GPU_physics

Once the patch comes out, I might have to try it. :)
 
Back
Top