first nvidia dual gpu card

rayman2k2 said:
remember the XGI Volari Duo?


yeah...

shitty drivers more than anything else...

what i dont understand about this card however, is that two regular 6600gt's in SLI are slower than a single 6800GT. now just because they put two gpu's on a single card it makes them faster than a 6800ultra? is the latency that bad?
 
Jason711 said:
shitty drivers more than anything else...

what i dont understand about this card however, is that two regular 6600gt's in SLI are slower than a single 6800GT. now just because they put two gpu's on a single card it makes them faster than a 6800ultra? is the latency that bad?
exactly what i was thinking
 
2 GPUs and 256mb of 256bit memory. Wouldn't that make each GPU running 128mb of 256Mbit memory?

That means the memory is a lot better than the memory your average 6600 is packing..

unless I'm just completely wrong
 
This would not only be almost twice the performance of a regular 6600 GT card, but also more than ATI's Radeon X850 XT Platinum Edition, which achieved in Gigabyte's test environment 13,271 points and Nvidia's GeForce 6800 Ultra, which posted 12,680 points.
umm.... whoa :eek:
i know benchmarks dont mean much to most of you guys, but that line kinda hit me lol. but it does look like it will cost more. not really sure though.

RAutrey said:
What's with all the arrousing talk? This ain't no pr0n site. :D
haha, i actually laughed out loud on this comment. :p
 
regardless of the performing power of SLI video cards, I still cannot help but feel that a dual GPU design is ultimately a better idea. It's nice to know that you don't have to buy 2 SLI compatible cards to get the most out of graphics performance. You don't have to buy specialized SLI motherboard (although granted we love doing that), and who cares if it's cheating Moore's Law - it is making the most of existing hardware. and uhhhh... i dunno i'm drawing a blank. i guess ultimately i see no reason to knock it.
 
tshen_83 said:
if voodoo 5 wasn't a failure(compared to NVidia's TNT2 and GeForce), 3dfx wouldn't have gone out of business...


If this was true then ATI and Nvidia would have gone under a long time ago. 3dfx didn't go under because of one card idea. Its called bad management.

To the post above this, this dual 6600GT requires SLI.
 
MeanieMan said:
If this was true then ATI and Nvidia would have gone under a long time ago. 3dfx didn't go under because of one card idea. Its called bad management.

To the post above this, this dual 6600GT requires SLI.


It is SLI but it doesn't require that you have 2 pci-e slots only one.
 
draksia said:
It is SLI but it doesn't require that you have 2 pci-e slots only one.


Granted the card only requires 1 PCI-E slot, but it still requires the SLI nforce4 chipset, how many motherboards you going to find with the SLI nforce4 chipset, but only 1 PCI-E slot?
Even the Tom's Hardware article shows the card in a dual slot motherboard.
 
MeanieMan said:
Granted the card only requires 1 PCI-E slot, but it still requires the SLI nforce4 chipset, how many motherboards you going to find with the SLI nforce4 chipset, but only 1 PCI-E slot?
Even the Tom's Hardware article shows the card in a dual slot motherboard.

Err no. what gave you that idea?

==>Lazn
 
Lazn_Work said:
Err no. what gave you that idea?

==>Lazn

Tom's hardware quote:
"Gigabyte will announce Friday a graphics card running two graphics processors on one board. According to sources, the SLI card will lift current 3DMark2003 record levels by a significant margin while being priced lower than ATI's and Nvidia's single-GPU high-end cards."

Another quote:
"The 3D1's two processors communicate through Nvidia's SLI interface and achieved 14,293 points in 3DMark2003, sources at Gigabyte said.



How many companies make "Nvidia's SLI"?
I know there are several levels of nforce4 chipsets, but from what I've seen only the SLI branded one supports SLI.
 
MeanieMan said:
Tom's hardware quote:
"Gigabyte will announce Friday a graphics card running two graphics processors on one board. According to sources, the SLI card will lift current 3DMark2003 record levels by a significant margin while being priced lower than ATI's and Nvidia's single-GPU high-end cards."

Another quote:
"The 3D1's two processors communicate through Nvidia's SLI interface and achieved 14,293 points in 3DMark2003, sources at Gigabyte said.



How many companies make "Nvidia's SLI"?
I know there are several levels of nforce4 chipsets, but from what I've seen only the SLI branded one supports SLI.

Any chipset with TWO slots that are capable of running video cards and are close enough together for the SLI connecor will run SLI, in fact Xeon motherboards with Intel chipset have been running SLI video cards for some time.

NVIDIA's SLI interface is the little connector on the top of SLI capable video cards, and has NOTHING to do with the chipset.

==>Lazn
 
Lazn_Work said:
Any chipset with TWO slots that are capable of running video cards and are close enough together for the SLI connecor will run SLI, in fact Xeon motherboards with Intel chipset have been running SLI video cards for some time.

NVIDIA's SLI interface is the little connector on the top of SLI capable video cards, and has NOTHING to do with the chipset.

==>Lazn

Any chipset with two slots is right, Alienware has SLI right now with an Intel chipset on their X2 motherboard. The space between the slots doesn't matter, some boards have it 2, others have them 3 slots apart.
Some SLI boards don't even use the SLI connector piece, this card doesn't use one either. Rumors on the 1gb transfer rate of these boards are all over the place.

And the number of video cards supported has almost everything to do with the chipset, nvidia's nforce4 sli is just the best buy. Have you seen the price of Intel's version :eek:

If you are using your wallet and brain when shopping for a dual card solution, the nforce4 sli is the answer. This 6600GT won't work with a chipset that doesn't support dual PCI-express cards.
 
MeanieMan said:
Any chipset with two slots is right, Alienware has SLI right now with an Intel chipset on their X2 motherboard. The space between the slots doesn't matter, some boards have it 2, others have them 3 slots apart.
Some SLI boards don't even use the SLI connector piece, this card doesn't use one either. Rumors on the 1gb transfer rate of these boards are all over the place.

And the number of video cards supported has almost everything to do with the chipset, nvidia's nforce4 sli is just the best buy. Have you seen the price of Intel's version :eek:

If you are using your wallet and brain when shopping for a dual card solution, the nforce4 sli is the answer. This 6600GT won't work with a chipset that doesn't support dual PCI-express cards.

The SLI connector is "Nvidia's SLI Interface" that is how those two GPU's on the Gigabyte board are communicating. the connection is built into that board. (both GPU's on one card, the pins that normally go to that interface go directly to the correct pins on the other GPU)

As for the board that don't use that interface, that is called XLI and it is slower than using that connector for normal SLI. http://www.hexus.net/content/reviews/review.php?dXJsX3Jldmlld19JRD05NTY=

And because the connector for between the two video cards is provided by the motherboard MFG they can adjust the distance between the two video PCI-E slots. (like you said, some make it 2 some 3 apart) see: http://tech-report.com/reviews/2004q4/sli/index.x?pg=1

My point is that this Gigabyte SLI on one card video card will work on any PCI-E motherboard that has ONE video PCI-E slot, any chipset, any mfg. not just Nvidia Nforce4 SLI motherboards, but any PCI-E motherboard.

==>lazn
 
Lazn_Work said:
The SLI connector is "Nvidia's SLI Interface" that is how those two GPU's on the Gigabyte board are communicating. the connection is built into that board. (both GPU's on one card, the pins that normally go to that interface go directly to the correct pins on the other GPU)

As for the board that don't use that interface, that is called XLI and it is slower than using that connector for normal SLI. http://www.hexus.net/content/reviews/review.php?dXJsX3Jldmlld19JRD05NTY=

And because the connector for between the two video cards is provided by the motherboard MFG they can adjust the distance between the two video PCI-E slots. (like you said, some make it 2 some 3 apart) see: http://tech-report.com/reviews/2004q4/sli/index.x?pg=1

==>lazn


After you just rewrote what I posted I figured out something. You actually understand that this card will only work with a supporting chipset (the original reason I posted to correct someone saying you didn't).
Our terminology is just different, something that happens a lot with products that aren't even released yet.
 
MeanieMan said:
After you just rewrote what I posted I figure out something. You actually understand that this card will only work with the supporting chipset (the original reason I posted to correct someone saying you didn't).
Our terminology is just different, something that happens a lot with products that aren't even released yet.

Erm no.

I was rewriting it, but it will work with ANY PCI-E motherboard, ANY chipset.

As long as it has at least one PCI-E video capable slot.

==>Lazn
 
Lazn_Work said:
Erm no.

I was rewriting it, but it will work with ANY PCI-E motherboard, ANY chipset.

As long as it has at least one PCI-E video capable slot.

==>Lazn

Then you don't understand. It won't work with any PCI-express motherboard. Why do you think Giga-byte is packaging a few with its SLI boards and not other boards?
The chipset has to support dual PCI-slots in order for this card, using SLI, to work.

SLI by design takes a PCI-express connection at 16x and spilts it into two 8x lanes. Since video cards still haven't reached that benchmark.
In order to do two lanes, you need a chipset that supports two lanes.
 
MeanieMan said:
Then you don't understand. It won't work with any PCI-express motherboard. Why do you think Giga-byte is packaging a few with its SLI boards and not other boards?
The chipset has to support dual PCI-slots in order for this card, using SLI, to work.

No you don't understand. It dosn't have to support Dual PCI-E to work.

Gigabyte is packaging it with it's newest motherboards to show both off.

Nvidia's SLI interface that it communicates through is built onto the card, the GPU's rather.

I don't get where you think it needs a SLI motherboard from..

==>Lazn
 
MeanieMan said:
Then you don't understand. It won't work with any PCI-express motherboard. Why do you think Giga-byte is packaging a few with its SLI boards and not other boards?
The chipset has to support dual PCI-slots in order for this card, using SLI, to work.

SLI by design takes a PCI-express connection at 16x and spilts it into two 8x lanes. Since video cards still haven't reached that benchmark.
In order to do two lanes, you need a chipset that supports two lanes.

Than why does it work with motherboards that have one PCI-E 16x slot and one PCI-E 4x slot?

The splitting of lanes is not a requirement for SLI to work that is just because current PCI-E chipsets don't have enough lanes for two 16x slots, it is a matter of ease of MFG not that SLI requires it.

http://xtremesystems.org/forums/showthread.php?t=47627

==>Lazn
 
Lazn_Work said:
No you don't understand. It dosn't have to support Dual PCI-E to work.

Gigabyte is packaging it with it's newest motherboards to show both off.

Nvidia's SLI interface that it communicates through is built onto the card, the GPU's rather.

I don't get where you think it needs a SLI motherboard from..

==>Lazn

SLI needs two lanes to work, if the chipset can't support two pci-express lanes it won't work. SLiI just takes 1 pci-express slot and cripples into 1 8x lane; d that twice and you have PCI-express at 16x. If any chipset could support two lanes there would have been no need for 3 versions of the nforce4 chipset, or for alienware to pick a serverboard for a gaming rig.
Only the bridge piece is built into the card, since its only one card.
Why would gigabyte show off a board that you won't even use completely? Thats a alot of wasted money and card slots if you think about it.

The bridge piece only helps combine the image, you still need a chipset that can communicate with two pci express lanes at one time.
 
bboynitrous said:
Yeah I read about this a while ago. It's going to be dual 6600GT chips on a single board and run faster than a 6800 ultra. All still being about the cost of a 6800GT. I think it's an awesome idea for people looking for a better bang for their buck :D

But dual 6600GT cards still loose to the GT and Ultra at most benchmarks above 1280x1024 (or at) and with any sort of AA...

This is probably going to be the only core available in this setup (considering heat factor)... I wonder why the COMPANY went out of their way to do this..
 
DropTech said:
But dual 6600GT cards still loose to the GT and Ultra at most benchmarks above 1280x1024 (or at) and with any sort of AA...

This is probably going to be the only core available in this setup (considering heat factor)... I wonder why the COMPANY went out of their way to do this..
2 6600GTs beat 6800gt in ALL resolutions with NO AA or AF, but once AA or AF is introduced 6800gt takes the lead
 
MeanieMan said:
SLI needs two lanes to work, if the chipset can't support two pci-express lanes it won't work. SLiI just takes 1 pci-express slot and cripples into 1 8x lane; d that twice and you have PCI-express at 16x. If any chipset could support two lanes there would have been no need for 3 versions of the nforce4 chipset, or for alienware to pick a serverboard for a gaming rig.
Only the bridge piece is built into the card, since its only one card.
Why would gigabyte show off a board that you won't even use completely? Thats a alot of wasted money and card slots if you think about it.

The bridge piece only helps combine the image, you still need a chipset that can communicate with two pci express lanes at one time.

Do you have any idea how PCI-E works? A PCI-E 16x slot is 16 1x lanes in one slot, 16 separate lanes.. you can put a 1x card into a 16x slot, and 15 lanes will go unused. So any PCI-E 16x card communicates with 16 PCI-E 1x lanes at one time, much less just two. http://www.pcisig.com/specifications/pciexpress/

See here:
http://xtremesystems.org/forums/showthread.php?t=47627
How it works just fine with a Ultra, non SLI chipset.

==>Lazn
 
Just to resolve this, I went ahead and emailed Gigabyte the folowing question:

"Just a quick question, will the new Gigabyte 3D1 Video card mentioned here: http://www.tomshardware.com/hardnews/20041216_115811.html require a SLI motherboard, or because it is on a single PCI-E card will it work with any PCI-E motherboard that has a 16x video slot?
Thanks."

I will get back to you when I get a responce.

==>Lazn
 
its still defying all logic that it is faster than a 6800u and an XTPE.
 
MeanieMan said:
SLI needs two lanes to work, if the chipset can't support two pci-express lanes it won't work. SLiI just takes 1 pci-express slot and cripples into 1 8x lane; d that twice and you have PCI-express at 16x. If any chipset could support two lanes there would have been no need for 3 versions of the nforce4 chipset, or for alienware to pick a serverboard for a gaming rig.
Only the bridge piece is built into the card, since its only one card.
Why would gigabyte show off a board that you won't even use completely? Thats a alot of wasted money and card slots if you think about it.

The bridge piece only helps combine the image, you still need a chipset that can communicate with two pci express lanes at one time.

Hmm, you were right, but only because of Nvidia's screwing around.
http://www.anandtech.com/video/showdoc.aspx?i=2315
There is no technical reason why it would not work if Nvidia did not have SLI setups search for SLI motherboards on boot:

"The reader should understand this before beginning the review: these solutions are somewhat limited in application until NVIDIA changes its philosophy on multi-GPU support in ForceWare drivers. In order to get any multi-GPU support at all, the driver must detect an SLI capable motherboard. This means that we had to go back to the 66.81 driver in order to test Intel SLI. It also means that even if the 3D1 didn't require a special motherboard BIOS in order to boot video, it wouldn't be able to run in SLI mode unless it were in an SLI motherboard. "

That sucks, because like I said, there it no technical reason for it to not work, (with the older drivers it worked on the Intel MB) but Nividia intentionally broke it to require a "SLI motherboard"

==>Lazn
 
Duh, of course Nvidia would do that - it can't allow its videocards to run in SLI mode on mobos that were not certified by it for SLI - otherwise it's gonna be a support nightmare for videocard manufacturers, and, in the end, nvidia itself.
 
Back
Top