Is the DFI SLI-DR Expert SLI X16?

Status
Not open for further replies.
SLI mode
- Use 2 SLI-ready PCI Express x16 graphics cards (identical cards) on the PCI Express x16 slots.
- Each x16 slot operates at x8 bandwidth. When the graphics cards are connected via the SLI bridge, the total bandwidth of the two graphics cards is x16.

so they'd operate 8x if you're in SLI
 
obsolete said:
SLI mode
- Use 2 SLI-ready PCI Express x16 graphics cards (identical cards) on the PCI Express x16 slots.
- Each x16 slot operates at x8 bandwidth. When the graphics cards are connected via the SLI bridge, the total bandwidth of the two graphics cards is x16.

so they'd operate 8x if you're in SLI


Not to be an ass, but we know that's how regular sli works. I don't think he reallizes that there's a new sli chipset that'll run x16 on both pcie connectors at once.

I would like to see more evidence one way or another. I think that if that fucker costs 200 bones, it should fucking have the new x16 sli, whether or not that actually does something.

If you knew that, obsolete, then sorry. It just kind of looked like you didn't know what he meant.
 
^^^Sorry to burst your bubble, but yes, the DFI Expert at $200 runs SLI only in 2 X 8 and not 2 X 16 lanes as stated in the specs on DFI's site and if you notice, it only uses one chipset and not the 2 like the current 2 X 16 lane boards from Asus and MSI (shipping soon I think).

http://us.dfi.com.tw/Product/xx_product_spec_details_r_us.jsp?PAGE_TYPE=US&PRODUCT_ID=3872&SITE=US
To quote DFI's specs:
"SLI / Single VGA Mode

# SLI mode - Use 2 SLI-ready PCI Express x16 graphics cards (use identical cards) on the PCI Express x16 slots. - Each x16 slot operates at x8 bandwidth. When the graphics cards are connected via the SLI bridge, the total bandwidth of the two graphics cards is x16.

# Single VGA mode - 1 PCI Express graphics card on the PCIE1 slot operates at x16 bandwidth. - The other PCI Express x16 slot (PCIE4) operates at x2 bandwidth."

Amazing what you can find if you read the manufacturers specs on their website, wow.
 
2x16 bandwidth does nothing for you. Sure the Asus "seems" to perform better with the extra video card bandwidth, but if you check the benches, it also performs significantly (+/- 5%) better with a single video card (which is running at x16 like the rest of the boards). This leads to the observation that it probably isn't the dual x16 slots giving you extra performance, but the chipsets that are used instead of the regular NF4 SLI one.

That being said, the DFI Expert is a better OCer. If you're using WC or phase, you should check out the DFI.
 
I agree that the 2x16 really isn't worth the price premium right now, but the DFI expert seems to have some price gouging going on for now with it being $200. Unless you need the extra space between the videocards in SLI and don't mind moving jumpers to put it in SLI and for the memory voltage rather than doing it in the bios, get the DFI SLI-DR, then down the road, if and when 2x16 makes a difference, upgrade then. I'd love to get the expert but the extra 40 to 50 bucks between it and the SLI-DR doesn't seem justified.

54YW4T, you could also just mod your Ultra-D to do SLI and just buy an SLI bridge off the net. You won't get 100% same performance but it'd do until things pan out. Heck, by that time, AMD's new socket will be out and we'll all be screwed to upgrade to the next best thing :D

Here's a Tyan bridge from Monarch computer for 26 bucks shipped:
http://www.monarchcomputer.com/Merchant2/merchant.mv?Screen=PROD&Store_Code=M&Product_Code=270461
 
Are there benches that show how the Ultra-D doesn't perform at 100% of the SLI-D? Just wondering because AFAIK once you mod the Ultra-D it essentially becomes the SLI-D even at the bios level.
 
Article with benches for NF4 Ultra-D modded for SLI (a little less than normal SLI):
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2322

However, I'm not sure if Nvidia has maybe disabled the mod in their drivers after seeing the above article again and this update below in the original Ultra review by them. Do any modded Ultra-D users out there know if SLI is indeed disabled on the modded chipset with an SLI-D bios loaded?

http://www.anandtech.com/mb/showdoc.aspx?i=2337

From Anandtech: It may depend on when the Ultra-D was made for it to be modded also:

"UPDATE 2/05/2005: nVidia has acted to prevent, or at least make it more difficult, to mod the Ultra board to SLI. First, DFI has advised us, and posted on their website, that they will NOT sell the SLI bridge to buyers of the Ultra board. Second, nVidia has advised us that future shipments of the Ultra chipset have been modified so that the mod to SLI will no longer be possible. An additional side effect of this second action is that the "Dual Video" mode, which performs at about 90% of SLI performance levels, will only work with nVidia SLI drivers 66.75 or earlier. If you do a quick check of web driver postings you will see it is now very difficult to find 66.75 drivers. With a chipset modded to SLI the "Dual Video" mode worked through 70.xx versions of the nVidia driver. nVidia also made it clear they will continue to make driver changes to prevent functioning of any "non-standard" (8X/8X) operation of their SLI driver. This also throws into question whether the VIA "dual graphics" mode on the 894 Pro chipset will ever work with nVidia graphics cards. If you are interested in the current UT Ultra-D we suggest you buy one now if you can find it. Future versions of the UT Ultra-D will not have the same capabilities as a result of these actions."
 
That review is from an Ultra-D that was running in mismatched SLI (x2 and x16)...basically driver SLI. If you physically mod the chipset, you get true SLI with no performance difference (since it tricks the chipset into thinking it's the NF4 SLI instead of the Ultra chipset). You also need to come up with a SLI bridge.
 
i copied and pasted a review for the DFI Expert and just restated the fact that it cant do dual 16x under it so i wasnt being an ass i was answering what i thought his question was.

apHytHiaTe said:
Not to be an ass, but we know that's how regular sli works. I don't think he reallizes that there's a new sli chipset that'll run x16 on both pcie connectors at once.

I would like to see more evidence one way or another. I think that if that fucker costs 200 bones, it should fucking have the new x16 sli, whether or not that actually does something.

If you knew that, obsolete, then sorry. It just kind of looked like you didn't know what he meant.
 
kirbyrj said:
That review is from an Ultra-D that was running in mismatched SLI (x2 and x16)...basically driver SLI. If you physically mod the chipset, you get true SLI with no performance difference (since it tricks the chipset into thinking it's the NF4 SLI instead of the Ultra chipset). You also need to come up with a SLI bridge.

My bad, you're totally right, there is no difference at all modded, I misinterpreted the graphs as the Ultra labeled stats meaning it was the modded chipset when it was really the x2 and x16. If this mod still works then an Ultra-D at say 119 bucks at some places is a great bargain with a little conductive paint. But then again, if you don't already own an Ultra-D, 25 bucks more for a bridge and you might as well buy an SLI-D which I think is selling at around 140 anyway.
 
NACZ3 said:
My bad, you're totally right, there is no difference at all modded, I misinterpreted the graphs as the Ultra labeled stats meaning it was the modded chipset when it was really the x2 and x16. If this mod still works then an Ultra-D at say 119 bucks at some places is a great bargain with a little conductive paint. But then again, if you don't already own an Ultra-D, 25 bucks more for a bridge and you might as well buy an SLI-D which I think is selling at around 140 anyway.

Agreed...buying the SLI bridge for $20 kills the whole point of modding to save money ;). Plus, you have to take off the HSF, scrape away the epoxy, connect some bridges with a conductive pen, remount the HSF, and then buy the SLI bridge.
 
apHytHiaTe said:
Not to be an ass, but we know that's how regular sli works. I don't think he reallizes that there's a new sli chipset that'll run x16 on both pcie connectors at once.

I would like to see more evidence one way or another. I think that if that fucker costs 200 bones, it should fucking have the new x16 sli, whether or not that actually does something.

If you knew that, obsolete, then sorry. It just kind of looked like you didn't know what he meant.


Umm, this response is uncalled for. Not to be an ass, but nobody cares what you "think" it should do vs. how much it costs. It is NOT an SLi 16x motherboard, meaning, it does NOT use the new C51D chipset. Why don't you spend 3 minutes of your time on the DFI website and do some research.
 
Status
Not open for further replies.
Back
Top