Motherboard that supports 6 8xPCIe lanes

According to the website you posted:

7 x PCIe 3.0/2.0 x16 (single x16 or dual x16/x16 or triple x16/x16/x16 or quad x16/x16/x16/x16 or seven x16/x8/x8/x8/x8/x8/x8)

And then in the manual...

Capture.JPG


That CPU isn't technically supported according to the list, and obviously 6x8=48 PCIe lanes needed, and I'm not sure how it breaks down the 24 lanes in the chipset either.
 
Last edited:
The more I think of it, the less likely it sounds that there's even a single chip option. Do I need to go with a dual processor system?

I found this post about scaling up GPU's:
https://www.quora.com/Can-I-double-the-PCIe-lanes-in-a-dual-CPU-motherboard

They mention Supermicro X9DRH-7TF, which is an $800 board :/
https://www.supermicro.com/products/motherboard/xeon/c600/x9drh-7tf.cfm

I also notice it has 8x slots, so I'm guessing I would have to get these risers to attach the 16x slot GPU's:
https://www.moddiy.com/products/PCI...der-Cable-w{47}Molex-%2b-Solid-Capacitor.html

This is getting crazy.
 
The more I think of it, the less likely it sounds that there's even a single chip option. Do I need to go with a dual processor system?

I found this post about scaling up GPU's:
https://www.quora.com/Can-I-double-the-PCIe-lanes-in-a-dual-CPU-motherboard

They mention Supermicro X9DRH-7TF, which is an $800 board :/
https://www.supermicro.com/products/motherboard/xeon/c600/x9drh-7tf.cfm

I also notice it has 8x slots, so I'm guessing I would have to get these risers to attach the 16x slot GPU's:
https://www.moddiy.com/products/PCI%2dExpress-PCI%2dE-8X-to-16X-Riser-Card-Flexible-Ribbon-Extender-Cable-w{47}Molex-%2b-Solid-Capacitor.html

This is getting crazy.
What is the use case for the PC you're building?
 
Oh wait a second, I just came across this article:
https://www.cgdirector.com/best-hardware-for-gpu-rendering-in-octane-redshift-vray/

They mention that the WS x299 Sage does support up to 7 gpu's @ 8x speed with a 44 lane processor by using a PLX chip to allow the CPU to support more lanes.

The build is for rendering with Redshift. When I asked on the forums over there they said the cards need 8x pcie lanes per card.

Right now I'm leaning towards an ASUS WS X299 Sage + Intel i7-9800X. I also got suggestion for the RAM to be twice the total VRAM (6 x 8GB x 2) making it 128GB total for the RAM, plus the power supplies.

I was hoping by having a specialized rig for gpu rendering I could optimize the cost after seeing what crypto miners are doing, but it looks like the only setups that support more than 4 gpu's for rendering are specialized HEDT systems.
 
Last edited:
Ooo, actually the processor specs say non-ecc :/

The ECC RAM may work, but it won't work in ECC mode with an Intel CPU without a 'workstation' chipset on the board. If ECC is important, then the above-mentioned C422 board would be necessary, or you could consider Threadripper. If not, then move on :).
 
You'll be very hard pressed to find a motherboard that has 6 x8 PCIe slots that doesn't use a PLX chip, you are basically looking at a dual socket solution on Intel or Epyc on AMD. Whatever solution you go with unless you watercool the cards you are going to need PCie risers and a mining chassis to hold them as almost all modern gou's are 2 slot designs at the very least.

Do you really want/need 6 GPU's, would 4 not be an easier and much cheaper option?.

As an FYI, my Asus Z9PE-D8 WS will do x8,x8,x8,x8,x16,x8,x16 if i populate both sockets and all 7 PCIe slots. With 4 slots occupied it does x16,0,x16,0,x16,x8,x16
 
Are you open to server boards?

Plenty of dual socket Supermicro boards have configurations like this. On the server 8x slots seem to be the preferred size.
 
The deal I was looking at was 6x 1070's for $1320, but after doing a bunch of research realized I'd be dropping another $3k to assemble a system that would support it.

Also it will easily be outpaced by half as many rtx 2070 once render engines optimize for those, so I turned it down :/

Would have been a fun project though.
 
Back
Top