Best X399 board for 1 CPU - 4 GPU Multihead LAN Gaming workstation?

Archaea

[H]F Junkie
Joined
Oct 19, 2004
Messages
11,826
I watched one of the Linus Tech videos where he made a multiheaded LAN station off a single CPU with multiple GPUs using VMware.

I still have ten 1080TI's from my mining adventures. I enjoy LAN parties. I have kids who would enjoy playing computer games on such an outfit. I could make 4 identical LAN machines, and for less than what it'd cost to make 4 unique machines.

To this end, I'm investigating buying a Threadripper 1950x system so I can have four quad core virtual machines with a 1080TI in each. Probably 4GB or 8GB of RAM assigned to each.

Anyone have any experience with such? Looks like the boards I've looked at the PCI-E lanes are 3.0 x16 x 2 and x8 x 2 - not that big of a deal, but that's really the only minor fault I can see in building something like this.

It can be a LAN outfit, and mine while not gaming (paying itself off in a year or two) - on the four 1080Ti and the Threadripper.


Any particular x399 board you'd specifically recommend for this use case?
 
Having enough discrete usb controllers for all those logcal systems (4 + 1 for useful host) might be the bitch part. Soft pass sucks for gaming, usb is something that gets overlooked with most VT-d builds. If you really get stumped by a need for additional usb or other controllers, there are ways to turn the m.2 into fully functional x4 slots.

For lane spacing, extra board power and the longest history of vt-d and bifurcation support of any prosumer brand I'd go with asrock.
 
looks like these are my options: (4 PCI-E full length properly spaced 16x slots)
https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007625 601301150&IsNodeId=1&bop=And&Order=RATING&PageSize=36

The ASROCK is the cheapest and has the highest user scores.

For USB connections I think you could either A) use USB Hubs to each PC, or B) purchase monitors that have built in USB Hubs

He's talking about the ability to pass thru dedicated USB controllers to each specific VM, which you will be limited by on the X399 platform
 
so this might be a deal breaker?

Unraid requires a dedicated video card to initialize at startup? according to Linus in his original 1 CPU, 2GPU videos.

I don't think that's going to fit with four 1080TI.


EDIT --- OR MAYBE NOT?
 
With your mining adventures (are you selling/out of the game now as far as GPUs are concerned?) I assume that price is not a major issue. I can't say for this particular use case there may not be some "gotcha" I'm not thinking of, but have you looked into the Asus ROG Zenith Extreme? I'm to understand it is one of the best OCing boards with a ton of features and I know it supports 4 GPUs etc.. When I was considering a (gaming+other) threadripper build, it was at the top of my list . Other manufacturers had some decent highest-end offerings as well, but much like my previous experiences at that particular tier level Asus seemed to stand above the rest.
 
Having enough discrete usb controllers for all those logcal systems (4 + 1 for useful host) might be the bitch part. Soft pass sucks for gaming, usb is something that gets overlooked with most VT-d builds. If you really get stumped by a need for additional usb or other controllers, there are ways to turn the m.2 into fully functional x4 slots.

For lane spacing, extra board power and the longest history of vt-d and bifurcation support of any prosumer brand I'd go with asrock.

For USB I use this add on card : https://www.amazon.com/gp/product/B00XPUHO10/ref=oh_aui_detailpage_o08_s00?ie=UTF8&psc=1

It has 4 individual controllers and each can be passed through to multiple VMs. So you can hook up a hub to each port. Works great on my Zenith Extreme. Although I don't know if he would run into spacing issues with that card + the 4 GPUs. I only run one extra GPU in my setup.

Also I use regular KVM instead of unRAID. For X399 there are some PCI-e reset issues that need to be patched with the latest kernel, not sure if unRAID has this built in. See here: https://forum.level1techs.com/t/threadripper-reset-fixes/123937
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
so this might be a deal breaker?

Unraid requires a dedicated video card to initialize at startup? according to Linus in his original 1 CPU, 2GPU videos.

I don't think that's going to fit with four 1080TI.


EDIT --- OR MAYBE NOT?

I don't think it has been an issue since 5.x/6.x, however, its been a while since I moved to from Unraid. (Read a write speeds were less than ideal, loved the idea of their parity setup though). The one thing Unraid did was make that whole process a lot easier to deal with.

But I only remember having a single GTX 980 in my system at the time that was passed through to my Win10 gaming VM. My setup was S2600CP2J/2x E5-2670s and a 3rd party Japanese USB 3.0 controller to pass through to the host for gamepad, keyboard, mouse.

I want to say after you dedicate it to the VM and restart the system, Unraid sees that it is allocated for use elsewhere and does not initialize it for itself. Could be wrong though... Before that you would have to use a cheapo card in the first PCIE slot (for Unraid) and any other could be passed to VMs.
 
For USB I use this add on card : https://www.amazon.com/gp/product/B00XPUHO10/ref=oh_aui_detailpage_o08_s00?ie=UTF8&psc=1

It has 4 individual controllers and each can be passed through to multiple VMs. So you can hook up a hub to each port. Works great on my Zenith Extreme. Although I don't know if he would run into spacing issues with that card + the 4 GPUs. I only run one extra GPU in my setup.

Also I use regular KVM instead of unRAID. For X399 there are some PCI-e reset issues that need to be patched with the latest kernel, not sure if unRAID has this built in. See here: https://forum.level1techs.com/t/threadripper-reset-fixes/123937

Nice, 4 dedicated controllers on one card. For a 4 gpu setup I would actually get a m2 to x4 converter, this is one of those odd-ball builds where it makes sense. In a larger case with more than 8 pci bracket mounts you could still have it directly attached to the back like any other card. I would expect a pretty big case and cooling setup for 4x1080ti anyways.

Or if at least one card is watercooled the open-ended x1 slot in the middle is available, that is plenty of bandwidth for regular peripherals. The caveat is those lanes come from the pch which sometimes complicates VT-d pass-thru further. For the previous idea, m2 lanes come direct from the cpu on threadripper [fuck yeah].

Regardless this is an extremely complicated build and he should be prepared for unforeseen issues/expenses.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
looks like these are my options: (4 PCI-E full length properly spaced 16x slots)
https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007625 601301150&IsNodeId=1&bop=And&Order=RATING&PageSize=36

The ASROCK is the cheapest and has the highest user scores.

For USB connections I think you could either A) use USB Hubs to each PC, or B) purchase monitors that have built in USB Hubs

I guess thats because of their user review program. But they dont have to say positive things to be in. So I guess it's actually a good mobo.
 
If you are still looking for a board, Wendell from Level 1 has shown pcie passthrough to be working on the MSI X399 Gaming Pro Carbon AC.
 
If you are still looking for a board, Wendell from Level 1 has shown pcie passthrough to be working on the MSI X399 Gaming Pro Carbon AC.

This^ I have that motherboard and it does indeed work with with two 1080ti's I have in there.
 
Back
Top