mining rigs

Bigbacon

Fully [H]
Joined
Jul 12, 2007
Messages
21,584
so I'm thinking it is time to upgrade my mining setup. Right now I have 6 computers running, each handling a single video card.

I have no expansion left other than to get more computers to run more cards.

So I was thinking of getting a rack, a mining motherboard and risers to at least consolidate everything into one computer and give me expansion. I don't want to spend like 200+ bucks on a board though. Hoping this also lets me use a little less power since I dont need to run 6 cpus/boards/HD/etc.

I see mining boards on newegg/ebay for like 150 to 200 bucks but I don't know enough about them.

I would probably try to consolidate PSUs as well since right now ech machine has its own PSU but I can deal with that as I continue along.

What is the best bang for the buck in terms of doing this that isn't taking up more space with more motherboards?
 
If you want to grow your operation, it's imperative that you consolidate. Less power requirements, consolidation, less parts and less machines to watch.

Find racks premade locally and spend the $500 it usually costs. I see them constantly here in that range, and thats Canadian so you will likely be in the $350 range.

They usually have a rack, risers, board and psu for that price.
 
People swear by the ocotominer boards, but I like more cpu and prefer to keep my rigs to 6 gpus. Right now I'm running these https://www.newegg.com/p/N82E16813145016?Item=9SIARTEFNN7753, but they use 6-7th gen CPUs. Super easy to setup and they've been solid.

I just ordered one of these https://www.newegg.com/p/N82E16813119163?Item=9SIA4REF347677 and a 9th gen I-5 to see if it's as solid.

EDIT - also consider using server style PSUs with breakout boards for your GPUs, it's much easier and the cost is about the same.
 
I may not need a dedicated rack. I am sure I can build something that would do the job. it was more about the other hardware. I am trying to do as you said consolidate and be able to add more cards as I can get them. I'm limited on what I get because I try to keep the card cost around 500 bucks + or - maybe 50.

With most setups it seems to be intel based, do all intel CPUs have APUs or do you use one of the video cards for th output to a monitor?

I am running windows and nicehash. I will probably stick with that setup unless there is an alternate/easy way to keep mining. I'm not really motivated to get into nitty gritty overclocking and setups of different softwares.
 
should also add I dont want to spend a fortune on a CPU for this either if possible. is there any AM4 boards that could work to handle like 6+ cards? I have two AM4 cpus I could use.
 
should also add I dont want to spend a fortune on a CPU for this either if possible. is there any AM4 boards that could work to handle like 6+ cards? I have two AM4 cpus I could use.
Seems to me like most full size boards out there have the standard 16x PCIe and then a plethora of 1x slots.
 
OP all your existing computers only have a single pci express slot? Are they mini itx or something?
 
Reason i ask is that anythin matx or larger usually has 3 pcix slots. 1 full size and 2x 1x slots. If you are gonna use risers anyways you should be able to consolidate down to 2 computers just by using risers. You can use multiple psus from existing rigs by using add2psu adapters which are cheap. This would be the cheap option.
 
OP all your existing computers only have a single pci express slot? Are they mini itx or something?

sadly, yes. they are a mix of ITx and Matx OR really OLD crap that still have PCI slots!

What about those PCIe adapter cards that have the USB ports for attaching risers? It looks like you can use one pcie slot and get 4 cards attached to it and also use the m.2 slots with adapters to get more cards.

EDIT:
so i am dumb, two of the boards have 3 PCIe slots, one as 1 full length and 2 stubbies, the other has 2 full length and 1 stub, so I guess I could get more that 1 card. Although I did try 2 on the one board with the 2 full length slots and one of the cards would always be completely unstable and crash the machine. I'm not sure if it is the card being unhappy for the whole system.
 
Yeah so start with just risers and see how it goes. The pciex splitter thing you mentioned is also a good option.
 
and other than using multiple PSU for now, how you suggest lowering those down? I saw kits somewhere for using server PSU and they were not badly priced.

Right now each rig has its own PSU and I have PSU that is good that can handle multiple cards @850 watts. that is probably good enough for 3 cards depending. the 1080s are like ~130watts and the 1080tis are ~205
 
and other than using multiple PSU for now, how you suggest lowering those down? I saw kits somewhere for using server PSU and they were not badly priced.

Right now each rig has its own PSU and I have PSU that is good that can handle multiple cards @850 watts. that is probably good enough for 3 cards depending. the 1080s are like ~130watts and the 1080tis are ~205

So run what you can on existing psu and steal psu from rigs you are decomisioning as needed and join with one of these. https://www.amazon.com/BAY-Direct-M...uPWNsaWNrUmVkaXJlY3QmZG9Ob3RMb2dDbGljaz10cnVl
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
does that just allow you to chain a bunch of PSU under one power switch?
You can also try these: https://tinyurl.com/2p8h2c65
I've used them and they seem to work well. Some motherboards though it's been kinda weird though with the synchronization and I had to do turn them both on manually (might've installed the cable wrong on that one).

I would recommend skipping frames and just getting 4U cases because it makes managing airflow much easier and will save you on cooling costs. Moving things around is also a lot easier. They're usually around $400 now because of mining but sometimes you can find them cheaper. If you're lucky enough to find a good GPU case from a server those can be even better since they'll usually come with almost everything you need minus the GPUs and they'll have features like IPMI that you might want. I was able to pickup an old supermicro GPU server (minus CPUs, RAM, and GPUs) for $700 that houses 12 GPUs so the total cost of the system was very similar to having 2 frame rigs with ATX power supplies and a low-end CPU+motherboard combo but is much cleaner. The downside to the server cases is that you typically need smaller double width GPUs in them so a lot of the higher end commercial GPUs won't fit. The consumer type 4U mining cases don't usually have that problem.

consumer type 4U GPU case
https://www.amazon.com/Hydra-III-Server-Mining-Case/dp/B07B4PPQK8
server type GPU cases
https://www.supermicro.com/en/products/GPU/


As far as motherboards and CPUs for consumer stuff go, I don't like to go with mining motherboards because keeping things to around 6 GPUs keeps everything relatively simple and mining motherboards are expensive right now. I also wouldn't want to move around a 12GPU frame rig either due to the weight. I CPU mine as well so I typically go for a cheap ryzen 3xxx CPU and a cheap motherboard (x470 if possible). I used to go for open box returns and whatever was cheapest, but that starts getting confusing knowing which can run headless, which can't, which BIOS options need to be set, etc. so now I just stick to one model that I know works well and is cheap enough and buy them whenever I see them for <$100. I don't like having all my eggs in one mining basket (GPUs) so this helps spread the risk out a little bit I guess. And also, I'm not wasting power idling a CPU even if it's minimal. Some AM4 motherboards will have 6+ PCIe slots and m.2 slots but will only actually support like 5 GPUs without tweaking so make sure to check before buying a particular model. Something like the B450 Pro4 ASRock (https://www.asrock.com/mb/Amd/B450 Pro4/) with 6 pcie slots is usually cheap. Another consideration is the CPU. Depending on what you get, you might end up in a scenario where your motherboard will support 6+ GPUs but your CPU doesn't support that many PCIe lanes and that can cause issues (usually lower end CPUs on a product line have this limitation).
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
You can also try these: https://tinyurl.com/2p8h2c65
I've used them and they seem to work well. Some motherboards though it's been kinda weird though with the synchronization and I had to do turn them both on manually (might've installed the cable wrong on that one).

I would recommend skipping frames and just getting 4U cases because it makes managing airflow much easier and will save you on cooling costs. Moving things around is also a lot easier. They're usually around $400 now because of mining but sometimes you can find them cheaper. If you're lucky enough to find a good GPU case from a server those can be even better since they'll usually come with almost everything you need minus the GPUs and they'll have features like IPMI that you might want. I was able to pickup an old supermicro GPU server (minus CPUs, RAM, and GPUs) for $700 that houses 12 GPUs so the total cost of the system was very similar to having 2 frame rigs with ATX power supplies and a low-end CPU+motherboard combo but is much cleaner. The downside to the server cases is that you typically need smaller double width GPUs in them so a lot of the higher end commercial GPUs won't fit. The consumer type 4U mining cases don't usually have that problem.

consumer type 4U GPU case
https://www.amazon.com/Hydra-III-Server-Mining-Case/dp/B07B4PPQK8
server type GPU cases
https://www.supermicro.com/en/products/GPU/


As far as motherboards and CPUs for consumer stuff go, I don't like to go with mining motherboards because keeping things to around 6 GPUs keeps everything relatively simple and mining motherboards are expensive right now. I also wouldn't want to move around a 12GPU frame rig either due to the weight. I CPU mine as well so I typically go for a cheap ryzen 3xxx CPU and a cheap motherboard (x470 if possible). I used to go for open box returns and whatever was cheapest, but that starts getting confusing knowing which can run headless, which can't, which BIOS options need to be set, etc. so now I just stick to one model that I know works well and is cheap enough and buy them whenever I see them for <$100. I don't like having all my eggs in one mining basket (GPUs) so this helps spread the risk out a little bit I guess. And also, I'm not wasting power idling a CPU even if it's minimal. Some AM4 motherboards will have 6+ PCIe slots and m.2 slots but will only actually support like 5 GPUs without tweaking so make sure to check before buying a particular model. Something like the B450 Pro4 ASRock (https://www.asrock.com/mb/Amd/B450 Pro4/) with 6 pcie slots is usually cheap. Another consideration is the CPU. Depending on what you get, you might end up in a scenario where your motherboard will support 6+ GPUs but your CPU doesn't support that many PCIe lanes and that can cause issues (usually lower end CPUs on a product line have this limitation).

I think my plan right now was to try the pcie cards to the USB risers and use either the 3200g or 2700x I have. I have some other odd issues as two of the cards I have are water cooled cards from the factory and another has an AIO. so I can't really use a case. I need an open frame kind of thing. After what others had said, I'm trying to use what I have and get away with some cheaper parts without having to buy new mobo/cpu I have 6 cards so I can start with one and go from there and expand as I need to. hopefully....
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
I think my plan right now was to try the pcie cards to the USB risers and use either the 3200g or 2700x I have. I have some other odd issues as two of the cards I have are water cooled cards from the factory and another has an AIO. so I can't really use a case. I need an open frame kind of thing. After what others had said, I'm trying to use what I have and get away with some cheaper parts without having to buy new mobo/cpu I have 6 cards so I can start with one and go from there and expand as I need to. hopefully....
Ah, ok. Watercooled makes things annoying but if it's an AIO then it's not too different, just takes up more space. If any of your existing motherboards have 6 or so PCIE lanes you could probably get away with something like this: https://tinyurl.com/5fx7nz73
I haven't used them personally but others report success with them. I think as long as your CPU supports say 6 PCIE lanes and your motherboard does as well but doesn't have 6 physical PCIE/m.2 slots that you can use one of these in a pcie slot to split one of the x8 or x16 lanes into more x1 lanes and run more GPUs off of the motherboard. You usually need to tweak BIOS settings like CSM, above 4g decoding, and pcie lane settings though.
 
Ah, ok. Watercooled makes things annoying but if it's an AIO then it's not too different, just takes up more space. If any of your existing motherboards have 6 or so PCIE lanes you could probably get away with something like this: https://tinyurl.com/5fx7nz73
I haven't used them personally but others report success with them. I think as long as your CPU supports say 6 PCIE lanes and your motherboard does as well but doesn't have 6 physical PCIE/m.2 slots that you can use one of these in a pcie slot to split one of the x8 or x16 lanes into more x1 lanes and run more GPUs off of the motherboard. You usually need to tweak BIOS settings like CSM, above 4g decoding, and pcie lane settings though.
That riser adapter card doesn't require you to have an equal number of PCIe lanes to GPU's - you just need a single lane on that PCIe x1 slot it plugs into. It uses a PLX switch to provide 4 x1 connections on a single x1 slot(at theoretically 1/4 the bandwidth AFAIK if all all 4 devices plugged into it were running at 100% bus utilization, but it works fine for mining). A bifurcation card would be one that requires enough lanes, as well as bios support, but those are not generally used for mining.
 
thanks all!

Going to try the PCIe card that allows 4 cards and use risers and see how it goes. trying to figure out my PSUs will be the fun one. I think.... I can run 3 cards off the 850 i have, 2 off the 550 I have and the other one can just deal with another PSU for now. Curious how that will go. This will be very interesting and hopefully fun. If it works then at least I know what to do as I, hopefully, get more cards.

I think eventually I will go with something like this for the power
https://www.parallelminer.com/produ...g-pcie-cables-6-pin-to-8-pin-62-zec-dash-eth/
 
SO......I tried to do this last night and it went terrible....I started at 7:30pm and finally stopped at 5:30 am

3 of the risers I got didn't work. I could not get more than 2 cards running from one machine off the card with the 4 usb ports. If I had 2 of the same cards, windows would go nuts and not find anything but a "VGA Adatper" if I had a 1080ti and a 1080, it would work with only 2 attached.

I ended up putting everything mostly back the way it was just in a messy way. One of the computers may not even boot now. One of the 1080s, luckly the crappy one, only works now in one computer in one of the PCIe slots for some reason. Put it anywhere else, it powers up and then powers off and only the DVI port works now as well.

I wonder if my motherboards are just not up to the task. I tried both of the newest ones I had and they are both B450s. Have to figure out if the the USB card is bad or not. It seems to work with 2 cards in any of the usb ports and that is it. I couldn't get 2 cards working with thjust the little PCIe1 to USB that the risers come with. no matter what, that didn't work even with the m.2 SSD removed to free up the slots. It didn't seem to be a power issue as an 850 watt should run at least 3 cards no problem. Even tried daisy chaining PSUs and having each card/riser powered by its own power supply which didn't fix anything.

In the end I accomplished having 1 less computer....I still have 5 running.

I guess I need those dedicated long boards for 6 cards and not have to have all this stuff going on. I do like the riser setup though, it worked amazing with my ghetto mining shelf. well it would be awesome, if it worked.
 
Last edited:
Have you tried linux? I can't imagine dealing with windows unless you absolutely have to (470 drivers for nvidia for example).
 
I went back downstairs and I was able to get more of the cards working. 3 machines now instead of 6.

It took windows AGES to find other cards when they were installed on risers in their own PCIe slot, like 10 minutes.


Hopefully I can get it down to 2 PCs and 2 PSUs once my other PSU gets here and the replacement risers show up from amazon. I am also going to try another brand of the 4 USB pcie thing just to see what happens.

I've had no issues with windows until this little 4 USB thingie.

I think I need to just get better motherboards with more PCIe slots and get rid of the B450 ones. That way I can hopefully not even have to use the 4 USB card thing.

Now the only issue I have is the crappy 1080, once I get into windows, the fan goes 100% all the time. Outside of windows, it is fine and doesn't do that. So it is like as soon as the driver loads, it just becomes a jet engine.
 
Got it working! It just takes windows forever to find all the cards. It took like 20 min. Crazy

Other than the oddball card all 5 others are on one machine!

20220118_160615.jpg
 
Back
Top