New MB for multi GPU mining

Joined
Jan 12, 2021
Messages
26
Hey !

I would like to buy a MB dedicated to mining, supporting 8 or 16(?) Gpu qith risers etc.
I stopped to check that in 2018, in 2021 what MB could do the stuff ?

thank you for your help

Electrolyse
 
Last edited:
Hey !

I would like to buy a MB dedicated to mining, supporting 8 or 16(?) Gpu qith risers etc.
I stopped to check that in 2018, in 2021 what MB could do the stuff ?

thank you for your help

Electrolyse


Amazon.com: ASUS B250 MINING EXPERT LGA 1151 Intel B250 HDMI SATA 6Gb/s USB 3.1 ATX Intel Cryptocurrency Mining Motherboard with HDMI Cable: Computers & Accessories

I have two of these and they work, but if I were to start setting up new rigs I would keep it to 6 GPUs and do multiple rigs, it's just too complicated and really taxes the rest of the system.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
As an Amazon Associate, HardForum may earn from qualifying purchases.
I can't imagine trying to get 20 gpus working on a single board. I wouldn't build one over 8 and probably stick with 6.
12 is the most I ever got working comfortably because it is the most of one vendors card supported by the windows 10 OS. If you want 20 under windows you'd have to put in 12 (some say 13 is the actual number - but I think officially it was 12) of one brand, and the other 8 of another.
IE 12 Nvidia cards and 8 AMD cards.

MSI afterburner only works cleanly with eight cards in total. So unless they are identical cards it’s difficult/impossible to manage clockspeed on cards beyond 8 in Windows. You can do it with custom BIOS on the AMD cards - but that's a hassle, and Nvidia won't let you do custom bios.

So 8 is the easy button limit for Windows -- unless all 12 cards you have are the exact same card and then you can manage the undervolt and overclocks all as one card with afterburner. I did that with 12 nvidia 1060 cards. It worked great.

When I tried mixing AMD and Nvidia cards on the same mining system it was a headache. Video card drive updates for one vendor seemed to lose configuration settings in afterburner for the cards and it made me want to pull my hair out. Where as if you just stay with one vendor, Nvidia or AMD on the single mining board OS it seemed way less hassle.

You’ll note these motherboards have three PSU connectors. You can just use just one PSU to get started with a few cards and add the other one or two PSUs as needed, as you acquire more cards.

Hints:
Target only using about 80% of your PSUs max capability for greatest stability.
Set your windows page file manually. Set it to be >= to the combined VRAM on all of your cards. Use an SSD for your OS/Page drive.

The boards work well. My Asus B250 mining expert with a Pentium 4560 (dual core/4 thread), ran at about 50-60% load feeding the 12, 1060 card, and 12, RX580 card rigs I built. Really reliable and well tuned boards.

If you want more than 12 of one vendor card on these boards - then move to Linux OS -- and I didn't explore too much in that area, so I can't advise.
 
Last edited:
Ive done 12 with vbios mods for clocks (and had to mod some chips to the consumer version of the firepros to get drivers too work under windows).

My favorite mining board was a x8dte. I would just throw 6 on each and call it a day. Those boards were dirt cheap. (<$50 for cpu+mobo+ram)

I also commonly ran 8 GPU rigs (rx480s) under Linux for a client and they were pretty reliable
 
12 is the most I ever got working comfortably because it is the most of one vendors card supported by the windows 10 OS. If you want 20 under windows you'd have to 12 (some say 13 is the actual number - but I think officially it was 12) of one brand, and the other 8 of another.

IE 12 Nvidia cards and 8 AMD cards.

MSI afterburner only works cleanly with eight cards in total. So unless they are identical cards it’s difficult/impossible to manage clockspeed on cards beyond 8 in Windows. You can do it with custom BIOS on the AMD cards - but that's a hassle, and Nvidia won't let you do custom bios.

So 8 is the easy button for Windows -- unless all 12 are the exact same card and then you can manage the undervolt and overclocks all as one with afterburner. I did that with 12 nvidia cards. It worked great.

When I tried mixing AMD and Nvidia cards on the same mining system it was a headache. Video card drive updates seemed to lose configuration settings in afterburner for the cards and it made me want to pull my hair out. Where as if you just stay with one vendor, Nvidia or AMD on the single mining board OS it was less hassle.

You’ll note these motherboards have three PSU connectors. The boards work good.

If you want more than 12 of one vendor card on these boards - then move to Linux OS -- and I didn't explore too much in that area, so I can't advise.

I had 15 Nvidia cards working on one of my b250's (PSU limited) , but it was a PITA to keep working. Just not worth the trouble and there were times that the I3 CPU was limiting hash rate. It is just not worth the trouble.
 
Bump - because I'm curious what boards people are mining with these days? I only have one mining motherboard, the Asus B250 Mining Expert. It's been good to me, but in this current climate it's stupid overpriced.
 
I bought a used ASRock H110 Pro BTC+
This was super easy to get 8 cards running in windows. It has 13 slots but i didnt try past 8 cards. Ran for weeks on end no hiccups.
 
Less then 50w (probrably ~25 with just one chip).

You dont need 2 chips installed to use the pcie lanes.
 
Less then 50w (probrably ~25 with just one chip).

You dont need 2 chips installed to use the pcie lanes.
80 or 120mm Chassis fan over that server heatsink sufficient? Or how do you recommend cool the CPU heat sink. That looks definitely made for being inside a server chassis with active airflow blown through.
 
80 or 120mm Chassis fan over that server heatsink sufficient? Or how do you recommend cool the CPU heat sink. That looks definitely made for being inside a server chassis with active airflow blown through.
Any airflow at all works with those tall heatsink.
I have a trash 80mm fan attached to mine and its more then sufficiant.

The board may only take 5 gpus happily* if I remember right it fought me for anything more.
 
I bought a used ASRock H110 Pro BTC+
This was super easy to get 8 cards running in windows. It has 13 slots but i didnt try past 8 cards. Ran for weeks on end no hiccups.
Same board here. Only running 5 cards in it right now.
 
Thanks for the idea cdabc123

I made an offer for $100 all in on this one and he took it. If it's good for 5 cards, that's not bad, and I can use it as a dedicated host at LAN parties in the future for random games. It's still a pretty beefy system for $100 bucks shipped with 12 core/24 thread and 48GB of ECC RAM.
https://www.ebay.com/itm/284169281190
 
As an eBay Associate, HardForum may earn from qualifying purchases.
Man, wish I'd read this before.

Intel Xeon L5630 ($4.5/pair, 40W cpu) + that motherboard ($50) is a lot cheaper than other options.
 
you guys know a AM3 900 series board with a 15$ athlon 250 and DDR3 can be used with 6 - 9 GPUs right? its like 100$ for the complete rig, but really you probably have parts that will work for it in the scrap bin. the PCIE expanders have increased in price but they are still under 25$ with the MB and 1 expander you get 6 GPUs, 2 expanders will get you up to 9 GPUs. (yes i have ran them configured that way, works fine in linux and windows.)

technically you could run 3 expanders for 12 GPUs, i never have great luck past about 8 though so that i have not tried.
link to expander (these do NOT work with many intel boards)
EXPANDER
 
As an eBay Associate, HardForum may earn from qualifying purchases.
you guys know a AM3 900 series board with a 15$ athlon 250 and DDR3 can be used with 6 - 9 GPUs right? its like 100$ for the complete rig, but really you probably have parts that will work for it in the scrap bin. the PCIE expanders have increased in price but they are still under 25$ with the MB and 1 expander you get 6 GPUs, 2 expanders will get you up to 9 GPUs. (yes i have ran them configured that way, works fine in linux and windows.)

technically you could run 3 expanders for 12 GPUs, i never have great luck past about 8 though so that i have not tried.
link to expander (these do NOT work with many intel boards)
EXPANDER

I've thought about something like that but I'm trying to stay away from PCIe expanders. I've heard others not have very good look with them.
 
As an eBay Associate, HardForum may earn from qualifying purchases.
they work really good actually, as long as you understand what they can and cannot do. the short story is, do NOT even bother on Intel, and only certain ports can run them on AMD rigs. on 900 series AMD chipsets no matter ho many PCIE slots the board has, you can probably only use 3 expanders. really windows doesnt do well over 8 GPUs any so not a big deal. even HIVEOS and a 1600w PSU you probably aren't going over 12.

Ryzen can work with them but there are limitations as some slots (most?) are direct CPU connected. still you end up with about the same number of expanders supported. 2 or 3 as long as it is a ryzen CPU not an APU. APUs in AM3 and AM4 are NOT going to work. some will work with 1 expander, at best.
 
Ive done 12 with vbios mods for clocks (and had to mod some chips to the consumer version of the firepros to get drivers too work under windows).

My favorite mining board was a x8dte. I would just throw 6 on each and call it a day. Those boards were dirt cheap. (<$50 for cpu+mobo+ram)

I also commonly ran 8 GPU rigs (rx480s) under Linux for a client and they were pretty reliable
She’s alive.

Fired right up. Installed Windows 10 on an SSD. Fun to have this much hardware power on tap — even if it’s old. My gaming rig is much faster with a I7 6950x, (10 core 20 thread) but this old legacy Xeon hardware is fun just because!
 

Attachments

  • 98BDFDBD-6BE0-43FB-AFBB-7418D444EE00.jpeg
    98BDFDBD-6BE0-43FB-AFBB-7418D444EE00.jpeg
    927.3 KB · Views: 0
  • E9FD543C-1DFC-4C12-836E-1B8798B82853.jpeg
    E9FD543C-1DFC-4C12-836E-1B8798B82853.jpeg
    648.8 KB · Views: 0
cdabc123

when I go to the super micro website the only driver listed for windows 10 is for LAN. Do you have other drivers you installed? Or do you just use Windows built in drivers?

Also those heatsinks on this are meant for high airflow and they are throttling at 80*C under benchmark load so I guess I'll have to buy new heatsinks as their doesn't appear to be a clean way to mount a fan on them. How are you cooling your heatsinks?
 
Last edited:
cdabc123

when I go to the super micro website the only driver listed for windows 10 is for LAN. Do you have other drivers you installed? Or do you just use Windows built in drivers?

Also those heatsinks on this are meant for high airflow and they are throttling at 80*C under benchmark load so I guess I'll have to buy new heatsinks as their doesn't appear to be a clean way to mount a fan on them. How are you cooling your heatsinks?
I never had to install any drivers for the board. Not even lan. I ran them under linux and server 2012 r2.

Idk what kind of airflow your dealing with but those 2u were never hard for me to cool. I would HVAC tape a 80mm fan to one side (all the airflow would be forced through the heatsink) and it would stay plenty cool. Reapply thermal paste? For my mining board I didnt even bother with that I just leaned a 120mm agains the side and the ram (single CPU config).

Edit: looking at the link you attached I would defenintly rotate one of the heatsinks so the airflow is parallel to both of them.
 
Do you suggest other motherboards for 4 up to 6 RTX 3060 12GB? :unsure:
I already tried and failed with ASUS h110 pro...I didn't solved the issues about no boot and no signal with 3 new brand motherboards :cry:
 
Do you suggest other motherboards for 4 up to 6 RTX 3060 12GB? :unsure:
I already tried and failed with ASUS h110 pro...I didn't solved the issues about no boot and no signal with 3 new brand motherboards :cry:
Did you try just putting one video card in them at first connecting the display to the graphics card and enabling the onboard graphics, then connecting the display to the motherboard as video out? Some boards require you to do that.
 
Did you try just putting one video card in them at first connecting the display to the graphics card and enabling the onboard graphics, then connecting the display to the motherboard as video out? Some boards require you to do that.
Unfortunately, yes I tried with the RTX 3060 (hdmi port) attached in the pci-e 16x... 😭
 
Unfortunately, yes I tried with the RTX 3060 (hdmi port) attached in the pci-e 16x... 😭
I didn't look at what board you guys are discussing, buy my supermicro g34 board wouldn't let me use the PCIe slot until I moved a jumper. Might be something to look into the manual for.
 
I already tried to clean CMOS :(
I will try to buy a new one (probably not the same model...)
 
I didn't say to reset the CMOS. I said there may be a jumper needed to enable the PCIe slot.
ok sorry I added an additional note!
However, nothing is mentioned in the manual or/and in the motherboard :(

Do you think that the model Biostar tb-360-bdc d+ will be compatible with 4 x rtx 3060? (n)
 
ok sorry I added an additional note!
However, nothing is mentioned in the manual or/and in the motherboard :(

Do you think that the model Biostar tb-360-bdc d+ will be compatible with 4 x rtx 3060? (n)
I don't see why not. It appears to accept 8 cards. Be aware it requires SODIMM memory.
 
I can't find the specific specs on that board, but usually mining boards like that with a bunch of x16 slots are physically x16 but electrically x1. They negate the need for risers, but if it's not running at PCIe 3.0 x16 or x8 the RTX 3060 will be hashrate limited even with the special driver and an HDMI dummy plug. All it says in the specs is 8x PCI express x16 slots, but it doesn't say specifically what they're running at.
 
I can't find the specific specs on that board, but usually mining boards like that with a bunch of x16 slots are physically x16 but electrically x1. They negate the need for risers, but if it's not running at PCIe 3.0 x16 or x8 the RTX 3060 will be hashrate limited even with the special driver and an HDMI dummy plug. All it says in the specs is 8x PCI express x16 slots, but it doesn't say specifically what they're running at.
You are right...
Maybe Asus WS Z390 PRO is a better choice to start with 4x3060 without any problems...
Anyway, I can’t understand how a compatible processor, ie i3-8300 or other in the official list, can manage 4 gpus at 8x when the processor doesn’t have enough lines...someone can explain it?
 
Since when will a x1 lane limit hashrate??? Provided the dag fits within the vram there really isnt too much data being transferred over the pcie lane.
 
Since when will a x1 lane limit hashrate??? Provided the dag fits within the vram there really isnt too much data being transferred over the pcie lane.
RTX 3060's are the first card released by Nvidia to have a hashrate limiter that limits the hashrate if it detects ethereum mining. They "accidentally" released a driver with the hashrate limiter missing, but it only works under very specific conditions, such as being in an x8 or x16 slot PCIe 3.0 slot, and having a monitor(or dummy plug) attached to the card.

I guess nowhere specifically in this thread was it stated that they're mining ethereum, but that was my assumption given current profitability. If they're trying to mine ravencoin or something it wouldn't matter, it's only a hashrate limiter specifically on ethereum mining algorithms. It being RTX 3060's specifically though was why the number of lanes matters.
 
Back
Top