Mining Motherboards

1) None of those choices (that fit a regular ATX case) are serious mining boards.
2) More GPUs may tempt you to pack them closer than can be effectively cooled.
3) More GPUs may tempt you to overburden a marginal supply.
4) More GPUs increase chance that mining will stop for some silly non-reason.
5) More GPUs increase troubleshooting time to locate the specific cause.
6) GPU downtime losses are multiplied by every card sharing the rig.

Around four to eight GPUs on one or two supplies seems optimal.
I own the ASROCK 13GPU board, and it hasn't failed me in any way.
But I still wouldn't recommend, because GPUs multiply too many headaches.
 
I see it different. I have 3 rigs and they are 12 cards. They been up and running straight for the last 3 1/2 months. No down time. It did take about 2 weeks to get all 3 up and running after tuning them (settings). Right now I am building 2 more 12 card rigs.
 
We just need a mini ATX with two power headers, that saves enough room for a second power supply.
One that can run from 12V connections alone, and needs no ATX supplies, just there for on/off control.
Give me an ATX on/off button on the ATX backpanel. Same for Reset and BIOS, but maybe recessed.

I don't give a damn about pcie slots unless they are full length and separated by at least 3 slot widths.
Given MiniATX, that means ONE. Rest should be USBPCIe, with fair warning plugs of dayglo orange.
At least seven of these, and not more than sixteen. No problem if these might be packed a bit tight.

And we need PWM fan headers, lots of them... At least six beefy enough to provide 5A all day long.
As fans of up to 3.9A are not an unusual requirement for mining in a closed chassis or wind tunnel.

Ryzen3 SOC offers plenty enough lanes and basic peripherals, especially if the primary GPU socket
need not burn more than four or eight lanes. Don't see any need here for a Southbridge. Embed as-if
a laptop. An upgradable CPU socket is wasted money. One stick of RAM is plenty, lay it flat please.

We need an M2 SATA, and a regular SATA. And I think thats as much as you get from the SOC.
If anyone demands to waste four lanes on NVMe, I suppose they could use the main PCIe slot.

And stop putting video and audio on mining boards. Every GPU provdies for audio through HDMI.
 
Last edited:
Cause I exhaust my heat out the dryer duct, and save 3500W of air conditioning.
Delta 252CFM 5,500RPM fans rate 3.9A, but draw less than two, even full speed.
But the proposed mobo would still have to handle the worst case peak.

Closed 4U chassis are a bitch to keep cool, and I regret that my plan was not
better thought out before I went that way. I should have just given up a room.
Instead I gave up the dryer, to be close to a 220V outlet. Bronze to Gold...

If you can externally ventilate a whole room, then open air makes more sense.
Box fans are fairly efficient, and don't further burden the 12VDC power supply.
 
Last edited:
This is my setup. Notice the bigass exhaust fan. The cards on the bottom and top left are 1060's, which pull so little wattage that denser is pretty easy.

rqdxBGB_d.jpg
 
Well, lets forget about ATX form factor and talk about real mining motherboards for a second:
My next puchase would first depend on choice of power supply.
For PC supply: The six slot Onda D1800 BTC at $135 with embedded celery looks to be a steal.
For Ant Supply: Eight slot Colorful C.B250A-BTC PLUS V20 for $190 seems well thought out.

The Colorful board is Ant powered from one side, and a 4pin near the CPU that never seems
to have anything attached in any build photo. Another row of 6pin powers across the front are
not inputs, but convenience outlets for routing to the top of individual GPUs.

Its less clear wether the Onda's front power headers are ins or outs. I'm thinking ins...

Both these suffer from fixed 3slot spacing. Which seems a little tight for optimal ventilation.
Certainly anything dual fan RX580ish isn't going to enjoy being packed quite that closely.
But low power 1060s would probably do just fine. DIY Pandabox...

Onda D1800 BTC.jpg

Colorful-cb250a-btc-plus-v20.jpg
 
Last edited:
Biostar-104riser.jpg

Or we can get stOOpid. The whole rig is down, and one riser is flakey. Go find it...
One of my risers just cratered a buck converter after running fine for 2 weeks.
Of course Murphy would never dare visit YOU. Pile-on many GPU as you like.
 
Last edited:
Or for $30, an ASMedia switch can expand almost anything to become a mining platform...
Front.jpg
Back.jpg

Again with the three slot spacing, but you can buy a switch and individual risers separately.
Switch cards come in many shapes, not all make sense...
 
Last edited:
That's funny, 104 cards. The OS that can handle that will be fun. SMOS can do 15 cards. Windows 10 can do 16, but they have to be 8 Nvidia and 8 AMD.



Or we can get stOOpid. The whole rig is down, and one riser is flakey. Go find it...
View attachment 40352
One of my risers just cratered a buck converter after running fine for 2 weeks.
Of course Murphy would never dare visit YOU. Pile-on many GPU as you like.
 
Yea there's a happy medium I think. For me it's 6 or 8 card rigs, 6x580's or 8x1060's. In a 4u case with 3 120mm fans up front, and not the quiet ones either. My wife hates the server cabinets enough, I couldn't imagine an open air 104 gpu monster sitting in my office....I'd have to move out
 
i use 5 gpu rigs as i can buy a motherboard cpu and ram for under 70. windows sucks with any more then 6ish cards and its just to much of a pain to have 10+gpus go down for whatever reason.
 
1) None of those choices (that fit a regular ATX case) are serious mining boards.
2) More GPUs may tempt you to pack them closer than can be effectively cooled.
3) More GPUs may tempt you to overburden a marginal supply.
4) More GPUs increase chance that mining will stop for some silly non-reason.
5) More GPUs increase troubleshooting time to locate the specific cause.
6) GPU downtime losses are multiplied by every card sharing the rig.

Around four to eight GPUs on one or two supplies seems optimal.
I own the ASROCK 13GPU board, and it hasn't failed me in any way.
But I still wouldn't recommend, because GPUs multiply too many headaches.
the asus is built for using 3 PSU's and again... like what seems to be covered here often don't power much on PC power supplies! Go server ones already.

i have the asus and will only use that now. you dont have to fill it
 
Nothing wrong with AXT supply you can sometimes find good deals almost anywhere, and without 2 weeks wait for the slow boat.
It requires a little more thought to the purchase however. Some are garbage for mining purposes, and those need to be avoided.
Just be sure to select one with plenty of PCIe power cables, offering three 12V wires, same as any 6pin converted mining supply.

Never power more than two GPU risers from a modular SATA peripheral cable, unless you want to smell melted wire and plastic.
I made that mistake recently with Qty4 of 1050ti (something like 275W) to a single 12V modular pin and wire. Didn't destroy the
entire supply, just voided the warranty (user error) a bit obviously. I need to scavenge a replacement connector somewhere and
it will be fine. Anyone on the north end of Dallas care to volunteer a scrapped modular with some other issue for that purpose?

I like the 850W Thermaltake Tough Power Grand. $104 today at Microcenter, just bought another one last week for $10 more.
Useless RGB light wastes negligible power and easy to turn off. True specifications, and very hard to kill the supply. Not that it
prevents stupid user from burning a peripheral connector. You get three GPU cables with two connectors each, which is more
than you should realistically ever burden an 850W supply, and they are plenty long enough. With 220V, its also a free upgrade
from Gold to to Platinum.

And the reverse is true of server grade mining supplies as well. Popular 1200W downgrades to 750W and much lower efficiency
when running from an ordinary 110V outlet. Just the way Universal Power and Active Power Factor Correction work. Rectify and
boost whatever AC you give to roughly 390VDC, before isolated downconversion. Less required boost is always more efficient.
110VAC rectifies to 0-155VDC. 220VAC rectifies to 0-310VDC. Obviously the latter option lives closer to the target of 390VDC.
 
Last edited:
I like the 850W Thermaltake Tough Power Grand. $104 today at Microcenter, just bought another one last week for $10 more.
Useless RGB light wastes negligible power and easy to turn off. True specifications, and very hard to kill the supply. Not that it
prevents stupid user from burning a peripheral connector. You get three GPU cables with two connectors each, which is more
than you should realistically ever burden an 850W supply, and they are plenty long enough. With 220V, its also a free upgrade
from Gold to to Platinum.

Power always confounds me...I'm getting ready to build a 6GPU system (probably AMD) but will an 850W like the one above be enough? Thanks!
 
Eight 550's or Six 560's sure why not, except those make little economic sense.

If you are thinking 570's and 580's, six at full speed on single 850W is not likely.
I think you are gonna need at least 1000W for that project, and far more if you
want enough margin to operate efficiently.

Get a $4 relay card, or hack your own, and split the work across two supplies.
Or the ASUS Expert Miner, a bit spendy if supply relay was the only advantage.

Unboxed all my AMD cause they run far too hot to survive Rosewill 4u chassis.
They will all eventually be put back into service, hanging from open wire rack.

One RX580 + Two RX480 + Four 1050ti drew 750W while plugged to 110VAC.
And that was absurdly hot, no closing the lid even with Delta fans at full whine.
Eight 1050ti then pulled reasonable 550W under the same conditions. Go figure.

Both the above measurements are open box with a 20" fan, and do not include
the Deltas, which add another 120W, with AMD took me slightly over the power
supplies' max rating. I didn't operate like that for long.

I can't use my 125VAC Kill-a-Watt to prove 220VAC power savings at home.
But I have Agilent 6813B at work, and plenty of dummy loads. And I measure
that sort of circuit professionally all the time.

Also verified my Kill-a-Watt to about 2%. They claim some false tolerance that
calibrated measurements prove a total lie, but better than a blind guess. I'm in
the market for 220V variant if anything like that comes up cheap.
 
Last edited:
I’ve been running two biostar tb250btc-pro motherboards for the last 2 months or so. I think its a better option than the asrock 110 BTC pro. Maybe I’m a slave to marketing, but the us vs. them thing convinced me on the biostar site.

I agree though in Windows it’s easiest if you don’t mix cards. MSI afterburner only supports up to eight cards in the GUI, but you can link the common cards it can’t see. So if you have eight matching 1060 cards and four matching rx580 cards, for instance, you can still control them all with group settings on windows. If you have several different types of cards you can’t always uniquely modify each one for undervolt /overclocking.
 
Also you’d need a beast of a cpu for one of those ASUS 19 GPU boards. A celeron isn’t going to cut it. My cousin can tax out a celeron to 80 or 90% CPU usage while mining with 10 cards on the 13 board asrock or the 12 board biostar. My pentium g5460 hit about 45-50% with 10 cards mining.

Also that 19 board ASUS they recommend 32GB Ram if mining 19 cards.
 
Running eight GPUs on Asrock with an 4.2GHz i3.
"unlocked" except the chipset disables that feature.
Still enough CPU left over to crunch two threads of XMR for 80Hashes.
All while typing this post, and observing through the iGP.
Task manager sais about 65% CPU... Also sais I'm only
using 2.9Gig of memory.

I do got 16Gig DDR4 2400. Originally for my Ryzen, but I
threw some faster Samsung at that box , so this 16 was
sort of not doing anything better than sitting around...
Disabled swapfile to keep from beating up the SSD.
 
Last edited:
The advantage on that 19 board rig would be that for just over $5k you could have a 19 GPU rig with something like 19 RX570 cards. Use one of anorak.tech custom bioses to pull right at 100-110 watts each for 28MH/s and you have a ~500MH/s stable rig for really quite cheap.

I have close to $10k in my two rigs I built a couple months back that has similar hashing power and more powerdraw, (8 1080ti, 2 rx580, 8 1060, 2 vega 56).
 
That B250 requires you to have at lease 6 mining cards on the board. From what I saw, Asus states this in there manual. Here's a video from Bits Be Trippin" .

But when having a rig with 19 cards. There is always problems. There like having children. If it go's down and it will. That's 19 cards making no money. Now you could go with 2 - 9 card rigs or 3 - 6 cards rigs. This way if one go's down you still have the others up and making money!


 
That B250 requires you to have at lease 6 mining cards on the board. From what I saw, Asus states this in there manual. Here's a video from Bits Be Trippin" .

But when having a rig with 19 cards. There is always problems. There like having children. If it go's down and it will. That's 19 cards making no money. Now you could go with 2 - 9 card rigs or 3 - 6 cards rigs. This way if one go's down you still have the others up and making money!




At timestamp 1:30 he says no OS out of the box even supports 19 cards yet. He tried four different Linux operating systems - and we know Windows doesn't support more than 16. (8 AMD, 8 Nvidia)

Odd that Asus would put 19 on the board without the capability to use that many???
 
I just started out with a rig using some old unused parts around the office here recently. I had a cubix pci expander and two Titan X 12GB cards and a 1080. Ran that for about a month and a half and made enough to buy a 1070 card so now that cubix is full with 4 cards. Does about 82Mh. The titans only do 14Mh and I can't seem to figure out how to get any more out of them if it's even possible. Also funny the 1070 gets about 30Mh and the 1080 only gets 24Mh.
 
Here's another video of someone else getting 19 cards working. The video was a reply to Bits Be Trippin'.




At timestamp 1:30 he says no OS out of the box even supports 19 cards yet. He tried four different Linux operating systems - and we know Windows doesn't support more than 16. (8 AMD, 8 Nvidia)

Odd that Asus would put 19 on the board without the capability to use that many???
 
There's nothing forcing you to max out the B250 19-slot board - the power supply management ALONE makes it a good option for a rig in the 12 GPU ballpark, and the price isn't all THAT much higher than the 6 card board options, and when you factor in only needing one CPU one set of RAM and one HD/SSD the economics starts making sense.
There is the "if it goes down you lose a lot more" issue, but it's been quite stable so far in my limited testing - but in fairness I've not gotten mine past 9 cards yet (mostly GTX 1080 ti so far).

Supposedly AMD and NVidia are working on driver versions that will support more than 12 cards.


The slots on the Onda and Colorfull motherboards are close enough together to make thermal management a pain - both get a workout in the "phillipma1957" 6'th thread on bitcointalk.
 
Post #443 of that thread clearly illustrates the crowding problem.
https://bitcointalk.org/index.php?topic=2138550.443

Reference blower cards might be less trouble for this packing.
Otherwise populate every alt mobo slot, + maybe three risers.

I've got 8x 1050ti's thermalling fine while all crammed together.
Not on Onda mobo, on risers. Be extra careful how you must
deliver power to multiple "Low Power" GPUs from below. The
peripheral power cables and modular supply connects smoke.
I imagine you could smoke a motherboard just as quick...

When I had five non-reference AMD RX packed at threeslot
spacing, there was no keeping any GPU (except one that was
watercooled) from overheat. Its just too tight and not practical.

The Onda is $104 on Ali right now. Which is excellent if you like
embedded Celery. But the price of four way ASMedia switches
is down to $13. Any junker motherboard with such a switch can
space risers to your preference. Maybe the better deal.

Few $ more on the egg, ebay, amazon. Easy to find though.
Switch.png
 
Last edited:
Post #443 of that thread clearly illustrates the crowding problem.
https://bitcointalk.org/index.php?topic=2138550.443

Reference blower cards might be less trouble for this packing.
Otherwise populate every alt mobo slot, + maybe three risers.

I've got 8x 1050ti's thermalling fine while all crammed together.
Not on Onda mobo, on risers. Be extra careful how you must
deliver power to multiple "Low Power" GPUs from below. The
peripheral power cables and modular supply connects smoke.
I imagine you could smoke a motherboard just as quick...

When I had five non-reference AMD RX packed at threeslot
spacing, there was no keeping any GPU (except one that was
watercooled) from overheat. Its just too tight and not practical.

The Onda is $104 on Ali right now. Which is excellent if you like
embedded Celery. But the price of four way ASMedia switches
is down to $13. Any junker motherboard with such a switch can
space risers to your preference. Maybe the better deal.

Few $ more on the egg, ebay, amazon. Easy to find though.
View attachment 42110
Easy to find? I just hit ebay, newegg and amazon for ASmedia Risers and ASmedia switches with nothing like that coming up. I found them on ebay with the cheapest US person at $17 but in CA so it'll take a week to get to me. Some of the chinese vendors get me stuff in about that timeframe.
 
Last edited:
As an eBay Associate, HardForum may earn from qualifying purchases.
As an Amazon Associate, HardForum may earn from qualifying purchases.
I have three Biostar tb250-tbc pro boards

12 GPU each


They work fine but when a riser dies it’s a pain to figure out which one.

I end up unplugging everything and plugging them back in one at a time for 24 hours to test. Is there a better way?
 
I have three Biostar tb250-tbc pro boards

12 GPU each


They work fine but when a riser dies it’s a pain to figure out which one.

I end up unplugging everything and plugging them back in one at a time for 24 hours to test. Is there a better way?

Maybe set the fan speed for all the GPUs to 100% and see which ones react?
 
Few $ more on the egg, ebay, amazon. Easy to find though. <----------- This.... Amazon link is $30 and the only one of those in the US... newegg and ebay links are also chinese wait a month to get. The closest to reasonable I've come across was this... https://www.amazon.com/gp/product/B0746H1KY3/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1 like $50 but in the US and with 4 shitty risers but can get here fast.

I don't consider stuff sitting in China actually on newegg/amazon since at that point they're just shitty third parties and worse than ebay. I did order a couple different ones from ebay though for slightly under $13 a pop. I was hoping there'd be something around like $20 in the us.
 
As an eBay Associate, HardForum may earn from qualifying purchases.
As an Amazon Associate, HardForum may earn from qualifying purchases.
Back
Top