Seriously Irritated With Mining Setup

username00

Limp Gawd
Joined
Nov 5, 2012
Messages
143
1) R9 280x
1) R9 270x
1) HD 7950
1) 850 watt PSU
1) 650 watt PSU
1) Add2PSU (I can't figure out if this was the biggest waste of money and source of problems ever)

Asus A88X-Pro mobo.

The system works with the HD 7950 as the primary video card and no riser, and the other two cards with 1x risers. With this config, the main PSU powers all the risers, and the primary video card, and the next one down (R9 280x). The secondary PSU powers the last card (HD 7950). I'm trying to split the load between the two PSUs, so I moved the weakest card (270x) to the primary video card slot, and the other two are on riser. Windows just plain will not detect 3 cards now. I moved the last card to a different PCI-E slot, and it detected it, installed the drivers, then when the drivers were finishing installing, windows reboots. Boot back up, windows reboots before finishing... black screen, hard reboot. Unplug one of the cards, everything is fine.

I've swapped around and played with PCI-E slots as much as I can, there doesn't seem to be any reasoning to whats going on. The manual lists the shared IRQ assignments, but when I have everything plugged into where they shouldn't conflict (ALL cards plugged into x16 slots) and it doesn't detect all of the cards. Am I missing something stupidly simple? Between yesterday and today, I've got well over 15hrs into this, I'm not exaggerating. Things worked great before my last intel board blew, but even then, I wasn't able to get the 270x to work as the primary card, with eveything else working too.
 
Are you at any point in your non-working configurations powering a card with two different supplies?
Say, the card that is plugged directly into the MB gets some power via the slot from the MB power supply and the 6-pin connectors are coming from the second supply?
Or, the powered risers are powered by one supply and the cards plugged into the powered risers have their 6-pin connectors powered by a different supply?
 
I wouldn't bother with 2x PSUs on that setup depending on the PSU. The 850W (if it's a decent one) will power the 280x, 270x, and 7950 just fine. The cards would do something along the lines of 300W + 150W + 225W + 25W overhead = 700W off the wall, meaing the PSU would actually be operating at ~600W (assuming 85% efficiency) or 50A on the 12V rail, well within its operating specs. My 850W PSUs have 70A on the 12V rail.

Have you tried using just a single PSU yet? At least just to boot in Windows, if you're not confident your PSU can handle all 3 cards fully loaded mining.
 
In my experience, you cannot mix and match power to the cards. If you are powering the risers from the PSU essentially powering the board, you should also be powering the card from the same PSU. Anytime I ran power from a secondary PSU with risers powered with a different PSU, it would not detect or it would not even power on. What happens if you power the mobo and 270x with the 650w (risers connected to this PSU) and the 280x & 7950 with the 850w PSU (risers connected to this PSU).
 
I wouldn't bother with 2x PSUs on that setup depending on the PSU. The 850W (if it's a decent one) will power the 280x, 270x, and 7950 just fine. The cards would do something along the lines of 300W + 150W + 225W + 25W overhead = 700W off the wall, meaing the PSU would actually be operating at ~600W (assuming 85% efficiency) or 50A on the 12V rail, well within its operating specs. My 850W PSUs have 70A on the 12V rail.

Have you tried using just a single PSU yet? At least just to boot in Windows, if you're not confident your PSU can handle all 3 cards fully loaded mining.

You also have to factor in the mobo/cpu/memory/and any drives into the power equation and 270x's pull 180w. Also, that calculation is what the PSU needs to provide. It will then pull above that depending on the efficiency from the wall. So say you need 800w of power, all in for the cards, board, etc and are using a 92% efficient PSU, it's going to pull 800/.92 from the wall. Your over wattage is lost as heat from the efficiency. Ideally, you should be running the PSU around 75-90% load for efficiency. At least this is my understanding of PSU ratings. I may be wrong and by all means feel free to correct me.

Depending on the CPU though, he vary well could undervolt those three cards and most definitely get them all on the 850w.
 
Wow you guys replied fast, today might be a good day lol. Anyways, I read on another forum that ALL the risers NEED to be powered by the first/primary PSU. Then after that, the 6/8 pin pcie connectors can connect to either the first or second PSU. I'm not sure if that's true, but that's how I've connected them and they all worked that way in my previous configuration.

Is 850 watts enough to power all 3 cards, plus the motherboard/CPU/HD? I have a feeling I'll fry it, it is a Thermaltake PSU but it does get quite warm. My secondary (Startech) PSU has never felt even close to this warm, even when it was in a "normal" gaming setup, so that's why I think I'm already stressing the 850 watt.

All cards are trying to be run at 100% stock settings.

I want to figure this out with both power supplies, because I thought that since I bought a board with a total of 5 PCI-E slots, I could use a total of 5 GPUs (provided that they all get enough juice).

I have 4 risers (some spares), and have swapped them all around, they all appear to be good. Using risers with ALL 3 cards does absolutely nothing vs 2 risers and one in the mobo. All the risers are powered. I've tried it without connecting the power to the risers. Same thing.

Can someone clear something up about the add2psu? I bought it because I thought I needed it, but if the ONLY power I want to draw from the second PSU is 6/8 pin video card power, I don't even need this thing do I? Looking at it, I have no idea how it's supposed to distribute the power load; theres no way in hell I'm going to get 650 watts from that single molex connector. Documentation on connecting video cards with this seems to be nonexistent... can't find anything from the company, youtube, forums, anything.

I can get into windows 7 just fine with all 3 cards installed with no drivers, once I try to install the drivers, it blacks out. If I move the PCI-E connectors to different spots on the mobo, it will get to windows but still won't detect the 3rd card.

edit for clarity: when I say that it's rebooting with all the cards installed, I mean the PSUs are completely turning off, as if there was some kind of short circuit, and then they turn back on and it tries to boot again. Windows boot loader always insists on running chkdisk when this happens for some reason. If I don't touch anything and tell it to boot normally, it will loop like this nonstop. I just don't get it, it's acting like it has power issues, but only when the drivers are installed? But it never detects all 3 GPUs unless the HD 7950 is the primary one without a riser.
 
Last edited:
I think you just explained it. The second PSU, the one being controlled by the add2psu DOES NOT SUPPLY POWER via that cable. It only turns it on/off.
You MUST use PCIe power cables from the second supply to your GPUs or you are getting nothng from it. Period.
That would also explain why the 850 is hot and the 650 is cold. :)
 
AFAIK the ADD2PSU just turns the second PSU on/off when your primary PSU turns on/off.

I'd also power all 12v related things from 1 PSU, so risers/gpu's all on secondary PSU while the rest of it's on the primary. This way I can get away with a tiny cheap PSU for mobo/cpu and a server/high wattage PSU for the GPU's. This may not work though if the secondary psu has a delay in turning on, which may be where others/yourself are having issues.
 
Ok, so in that case, I do have the PSU's hooked up right, I wish add2psu had been a bit more straightforward and advertised that it turns on 2 PSUs at the same time, instead of advertising that it allows you to connect 2 power supplies together (apparently it's always been safe and I just didn't read enough into it). The fans look like they are starting at the same time on both PSUs. It really is worthless, I thought it provided load balancing or a single common rail to access the total wattage from.
 
You need to do more reading.

1. Are your powered risers cut? Meaning, are the wires that would normally draw current from the pci-e slot isolated so that they cannot get juice from the board? If they are like normal chinese risers, than they are not cut and you have the 12volt from the mobo(coming from PSU1) mixing with the 12v from PSU2.

Likely the reason you can't get it to work that way.

2. Add2psu is designed to just turn on the 2nd psu when the first powers on. Only purpose of that connector. Distribute load? No idea what you are thinking with that. Connect pcie 6/8 pin connectors as normal off of the second psu when using it.

3. Ideally, using cut powered risers, any video card drawing power from the 2nd psu should also have its riser powered by the 2nd and ONLY the 2nd psu.

However, I have had no problem allowing to to draw power from the mobo on my 2 psu rigs. JUST DON"T LET THE 12V FROM BOTH PSU'S MIX AT THE RISER. ISOLATE THE POWER WIRES.

Check this guys page and scroll down to the pci-e power cables.
http://www.xodustech.com/projects/bitcoin-mining-farm
 
Ok, so in that case, I do have the PSU's hooked up right, I wish add2psu had been a bit more straightforward and advertised that it turns on 2 PSUs at the same time, instead of advertising that it allows you to connect 2 power supplies together (apparently it's always been safe and I just didn't read enough into it). The fans look like they are starting at the same time on both PSUs. It really is worthless, I thought it provided load balancing or a single common rail to access the total wattage from.

Oh, no, it's an overpriced piece of crap. You would have been better off either paperclipping the green and black leads for a free turn on solution or doing what I did (since I wanted both PSUs to turn on and off with the power button) by using this $5 cable at newegg. http://www.newegg.com/Product/Product.aspx?Item=N82E16812201037
 
Just pointing out an alternative theory. Some boards need IOMMU enabled in the BIOS to correctly see more than 2 cards. My AM3+ MSI 970A-G43 is one of these boards. This setting is found either in the chipset section of the BIOS or on CPU features (don't ask me why). I will browse your A88X manual (slow day at work).

Edit - Found it in your manual. Top of page 3-15 under CPU Configuration. Disabled by default it looks like. Worth a shot.

http://dlcdnet.asus.com/pub/ASUS/mb/SocketFM2/A88X-PRO/E9072_A88X-PRO.pdf

Related thread with similar issue, enabling IOMMU fixed it: https://litecointalk.org/index.php?topic=11484.0
 
Last edited:
The idea regarding the cut risers sounds promising, I'm googling it now, but if anyone has a link with pictures, that would be awesome. To clarify, I have this problem even when I don't hook up the molex connector to the to the risers, but this should still fix it?
 
Just pointing out an alternative theory. Some boards need IOMMU enabled in the BIOS to correctly see more than 2 cards. My AM3+ MSI 970A-G43 is one of these boards. This setting is found either in the chipset section of the BIOS or on CPU features (don't ask me why). I will browse your A88X manual (slow day at work).

Edit - Found it in your manual. Top of page 3-15 under CPU Configuration. Disabled by default it looks like. Worth a shot.

http://dlcdnet.asus.com/pub/ASUS/mb/SocketFM2/A88X-PRO/E9072_A88X-PRO.pdf

testing this right now...
 
I know in some mobo's you need to short the presence/detect pins on pcie-1x slots to get them to work with risers.
 
No dice. After googling a bit more, it looks like one reason I'm having issues, is because I'm using 1x-16x risers in a 16x slot, and apparently 16x slots need 16x - 16x risers on some motherboards. I'm digging up some info on this now.

Now I'm even more confused, my cards work fine when plugged into the 1x slots, but not the 16x slots.
 
Last edited:
Here:
https://bitcointalk.org/index.php?topic=102547.0

Only needed for some boards to detect a card using a riser. The picture in the second post is what you need to due. Maybe

This is for mobos that don't detect GPUs in a 1x slot, mine are detected and working fine in the 1x slot, but when I plug any 1x riser into the 16x slot, they aren't detected. I'm at a complete dead end though so it can't hurt, I'll give it a try.
 
wait a minute, that picture shows the 1x slot being shorted with the wire, are you saying that i should short the exact same pins shown in that picture, except try it on a 16x slot?

edit: reading more of that link, it looks like it worked for another person. Jesus, this might be my cure!
 
wait a minute, that picture shows the 1x slot being shorted with the wire, are you saying that i should short the exact same pins shown in that picture, except try it on a 16x slot?

edit: reading more of that link, it looks like it worked for another person. Jesus, this might be my cure!

Yes.

My MSI AM3 board needs the x16 slots to be shorted for the x1 risers to work.

Good discussion here:
https://bitcointalk.org/index.php?topic=36061.0
 
So the wire trick worked for one of my 16x pcie slot, but had no effect on the others. So now it still does this insta power off thing, as soon as the drivers load :(
 
^^ That still sounds like a power issue.
In words, can you describe exactly what you have plugged into where?
GPU, MB and PSU connections.
 
^^ That still sounds like a power issue.
In words, can you describe exactly what you have plugged into where?
GPU, MB and PSU connections.

Exactly, don't try to power the card and the risers from +12v lines from two different PSUs. That's a start right there.
 
It's just not gonna happen, I dunno what the deal is with my 7950, but the ONLY way I can get it to work, is if it is the primary GPU without a riser. Any other way and reboots, no matter what or how many different wiring combinations. Once, I got it to detect without rebooting, but it was detecting the 7950 as R9 200 series for some reason. Right now, I have the 7950 in the primary spot, with no riser. The R9 280x is in the bottom x16 slot, without a riser. They are both being powered by 850watt psu. The R9 270x is in the top 1x slot, and is being powered by the secondary psu with the riser being powered by the primary psu.
 
You still have PSUs mixed on one card so there is that.

The mix up in card type reported gives me another idea.
What does your configuration file or command line for mining look like?
And are you changing it every time you move cards around in slots?
If you are not changing the mining parameters to match the card with the slot every time you move them around, you may have a setting from a card previously in a slot that the card now moved to that slot cannot tolerate.
I have seen rig reboots on mining start up when I had really bad parameters assigned to a card.
 
It's just not gonna happen, I dunno what the deal is with my 7950, but the ONLY way I can get it to work, is if it is the primary GPU without a riser. Any other way and reboots, no matter what or how many different wiring combinations. Once, I got it to detect without rebooting, but it was detecting the 7950 as R9 200 series for some reason. Right now, I have the 7950 in the primary spot, with no riser. The R9 280x is in the bottom x16 slot, without a riser. They are both being powered by 850watt psu. The R9 270x is in the top 1x slot, and is being powered by the secondary psu with the riser being powered by the primary psu.

Have it powered by the same PSU.......
 
Have it powered by the same PSU.......

That's the reason for this whole thread, I want the power to be at least somewhat distributed between the power supplies, I don't want to burn my house down trying to make a few hundred bucks a month. What you are suggesting, is that I run an R9 280x, an R9 270x, an HD 7950, + motherboard, CPU, HD, memory, etc, all off an 850 watt power supply? You're nuts. The 270x is the only one not power by the same psu, and its working fine.
 
Last edited:
You still have PSUs mixed on one card so there is that.

The mix up in card type reported gives me another idea.
What does your configuration file or command line for mining look like?
And are you changing it every time you move cards around in slots?
If you are not changing the mining parameters to match the card with the slot every time you move them around, you may have a setting from a card previously in a slot that the card now moved to that slot cannot tolerate.
I have seen rig reboots on mining start up when I had really bad parameters assigned to a card.

I don't run cgminer with a config file so to speak, I just save a command line entry as a batch file and run it. No matter what the card location, when all 3 are present, the 280x is always gpu0, 7950 is gpu1, and so on. I'm having these issue before even starting cgminer though.
 
ok, I had a problem just today getting my Asrock board (Z87 Extreme3) to accept a riser on a 1x slot with two other 280x's in the system directly on the board in 16x slots. It just wouldn't detect the riser card (270). After a little research I found sometimes you need to go into the BIOS and set the PCI-e speed to 1x instead of auto or 2/3x. That did the trick.

My riser card runs on a separate PSU with the MB psu providing the powered riser with power. The extra PSU only provides the 6pin power.
 
ok, I had a problem just today getting my Asrock board (Z87 Extreme3) to accept a riser on a 1x slot with two other 280x's in the system directly on the board in 16x slots. It just wouldn't detect the riser card (270). After a little research I found sometimes you need to go into the BIOS and set the PCI-e speed to 1x instead of auto or 2/3x. That did the trick.

My riser card runs on a separate PSU with the MB psu providing the powered riser with power. The extra PSU only provides the 6pin power.

That's exactly what I'm trying to do, I'll take a look and see if I have some PCI-E speed settings in the BIOS.
 
My BIOS doesn't have any PCI-E speed settings, other than 8x and 16x. I've dedicated such an absurd amount of energy and time to mining since starting last month, I'm just going to let my PSU fry and be done after that. My setup, even with my last mobo/cpu combo has been fighting me all the way and it's just not worth it, at this point I would get more satisfaction out of destroying everything now than actually getting it to work. If it's not one of the cards dying, its team viewer, if its not teamviewer, it's windows overall stability... it just needs constant nonstop babysitting. I'm gonna sleep on it. At least I've almost broke even. Thanks for the help guys.
 
That's the reason for this whole thread, I want the power to be at least somewhat distributed between the power supplies, I don't want to burn my house down trying to make a few hundred bucks a month. What you are suggesting, is that I run an R9 280x, an R9 270x, an HD 7950, + motherboard, CPU, HD, memory, etc, all off an 850 watt power supply? You're nuts. The 270x is the only one not power by the same psu, and its working fine.

By mixing 12V from different power supplies you are MORE likely to fry something or set something on fire. A guy on a different forum cooked a few of his 7950s due to using those riser cables and mixing power between 2 PSUs... make sure no component gets power from more than one PSU, period.
 
My BIOS doesn't have any PCI-E speed settings, other than 8x and 16x. I've dedicated such an absurd amount of energy and time to mining since starting last month, I'm just going to let my PSU fry and be done after that. My setup, even with my last mobo/cpu combo has been fighting me all the way and it's just not worth it, at this point I would get more satisfaction out of destroying everything now than actually getting it to work. If it's not one of the cards dying, its team viewer, if its not teamviewer, it's windows overall stability... it just needs constant nonstop babysitting. I'm gonna sleep on it. At least I've almost broke even. Thanks for the help guys.

Why are you not willing to try hooking the cards and risers up to the same shared PSU instead of mixing and matching. It has been beaten to death already. Try it. :eek:
 
Tomorrow, I'm absolutely beat.


edit: last update for today, I just can't leave it alone, it racking my brain....


The reason I didn't want to use just the 850 watt PSU:

I started with the 650 watt PSU, after reading some litecoin mining guides, the general consensus was that ADD2PSU is a great way to get a 6+ GPU miner. But what you guys are telling me, is that there is basically no way to hook up two power supplies? Because even if I power the risers and cards with just one psu, the pci-e connections are all going to be powered from the psu plugged into the motherboard. I have to be overlooking something.

I also don't want to do that, because my original plan was to allow me to get possibly around 5 GPUs on one mobo and be done, I don't have room for a large setup. If I can't use two power supplies, that means that 1) I wasted 30$ on this stupid adapter, 2) I thought I was saving 150$ by buying a 850 watt PSU instead of 1200 watts 3) I can't add any more cards at all without either shelling out $300+ for a power supply, or building a new miner altogether. This might all be true, but I'm having a hard timing accepting it as that means all my plans for this system are basically trashed.

Ok I'm done for the night for real this time, I'm just going to hurt my brain if I keep thinking about it lol.
 
Last edited:
I understand your frustrated. All of us have been at one point in this game for sure.
I won't try and convince you not to give up but will make a couple more points about items you seem to misunderstand.
First, spreading the power between two PSUs across the rig is just fine and done all the time.
What is being recommended above correctly is to make sure that all the power to any one individual GPU comes from one PSU only.
So, if you have the cable resources, try this setup:
1) Make your 650W supply the primary PSU (it now controls the 850W supply via the fancy cable you bought.
2) 650W supply is now connected to all MB power input connectors
3) Put your choice of GPU directly into the #1 slot (x16 I suppose) of the MB
4) Connect a PCIe power cable between the primary 650W supply and all 6/8 pin connectors of this first GPU
5) Via powered riser, connect the second GPU to the MB
6) Run all power for this riser from the secondary PSU, 850W model
7) Connect any 6/8 pin connectors of this second GPU only to the 850W supply
8) Via powered riser, connect the third GPU to the MB
9) Run all power for this riser from the secondary PSU, 850W model
10) Connect any 6/8 pin connectors of this third GPU only to the 850W supply
Now you have the power distributed pretty well between the two supplies but no card is connected to both supplies in a bad way.
One GPU and all the MB components on the smaller supply and the other two GPUs on the larger supply.

Now about this statement:
I don't run cgminer with a config file so to speak, I just save a command line entry as a batch file and run it. No matter what the card location, when all 3 are present, the 280x is always gpu0, 7950 is gpu1, and so on.
Sorry, but unless I am crazy, that is a total misunderstanding of how the GPUs are numbered.
They are numbered by the system when it detects them. And when you swap them in slots they WILL take on a new number, thus needing a new order in the command line.
If you want, paste your cgminer command line here and we can help you sort through it.
Running "cgminer -n" no quotes and no other flags should report to you what GPU is numbered what in their present locations.

Good luck and hope this helps.
 
Now about this statement:
Sorry, but unless I am crazy, that is a total misunderstanding of how the GPUs are numbered.
They are numbered by the system when it detects them. And when you swap them in slots they WILL take on a new number, thus needing a new order in the command line.
If you want, paste your cgminer command line here and we can help you sort through it.
Running "cgminer -n" no quotes and no other flags should report to you what GPU is numbered what in their present locations.

Good luck and hope this helps.


It doesn't matter what slot and how many card are in (unless its only 1) they all are assigned the same gpu number, every time. No matter what, the 280x is always gpu0, and I can confirm this by starting cgminer for gpu0, and physically putting my hand behind it to feel that card getting warm. It's like they have permanent numbering.
 
Read Core32's post. He's being generous with his time typing all that out, and basically echoes what like five of us are saying, lol.
 
It doesn't matter what slot and how many card are in (unless its only 1) they all are assigned the same gpu number, every time. No matter what, the 280x is always gpu0, and I can confirm this by starting cgminer for gpu0, and physically putting my hand behind it to feel that card getting warm. It's like they have permanent numbering.

Have you at least tried cgminer -n ?
Just for my sanity.
What you are telling me goes against what I have observed for two months worth of mining.
 
Back
Top