Best options for folding on multi rtx 4---- cards?

Doozer

2[H]4U
Joined
May 30, 2001
Messages
2,495
So, I still am waiting for the last house to sell to get my inheritance from when my Mom died.


I want to build something for folding at home with multiple RTX 4--- GPUs to do cancer research.
What's the best plan of attack to get the most PPD for the least heat production?
I don't want to walk into an oven when I go to my office.

I want to just throw a couple of 4090 cards in something to maximize PPD but I don't know how hot they get.

As far as what motherboard, etc. goes what would support multi-card folding? Just anything that has enough PCIe slots?
Would one of the mining boards make sense?

For those of you running more than one RTX card on one PC for folding, what is your setup? I'm looking for ideas.
 
If you are looking for 3 or more gpus, threadripper or epyc server motherboard is one way to go. Here is a picture of one of the DC members in another team with 7 Titan V in each setup. One air cooled and the other water cooled (single slot, allows you to sardine packed the gpus). Both run on epyc motherboard with Rome cpu providing 128 lanes of pcie3.0. I believe all the cards are fed with at least pcie 3.0 x 8, so this is more than enough for folding and other BOINC projects.

Not a big fan of doing the pcie bifurcation which can allow you to have more than 2 cards on consumer grade motherboard. The old threadripper like 2950x can provide 64 lanes already but I think if your inheritance is sizable, get a latest and greatest one, lol.

Cooling the cards will obviously be the main factor, so if you have good cooling and beefy PSU, go for 4090 :)



1706739963495.png


1706740000951.png
 
So, I still am waiting for the last house to sell to get my inheritance from when my Mom died.


I want to build something for folding at home with multiple RTX 4--- GPUs to do cancer research.
What's the best plan of attack to get the most PPD for the least heat production?
I don't want to walk into an oven when I go to my office.

I want to just throw a couple of 4090 cards in something to maximize PPD but I don't know how hot they get.

As far as what motherboard, etc. goes what would support multi-card folding? Just anything that has enough PCIe slots?
Would one of the mining boards make sense?

For those of you running more than one RTX card on one PC for folding, what is your setup? I'm looking for ideas.
My opinion:

I would not use a mining board. Your project warrants a better platform and the F@H workloads definitely want the PCIE bandwidth you won't get from a 1x slot. Those 1x mining rigs work fine for most BOINC projects, but for the competition we just had, I bought a half dozen PCIE3 ribbons to swap gear off the 1x slots and onto full slots. It made a difference.

Figure out what is most important priority wise. Efficiency, speed, etc. Start there. I have a couple of 3060's that are going to keep folding for a few more days and they are enough on their own to heat up a home bedroom/office. But, they are more efficient than a similar system with 3070Ti's. 4090's? My 4090 is water cooled and it kicks out a lot of heat when running at 350-450 watts constantly. X2? Is one 4090 better than 2x 4070 / 4080 Supers? How much juice will those chew through? Does it make sense to get bigger, hotter hardware and gimp it? Or get mid range gear and let it run?

Watch the PCIE lanes and bandwidth you'll need. F@H gives a bonus for your gear's efficiency. Keep that in mind if you are trying to max out points. So, putting a fast GPU at 60% power on a 1x slot is not a good plan.

Make sure your breaker can handle it and whatever else you have in the room. I only tripped one breaker last week.

Are you planning on CPU DC projects? If not, get a power efficient cpu that supports pcie4 and enough PCIE lanes for the hardware you are running. If DC CPU crunching consider the projects you'll run. It looks like consumer/commercial AMD has an edge in many projects these days. Not to mention PCIE bandwidth support. I'd seriously look at an EPYC box. Add up the space and energy needed.

Chassis? Power?

Personal experience: I have a couple old 20c xeons that power some consumer 2011v3 MB's and keep 3-4 GPU's busy, mostly 3070Ti's, since they were OK bang/buck when I built them. They are dual use CPU/GPU crunchers - miners. Whatever I'm doing. I'm happy with them, but, they are older, not nearly as power efficient and getting creamed by newer gear. I also have a couple 6700/7700 skylake systems doing GPU only duty. I'm ok with those, but, I'm seriously considering swapping them for AMD based platforms. I like the versatility of CPU/GPU production.
 
My really quick research could be wrong, but according to it no advantage to run a multi-gpu system versus 2 different system, other than if it save needed space-cost.

Specially if we are talking going epic-threardipper instead of 2xused regular cheapo affiar maybe it will not do that and I do not imagine that the data vs compute ratio is a big one here (i.e. if it is only 2 gpu, I would verify before worrying that running say in PCIE 4.0 in 8x mode is an issue at all)

For only 2 GPU system like talked about, maybe anything with 2 full pci-express slot far enough from each other would do, maybe having one of the 2 4090 being a watercooled one to save vertical place if needed being the simple solution.
 
Titans in a Dual Xeon for last week. The RTX Titan ran in "Unicorn" mode for many of its WU's. She's very proud of herself, as you can tell. Going back into their boxes now. I had the CPU's doing some BOINC stuff during the F@H challenge. Fun little box.


20240131_155901.jpg
 
So, dual Xeons or just a single current-gen CPU if only running two GPUs?
This will be a full-time 24/7/365 folding@home box.

Water cooling would be nice but my only experience with it is using the occasional AIO so I'm not sure about going that route.
 
I would recommend a current-gen Intel CPU to feed them. In the last few months i've built several machines, first 2 being 4090-based but one being an i3600k and the other a Ryzen 7800x. The AMD system was consistently a few seconds off-pace from the Intel system (doing the same WU of course). Now while a few seconds might not seem like much, it comes out to a few million PPD per card. If you're building no I see no reason to NOT maximize all you can get.
 
Heres my 0.02 as someone who is currently building multi x4 gpu folding boxes.

if using gen4 GPUs, use Gen4 mobos that support x16 lanes per gpu. Could you get away with x8 per gpu? Sure, BUT I am thinking of future gpu releases.

My current builds are W790 and X670E w/ a ROMED8-2T in the works atm as well. My preference is DDR5 systems, but I always like good deals on used server hardware. CPUs and GPUs are all watercooled and being shoved into a 42U HP Rack. Building my own custom CDU and rack manifolds for this project to handle all the cooling along with using four MO-RA3 420 rads for plenty of capacity.

All rigs will be in Alphacool 4U chassis as from what I've found, those give me the best bang for the buck in terms of flexability and options.

Quick run down of builds underway:

W790 SAGE, W9-3475X, x4 4090 w/ EKWB Pro Blocks
X670E ProArt, 7800X3D, 4090 w/ EKWB Pro Blocks(Gaming rig mostly but helps when it can)
ROMED8-2T, EPYC 73F3, x4 4090 w/ EKWB Pro Blocks
WRX90E SAGE or W790 SAGE, 16-36 core cpu, x4 4090 w/ EKWB Pro Blocks
GENOAD8X-2T/BCM, 16-36 core cpu, x4 4090 w/ EKWB Pro Blocks


Looking to have 4-5 4U chassis with x4 gpu each. I currently have 40A of 240v and 40A of 120v available solely for the server rack, so powering these beasts shouldn't be an issue.


If you ARE going dense systems though w/ 40XX GPUs, watercooling IMO is the ONLY way to keep them in check temperature wise
 
One other thing that I will say is, IMO going for something like an older x8 SuperMicro GPU Chassis is not worth the headache if going with consumer GPUs
 
Water cooling would be nice but my only experience with it is using the occasional AIO so I'm not sure about going that route.
You can get AIO card (at least at the 4090 level) MSI has a popular one:
https://www.msi.com/Graphics-Card/GeForce-RTX-4090-SUPRIM-LIQUID-X-24G

You probably not need that much CPU for running only 2 gpu, at least something to verify, maybe a nice 13100 would be more than enough (it is the GPU doing the work, 1 thread by card could do and a 13100 has 8 really strong of them)
 
You can get AIO card (at least at the 4090 level) MSI has a popular one:
https://www.msi.com/Graphics-Card/GeForce-RTX-4090-SUPRIM-LIQUID-X-24G

You probably not need that much CPU for running only 2 gpu, at least something to verify, maybe a nice 13100 would be more than enough (it is the GPU doing the work, 1 thread by card could do and a 13100 has 8 really strong of them)
Does anyone have any experience with running two or three AIOs in the same case? I see plenty of cases on Newegg that support AIOs in multiple spots but nothing to say if you can run 2 or 3 of them at the same time (two GPUs and CPU). Googling is not giving me much either.
 
Last edited:
Does anyone have any experience with running two or three AIOs in the same case? I see plenty of cases on Newegg that support AIOs in multiple spots but nothing to say if you can run 2 or 3 of them at the same time (two GPUs and CPU). Googling is not giving me much either.
It's no problem, if you can fit them. 2 is pretty strait forward - CPU and GPU. Probably top and front mounted. Maybe side on the back panel if you get one that supports it that way. Measure thrice.

If you are looking at three, I'd say look to a custom loop. Might be a little more $ up front, but you will get lots of benefits down the road.

I had three on my SR2 / 970 build years ago. Swapped the Corsair AIO's for EVGA 280's when they came out. Still works. Original case was a Corsair 900D.

1706752208281.png
 
It's no problem, if you can fit them. 2 is pretty strait forward - CPU and GPU. Probably top and front mounted. Maybe side on the back panel if you get one that supports it that way. Measure thrice.

If you are looking at three, I'd say look to a custom loop. Might be a little more $ up front, but you will get lots of benefits down the road.

I had three on my SR2 / 970 build years ago. Swapped the Corsair AIO's for EVGA 280's when they came out. Still works. Original case was a Corsair 900D.

View attachment 631917
Now that looks sweet. I had an SR2 back in the day. I don't remember what happened to it.
 
I'm thinking about a nice big case that supports two 360mm rads for the GPUs, Gigabyte makes one with a 360mm AIO, plus an NH-D15 for the CPU. Probably nothing over a 14600k for a CPU though with that cooler. Mine gets up to low to mid-90s while folding
 
I would go cust. loop if going with three or more pieces of hardware. 2P E5-2695V2 and 2 Tesla P100 GPUs Simple old school.

2P2P100.jpeg
 
I'm wary of doing my first loop on hardware this expensive. I'll consider it though.
I did find a case that Google says supports 3 360 radiators though, the NZXT Flow 9
 
The trouble with big cases sometimes the water lines are too short to reach the front and have the lines enter at the bottom so the air goes to the top of the rad and stays there.
With smaller cases sometimes an AIO will not fit at the top because of the heatsink of the MB chipset or 8 pin CPU wires. Can be very tricky.
 
The trouble with big cases sometimes the water lines are too short to reach the front and have the lines enter at the bottom so the air goes to the top of the rad and stays there.
With smaller cases sometimes an AIO will not fit at the top because of the heatsink of the MB chipset or 8 pin CPU wires. Can be very tricky.
Yeah, there may be some trial and error when it comes time to build to see what works. I'm not above modifying some sort of open-air tray/frame.
 
It would be so nice if you could order AIO with different length lines. There are AIO with quick disconnect fittings. I believe prefilled, maybe can get line extensions.
I think they run a little more $$ but maybe worth looking into. See what their reviews are like.
 
I didn't see any suggestions for power limiting the cards. Basically all of the Nvidia cards going back to the 980 have the maximum efficiency at 55% power limit +/- 5%. My 3080ti kicks out dramatically less heat at 55%, and bumping it up to 110% really cranks up the heat but adds less than 1m ppd average.
 
As I've been thinking about it, if I were going to try to fold with 4x cards, I'd probably build one system now with 2x 4090s and use it as a learning experience - a benefit is I think you can get away with running dual 4090s on air in the right case, you don't have to worry about exotic power supplies (1200w Seasonic Titanium will be fine), you don't have to worry about overloading a circuit, you can heat 2 different rooms with a PC in winter, you can work out the Linux install (linux folds faster), and you can establish how much power limiting will effect points per day. Once you work out all the details on the first system and have it working exactly the way you want, you might be really close to the release date for 5090s - in which case you can either sell your 4090 system and build a 4x 5090x system, or you can just build a 2x 5090 system and let them run in parallel.

Questions you probably want to think about:
Do you want to let the the CPU be idle to maximize ppd out of your vid cards? Do you want to CPU fold (points per watt is much less than GPU)? Do you want to run Boinc on the CPU?

Will Intel or AMD be better? If you're using the CPU for folding or boinc, a 7950 is almost certainly the best option. If you're just letting the CPU idle (excluding running the GPUs) maybe a 7800X3D or Intel CPU would be better?

How much ram do you need? If you are letting the CPU idle and you only need ram for folding, you can probably go for 16GB of very fast ram. If you're going to use the CPU for other things, you'll want more - probably 32gb or 64gb.

Do you want to try a custom loop? If so, maybe try on something a little smaller than a 4090 system. If you've never done a custom loop before, I can almost guarentee that you'll want to change things after you build your first loop and you'll learn a lot. I've been doing custom loops for 20 years and I'm still not 100% happy with what I have :D

How much do you want to power limit the card? Lots of people have tested and I've confirmed in my own setups that you'll generate more points per day running 2 4090s at 50% power limit, vs 1 4090 at 100% (for more info on power limiting here is a nice site with data: https://greenfoldingathome.com/ )
 
So, I still am waiting for the last house to sell to get my inheritance from when my Mom died.


I want to build something for folding at home with multiple RTX 4--- GPUs to do cancer research.
What's the best plan of attack to get the most PPD for the least heat production?
I don't want to walk into an oven when I go to my office.

I want to just throw a couple of 4090 cards in something to maximize PPD but I don't know how hot they get.

As far as what motherboard, etc. goes what would support multi-card folding? Just anything that has enough PCIe slots?
Would one of the mining boards make sense?

For those of you running more than one RTX card on one PC for folding, what is your setup? I'm looking for ideas.
Doozer I got one of these that I used for mining laying around, has all cabling and original box, used for 4-5 months before I started selling off while prices were still elevated for GPUs. We could work out a fellow Hardforum DC price if you are interested. I also have an open air mining frame similar to the black square rail frame with GTX Titans in photo above, but it is unpainted aluminum.
 

Attachments

  • IMG_4183.jpeg
    IMG_4183.jpeg
    740.2 KB · Views: 1
  • IMG_4042.jpeg
    IMG_4042.jpeg
    776.5 KB · Views: 1
Back
Top