Pictures Of Your Dually Rigs!

Too much work. Pound out replies and post! Never understood how more posts is a problem since forums 'activity' are based on posts and so is users 'rank'. Kinda hurts everyone to consolidate--and takes more time to boot.

Please keep posting as you always have. I enjoy your posts and will never complain.
Problem with this board and many others is there are too many wannabe mods and not enough indians. :p
 
I wasn't necessarily complaining, either. I was just being more of a smartass than anything. I'll let Kyle and crew do the mod duties around here. I'm good running a Discord server of ~60 people lol

I'm just another little injun. Who wants the peace pipe? :ROFLMAO:
Please keep posting as you always have. I enjoy your posts and will never complain.
Problem with this board and many others is there are too many wannabe mods and not enough indians. :p
 
I wasn't necessarily complaining, either. I was just being more of a smartass than anything. I'll let Kyle and crew do the mod duties around here. I'm good running a Discord server of ~60 people lol
And the mod crew around here is no joke if you get tangled with them! Know a bunch of users have have in the past. ;)
 
Even the older lga1366 and lga2066 z-lines were stout machines. I have a z600 and z420 and love them both. Both have seen duty in up to 90F and never complain. I have purposely set the fans to 100% for them though considering the service duty.
 
SamirD

4 posts in under 2mins.. seriously? consolidate your post like you do your servers :p

Postcount++

:p

More seriously though, sometimes when you are replying to one post, you don't know you are going to be drawn to reply to another shortly, and you don't necessarily want to edit, because then the person you are replying to may miss the reply.

I don't mind the separation between replies though, it can make things more organized and readable.
 
Last edited:
More seriously though, sometimes when you are replying to one post, you don't know you are going to be drawn to reply to another shortly, and you don't necessarily want to edit, because then the person you are replying to may miss the reply.
This is 100% the only reason I'm guilty of it from time to time.
 
This is 100% the only reason I'm guilty of it from time to time.
Yep, pretty much happens to me as I will open the reply in a new window so I can continue reading where I left off and get my thoughts out and posted. Sometimes that backfires a bit as someone else addresses the same issue or there's more info later in the thread, but hey this is free 'content' for the website so I'm not going to spend more time than necessary. :D
 
I've been here since 2001 and have only 2415 posts. Call me a conservative. :p
I'm on some forums for over a decade and don't even have a 100 posts, lol. It depends on what the activity is and the community really. I dig the vibe of the people here, so I'm back regularly. :)
 
I heard about the Z840 workstation series from a friend. I like the internal layout a lot. I also like that HP doesn't skimp out on the DIMMs per channel like Dell did in the Precision T3600/7600 (I think it was common across the entire SB-EP line; please correct me if I am wrong!), giving you only 1 per channel instead of 2

There's an auction house that has a location here for some of their stock, if I see a Z440/840 there I'm definitely going to nab one to play around with.
I really love my Z840, 4th gen Intel base Xeons, but with the right config, able to keep up or exceed many current desktop CPUs.

With that said, as much as I love it, to think of purchasing it now ?? Sort on the fence. Great box, don't get me wrong, but it's going to be harder to justify as it ages (as far as getting one that's "new to you").

But still, there's a part of me that says "Buy it!!"

One thing, it is very custom. The passive heat sinks on the CPUs are specially designed for a very custom duct work and fan shroud. So much so, I purchased a spare.
 
I like my Z840. I bought it a couple years ago specifically for me to play around with a new to me dual cpu system that I use for distributed computing. The v4 Xeons are cheap and HP's are pretty versatile. I ran it during the last couple challenges we've had for Primegrid and The Pentathlon. It fits a couple low end NV xx60 type GPU's and has 40c/80t for CPU work in a fairly small space. And it just runs. No issues. So, lots of CPU crunching and a pair of small 3060's make a tight little box.

However, I just bought a Z6 single CPU off lease and will either add the 2nd CPU module to it or if looking for a dual system off lease today I'd look at the newer Z8's. 2nd gen Intel scalable CPU's are plentiful. You just have to watch compatibility as there are a bunch of off roadmap CPU's cheap that may not work with the MB in some systems.

1687796067636.png
 
I like my Z840. I bought it a couple years ago specifically for me to play around with a new to me dual cpu system that I use for distributed computing. The v4 Xeons are cheap and HP's are pretty versatile. I ran it during the last couple challenges we've had for Primegrid and The Pentathlon. It fits a couple low end NV xx60 type GPU's and has 40c/80t for CPU work in a fairly small space. And it just runs. No issues. So, lots of CPU crunching and a pair of small 3060's make a tight little box.

However, I just bought a Z6 single CPU off lease and will either add the 2nd CPU module to it or if looking for a dual system off lease today I'd look at the newer Z8's. 2nd gen Intel scalable CPU's are plentiful. You just have to watch compatibility as there are a bunch of off roadmap CPU's cheap that may not work with the MB in some systems.

View attachment 579518

Same,

While my Threadripper 3960x is not a "dual CPU" solution, and is newer than an e5-2698 v4, the comparison is somewhat revelant from a "many core" perspective. Having been launched in 2019 vs 2016, even it is starting to lose its relevancy compared to newer fewer core consumer parts (as long as you don't need quad channel RAM or a large number of PCIe lanes)

While it is really cool to pick up these older many core dual socket systems, and you can get a lot of rendering power for not very much money if you time it right, that window is pretty small.

My 24C/48T Threadripper 3960x scores ~32,500 in Cinebench R23 which was amazing when it was new, compared to newer consumer chips it's not as impressive. A MUCH cheaper Ryzen 9 7900x is nipping at its heels with ~29,500, and a Ryzen 9 7950x beats it handily at ~38,700. And my Threadripper absolutely has its ass handed to it by both of these newer chips in single core loads. (2072 and 2042 vs my paltry 1337)

It seems like just yesterday I bought this thing, but I guess a lot happens in 3.5 years. At least my single core score is 1337 :p

My old dual socket E5-2650v2 in my server feels like it is hobbling along at this point, spending more power and having less to show for it than a newer system. I really should replace it with something newer, but I am a homeowner now, and something always seems to have higher priority than spending a few thousand bucks on a new server...
 
Last edited:
With that said, as much as I love it, to think of purchasing it now ?? Sort on the fence. Great box, don't get me wrong, but it's going to be harder to justify as it ages (as far as getting one that's "new to you").
The thing that will always make the justification easy is when you look at the price/performance ratio. When these workstations drop in price, they usually drop pretty massively from new once they enter the used market. And as they age, it falls off even faster. And then there's private parties that no longer need them--which is where the real bargains are.

Still, the cheap performance is always in rack mount servers (when it comes to duallys)--people will basically give them away once they're older even when their workstation twin still commands some dough. And you can put it in the garage and rdp into it and all is good!
 
  • Like
Reactions: cjcox
like this
One challenge in recent time is that it can be hard to beat an simple 3900x-5900x type of build for MT performance for old dual rig if 64-128gig of ram is enough, and those can be on the simple-cheap to get.

A simple 3900x has a 33k MT passmark, 64gb system with a Radeon 6700, 1tb ssd goes for $650 on Ebay, maybe you can find something under $500 with no GPU.

you need a dual Intel Xeon E5-2699C v4 and their 44 cores-88thread to match that and you will have half-66% the single thread performance and only match the 3900x for things that can scale well over 80 threads.
 
I have a very old Supermicro 1U dually server based on a X7DWU with 2 E5472 Xeons and 32GB of ram. It also has an add in IPMI daughter card and 4 10k SAS HD's. It has a Dell Perc H310 SAS controller in IT mode so it can handle just about any HD I could throw at it and it still functions flawlessly. I fire it up maybe 2 to 3 times a year and would not hesitate to gut it and toss it in the Dumpster if it ever failed. The only reason I don't utilize it for a useful purpose is because it consumes gobs and gobs of power when running.
 
One challenge in recent time is that it can be hard to beat an simple 3900x-5900x type of build for MT performance for old dual rig if 64-128gig of ram is enough, and those can be on the simple-cheap to get.
Yep, the single thread performance is where a lot of these workstations lack compared to more modern cpus which have come up in cores as well so now can do similar 'thready' jobs, and faster thanks to the higher ipc.

But ime, this is where the cost factor comes in as the 'slower' systems will be cheaper--sometimes so much cheaper that you can just pick up 3-4 and cluster them and have more performance for the same price.
 
I have a very old Supermicro 1U dually server based on a X7DWU with 2 E5472 Xeons and 32GB of ram. It also has an add in IPMI daughter card and 4 10k SAS HD's. It has a Dell Perc H310 SAS controller in IT mode so it can handle just about any HD I could throw at it and it still functions flawlessly. I fire it up maybe 2 to 3 times a year and would not hesitate to gut it and toss it in the Dumpster if it ever failed. The only reason I don't utilize it for a useful purpose is because it consumes gobs and gobs of power when running.
Still a capable machine to do something if you need it to. Reliability is probably its strongest point as it was built to a much higher enterprise standard.
 
How do you guys deal with the lack of HD capacity in newer cases? I've had to move to rackable chassis to get the capacity I need and then I have to deal with noise.
 
How do you guys deal with the lack of HD capacity in newer cases? I've had to move to rackable chassis to get the capacity I need and then I have to deal with noise.

My Dual CPU system is my NAS (among other things). It's essentially a VM server.

Noise is an issue with many rackmount cases, but you can mitigate it.

I have a SuperMicro SC846 4U case for mine. The noise in that case comes from three places. Power supplies, fan wall and exhaust fans.

You can replace the two 80mm exhaust fans in the back with desktop equivalent PWM fans (I went with Noctua).

I took the entire fan wall out, and replaced it with a custom solution with three 120mm desktop Noctua PWM fans, which is much quieter.

They fit perfectly across. I just made my own wood strip to block the top and hold them in place.

The power supplies that usually ship with the case are very loud, but there are alternative quiet PSU's for it that make it much more reasonable.

585889_PXL_20220126_203906206.jpg


It's reasonably quiet (though maybe not quite as quiet as a desktop can be. Not sure as it sits in my rack with other noisy stuff)

And you get 24 drive bays.
 
My Dual CPU system is my NAS (among other things). It's essentially a VM server.

Noise is an issue with many rackmount cases, but you can mitigate it.

I have a SuperMicro SC846 4U case for mine. The noise in that case comes from three places. Power supplies, fan wall and exhaust fans.

You can replace the two 80mm exhaust fans in the back with desktop equivalent PWM fans (I went with Noctua).

I took the entire fan wall out, and replaced it with a custom solution with three 120mm desktop Noctua PWM fans, which is much quieter.

They fit perfectly across. I just made my own wood strip to block the top and hold them in place.

The power supplies that usually ship with the case are very loud, but there are alternative quiet PSU's for it that make it much more reasonable.

View attachment 579716

It's reasonably quiet (though maybe not quite as quiet as a desktop can be. Not sure as it sits in my rack with other noisy stuff)

And you get 24 drive bays.

Awesome. 😎
 
Finally decided to replace my Supermicro X8DTI with a pair of X5670's with 64GB of RAM and an R9 290 with something a lot more reasonable.

Making a fun little project out of modifying (heavily) an old Dell Optiplex 5070. i7 8700 with 64G of RAM and a AMD 5600XT. This little Dell uses less power at full load than my Dual Xeon system was using at idle!

I'm a bit sad because it's breaking a nice streak I had going of dual processor home PC's.

* HP XW6200, 2x Xeon 3.8ghz (2C/4T)
* HP XW6400, 2x Xeon 5160 (4C/4T)
* Supermicro X8DTi, 2x Xeon X5670 (12C/24T)
 
Last edited:
Here's my spaghetti mess of a server

PXL_20230920_030925335.jpg


IMG20230604164417.jpg



2 x CPU E5-2697 v2 @ 2.70GHz (24 cores, 48 threads)
128 GIG of DDR3 ECC
Supermicro X9DRi-LN4F+ V1.20A
2 x LSI controllers flash to P20 IT mode firmware
Coolers came off aliexpress :p

It's a TrueNAS Core server 13.0U5 (10 or so iocages plex,sonarr,nzbget,bazar etc, also have minio iocage running so my Unraid server can backup to minio via restic)
I recently added three additional 12TB Helium drives, which are not shown in the picture. I manage to cram one more at the very top and the other two at the bottom left by the PSU. This brings my total to 15 hard drives and 4 SSDs in my setup. Also opted to 3D print the bay mounts using glow-in-the-dark filament. I think buying them would have cost me at least $120-$140 so no thanks.
 
Last edited:
Here's my spaghetti mess of a server

View attachment 599940

View attachment 599941


2 x CPU E5-2697 v2 @ 2.70GHz (24 cores, 48 threads)
128 GIG of DDR3 ECC
Supermicro X9DRi-LN4F+ V1.20A
2 x LSI controllers flash to P20 IT mode firmware
Coolers came off aliexpress :p

It's a TrueNAS Core server 13.0U5 (10 or so iocages plex,sonarr,nzbget,bazar etc, also have minio iocage running so my Unraid server can backup to minio via restic)
I recently added three additional 12TB Helium drives, which are not shown in the picture. I manage to cram one more at the very top and the other two at the bottom left by the PSU. This brings my total to 15 hard drives and 4 SSDs in my setup. Also opted to 3D print the bay mounts using glow-in-the-dark filament. I think buying them would have cost me at least $120-$140 so no thanks.

nice.
which case is that?
 
nice.
which case is that?
It is the Fractal Design Define 7 XL. There are two versions available: one featuring a side panel with a glass window, and the other, like mine, with a regular side panel. This case is highly modular, all the internal walls, floors, hard drive mounts inside come off and can be reconfigured depending on your setup. Also it was the only case I found compatibility with both E-ATX and EE-ATX motherboards. I'm currently using an EE-ATX motherboard in this case.
 
It is the Fractal Design Define 7 XL. There are two versions available: one featuring a side panel with a glass window, and the other, like mine, with a regular side panel. This case is highly modular, all the internal walls, floors, hard drive mounts inside come off and can be reconfigured depending on your setup. Also it was the only case I found compatibility with both E-ATX and EE-ATX motherboards. I'm currently using an EE-ATX motherboard in this case.

Ah yes.. I remember considering buying it for the exact same reason (EEATX).. but then I modded a Lian Li which I already had. :)
 
Dell Precision Tower T7910
2 Intel Xeon CPU E5-2630 v3 @ 2.4Ghz
2 Sockets, 16 Cores, 32 Logical Processors
64GB Memory
1300w power supply
Mushkin 500GB M2 Boot Drive
4 Samsung 500gb SATA Flash Drives setup as 1TB Raid Array. LSI Hardware Raid.
6TB WD Red HDD
Nvidia Quadro K4200 Video Card
Ubuntu 22.04.3
PXL_20231212_211550966.jpg
 
Dell Precision Tower T7910
2 Intel Xeon CPU E5-2630 v3 @ 2.4Ghz
2 Sockets, 16 Cores, 32 Logical Processors
64GB Memory
1300w power supply
Mushkin 500GB M2 Boot Drive
4 Samsung 500gb SATA Flash Drives setup as 1TB Raid Array. LSI Hardware Raid.
6TB WD Red HDD
Nvidia Quadro K4200 Video Card
Ubuntu 22.04.3
View attachment 619807
Nice! And nice socks too! I just looked down and noticed I'm wearing the same, lol. :D
 
Nice! And nice socks too! I just looked down and noticed I'm wearing the same, lol. :D
Yeah, this beastie is to replace an Intel NUC as the host for my Unifi network controller, my Jellyfin setup, and my general linux mess-around box.
Was only $400 on fleabay.
(Compression socks FTW)
 
Yeah, this beastie is to replace an Intel NUC as the host for my Unifi network controller, my Jellyfin setup, and my general linux mess-around box.
Was only $400 on fleabay.
(Compression socks FTW)
The used Dell and HP workstations are killer when used. What's even more killer if you can stomach server noises are the servers--a lot of times they're 1/2 the desktops for the same amount of computing power, and usually more storage bays. Downside is that you need to rack them somewhere, but when you have the ability to do that, it's a game changer. ;)
 
The used Dell and HP workstations are killer when used. What's even more killer if you can stomach server noises are the servers--a lot of times they're 1/2 the desktops for the same amount of computing power, and usually more storage bays. Downside is that you need to rack them somewhere, but when you have the ability to do that, it's a game changer. ;)
Yeah, I don't have a rack, my storage needs are met by a Synology DS1821+ with 8x16TB drives :p
 
Yeah, I don't have a rack, my storage needs are met by a Synology DS1821+ with 8x16TB drives :p
I've seen stands that turn them on their side and honestly you can even just leave them somewhere on a shelf, etc. Xpenology and Truenas works well on these servers as well. ;)
 
Better cool those Cheetahs! :eek: Those things ran pretty hot and didn't like heat buildup at all.
Ya, almost burnt my hand after I took that pic, didn't realize that they would get that hot.

I took mine apart a few years ago and the internals are really beefy, especially the magnet assembly, looks like a tiny brake caliper.
And the platters are a bit smaller than a normal 3.5" HDD, and there are a bunch of platters.
The platter stack is sitting on a standard 3.5" platter for reference.
IMG_1932.JPEG
 
My Dual CPU system is my NAS (among other things). It's essentially a VM server.

Noise is an issue with many rackmount cases, but you can mitigate it.

I have a SuperMicro SC846 4U case for mine. The noise in that case comes from three places. Power supplies, fan wall and exhaust fans.

You can replace the two 80mm exhaust fans in the back with desktop equivalent PWM fans (I went with Noctua).

I took the entire fan wall out, and replaced it with a custom solution with three 120mm desktop Noctua PWM fans, which is much quieter.

They fit perfectly across. I just made my own wood strip to block the top and hold them in place.

The power supplies that usually ship with the case are very loud, but there are alternative quiet PSU's for it that make it much more reasonable.

View attachment 579716

It's reasonably quiet (though maybe not quite as quiet as a desktop can be. Not sure as it sits in my rack with other noisy stuff)

And you get 24 drive bays.

Sadly mine is nearing decommissioning.

While there was a cool factor involved with dual socket, this time around single socket was more effective for my next server upgrade.

I'm moving to Epyc 7543 (Milan, 32C/64T, 2.8Ghz Base, 3.7Ghz boost).

It's twice the cores, in half the sockets. While there is less cool factor, the performance will see a huge increase, the power use a significant drop, and the license fees drop by half.
 
Ill have to drag out my old server, its an Asus A7M266-D with a pair of barton MP's in an Antec SX1240 case. Thing weighs a metric ton though and is a pain to move. Pretty sure the PSU in it died after the last time I had it running to pull all my old data off the drives, it was a capacitor plague antec PSU...amazing it worked last time I booted it up.
 
Sadly mine is nearing decommissioning.

While there was a cool factor involved with dual socket, this time around single socket was more effective for my next server upgrade.

I'm moving to Epyc 7543 (Milan, 32C/64T, 2.8Ghz Base, 3.7Ghz boost).

It's twice the cores, in half the sockets. While there is less cool factor, the performance will see a huge increase, the power use a significant drop, and the license fees drop by half.

I lied.

New server is up and running, and I couldn't bring myself to decommission my dually, so I used it to upgrade my testbench build.

The server fans were a little much though, so I grabbed some 92mm Noctuas to replace them

1704011742264.png


Two 8C/16T Ivy Bridge Xeon E5-2650 V2's with 256GB of ECC RAM. This would have been quite the workstation back when it was new :p

All the PCIe lanes definitely help with all of my testbench/backup/imaging work though. And the fact that it has ECC makes me feel better about using ZFS on it.

The question is what I do with the old Sandy Bridge-E x79 Workstation board I was using in the testbench. It has been with me since 2011. I almost get a little misty-eyed at the thought of it no longer being in service somehow.

1704012134513.png


It was my main desktop from 2011 to 2019. I bought it when Bulldozer sucked at launch, and used it until I upgraded to my Threaderipper in 2019, then it went in the Testbench where it has been since.
 
Back
Top