Pictures Of Your Dually Rigs!

[Ion];1039707415 said:
Thank you! :)

I have another system with four AMD Opteron 8350s (2GHz quads) that subjectively just feels much slower--according to well-threaded CPU benchmarks, it is a bit faster, but at probably twice the power consumption. I wish DBWillis had more of the DP Xeon combos available--definitely a great price, although finding a case will be a challenge.

I got lucky with mine. The place I used to work at decided to chuck three full servers, I parted and sold one, and I have the other one in its full glory. If you find the case get a hand truck the thing weighs like 60lbs.
 
Yeh, Ion, I have one more dual xeon combo for sale, much faster and newer, X8DAi, 2x xeon quad core 2.4 and 24gb ram
 
Yeh, Ion, I have one more dual xeon combo for sale, much faster and newer, X8DAi, 2x xeon quad core 2.4 and 24gb ram

I wish i had 24gb of ram my server only has 8gb, trying to get 16 so that i can run some more vm's
 
My work in progress.

kUWJfbZl.jpg


Dual Xeon e5-2650 (8 cores, 2.0ghz, 2.8ghz turbo) ES cpus
48gb ddr3 ecc ram
ASUS Z9PE-D8 WS motherboard
Antec 1000W modular psu
Radeon 7970
Azza Genesis 9000W case

Waiting for (In the mail):
500gb WD Velociraptor 10Krpm HDD
256gb Samsung 840 Pro ssd
Second Radeon 7970 for crossfire.
 
May I ask what you do with such a rig? The dual server CPUs would suggest that it's a workstation, but the 7970 is a straight-up gaming card.

Just a friendly inquiry. :)
 
May I ask what you do with such a rig? The dual server CPUs would suggest that it's a workstation, but the 7970 is a straight-up gaming card.

Just a friendly inquiry. :)

Its a workstation for photogrammetry (utilizes GPU compute for some tasks, and uses a tonne of ram), CAD work, other 3d artwork and modelling, but who's to say I won't do a bit of gaming :)

I'm almost thinking I may need a third 7970 when I eyefinity 3x old school 30 inch apple cinema displays @ 2560x1600, Not sure if the two will be able to keep up... will see though.
 
Posted this in the Storage server Sticky but also counts here.

Specs-

Lian-Li p80b
4x x-case 5-into-3 hd caddys
Tyan S7025
2x E5530 Quad Xeons (2.4ghz)
12GB Ram
Supermirco MV8
Modified so now near silent

120gb OCZ Solid 3 (Win Server 2008 boot drive)
11x Samsung 2TB
2x Hitachi 2TB
1x Samsung 1TB
1x Hitachi 1TB

2x Seagate 3TB (Backup sit on shelf)

Each drive is currently configured as JBOD as seemed easier at the time to manage backup of my media files. (i.e lose a 10th of my data and i can see what ive lost and replace.)
Tempted to move to RAID but not sure which way to go? Any suggestions

Currently its in place as my fileserver/Game server. Want to change to VMWARE but dont want to risk fudging up my current config!
Current Capacity - 34TB

2013-03-04%2015.54.47.jpg
 
Thats a lot of storage! :)

I have always been a fan of raid5, if you were to put your 11x samsung 2tb drives into raid5, you'd have about 18.5TB of usable storage, and a fault tolerance of 1 failed drive.
 
Thats a lot of storage! :)

I have always been a fan of raid5, if you were to put your 11x samsung 2tb drives into raid5, you'd have about 18.5TB of usable storage, and a fault tolerance of 1 failed drive.

I dont mind losing a drive or two. Im just worried that if i do a raid if i for instance the power cord gets detached (GF lives with me so possibility of a mishap of power disconnection is a reality)
I want to go over to VMWARE machines to practise with VM. Also as the board only had built in raid and the card i use is not hardware raid. it worries me!
 
I dont mind losing a drive or two. Im just worried that if i do a raid if i for instance the power cord gets detached (GF lives with me so possibility of a mishap of power disconnection is a reality)
I want to go over to VMWARE machines to practise with VM. Also as the board only had built in raid and the card i use is not hardware raid. it worries me!

I've been using 1 Promise 8-port and 3 Areca 8-port sata raid cards (each in raid6 configuration) for quite a few years, works great. A bit expensive but worth it; never lost data when a drive or 2 failed. UPS's help with those occasional power problems.
 
Tempted to move to RAID but not sure which way to go? Any suggestions

There are several ways to go here:

If you are looking for performance, go with RAID-10 wherever possible. You'll take a 50% hit in storage capacity, but the remaining disk will be the fastest performing raid configuration

If you are looking for resiliency against data loss, go with RAID-6 (if your cards support it) RAID-5 otherwise. With a RAID-6 configuration, you can lose two drives out of an array and still keep on plugging. This is significant in that some controllers can take several days to rebuild a volume after a disk swap, leaving you exposed to total data loss if/when a second drive goes.

In order of performance from highest to lowest: RAID-10, RAID-5, RAID-6

In order of data resiliency: RAID-6, RAID-5, RAID-10

If you have a higher-end RAID card, you can do hybrid RAID levels, like RAID-50 or RAID-60 (basically striping RAID-5 volumes), giving you a blend of performance and resiliency.

...but this is the dually-rigs thread not the storage thread so I'll stop here. :D
 
My work in progress.

kUWJfbZl.jpg


Dual Xeon e5-2650 (8 cores, 2.0ghz, 2.8ghz turbo) ES cpus
48gb ddr3 ecc ram
ASUS Z9PE-D8 WS motherboard
Antec 1000W modular psu
Radeon 7970
Azza Genesis 9000W case

Waiting for (In the mail):
500gb WD Velociraptor 10Krpm HDD
256gb Samsung 840 Pro ssd
Second Radeon 7970 for crossfire.

TeH PuRe SeXX :cool:

Love that setup. Probably cost more than my first car........
 
My HP xw8600
RTJxHhp.jpg


Specs:
- 2x Xeon E5420 (2.5GHz Yorkfield)
- 4x1GB FB-DDR2
- 74GB WD Raptor
- HP 800w 80+ PSU
- HP D5400 motherboard
- DVD-RW
- Geforce 8500GT

I'm not excessively fond with the big string of cables at the bottom, but they are stiff and there isn't really anything to do with them (also, I can't imagine that they impede airflow). HP decided that it would be OK to have a pair of 80w CPUs + GPUs in a case with no intake fan and a 1000RPM exhaust; I feel otherwise, so I put a Corsair fan in the front (cable ties!)
I'm contemplating de-riveting the HDD cage and then putting in a 2nd 120mm intake--I think that delivering cool air directly to the CPUs might help a bit (load temps are mid-to-upper 60s C in standard room temp)
 
my little build
supermicro x7 board
2x Xeon e5410 (bsel later)
16gb ram
gtx470
custom water cool loop and customized cm gladiator case
linux mint x64 (win 7 later)
128gb ssd
dvdrw
(perc5i waiting and 750 gb seagates on order)
8610672803_ee6955a9d5_b.jpg
 
(perc5i waiting and 750 gb seagates on order)

How are you going to cable the Perc 5i to SATA disks? I thought they only worked with the wierd Dell cable that connected the card to the backplane in a dell server. Is there a 4-disk or 8-disk fan-out cable available or something?
 
Updated pic. I sent the asus 7970 back because of artifacts. Got my ssd and velociraptor installed now, along with 2x MSI 7970s (The rebranded Lightning ones, they actually put black tape on the pcb where it says "lightning" to cover it up).

DIQSLisl.jpg
 
2x Xeon e5410 (bsel later)

sweet build.

regarding the part i quoted, you can use setFSB to OC on x7 boards. when i get home i can let you know exactly which PLL to use. i have the low voltage 5410's on an x7dwn board and can get to 3.0ghz with no voltage increase.
 
New pair of dualies :D
2x C6100 XS23-TY3 nodes
2x L5520 and 48GB DDR3 ECC REG each

2.jpg
 
Posted this in the Storage server Sticky but also counts here.

Specs-

Lian-Li p80b
4x x-case 5-into-3 hd caddys
Tyan S7025
2x E5530 Quad Xeons (2.4ghz)
12GB Ram
Supermirco MV8
Modified so now near silent

120gb OCZ Solid 3 (Win Server 2008 boot drive)
11x Samsung 2TB
2x Hitachi 2TB
1x Samsung 1TB
1x Hitachi 1TB

2x Seagate 3TB (Backup sit on shelf)

Each drive is currently configured as JBOD as seemed easier at the time to manage backup of my media files. (i.e lose a 10th of my data and i can see what ive lost and replace.)
Tempted to move to RAID but not sure which way to go? Any suggestions

Currently its in place as my fileserver/Game server. Want to change to VMWARE but dont want to risk fudging up my current config!
Current Capacity - 34TB

2013-03-04%2015.54.47.jpg



Well thats the end of that server! I got 8x 4gb 1333 sticks from work installed them and on turning it back on something went bang! Makes no real sense. Same brand sticks (hynix) they were faster how i thought that would just be better for it..

Now when power is attached to the psu (tried on anouther psu same outcome) the fans connected to the mobo spin up however not at full whack.

I get no post beeps at all no monitor output either.

I have tried so far booting without any ram, changing the psu and discharging the board. Thats it so far as only happened last night and i was in no mood to strip it down to only find out its well and truely dead! Im going to look at this tonight, does anyone have any ideas where to go next? Or am i dead in the water till i get a new board?

I spoke to a retailer who deal with Supermicro boards and he recons since the RAM was HP branded (however still hynix just has a hp serial on it) that hes seen them to blow motherboards before. Ive never heard of this?

If someone can figure out what it could be which results me getting this working again i would hapily donate to a beer fund for you!
 
@Kerbysj

Try turning it on without the PSU connected. That way, if it doesn't work, you know it's either the motherboard or the PSU.

If that doesn't work, you could try willing it to life. Just think big happy thoughts. Positive energy!

On the other hand, best of luck to you. I've had a computer do that to me once. A very expensive once.
 
@Kerbysj

Try turning it on without the PSU connected. That way, if it doesn't work, you know it's either the motherboard or the PSU.

I think you mean try it with a different PSU.
 
I think you mean try it with a different PSU.

Tried that :(

ended up migrating to a microserver till i decide on hardware. Not sure wether to go for a for a i7 3770, amd 8350 or go big with a multi cpu opteron system.. I know stupid place to ask!!
 
Tried that :(

ended up migrating to a microserver till i decide on hardware. Not sure wether to go for a for a i7 3770, amd 8350 or go big with a multi cpu opteron system.. I know stupid place to ask!!

Never stupid to ask! I am actually think about selling my huge SuperMicro box and getting something smaller. Its a little to much for what I currently need. Post up what solution you decide on then!
 
I would like to see what kind of configuration you guys came up with, especially multi opteron setup. I want to build something like that also either 2x or 4x opterons, but do not know where to start, like what kind of motherboard to get, where to buy the cpus? Look for a used, mobo+cpu combo from another user etc...
 
Check out the Distributed Computing thread. There's all kinds of higher-end builds going on there with 2P and 4P rigs, Often guys post rigs for sale as they upgrade.
 
9593641444_3c209e42a6_c.jpg


Dual Xeon 5050 with ghetto 120mm CM fan ziptied for cooling
Supermicro X7DA8
Supermicro case (the black one that is a tower but can be rack mounted)
8GB Hyundai ram
MSI 8800GT OC
WD 250gb
800w PSU
 
Dual E5-2680s 8x 8gb ddr3 1600 reg
p222 raid card running 4x 3TB seagate 7200.14 drives.
Boot off onboard sammy 830 128GB, old data 1.5TB WD black x2 Mirrored.
7970 gfx for power the ZR30w and peripheral lp2065s
Xonar DGX for power the pair of Polk Monitor 30 SIIs

IMG_20130901_014553.jpg
 
My new ESX whitebox build. I had been planning this for a while but did not want to order until the new Ivy Bridge Xeons were out. I've found myself digging into Microsoft products like Exchange, SharePoint and Hyper-V more and more and was spinning up lots of VMs in my home lab. The previous server I had just wasn't cutting the mustard anymore. None of the single proc boards had enough RAM for running lots of VMs so I went the MP route. From what I saw, Supermicro was pretty much the only option. I finally settled on this one:
http://www.supermicro.com/products/motherboard/Xeon/C600/X9DR3-LN4F_.cfm
srv_mboard1.jpg


I had never even heard of EEATX form factor before, so I embarked upon a search for a case that would accommodate it. I don't have a server rack or anything like that at home, so was looking for a pedestal case. I really liked the Corsair Carbide 540 case and it looked pretty huge. On top of that, I found all of the server pedestal cases to either be really lackluster, grossly expensive or both. I ended up calling Corsair support and a tech confirmed with me that the motherboard should fit but that some of the cable management would be blocked.

When everything arrived I kinda crapped myself a bit because it looks at first like the motherboard may not fit :( I finally had to remove the two front case fans and mount them on the top which allowed everything to fit. The board did indeed block pretty much all of the cable management except near the bottom which made running the motherboard power a little sloppier than I would have liked. There was also an issue with not all of the motherboard mounting holes lining up, but enough did to ensure the board was securely installed.

The final pitfall I ran into was after getting everything together, I went to power on and the server wouldn't post :( I called Supermicro support after checking everything over and discovered that this board shipped with an old BIOS. I had to get my hands on a Sandy Bridge Xeon just long enough to update the BIOS but now everything is up and running. With 192GB of RAM, I have lots of headroom for running whatever I need.

Pics:
whitebox1.jpg


whitebox2.jpg


Specs:

Motherboard: Supermicro X9DR3-LN4F+
CPUs: Intel Xeon E5-2630 v2 Ivy Bridge-EP 2.6GHz
Power Supply: SeaSonic SS-750KM3 750W
RAM: x24 Samsung 8GB ECC 1066 modules
Case: Corsair Carbide 540 AIR
Storage: NFS mount on Synology DS1511+ and some local SSD as well
 
Last edited:
That case is really mean lookin Kohl, I like it. 192GB of RAM? Dude, hahaha! The servers where I work don't have that much.
 
Thanks man! Awesome looking case you have there. The cable management looks flawless. I am jealous!

The Corsair 540 is a great case; tons of space if you don't mind the cube shape.
 
I have an oldie but goodie.
mm-mm-mmm! that bonafide 2006 juicy goodness!

uugj.png




I always get people asking me what I do with such a computer-- I tell them I use it like anyone would use a normal computer with 4 cores--
but it just uses a lot more power... the system uses 300W with both CPUs pinned at 100% on all cores.

the 200GB WD is the boot drive, and it does just fine for a machine of this caliber in terms of speed. it reached the ripe old age of 8 years old today
in accordance with the manufacturing date on the drive.

The wireless card pictured has since been replaced with a USB2 card. I have an internal webcam with LEDs built in pointing at the front heatsink
so I can just peer in through the Skype webcam options and see if I need to shut down and clean it:

55qn.png


and of course, the mess behind the scenes! :D


This is all cheap stuff I either bought on the cheap or had gotten free by the time I built this back in September of 2012.
At that point I didn't have a good camera to take pictures with-- now I do! These were all taken back in February sans the screenshots.
 
I guess it counts as dual-socket, but that machine is about 3 times slower than my laptop. Literally.
 
I guess it counts as dual-socket, but that machine is about 3 times slower than my laptop. Literally.

it's 3 times slower than everyone's laptop!
but it gets the job done, and that's what's important at the end of the day.

next build is going to be a dual socket G34 with a couple of 16-core Opteron 6300s in it, when money allows for it-- and I'm looking at that to last me someplace in the ballpark of 10 years when it is all finally built. but prices have to come down some before I can even think about it more.
 
Back
Top