• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

Need advice on an external SAS array/case

EvanWasHere

Weaksauce
Joined
Apr 24, 2008
Messages
68
So, in my current home server, I have twelve 1TB drives on 2 3ware SATA Raid cards. This is in a full tower case. The drives are pretty stacked up, and there is no room for any upgrade. I have filled the almost 9TB available, so I am looking to purchase something by the end of this week.

My new server, will be HTPC bound, running 3 SSD drives RAIDed, so boot and load times will be nill. This will be in a small case I put in my AV center. Inside, I want to put a SAS RAID card, and then run an SAS cable from the media area, about 10 feet into my kitchen (which is open to the side of my living room). This SAS cable will the connect to a case holding 24 drives.

I plan to take the twelve 1TB drives I currently have, plus purchase another eight 2TB drives, creating two RAID 6 arrays.


So, what I need is a case that will let me hold twenty four SATA drives in total (in case I plan to add more later down the line). I need screwless access for the hard drives for when they fail and need replacement. I also need this case to be quiet as you will be able to hear a case with loud fans from the kitchen to the living room. I would also like the SAS connection to be dual linked for increased throughput.

Does anyone know a case like this? I looked at the Norco line, but it has room for motherboards and other parts I do not need. Those cases would just take too much space and be way too loud. I just need something to hold and power all the SATA drives with a SAS backplane.

Last, which SAS card would you recommend for dual linking to this array?
 
Just build another file server instead of dumping it all in the HTPC. All 24 bay cases use screws for drive trays as well and unless I'm mistaken, a screwless version does not exist. You're just not going to find a product like you described and the proposed implementation isn't really the best. You're also worried about noise...just dump another server in a remote closet and run an ethernet cable.
 
I want all the hard drives on the same machine.. I will be doing a lot of file processing such as having a tuner card with 4 signals recording, converting video formats, uncompressing files, etc etc.. Doing that over a gigabit network will still be a tenth of the speed than doing it on the same SAS connected machine.

I think I was misunderstood when I said screwless.. I don't mind if the trays have screws for the HD to be put into.. I just want to be able to remove the trays easily from the bays if a drive dies..

I was thinking of a solution like this maybe:
3 of these: http://www.sansdigital.com/towerraid/tr8xb.html
This is external enclosures.. 8 drives each… so 3 of them would give me 24 drives.. and I would have hot swap.. They each would require 2 mini SAS SFF-8088 connections.. so 6 connections needed total.
Purchase here: http://www.mwave.com/mwave/SKUSearch_v3.asp?px=FO&scriteria=AA75145
So $350 x 3 = $1,050

Then to run them, I would need a SAS RAID card with a total of 6 ports available.
Most cards either only have 2 or 5 ports available.
I found 3 with more ports.
4 external/2 internal: http://www.newegg.com/Product/Product.aspx?Item=N82E16816401189
1 external/6 internal: http://www.newegg.com/Product/Product.aspx?Item=N82E16816118137
1 external/6 internal: http://www.newegg.com/Product/Product.aspx?Item=N82E16816118141

So roughly $1,300 for the card.

Plus $200 for a battery backup: http://www.newegg.com/Product/Product.aspx?Item=N82E16816118118

And to convert the internal 8077 ports to 8088 ports, I would get these $50 converters: http://www.pc-pitstop.com/sas_cables_adapters/AD8788-2.asp

Thoughts?
 
Last edited:
I still think a separate server is a better idea. If you are worried about speed, you could get a 10gbit ethernet card. They're only ~$200 or so on eBay and it would be much cheaper to buy a Norco 4224 instead of the rather complex setup you proposed. I would also look into buying an 8 port SAS card and then using an expander (there is a huge thread on them) as you'll save even more money over the 24 port card (with no real disadvantages).
 
If I got a 10GB card, I'd still need to get a 10GB switch..

That would be an extra (minmum) $2,500.. plus 2 cards at $200 each, would be $3,000 total.

And that doesnt include the extra cost for CPU, motherboard, memory, etc for this server, plus the monthly power cost for an additional PC running in my apartment.

If I kept it all in the same machine, I would get throughput of hundreds of MB/s... If I did it via network, that number would drop by a huge percentage...

The cost for my version is about $2,500.

If I'm wrong, please inform me?
 
I have 2 10GbE cards, get 400-600MiB/s, connect two servers to each other.
Nice to xfer 5-10TB quickly.
 
And then gigabit for the regular network of course, the switches are $12-$25k
 
$300 - HP SAS Expander - PM Synergy Dustin for exact Price
$110 - Antec Truepower New TP-750 750W PSU
$90 - 6x NORCO C-SFF8087-D SFF-8087 to SFF-8087 Internal Multilane SAS Cable - OEM
$400 - NORCO RPC-4224 4U Rackmount Server Case
---
$900 plus tax and shipping

To actually power the HP SAS Expander card, you can do one of these two methods:
http://www.servethehome.com/sas-expanders-diy-cheap-low-cost-jbod-enclosures-raid/
http://www.servethehome.com/sas-expanders-build-jbod-das-enclosure-save-iteration-2/

That's an additional $35 to $60 right there.

Then all you need is this card as well as any SFF-8088 to SFF-8088 cable (another $100):
$660 - areca ARC-1680X PCI-E x8 SAS PCI-E to SAS RAID Adpater

That's a total of $1720 give or take. That's a full $780 less than your planned setup but far less complicated and uses only one external SFF-8088 cable rather than two. Not to mention it'll be more power efficient since the included PSUs in those RAID enclosures have pretty crappy efficiency.
 
If I got a 10GB card, I'd still need to get a 10GB switch..

That would be an extra (minmum) $2,500.. plus 2 cards at $200 each, would be $3,000 total.

...

The cost for my version is about $2,500.

If I'm wrong, please inform me?

I have 2 10GbE cards, get 400-600MiB/s, connect two servers to each other.
Nice to xfer 5-10TB quickly.

You could always look at something like a Dell PowerConnect 6224 switch. I believe it's about $1200 and has 24x GbE ports, with 4x10GbE ports which can be used for stacking, or just as straight 10GbE ports. If you can get your two of your ~$200 cards, that's a $1600 solution, which beats your $2500.

For a small, home implementation like this you could connect your primary server(s) and your storage solution to the 10GbE ports, and still have Gigabit connectivity for everything else in your home. It's a nice middle ground without having to invest 2500 - 15000$ in 10Gb switches, and has an added benefit of allowing multiple servers (if you add more later) to have 10GbE access to the storage as well. Up to 3 @ 10GbE, and the rest at 1GbE. IMO, not a bad setup.
 
Last edited:
Not going to list a configuration as one was just mentioned, but you don't have to buy a 10gbit switch. Just run a cable between the two NICs.
 
NORCO RPC-4224 4U Rackmount Server Case

The standard fans are quite loud, and the OP said noise was a concern. But he could get the 120mm fan bracket adapter and put in 3 quiet fans. Also, the rear two 80mm fans would need to be replaced with something quieter.

All of that would lower the airflow, so it would be a good idea to get the lowest-heat HDDs possible. But neither the Samsung F4EG drives nor the WD Green drives are on Areca's hardware compatibility list. It is hard to find green drives on any hardware RAID compatibility list.

I wonder if the OP would be able to use a non-hardware RAID solution. Maybe Windows software RAID / dyanmic disks, or maybe FlexRAID.
 
The standard fans are quite loud, and the OP said noise was a concern. But he could get the 120mm fan bracket adapter and put in 3 quiet fans. Also, the rear two 80mm fans would need to be replaced with something quieter.

True. Forgot about that. Still even with the 120mm fan bracket adapter, it would be cheaper and simpler to do than the OP's planned setup.
 
similar to what danny said.
what you are looking for is a JBOD chassis with 16 drives and a SAS-expander. You can check out www.cineraid.com as they manufacture the JBOD chassis specifically for areca controllers and are targeted toward video editing.
 
10GBASE-T runs great with just a cat-6 cable between two machines. I picked up a pair of Dell XR997's (Intel 10G AT Adapters) for $250 off ebay, put one in my server, one in my main workstation, and plugged a cat-6 cable between them. They use their Gig-e to talk to the rest of the network, and the 10G cards when they talk to each other. Only problem is that noisy little fan on the 10G NICs. Doesn't bother me in my office, but it would in my HTPC. Newer cards are fanless, but will also cost a lot more.
 
You could always look at something like a Dell PowerConnect 6224 switch. I believe it's about $1200 and has 24x GbE ports, with 4x10GbE ports which can be used for stacking, or just as straight 10GbE ports.

The Dell 6224 doesn't actually come with the 10G ports, it has a couple of expansion bays that you can buy modules for, only one of which can be the 10GBASE-T module. Haven't seen any 10GBASE-T ones on ebay yet, but they are $1350 from dell. Doesn't make much sense if you can only have 2 ports, might as well direct connect. You'd have to go to something like the Dell 8024, which is $7500+, to get more that 2 10GBASE-T ports,
 
The Dell 6224 doesn't actually come with the 10G ports, it has a couple of expansion bays that you can buy modules for, only one of which can be the 10GBASE-T module. Haven't seen any 10GBASE-T ones on ebay yet, but they are $1350 from dell. Doesn't make much sense if you can only have 2 ports, might as well direct connect. You'd have to go to something like the Dell 8024, which is $7500+, to get more that 2 10GBASE-T ports,

I think you are right, looking back it was only 2 ports in the module. Of course, you could go to a cheaper vendor switch like a Netgear, maybe something like the NETGEAR GSM7328S-200NAS 10/100/1000Mbps + 10 Gigabit Gigabit that comes with 4x 10GbE ports, of course now you're to near $1900 and it may be getting to pricey to bother.
 
Ok Ok.. I see the light and your points.. The ServeTheHome.com website explains a LOT!

The network option is not the way I want to go..

But the Norco option isn't bad.. I like the price saving, but it seems like such a huge waste of space. The Norco will be 2/3 empty. I have a Manhattan apartment, and will have to remove some shelving to fit this..

I would definitely use the fan bracket adapter and replace all the fans. I will also want to dial down the speed of the fans to lessen the noise.

This system has to be compatible with the (12) Seagate 7200.11 1TB Barracudas SATA drives I currently have and the (8) Hitachi Deskstar 7k2000 2TB SATA drives I am buying.

So my questions for this Norco setup:

1.) Why the HP SAS expander? Why not the Chenbro that has a molex and doesn't need a motherboard or other kind of PCI-E power.

2.) Can I go with a fanless PSU for even more silence? Like the Seasonic 460W? I currently have 1 powering the 12 drives AND an i7 based CPU for the last year with no issue. http://www.newegg.com/Product/Product.aspx?Item=N82E16817151099

3.) Why the Areca 1680? Why not the Areca ARC-1880 so I can have dual linking? And would I see noticeable a difference in speed using that?

4.) Speaking of speed, if I don't do the dual link, will I notice any slowdown using the single port? Or will I not even max out the 1.2Gb/s that the port is capable of?

Thanks for everyone's help! I want to buy all the parts by the end of this week, so I can start building before the New Year!
 
1) The HP SAS Expander is compatible with more cards out there than the Chenbro while slightly cheaper IIRC.
2) I would recommend no for 24 hard drives.
3) I probably should have made this clearer: You put the Areca card in your PC. You're connecting the Areca 1680 card in your PC to the HP SAS Expander in the Norco case via External SFF-8088 ports using a single SFF-8088 to SFF-8088 cable. Thus dual linking will not work out in this configuration.
4) Not really.
 
If you are going to be putting in a lot of 7200rpm Hitachi HDDs, and replacing the fans and dialing them way down, you will probably want to keep a close eye on the temperatures of the Hitachi HDDs. They could get quite hot. The Norco 4224 packs them in there quite close together.
 
Ok..u guys are gonna HATE me.. LOL..

I took all the advice from you guys, and tweaked it a bit with a friend to where I was happier with the results..

I am basically going to have my boy with a CNC machine create a custom case for me that will be half the depth.. As I really do not need a motherboard tray or other space in there. The metal and metal work it will cost me a total of $500.

Inside I will purchase:

$315 Chenbro - CK13601 with UEK (Universal Expander Kit)
http://www.compubees.com/chenbro-uek-13601-universal-expander-kit-w-ck13601-sas_288_12104.html

$420 - (6) x $70 Norco SS-400
http://www.newegg.com/Product/Product.aspx?Item=N82E16816133035&Tpk=ss-400

$160 Seasonic Fanless SS-460FL
http://www.newegg.com/Product/Product.aspx?Item=N82E16817151099&Tpk=seasonic fanless

$126 - (6) x $21 Silverston AP121
http://www.newegg.com/Product/Product.aspx?Item=N82E16835220038&Tpk=ap121

$108 - (6) x $18 C-SFF8087-4S
http://www.excaliberpc.com/593826/norco-c-sff8087-4s-discrete-to-sff-8087.html

Total $1,129 + $500 for the case = $1,629

I will also need a 10-15 foot SFF-8088 to SFF-8088 cable to go from the
expander to the raid controller. Need to find one.. hopefully in the $100 range.

Plus a $600 raid card with battery. I won't need a 6Gb/s as the expander won't support that with the SATA drives.

Compatible raid card list is here:
http://usa.chenbro.com/corporatesite/service_download_result.php?type=pro&mk=123&sk=

So, using this method, the case will be half the depth.. and will be 30dba with all the fans at full power.. of course we plan to put a few resistors in there to put the fans to 22dba or so. Each Norco has fans directly on the HDs, so cooling would be more optimized.

So for a smaller, customized, slick looking, ultra quiet case... it will cost me about $2,400 for the total setup... versus $1,900 (the Norco method plus fan upgrades)..

Do u guys think it's worth the $500 extra? Any comments on the parts we selected?

Oh.. and Merry Xmas!
 
Biggest issue that I have with that setup is the PSU: For $25 less, you can get this PSU:
$134 - Seasonc X650 Gold 650W Modular PSU

I'm more than willing to bet that once that system is up and running (if it can with that 460W PSU), you won't notice an iota of noise level difference betweeen that Seasonic 460W and that Seasonic 650W. At least that Seasonic 650W PSU will be more than capable of handling all 24 drives loading at the same time as well as the hot-swap bays themselves.

Second issue are the fans themselves: You can get slightly quieter fans for $5 cheaper per fan:
$15 - Scythe S-Flex SFF21E 120MM Case Fan

Not only are the Scythe fans slighly quieter, they actually push more air than the Silverstone fans.

Third issue is that you haven't stated exactly you're gonna be powering that Chenbro expander.
 
Awesome.. I will check out the PSU and fans in a few min.

The chenbro has a 24 pin connection from the PSU on the board. So I can plug directy into the PSU.

Any other input?
 
Last edited:
LOVE the fans.. LOVE the PSU.. Changed to those..

Any other advice? Buying EVERYTHING on Sunday.

The width of this should be 26" wide.. and only about 12" deep!!! Compared to 25" of the Norco!!!
 
Last edited:
I also think separate machines is way to go. Why would you want 24 drives plus all the rest of the hardware running in your entertainment center?

You can have all the processing done on the remote machine and a low power front end for playback. I haven't messed with tuner cards in years and I don't know what format your using, but decent gigabit equipment should give you 80MB/s easily, surely recording a single stream can't use 20MB/s.
You could even trunk 2gig ports together if you're that worried about it.
 
Seems like a waste of money. You can buy a 45 bay Supermicro SAS expander chassis for essentially the same price.
 
Correct me if I'm wrong, but reverse breakout is for SATA Host connections to SFF-8087 Target, and regular/forward breakout is for SFF-8087 Host to SATA Target connections.
So if you were intending on powering those SS-400s, you would need regular breakout cables, not reverse breakout.
 
Back
Top