Data Storage System Build Advice Needed

pirivan

Limp Gawd
Joined
Feb 22, 2009
Messages
346
Hi All,

I've looked through many of the incredible systems here and have been contemplating my build but I could use some feedback about my components (the controller card choice and motherboard are giving me the most trouble). Like many of the system builders I see here my storage system has been a bit evolutionary. First I added just an external USB drive, then more drives internally, then a 5 disk eSata enclosure onto my HTPC, then another 5 disc eSata enclosure and now here I am considering a rack mountable "NAS/Storage Server". First let me lay out what some of my priorities/goals are for the build along with some explanations:

  • Large drive capacity for the enclosure in terms of how many drives I can stuff into it (I am tired of buying new 5 bay eSata enclosures and hooking them onto my HTPC)
  • Around $1000 total for the system was my original goal (I don't believe I will hit this based on what I have chosen so far, probably more like $1500 or $1600, especially since I will need a small rack)
  • Fast enough to stream 2 1080p movies (max, currently I only ever do 1 to my HTPC that runs MediaPortal). Performance is not a huge concern. I get 22MB/S to the eSata drives that are connected to my HTPC and that is generally fast enough for me for file copy performance).
  • Fairly quiet. If I can avoid it I would rather not have this sound like a small tornado, I don't mind paying a bit extra for quieter components.
  • Implement RAID5. I currently simply have one drive of equal size to backup each of my storage drives, which I sync using Syncback. Utilizing RAID5 would give me a ton more usable space since what I am doing now is essentially mirroring. However I am aware that RAID5 probably isn't as good of a 'backup' solution as the primitive 'synback daily mirroring' I am doing now. However, I don't believe I can afford to backup a RAID5 array fully; using RAID5 might be the best fault tolerance I can afford. I am also happy to use software RAID5 because A) It sounds like it will be fast enough for my purposes, B) Hardware controller cards to support 20 drives are incredibly expensive C) If am using multiple controller cards to get to 20 drives, as I understand it, I will have to use software RAID to get a RAID5 array with drives hooked onto 2-3 different controller cards
  • Rack mountable. I considered filling my huge LIAN-Li case with hard drives (looks like I could get at least 20 in there) but that just feels like a "step". My guess is that in a few years I would tire of that and want to build a rack-mounted hot-swap drive bay storage server. I figure, why not just skip ahead to that now when it is so close to affordable?
  • Utilize FreeNas or Server 2008 (I am leaning toward FreeNas but I worry it won't support the hardware I choose). I want Server 2008 for the larger than 2TB partition support, a new OS to play with and I am not interested in WHS "pooling", I want RAID5 (maybe RAID6?).
  • A motherboard with 2 PCI-E slots, 2 PCI-X and as many onboard SATA ports as I can get. My preference is for Intel but I am not wholly oppose to AMD. I also may not be married to the PCI-X slots, it just seems like they are still useful for controller cards, I really am not sure though. It would also be nice to have a PCI-E x1 slot as that is what the controller card for my 2 eSata 5 bay external enclosures use; this is not totally necessary.

So given some of my goals here is what I have chosen so far and comments on each component of why I chose it or why I am having second thoughts/why I need advice. For the most part I am not married to ANY of this beyond the case, so feel free to suggest based on my goals:

  • Rack: Kendall Howard 12U Rack Without door $338
    I couldn't find many racks that were 8-12U that I found affordable besides this one. I had considered using a cheaper rack that was meant for audio equipment but I couldn't find one that looked good and was affordable. I could just set it on some kind of shelf but that does seem a bit crude and not nearly as much fun. Plus what if I want another piece of rack mounted equipment someday?
  • Case: NORCO RPC 4020
    20 drives, rack mountable, incredibly affordable, sounds like the fans are quiet (?). What else could I ask for? I am sure there is a good reason why a lot of people here are using it, it fits my plan perfectly as well.
  • Motherboard: SUPERMICRO MBD-X8SAX-O LGA 1366 Intel X58 ATX Server Motherboard
    I am having an incredibly tough time picking out a motherboard. This one in particular makes me very nervous given that it has some shaky reviews and shaky sounding RAM support. Why I chose it was purely based on the fact that it had 2 PCI-E x16 slots and 2 PCI-X slots (I wish it had a PCI-1x slot as well). It seemed like a smart idea to have PCI-E and PCI-X so that I could get some older more affordable controller cards to go in the PCI-X slots and if I could afford it later, have the flexibility to put a more expensive controller card in the PCI-E x16 slot (or a NIC). I also liked that it was an Intel board with support for their latest CPU's. I am not thrilled about the cost though for a basic file-server motherboard.
  • Another Motherboard Option ASUS M2N32-WS Pro AM2 NVIDIA nForce 590 SLI ATX Server Motherboard
    This one looked like it might fit the bill as well, however, it is deactivated on newegg, is AMD (which I am not TOO partial to at the moment) and is clearly a bit of an older board. I am sure I could find it from another retailer but it makes me a bit nervous to start purchasing older components. I worry that support might be dodgy from the vendor however on the plus side, I am sure the older it is the better the chances of FreeNas supporting it are.
  • CPU: Intel Core i7 920 Nehalem 2.66GHz
    This would go with the first motherboard I linked. I am sure I would be happy performance wise with a Core 2 Duo or Core 2 Quad (that is what I was looking for initially) but this fit the motherboard I found. I believe this is the most cost-effective core i7. I wouldn't be TOO terribly opposed to going the AMD route as long as I could get one of their new, cheap, Phenom's on a motherboard that had the right slots/sata ports. However, I had a lot of trouble finding a new AMD board with 2 PCI-E x16 slots, 2 PCI-X slots etc. Again though, I may be able to be convinced that 2 PCI-X slots aren't worth having for controller cards.
  • RAM: Patriot 4GB (2 x 2GB) 240-Pin DDR3 SDRAM DDR3 1333
    I may as well have picked this out of a hat. It was DDR3 1333 and affordable, it didn't matter to me if it was 2GB or 4GB. Though it sounds like the chances are very slim that it would work with my motherboard. If I go with another motherboard I am fine with 2-4GB of just about ANY RAM that I know works with the board (DDR2 800, 1066, DRR3, I don't really care). I don't believe RAM will have a huge impact on performance for what I want the file server to do.
  • Power Supply: Antec TPQ-850 850W ATX12V / EPS12V
    I am partial to Seasonic due to their low noise and great efficiency but I like Antec as well when I want to save some money. I assume this is enough to power up to 21 drives, plus all my other components but I am really not sure. From the sounds of it it is a quiet PSU which is quite important to me. The modular cabling is a must for me these days. Is there any reason this wouldn't fit in the case I picked? Do I need more cabling? Is this powerful enough?
  • Boot drive: Western Digital Caviar Black WD6401AALS 640GB
    I wasn't particular about the boot drive at all. I only chose it because A) People on this forum liked the performance and B) It had a 5 year warranty. These days I have learned that warranty and quality of the support is very important to me. There is a reason that I will only ever buy an eVGA NVIDIA card (when I buy NVIDIA), fantastic support.
  • Storage drives: Seagate Barracuda 7200.11 ST31500341AS 1.5TB
    I already own two of these and have had no issues what-so-ever. I will check the firmware and upgrade accordingly if I order 2 more of these. I'd go with 2TB drives but that is just not cost effective currently. So, for now, I am thinking 4 1.5TB drives in 1 RAID 5 array and then my other 4 1TB WD 'green' drives in another RAID 5 array.
  • Disc Drive Sony Optiarc Black 8X DVD-ROM 24X CD-R 24X CD-RW 24X CD-ROM 2MB Cache SATA Slim Combo
    I need a slim drive for the case, I chose this basically at random. I am open to better/cheaper suggestions of course as with all the other components. I chose SATA because frankly, I am sick of IDE cables and if I have ports to spare I'd rather have SATA
  • SATA Controller Card PCI-X Option : SUPERMICRO AOC-SAT2-MV8 64-bit PCI-X133MHz Frankly, I like it because it is cheap. Two of these, combined with the onboard SATA ports on my motherboard will allow for 22 drives (I assume I can create a RAID5 array including drives from either card plus the motherboard) drives plus the SATA DVD-ROM drive for an affordable price. I would LOVE to buy a $1000 RAID controller card but I just can't justify doubling my system price for it. However, it may be that these cards would cripple RAID5 performance, I really have no idea. If I wasn't looking at these cards I probably also wouldn't feel it was as necessary to have PCI-X on the motherboard. Though it looks like some of the other cheap controller cards are PCI-X as well.
  • Alternative SATA Controller Card : 21610SA ADAPTEC
    I saw this suggested as a cheap alternative RAID card, seemed like an affordable way to add more SATA ports. Does this card have some major pitfalls? Are the other cards with lots of ports like this I should look at?
  • More Expensive/Newer SATA Controller Card : HighPoint RocketRAID 2340 PCI -Express X8 SATA I & SATA II
    I don't think I can justify spending this much given that the costs of the other components seem to be growing but I thought I would throw it into the mix to see what people think.
  • Video Card : It barely matters. I will remote into the server if it is 2008 or use the web management utility for FreeNAS. I plan to choose a cheap PCI slot video card so that I don't waste a PCI-E x16 or x1 slot.
  • Fans : No idea what/how many extra I might need to purchase with the case. If I needed any I would go for the quietest possible that move a decent amount of air.

After this all is purchased my idea is to put it all together, throw FreeNas or Server 2008 onto the boot drive, create two big GPT software RAID5 partitions (one RAID5 set of the 1.5TB drives and one of the 1TB drives), share it all out to my HTPC and two other machines and stream/copy away. I know that with only 4 drives to begin with I don't NEED space for 20 BUT I know I will expand. That is the point of this whole project, I want a platform that I can grow with for a while without having to start from scratch again.

I am a bit nervous about using RAID5 in this way as opposed to the primitive 'mirroring' I have been doing for "backup" but I like the space I would free up with RAID5 and it makes me feel slightly better that I would get some fault tolerance. I may purchase a "hot spare" for each RAID 5 set but I am not sure. I'd love to back this server up with another 4U unit just like it but I just don't think that is affordable for me at the moment. If anyone has another other cheap, easy ways to backup this much data I am all ears. Technically I suppose I am willing to entertain the idea of continuing to do do what I have been (mirroring all my storage drives with backup software) if people think that using RAID5 is really a bad idea. I just love the idea of just losing a little bit to parity :). Plus, having never really used RAID on any of my desktop systems it sounds like an interesting project.

Anyhow, any feedback on the above thoughts/components would be HIGHLY appreciated. Please let me know if you need additional details or clarification on anything. I am having a lot of trouble selecting the proper/affordable motherboard and RAID controller card; it's a lot more difficult than I thought to figure out a way to get 22 SATA ports or so. I'm open to cheaper suggestions as well, if they will meet my needs; I am not locked into Intel or the Core i7 platform (it's just highly attractive). In the end, this is just a basic file-server. I need something that can perform that task reliably and fairly quickly, with a lot of room for expansion (like when I decide 2TB hard drives are affordable and want to make another RAID5 for those :)).
 
As an eBay Associate, HardForum may earn from qualifying purchases.
The Adaptec controller has had reports of having rather lackluster performance, though you did say that performance wasn't a main priority. As for the Norco case, I believe the stock 80mm fans are server-grade, which means it's not the quietest, but I suppose the case can be quieted down by sticking in Panaflo 80mms for each of the stock 80mms to quiet that down.
 
As for the Norco case, I believe the stock 80mm fans are server-grade, which means it's not the quietest, but I suppose the case can be quieted down by sticking in Panaflo 80mms for each of the stock 80mms to quiet that down.

I agree with Syntax here. The stock Norco fans are pretty loud, but replace them Panaflo L1A's or equivalent and it'll be almost as quiet as most desktops.

The rack you've got listed looks like it will work pretty well. I bought a used rack locally though and got a much better deal. I looked at audio racks as well since I have a lot of experience in professional sound systems. The problem with audio racks (or flight cases) is depth. The Norco is ~26in. long. Most audio cases only have a rackable depth in the neighborhood of 21in.

The mobo I'm running is the Asus P5BV/SAS. I specifically got it for the 8 SAS ports on the built-in LSI 1068E controller and the 2 PCI-X slots. The LSI 1068E will do RAID5 if you get a ZCR card, but then you lose one PCI-X slot to that. I'm running WHS so RAID was not a concern to me. I wanted the additional onboard SATA ports to start with. This mobo plus 2 SAT2-MV8 cards give my 28 SATA ports. I'm using the 4 off the ICH7R for boot drive and DVD, 20 for the hotswap bays, and the last 4 will go to eSATA or external multilane for offsite backup connection.

The Core i7 and X58 are way overkill for a file server unless you just want to go that route. There are a lot of other CPU/mobo combinations that I bet will give you the same real world performance at a much cheaper price.

I've got a Samsung SN-T083A slim SATA DVD drive in my server. I've never liked the slim tray drives so I searched long and hard to find this slot load drive. I bought mine at MWave, but they don't seem to carry it anymore.

I'm know you said you weren't interested in WHS, but if you're nervous about RAID5 or 6, then I think WHS is the best alternative.
 
Well

1) Stay away from the adaptec 21610sa. It sucked for jbod, I can only imagine what kind of nightmares you would have with raid 5.

2)How do you plan on doing software raid 5 with server 2008?
I thought it was removed in 2008.

3)X58?!?!?
Overkill
Unless this is going to be used for gaming and/or encoding and/or vms you don't need a quad core much less an i7 setup.
Get a low power dual core setup. It will be plenty fast and run cooler and draw less power.

4)I know you said you didn't want WHS but there is a reason that those of us with the largest storage systems use it.
 
Since you're not going WHS (I'm not a WHS guy either, at least for my primary storage servers) consider going cheap on the proc and ram and spend more on the raid card. For a long time I went with a cheap Fry's combo (AMD X2) and ran a Areca 1280 out of the pci-e slot intended for a video card. Prices and tech upgrades for procs and mobos seem to happen a lot quicker than raid cards, so you can cheaply upgrade processing power as you need to, and get the most bang for your buck. Not saying you should drop $1k on a raid card, but something to think about.

Also, I'm in the geeky minority for cases, but I love my Hardigg rack \ shipping case. You can find them used for fairly cheap, the main thing is shipping. But check ebay or your local Craigslist. Here's a pict of mine. It's an 11U, but I recently got a couple 14Us for cheap that I'm gonna move to :)

http://www.muppetlabs.com/~tom/hardigg_full.jpg
 
Your processor, ram, and motherboard is overkill for your application.

You can dump the two controllers and get something like a Dell Perc5i controller off ebay for massive savings and real hardware raid performance.
 
Thanks for all the replies! I really appreciate it, I'm ecstatic to get so much good feedback! So, there seem to be a couple of themes in the replies so let me try to explain myself a bit.

1. The processor choice. I am not in ANY not dead-set on a Core i7. Literally the only reason I went with it is because I wanted a motherboard with 2 PCI-E x16 slots and 2 PCI-X slots for maximum RAID card flexibility. That X58 board was one of the few I found that had these ports. I am very flexible on this. What motherboard/processor combination would people suggest? As long as it comes with a CPU and RAM that is fast enough to support a (potentially) 20 disk RAID5 software raid, I am happy. Realistically, it will probably be a 1 RAID5 of 1TB harddrives, 1 RAID5 of 1.5TB hard drives, 1 RAID5 of 2TB hard drives etc etc as I buy different size drives to fill it. There is a POTENTIAL that I would consider using the server for VM's someday, but that would only be if I went with Server 2008. However, a lesser processor that draws less power and creates less heat is a good thing.

2. The RAM choice. Again, totally based on the motherboard, I just found the cheapest RAM I could that was DDR3. I really don't care as long as it is fast enough for a file server with multiple RAID5 software arrays and to stream 1-2 1080p movies at a time. I'd be happy with DDR2 800 if that is what the motherboard supported.

3. The motherboard choice. The primary concern here was finding the proper slots (2 PCI-E x16, 2 PCI-X, PCI-E x1 etc) that would make a flexible NAS. I would love to find a much cheaper relatively recent motherboard that had a good mix of slots, 6 or so SATA ports and supported low-end Core 2 duo's or low end Phenom's (assuming either of those processors is good enough for software RAID5).

4. RAID cards. I will dump considering the Adaptec 21610sa, I want to get something people consider stable and decent. Croakz, what you said made a lot of sense to me. If I can get a motherboard with the right ports, I would love to save $500 or so on proc/ram/motherboard choice and sink it into a better RAID controller card. What would you suggest? How many SATA ports does a Dell Perc5i support (that is a PCI-X card correct)? Again though, keep in mind that my goal is to use software RAID. Why I am sort of preferential to software RAID is two reasons. A) I understand that with a decent CPU the performance isn't half bad. B) With hardware RAID, as I understand it, you end up being pretty tied to the controller card you choose and it's difficult to migrate the array to a new system at some point and if the card fails you need to replace it with the exact same card to not lose the entire array.

5. Fans. I will look into the Panaflo L1A's, that sounds like a great suggestion, thanks Synax Error and Epimetheus.

6. OS. The only reason I decided that WHS might not be for me is that its focus seems to be on the 'storage pooling' system instead of RAID. My understanding is that storage pooling does not provide parity and thus fault tolerance. So, that is why I chose to try either FreeNas with software RAID5/6 OR if my hardware isn't supported under FreeNas go with Software RAID5/6 on Server 2008. Based on my thoughts here is there a reason I should circle back and reconsider WHS? Also, as far as I am aware software RAID in Server 2008 is still fully implemented; I have not read otherwise.

7. Racks. Thanks for the suggestions/feebacks on this (nice rack Croakz, that hardigg rack looks nice to me, I like the idea of something movable, that is why I chose one that had wheels on it). I might investigate if there is somewhere locally I could purchase a small 8-12U before I pull the trigger on all the other components. In the end a rack is a pretty simple metal cage and while I would love to get something that looks decent I don't want to waste money on it either.

Overall, based on what people have said it sounds like choosing a different motherboard can really reduce my overall system component cost while maintaining a decent level of performance for a software RAID5/6 file server. I'm open to more suggestions of motherboard/CPU/RAM combinations that have plenty PCI-E/X slots. It also sounds like I should spend a bit more on a controller card. Something with 18 ports or so would be ideal, so that I could 6 or so motherboard ports to get 22 drives or so. I doubt I could still afford $1000 on a RAID card but $500 might be workable if I shave cost on other components. Though this all does make me wonder if having a better RAID controller card will affect performance if I use a software RAID?

Again, thanks for all the suggestions and advice. I look forward to hearing more from everyone so that I can make a better/more cost effective decision for what I want to accomplish.
 
4) The PCI-E Dell Perc 5/i supports 8 SATA drives. While a decent CPU would allow for decent performance with software RAID, the problem is that you're about to spend enough money that could put towards an actual hardware RAID card that can easily outperform a software RAID by at least a factor of two. A few links about Perc 5/i cards:

If you don't mind buying used hardware with little or no warranty, you could buy a Dell Perc 5/i card off eBay for ~$125 or so plus two of these SFF-8484 to 4 x SATA Cables for ~$25. Thus, you get 8 ports and a true hardware RAID controller for around ~$150 or so. Add a battery backup unit for ~$50 and you're set in case of power outages. Not a bad deal considering that a new true PCI-E hardware RAID controller is around $300 (Areca ARC-1210) for 4 ports. But those Dell Perc 5/i cards are finicky about motherboards. It will take up PCI-E x16 slot if you don't have a PCI-E x8 slot.

Read these threads for more info:
Dell Perc 5/i RAID Card: Tips and Benchmarks
Finally went to a Hardware Raid5 controller...
Dell Perc 5/i - Mainboard Compatibility List
Solution for Dell Perc 5/i for Intel Chipsets
Add 8 device SAS/SATA 256MB BBU Enterprise class RAID card to your rig for about $100 w/ PERC 5i (LSI 8480E OEMed to Dell)

6) There is a feature in WHS called duplication. It's almost like RAID1: A second copy of whatever files/folders you have set duplication on will be placed on another hard drive. So should a drive die, whatever files on that drive that had duplication on would not be lost since there is a backup copy of those files on another drive. I highly recommend looking through the WHS FAQ stickied at the top.

As an aside, look through Ockie's Project Galaxies. The man had $1000+ hardware RAID cards at one point yet ditched all of that for a WHS setup. That should tell you something about the usefullness of WHS:
http://networkisdown.com/showthread.php?t=276
 
About the Server 2008 R5, it is supported....my head was in Vista Land when I wrote that.

4)
a. Well I guess it depends on the person, but software R5 is not for me. Yea the reads are decent but doing writes sucks. Filling your drives is going to take you a very long time.

b. If you want to move systems you really just have to move the card with the drives, because the config is all stored on the card.​

6) Reasons to use WHS.
a. Solid Server 2003 Base
b. Massive storage without the massive price
c. Addins
d. You will prob see better speeds from a WHS setup than you will a software R5.
Yea you may have higher read speeds, but you already said that you need it to be able to play 2 1080p clips. You can also do writes to the server at the same time as they will go to another disk, assuming there is available space on another disk.
With my WHS I can do sequential writes at 80MB/s over the network.
If you go the software raid route and you try to do anything while you are doing a write, you video will skip stutter and maybe even crash.
With software R5 you will be lucky to get 20MB/s writes, and that will be closer to 3MB/s if you are playing a movie from the array.​
 
Have you considered Solaris 10 with RAIDZ and ZFS? Took about 5 minutes to configure the array and share it out via CIFs (windows fileshare) to my windows boxes. The performance is excellent (I can max out a 1Gb link no problem and for that matter probably two+ if I wanted), point in time snapshots of the entire filesystem with no performance hit is sweet! I have snapshots going back 2 months I can literally time slide to and then copy and paste files out of. I went with the following build for my Solaris NAS:
Asus M2N-LR - 6 sata dual PCI-X dual PCI-E with dual ethernet
Athlon x2 5050e processor (low power with plenty of horsepower for Solaris and Virtualbox)
4Gb of Patriot DDR2
5 x 750Gb Samsung F1's

Solaris natively supports that Supermicro card you listed and a lot of people use them as you only need a JBOD card for Solaris as all the magic is in the software layer.

The best thing is you aren't tied to a specific platform or RAID type. I actually moved the array to this config after installing it on a more basic mainboard and reinstalled Solaris which immediately saw the old array config and mounted it up with any interaction from me (Solaris writes the array config to the drives in the arrray).

There are a load of guides on setting up a Solaris NAS and having played FreeNAS and OpenFiler (which I think is better than FreeNAS at this point) Solaris is way more stable and much faster. PM me if you want more details on the setup but its really straightforward.

Oh and there isn't a volume pool size limitation (I'm sharing out a 3TB pool to my Windows Boxes right now).
 
Wow more great replies. Argh! I have been given so much to think on I have changed my mind several times already. Based on the research I have done about all the answers I have seen, I will try to reply to everyone.

4) The PCI-E Dell Perc 5/i supports 8 SATA drives. While a decent CPU would allow for decent performance with software RAID, the problem is that you're about to spend enough money that could put towards an actual hardware RAID card that can easily outperform a software RAID by at least a factor of two. A few links about Perc 5/i cards:

If you don't mind buying used hardware with little or no warranty, you could buy a Dell Perc 5/i card off eBay for ~$125 or so plus two of these SFF-8484 to 4 x SATA Cables for ~$25. Thus, you get 8 ports and a true hardware RAID controller for around ~$150 or so. Add a battery backup unit for ~$50 and you're set in case of power outages. Not a bad deal considering that a new true PCI-E hardware RAID controller is around $300 (Areca ARC-1210) for 4 ports. But those Dell Perc 5/i cards are finicky about motherboards. It will take up PCI-E x16 slot if you don't have a PCI-E x8 slot.

Read these threads for more info:
Dell Perc 5/i RAID Card: Tips and Benchmarks
Finally went to a Hardware Raid5 controller...
Dell Perc 5/i - Mainboard Compatibility List
Solution for Dell Perc 5/i for Intel Chipsets
Add 8 device SAS/SATA 256MB BBU Enterprise class RAID card to your rig for about $100 w/ PERC 5i (LSI 8480E OEMed to Dell)

6) There is a feature in WHS called duplication. It's almost like RAID1: A second copy of whatever files/folders you have set duplication on will be placed on another hard drive. So should a drive die, whatever files on that drive that had duplication on would not be lost since there is a backup copy of those files on another drive. I highly recommend looking through the WHS FAQ stickied at the top.

As an aside, look through Ockie's Project Galaxies. The man had $1000+ hardware RAID cards at one point yet ditched all of that for a WHS setup. That should tell you something about the usefullness of WHS:
http://networkisdown.com/showthread.php?t=276

Danny Bui, you've changed my mind about a few things; goodbye to software RAID5. I've given up on this idea for the moment. I'd entertain the idea of spending up to $500 (more?) on a good hardware RAID card that could get me close to the 20 drives max I am looking for (based on the enclosure size). However, based on reading your links I don't believe I want to save that much money and go with the Dell Perc 5/i. That sounds like more of a hassle getting it to work than it is worth to save some money. I'd rather have a RAID card that I am using in a supported way; rather than trying to hack a enterprise class RAID controller into a 'consumer grade' (relatively) NAS.

Also, I've also turned around WHS based on some research. I do find it a bit odd that it seems to only come in a x32 edition and that it does not natively support GPT partitions larger than 2TB (sounds like you have to do some hacking to get this to work). However, it sounds like based on the way the storage pooling works this isn't a huge concern as it can 'pool' together much larger than 2TB and it doesn't really work with RAID volumes anyhow, so you still interact with a pool that "looks" like something larger than 2TB. Also, I will admit I really like the idea of WHS duplication. That does make me feel a bit more secure than RAID 5; it sounds like it is probably easier to manage and recover from a failure.

As well, I have read through Okie's Project Galaxies; amazing and fascinating. It makes me wonder though, you seem to be making two conflicting points. If I go with WHS, is there really any need for expensive RAID cards? I suppose they could be useful if I decided to go down the Server 08 route etc in the future but as far as I can tell I wouldn't see the advantages of using a good hard RAID controller on WHS would I? It looks like the Galaxy 6.0 simply used the SuperMicro SAT2 MV8 PCI-X cards. This makes me wonder what the performance of WHS is like compared to hardware RAID and if it is even worth concerning myself over? That gives me much to consider.

About the Server 2008 R5, it is supported....my head was in Vista Land when I wrote that.

4)
a. Well I guess it depends on the person, but software R5 is not for me. Yea the reads are decent but doing writes sucks. Filling your drives is going to take you a very long time.

b. If you want to move systems you really just have to move the card with the drives, because the config is all stored on the card.​

6) Reasons to use WHS.
a. Solid Server 2003 Base
b. Massive storage without the massive price
c. Addins
d. You will prob see better speeds from a WHS setup than you will a software R5.
Yea you may have higher read speeds, but you already said that you need it to be able to play 2 1080p clips. You can also do writes to the server at the same time as they will go to another disk, assuming there is available space on another disk.
With my WHS I can do sequential writes at 80MB/s over the network.
If you go the software raid route and you try to do anything while you are doing a write, you video will skip stutter and maybe even crash.
With software R5 you will be lucky to get 20MB/s writes, and that will be closer to 3MB/s if you are playing a movie from the array.​

Nitrobass24, as stated above, you have also helped in swaying me. I'm moving away fro the software RAID 5 approach. Also as mentioned above I am really entertaining using WHS as well. Based on what you have said it sounds like I could accomplish my playback goal with WHS, which is really what I am going for anyhow. Do you have duplication enabled on all of your WHS shares (I know that I would if I went with WHS)? If so, what might the performance be like without it if you were doing a file copy and some playback with duplication enabled? Also, is there an x64 version of WHS? I can't seem to find one one newegg. Would having more than 4GB of RAM do any good anyhow? I can't see that it would unless you used it to host VM's (which might be fun).

As an aside, I should also note that I was mainly considering Server 2008 because I have access to it through work for the low, low price of $0. I wouldn't even CONSIDER it based on purely the price if that wasn't the cost. This is also why I was initially considering something like FreeNas as well; the cost was right. But, I don't mind spending the $$ for WHS if it makes sense for what I want to accomplish; especially at that price point.

Have you considered Solaris 10 with RAIDZ and ZFS? Took about 5 minutes to configure the array and share it out via CIFs (windows fileshare) to my windows boxes. The performance is excellent (I can max out a 1Gb link no problem and for that matter probably two+ if I wanted), point in time snapshots of the entire filesystem with no performance hit is sweet! I have snapshots going back 2 months I can literally time slide to and then copy and paste files out of. I went with the following build for my Solaris NAS:
Asus M2N-LR - 6 sata dual PCI-X dual PCI-E with dual ethernet
Athlon x2 5050e processor (low power with plenty of horsepower for Solaris and Virtualbox)
4Gb of Patriot DDR2
5 x 750Gb Samsung F1's

Solaris natively supports that Supermicro card you listed and a lot of people use them as you only need a JBOD card for Solaris as all the magic is in the software layer.

The best thing is you aren't tied to a specific platform or RAID type. I actually moved the array to this config after installing it on a more basic mainboard and reinstalled Solaris which immediately saw the old array config and mounted it up with any interaction from me (Solaris writes the array config to the drives in the arrray).

There are a load of guides on setting up a Solaris NAS and having played FreeNAS and OpenFiler (which I think is better than FreeNAS at this point) Solaris is way more stable and much faster. PM me if you want more details on the setup but its really straightforward.

Oh and there isn't a volume pool size limitation (I'm sharing out a 3TB pool to my Windows Boxes right now).

Very interesting post. I had not considered such an approach. I will openly admit, I am a complete Nix/BSD noob (I play with Ubuntu and Open SUSE on a laptop) but I know even LESS by leaps and bounds about Solaris. This sounds like a fascinating idea but based on some of the reading I have done I am not sure I would feel wholly comfortable basing my entire NAS platform that stores all my precious data (currently 5TB or so) on an OS that I know absolutely nothing about and would have a very tough time troubleshooting. It makes me a bit nervous to throw myself into proverbial the Solaris pool that way as a semi-competent Windows admin with limited UNIX/LINUX/SOLARIS experience. Either way, I am going to seriously keep this in mind as an alternative to FreeNas or OpenFiler if I decide to take the open source plunge.

So, overall, what I can take away from this second round of comments is that I need to reconsider my hardware build and find a cheaper motherboard/CPU/RAM combo (probably AMD, something that has PCI-E and PCI-X) and explore spending some money on a hardware RAID controller card. It also sounds like it wouldn't hurt to kick in and get 1 or 2 of the super micro cards along with whatever hardware RAID I chose. If they are good enough for the Galaxy 6.0, why not for my build? Additionally, it sounds like I should firmly examine the WHS route and perhaps go with that at least to start with. It's not that expensive, people here seem to love it to do exactly what I want to do, it has built in data duplication features and the performance sounds like it would fit my needs. If I dislike it, I can always go the Server 08/open source route right? :) As soon as I have time to rummage around newegg, I will try to find some new components that meet what I mentioned above. Thanks again all for the great comments, this is a good learning experience.
 
You definitely have received some different approaches, but I would not say conflicting advice.

With Ockies Galaxie: He had expensive RAID arrays, but when he saw the WHS light, he switched and move the drives onto the Supermicro SAT2-MV8 controllers(I also use and recommend these)

I use duplication on some but not all things.
I have not noticed a performance difference when copying to shares that are duplicated vs. those that are not.
NOTE* Duplication feature is not real-time, like parity.
When a Drive fails, WHS will not re-duplicate until you tell it to. ie. No Hot Spares.

All that said I still find it a great option for those that need mass storage for the home. I use mine to serve all my media files across my house and can play stuff on 2 computers simultaneously without a hitch.
Also you dont need high end expensive hardware.

There is no x64 version of WHS.....yet. WHS v2 will be 64bit AFAIK.
I have 4gb of RAM and quadcore on my WHS because I also use it for VM's.
When I first started using WHS I had a PIII 733mhz, and 1gb ram. It ran good too.
Really there is no need for more than 2gb unless you want to do vm's or something other than serve files.
At the same time I would not go less than 2gb.

If your going to cheap out anywhere I would do it on the CPU dept. Its the least important part. IMO
 
As well, I have read through Okie's Project Galaxies; amazing and fascinating. It makes me wonder though, you seem to be making two conflicting points. If I go with WHS, is there really any need for expensive RAID cards? I suppose they could be useful if I decided to go down the Server 08 route etc in the future but as far as I can tell I wouldn't see the advantages of using a good hard RAID controller on WHS would I? It looks like the Galaxy 6.0 simply used the SuperMicro SAT2 MV8 PCI-X cards. This makes me wonder what the performance of WHS is like compared to hardware RAID and if it is even worth concerning myself over? That gives me much to consider.

Oh sorry for the confusion. If you don't go with WHS and plan to use another OS, then I recommend getting the RAID cards. If you do go with WHS, go with the cheap Supermicro 8 Port PCI-X cards (which will work in a PCI port just fine) for around $100 for 8 ports:
SuperMicro AOC-SAT2-MV8 64-bit PCI-X133MHz (PCI Compatible) SATA Controller Card - $98

I haven't seen any performance numbers between WHS and a hardware RAID card. However, I don't think its worth concerning yourself over unless you plan on having 20 users accessing the server at the same time.

If I dislike it, I can always go the Server 08/open source route right? :)

Yup. In fact, there's a free 120 day trial of WHS available:
http://www.microsoft.com/windows/products/winfamily/windowshomeserver/eval.mspx

So it still won't cost you any money (for the OS) if you try out WHS. If you don't like it, then go with Server 08 or any of the open source software.
 
Common in all suggestions seem to be Norco 4020 is good (I like the two I have a lot), get a good PSU, and your drives choices are fine. A cheaper mobo \ cpu \ ram combo would do you ok too, if it's just a fileserver.

Now it get directional. The suggestions at the OS level of "try before you buy" are good, so maybe get the cheap HBA \ controller cards and forgo hardware raid while your "trying". If that doesn't work out, you should be able to resell the controllers for little loss, then pick up your hardware card.

BTW, if you're doing arrays over 8 drives, and doing hardware raid, I'd suggest doing raid6 or at least raid5+hotspares. I woke up and had 1 drive fail in a large raid5 once and I was sweating bullets the entire rebuild time. But I'm the paranoid type :)
 
You definitely have received some different approaches, but I would not say conflicting advice.

With Ockies Galaxie: He had expensive RAID arrays, but when he saw the WHS light, he switched and move the drives onto the Supermicro SAT2-MV8 controllers(I also use and recommend these)

I use duplication on some but not all things.
I have not noticed a performance difference when copying to shares that are duplicated vs. those that are not.
NOTE* Duplication feature is not real-time, like parity.
When a Drive fails, WHS will not re-duplicate until you tell it to. ie. No Hot Spares.

All that said I still find it a great option for those that need mass storage for the home. I use mine to serve all my media files across my house and can play stuff on 2 computers simultaneously without a hitch.
Also you dont need high end expensive hardware.

There is no x64 version of WHS.....yet. WHS v2 will be 64bit AFAIK.
I have 4gb of RAM and quadcore on my WHS because I also use it for VM's.
When I first started using WHS I had a PIII 733mhz, and 1gb ram. It ran good too.
Really there is no need for more than 2gb unless you want to do vm's or something other than serve files.
At the same time I would not go less than 2gb.

If your going to cheap out anywhere I would do it on the CPU dept. Its the least important part. IMO

Based on some serious consideration about what has been said here, I think I will take what Ockie did and go in that same direction. I will simply start out by purchasing two of the SuperMicro SAT2-MV8 controllers and go with WHS. However, is it necessary to run the SuperMicro SAT-MV8's in PCI-X 133Mhz slots? Does it make a real performance difference opposed to PCI? Basically, if it matters, I should buy a server motherboard with 2 PCI-E x16 slots, 2 PCI-X 133Mhz slots and a PCI-E x1 (or x4 or x8) if I can get it. If it doesn't matter using it in a PCI slot and there is really no need for me to have 2 PCI-X 133Mhz slots then essentially I can chose from a massive amount of consumer motherboards (I will detail some of the server motherboards I have found with these slot configurations in my next post).

Oh sorry for the confusion. If you don't go with WHS and plan to use another OS, then I recommend getting the RAID cards. If you do go with WHS, go with the cheap Supermicro 8 Port PCI-X cards (which will work in a PCI port just fine) for around $100 for 8 ports:
SuperMicro AOC-SAT2-MV8 64-bit PCI-X133MHz (PCI Compatible) SATA Controller Card - $98

I haven't seen any performance numbers between WHS and a hardware RAID card. However, I don't think its worth concerning yourself over unless you plan on having 20 users accessing the server at the same time.



Yup. In fact, there's a free 120 day trial of WHS available:
http://www.microsoft.com/windows/products/winfamily/windowshomeserver/eval.mspx

So it still won't cost you any money (for the OS) if you try out WHS. If you don't like it, then go with Server 08 or any of the open source software.

As I mentioned above, I will go with the Super Micro SAT-MV8 cards to start with and see how I like that. You also mention that they will work in a PCI slot just fine. Any idea how the performance is or are you just noting that they function, not necessarily that they perform decently with 8 drives on each card.

Just for the heck of it, I also research what RAID controller cards I decided I wanted to go that route in the future for hardware RAID, these seemed like solid affordable choices:

HighPoint RocketRAID 2680 PCI-Express x4 SATA / SAS Controller Card $260, PCI-E and could do 8 SATA drives I believe using the two min-sas.

HighPoint RocketRAID 2340 PCI -Express X8 SATA I & SATA II Controller Card = $430 but 16 SATA ports using the mini-sas ports; doesn't seem too bad.

Common in all suggestions seem to be Norco 4020 is good (I like the two I have a lot), get a good PSU, and your drives choices are fine. A cheaper mobo \ cpu \ ram combo would do you ok too, if it's just a fileserver.

Now it get directional. The suggestions at the OS level of "try before you buy" are good, so maybe get the cheap HBA \ controller cards and forgo hardware raid while your "trying". If that doesn't work out, you should be able to resell the controllers for little loss, then pick up your hardware card.

BTW, if you're doing arrays over 8 drives, and doing hardware raid, I'd suggest doing raid6 or at least raid5+hotspares. I woke up and had 1 drive fail in a large raid5 once and I was sweating bullets the entire rebuild time. But I'm the paranoid type :)

What did you think of my PSU choice:

Power Supply: Antec TPQ-850 850W ATX12V / EPS12V Too expensive, not powerful enough, not enough power cables/connections? I am partial to Seasonics/Antecs both for their reliable power and low noise (and modular cabling is a must these days). Any idea if it would have issues fitting in the Norco is that case pretty flexible? Also, I am working on the motherboard choices, I will post my thoughts in the next post. That will essentially determine my CPU/RAM choices; so far it seems difficult to find a cheap motherboard with the right slots.

As mentioned above, I am going to go for WHS with pooling/duplication with just the SATA controllers and go for hardware RAID perhaps later. However, I do fully agree with you. If I go RAID, I want RAID5 with a hot spare for sure or RAID6. I don't want to worry about my data or have to sweat through a horrible data recovery process.
 
However, is it necessary to run the SuperMicro SAT-MV8's in PCI-X 133Mhz slots? Does it make a real performance difference opposed to PCI?

Not necessary AFAIK. I don't think it would make a real performance difference but I could be around about that.

Just for the heck of it, I also research what RAID controller cards I decided I wanted to go that route in the future for hardware RAID, these seemed like solid affordable choices:

HighPoint RocketRAID 2680 PCI-Express x4 SATA / SAS Controller Card $260, PCI-E and could do 8 SATA drives I believe using the two min-sas.

HighPoint RocketRAID 2340 PCI -Express X8 SATA I & SATA II Controller Card = $430 but 16 SATA ports using the mini-sas ports; doesn't seem too bad.
Those aren't true hardware RAID cards. Those are hardware assisted software RAID (or fakeRAID) cards meaning that it still uses the PC's CPU to do all of the parity calculations just like with Software RAID. They're only a little bit better than actual software RAID.

What did you think of my PSU choice:

Power Supply: Antec TPQ-850 850W ATX12V / EPS12V Too expensive, not powerful enough, not enough power cables/connections?

Not that great of a choice. The Corsair 850TX is of equal quality to that Quattro 850 yet costs $60 cheaper. And, as much as I hate to say this, but you might be better off with a dual or single Rail PSU due to the shear number of hard drives the Norco can hold. Since you're going to use all 20 + 1 hard drive bays anyway, the whole point of modular cables is a bit moot.

I recommend the Corsair 850TX if you don't need modular cables (Slightly bigger version of the Corsai 750TX that Ockie used in one of the Galaxy 6.0s):
Corsair 850TX 850W PSU - $140

The only modular PSU I can recommend for your system is this:
Corsair 1000HX Modular PSU - $233
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Due to the way Newegg appears to provide sorting categories for Server motherboards I have had a helluva time locating boards with the following slots:

  1. 2 PCI-E x16 or x16 2.0 slots for RAID controllers in the future or a fast NIC
  2. 2 PCI-X 133mhz slots for the Super Micro Sata controllers
  3. An assortment of other lesser PCI-E ports, an x1 would be great or an x4 or x8 would be nice as well. I want flexibility for additional hardware RAID controllers

For whatever reason it doesn't properly sort 133mhz PCI-X slots (there is no option to do this, only 100Mhz, so it completely discounts 133mhz in the sorting) and it appears to not include PCI-E 2.0 slots from the PCI-E x16 sorting. So, I have manually pawed my way through some motherboards and found these and weighed some pros and cons:

ASUS M2N32-WS Pro AM2 NVIDIA nForce 590 SLI ATX Server Motherboard

It's a deactivated item but I like that it is DDR2 800 (cheap), has 2 x16 slots, 2 PCI-X 133Mhz and 6 SATA ports. I also like that it is the AM2 socket type; also cheap. The PCI-E x1 slot is a welcome addition as well; not sure why it isn't sold anymore there.

SUPERMICRO MBD-C2SBX LGA 775 Intel X38 ATX Server Motherboard

I like that this is a Core 2 Duo socket type (cheap) but I am unhappy that it is an X38 with DDR3 1333 memory as that tends to be pricier (though I am fairly certain I could get a 2-4GB 1333 kit for fairly cheap). At least it isn't a Core i7, still $199 though (not horrible but consumer motherboards are cheaper). I also like that again it has two PCI-E 2.0 x16 slots, a PCI-E x1 slot and two PCI-X 133Mhz slots WITH 2 PCI slots. That's a whole SLEW of slots. With 6 SATA ports onboard that's a fairly flexible motherboard; it seems like a fairly tempting choice.

SUPERMICRO MBD-PDSGE-O LGA 775 Intel 955X

Sadly as pricy as the above board, $199, but with cheaper RAM choices (DDR2 667). It's also an LGA 775 board. It also has the disadvantage of only have 1 PCI-E x16 slots vs the 2 from above, though it does have an x1 PCI-X 133mhz x2 and 3 PCI slots.

ASUS KFN32-D SLI/SAS Dual 1207(F) NVIDIA nForce Professional 3600 + 3050 Server

I am barely even considering this because it is $200 and has Dual AMD procs (I have no need for two processors). But it fit my bill with the right slots, 2 PCI-E x16 slots, 1 PCI-E x8 slot, 2 PCI-X 133Mhz slots and 2 PCI slots. 6 SATA ports and DDR2 667 (cheap). It doesn't seem any better than the other motherboards I found, plus it has funky CPU support.

So, clearly I need to do some more manual looking at motherboard specs but here is what my conclusion is so far. The combination of slots I am looking for A) Requires a server grade motherboard to get the PCI-X 133Mhz slots B) The combination of slots I am looking for ONLY tend to be present on more expensive motherboards.

I think if I wanted to save on cost the basic sacrifice to make is to not look for anything with PCI-X slots on it to put the two SUPERMICRO AOC-SAT2-MV8 64-bit PCI-X133MHz SATA in. If PCI slots would be more than sufficient for these cards, PCI-X wouldn't really matter would it? As far as I can tell (and I may be very wrong) PCI-X is on its way out. It seems like the only real reason to have it is for the Supermicro cards. It looks like the future in SATA expansion cards (even the non-raid oriented ones) appear to be moving toward the PCI-E market; if not x16 there appear to be a lot of x4 or x8 cards. So, if PCI-X isn't really necessary for the SATA controller cards I might get and it's not necessary for future expansion, then I don't need it right? Is my thinking sound here or should I invest in one of the above motherboards (the X38 looks like the best choice)?

Depending on the responses I will keep looking for the ports I outlined again or start over from scratch looking at consumer style motherboards with 2 PCI-E x16 slots, PCI-E x1 slots, PCI-E x4, x8 etc with a couple of PCI slots to put the supermicro cards in.

Anyhow, I look forward to reading the responses; I appreciate all of the advice!
 
First off like Danny said those highpoints are not Hardware Raid Cards. If your going to spend that much get a real card.

PSU:
Those corsairs are good but I think a little overkill, and are slightly expensive.
If you want corsair I think the HX750w would be plenty.

I use a PCP&C 750w and its an awesome PSU and highly recommend it, there are a few for sale in the fourms for really good prices.

The SAT2-MV8's run fine in a regular PCI slot for WHS because with WHS you are essentially doing a JBOD. And well PCI is 133MB/s bus width and more than an single drives sequential throughput.
Not to mention you are really going to be limited also by your network.
I run one in a PCI slot and one in a PCI-x slot....the only time I can tell is when I force writes to certain drives for bencmarking.
In actual use I have never been able to tell.
 
I also would recommend the PCP&C 750w. I have 4 running various storage servers and I've had no problems with them. I've seen them as low as $80 from Newegg, with rebate.

Also, you mentioned doing 4 x 1.5tb and 4 x 1tb, so a single SAT2-MV8 will do you ok if you're gonna go the WHS nonRAID route. And then you can always pop in another card when you need more drives.

Also, if you look at, say Ockie's recent builds, he's using cheaper motherboards and procs. These will take SAT2-MV8s in a standard PCI slot, and if you feel the need to go hardware RAID, some will work with hardware RAID cards in the pci-e slot usually meant for graphics. Just tossing out cheap methods here. I've had plenty of storage servers where I went overkill on the mobo\proc, and wished I spent more on drives instead :)
 
Not necessary AFAIK. I don't think it would make a real performance difference but I could be around about that.


Those aren't true hardware RAID cards. Those are hardware assisted software RAID (or fakeRAID) cards meaning that it still uses the PC's CPU to do all of the parity calculations just like with Software RAID. They're only a little bit better than actual software RAID.



Not that great of a choice. The Corsair 850TX is of equal quality to that Quattro 850 yet costs $60 cheaper. And, as much as I hate to say this, but you might be better off with a dual or single Rail PSU due to the shear number of hard drives the Norco can hold. Since you're going to use all 20 + 1 hard drive bays anyway, the whole point of modular cables is a bit moot.

I recommend the Corsair 850TX if you don't need modular cables (Slightly bigger version of the Corsai 750TX that Ockie used in one of the Galaxy 6.0s):
Corsair 850TX 850W PSU - $140

The only modular PSU I can recommend for your system is this:
Corsair 1000HX Modular PSU - $233

Thanks for the feedback again here Danny. It seems that the consensus is that I don't need PCI-X slots for the SuperMicro cards. In that case I can look at much cheaper consumer motherboard/CPU combinations with a couple PCI-E slots instead.

Also, good to know about the RAID cards. From the sounds of it, might as well use SuperMicro cards and software RAID instead of buying something like that. Very good to know.

As for the PSU, in an effort to save money I have changed over to the 850W Corsair PSU you recommended. In this particular case you make a good point about modular; I'm going to need all the cables I can get most likely. However, I may also tak e a look at the 750W PCP&C people recommended. Though honestly it seems like in this case, probably the more power the better (saving money doesn't hurt either though).

First off like Danny said those highpoints are not Hardware Raid Cards. If your going to spend that much get a real card.

PSU:
Those corsairs are good but I think a little overkill, and are slightly expensive.
If you want corsair I think the HX750w would be plenty.

I use a PCP&C 750w and its an awesome PSU and highly recommend it, there are a few for sale in the fourms for really good prices.

The SAT2-MV8's run fine in a regular PCI slot for WHS because with WHS you are essentially doing a JBOD. And well PCI is 133MB/s bus width and more than an single drives sequential throughput.
Not to mention you are really going to be limited also by your network.
I run one in a PCI slot and one in a PCI-x slot....the only time I can tell is when I force writes to certain drives for bencmarking.
In actual use I have never been able to tell.

Thanks for the information Nitrobass24; I will avoid those hardware RAID cards. Additionally, I am going to add that PCP&C 750W to the list of possibilities; it sounds like another good (cheaper) option just with 100W less. Also, as I mentioned in response to Danny's post, I think I will go with the SAT2-MV8's in regular PCI slots; it seems like sticking with PCI-X slots isn't worth it. Thank you for posting your actual experience with this; very helpful.

I also would recommend the PCP&C 750w. I have 4 running various storage servers and I've had no problems with them. I've seen them as low as $80 from Newegg, with rebate.

Also, you mentioned doing 4 x 1.5tb and 4 x 1tb, so a single SAT2-MV8 will do you ok if you're gonna go the WHS nonRAID route. And then you can always pop in another card when you need more drives.

Also, if you look at, say Ockie's recent builds, he's using cheaper motherboards and procs. These will take SAT2-MV8s in a standard PCI slot, and if you feel the need to go hardware RAID, some will work with hardware RAID cards in the pci-e slot usually meant for graphics. Just tossing out cheap methods here. I've had plenty of storage servers where I went overkill on the mobo\proc, and wished I spent more on drives instead :)

PCP&C, strong consideration for me based on some of the recs here; sounds like a great value. Also, it's good to know that you find them to be reliable; I am sick of RMA's and poor tech support.

I think to begin with I will just get 2x SAT2-MV8's so that I have them at the ready if/when I add more storage. But for now, no need to get some real hardware RAID that would require a PCI-E slot anyhow.

It is good to hear that CPU/motherboard don't play a huge part in performance overall. Something to come in mind when building a storage system. That much is clear when I really look at Ockie's builds; you see where the money really goes.

-------------------------------------------------------------------------------------------------------------------------------

So, overall based on the last few posts I have been able to synthesize that I don't need a server style motherboard. In the end I had decided that if I wanted PCI-X slots the SuperMicro X38 board was the best choice:

SUPERMICRO MBD-C2SBX LGA 775 Intel X38 ATX Server Motherboard

However, as it sounds like that's not really necessary anyhow I am going to start looking at cheaper consumer style motherboards. Low end LGA 775 or AM2+/AM3 so that I can get a lower end (read, cheap) Core2Duo or Phenom. I'll look for a motherboard that ideally has 2 PCI-E x16 slots a PCI-E x1 slot or 2 and 2 PCI slots; something that has room for the supermicro cards but also room to grow into hardware RAID controller cards. Once I spec something like that out after reading a few reviews I'll post it.

Now, the other plan I am considering will cost a bit more but benefits me in two ways. Currently, in my "primary" Vista/gaming PC I have the following:

GIGABYTE GA-P35C-DS3R LGA 775 Intel P35 ATX

Intel Core 2 Quad Q6600 Kentsfield 2.4GHz 2 x 4MB L2 Cache LGA 775

CORSAIR XMS2 2GB (2 x 1GB) 240-Pin DDR2 SDRAM DDR2 1066

I am thinking, for fun, I could build myself a nice new Core i7 system with a new CPU, X58 motherboard and DDR3 1600Mhz RAM. Then, I keep all the other existing components in my current PC (case, PSU, disc drive etc) and take out the above components from my old PC (CPU, motherboard and RAM) for usage in the NAS. The only major disadvantage of the P35 Gigabyte motherboard for the NAS there is it only has 1 PCI-E x16 slot (it's heavier on PCI-E x1 and PCI slots). The CPU is more than adequate(Q6600) and 4GB of that Corsair RAM is also just fine. I thought this might be a fun way to upgrade my main PC and create a NAS at the same time (even if the components in my old gaming PC aren't perfectly 'NAS suited') :). Sadly though this adds a few hundred dollars to the build as obviously X58 boards and even the cheapest Core i7 aren't that cheap; either way something for me to consider before I spec out a cheaper consumer setup for the NAS with dual PCI-E x16 etc.

In case you are curious the components I was considering for replacement "primary PC" parts were (I haven't researched these fully, this was to give me a quick idea of how much this plan would cost):

EVGA 132-BL-E758-TR LGA 1366 Intel X58 ATX Intel Motherboard

Intel Core i7 920 Nehalem 2.66GHz 4 x 256KB L2 Cache 8MB L3 Cache LGA 1366 130W

OCZ Gold 6GB (3 x 2GB) 240-Pin DDR3 SDRAM DDR3 1600 (PC3 12800) Triple Channel Kit Desktop Memory
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
If you want good data storage. Get a system with about 2-4 gb ddr2 ram, and as much storage as you want. And get a NOS on it :).
(Network operating system) if you werent sure. :)
 
So, I've spent some time putting together a number of potential NAS builds. Essentially, the only differences between each build are the CPU, Motherboard and RAM. I will comment on some of my choices for each build a list after the link. Basically I started looking for a motherboard of various types with 2x PCI-E x16 slots, 1PCI-E x1, 6 SATA ports minimum, that had decent reviews and was affordable. In terms of RAM I looked for 4GB, that had a good balance of low cost to high average user reviews. CPU choices were made on the basis of, low end of newer processors with decent performance and fairly low wattage. The other components I have pretty well decided on that are present in each build are:

NAS Without CPU/Motherboard/RAM - $1,175.84

Here are the components that don't vary between any of the builds and a brief explanation for my choice of each:

  • Norco case - A given, everyone likes this
  • 2x Super Micro cards - Also a given)
  • The Corsair 850TX PSU - Darn good value and I like having that much max power
  • WD 650GB - Decent OS drive, everyone seems to like it
  • Seagate 1.5TB x2 - Good $/GB ratio
  • Sony Optiarc Slim SATA drive - Picked it because it was a slim drive and SATA
  • Windows Home Server - For the pooling/duplication features and ease of use
  • Panflo 80mm Case Fan - The one I have linked MAY be the wrong fan people were referring to before, I couldn't find it on newegg so I added 7 of these to the build so that I could estimate how much 7x quieter fans would cost me

NAS With Intel C2D SUPERMICRO MBD-C2SBX LGA 775 X38 4GB DDR3 1333 - $1,615.81

  • Server style X38 motherboard so it has LGA775, 2 PCI-E slots, 2 PCI-X slots, 1 PCI-E x1, dual 1GBps NICs, DDR3 1333 support. I like that I can throw a cheaper Core 2 Duo into it but still have decent slots etc. It even has PCI-X slots to put the super micro cards into (though it sounds like this isn't really necessary)
  • I choose the C2D 8400 (for this build and my other LGA 775 builds) as it had 65W requirements, seemed pretty powerful and had great reviews. Pricing also seemed affordable.
  • The GSkill DDR3 1333 4GB RAM I choose purely based on the A) Low cost, B) Decent user rating on newegg. Normally I would choose Corsair or OCZ as I am more familiar with those vendors but I was looking at cost here. I employed this RAM in my other DDR3 builds.


NAS With Intel C2D GIGABYTE GA-EP45-UD3P LGA 775 P45 4GB DDR2 1066 - $1,530.81


  • Motherboard was chosen because it was affordable, LGA 775, 2 PCI-E x16, 3 PCI-E x1, 2 PCI slots, 8 SATA ports and dual nics. It also has pretty fantastic user reviews. For whatever reason lately the Gigabyte mobo cost/features ratio has been pretty good so I have ended up liking a lot of their boards. I am not married to that vendor in any way, to be honest I like eVGA more PURELY based on their fantastic support.
  • Same Core2Duo Wolfdale as I used with other builds, LGA 775, 65W
  • More G.Skill RAM. This was chosen as it also had a great overall user rating and was a wonderful price for 4GB of DDR2 1066


NAS With Intel Core i7 EVGA 132-BL-E758-TR LGA 1366 Intel X58 6GB DDR3 1600 - $1,867.81


  • This needs slightly more explanation. If I employed this plan I wouldn't use the motherboard/RAM/CPU in the NAS. I would take these and put them in my current primary gaming PC and take the motherboard/RAM/CPU out of that. That is why this "build" is the most expensive; I would be building a NAS and upgrading my current system. I also chose parts here based MOSTLY on performance/reviews rather than low price.
  • This motherboard was chosen mostly as it was an AFFORDABLE x58 board (there aren't many) and it was an eVGA. Triple channel DDR3 RAM support, Tri-video card support, 9 SATA ports, Corei7, what else could a gamer want?
  • The i7 920 Nehalem 2.66Ghz was chosen as I could not justify the cost of the other Core i7 CPU's; even for my main gaming box.
  • OCZ 6GB of RAM (triple channel) was chosen for it's darn good performance, stability and very competitive pricing
  • Overall this system option is pricier than I wanted it to be but it is still appealing. However, you get the most gaming performance out of upgrading a GPU anyhow, so I wouldn't see probably a whole lot of real world gains from this (though it would be fun).

NAS With Intel C2D GIGABYTE GA-EP45T-UD3P LGA 775 P45 4GB DDR3 1333 - $1,555.81

  • This motherboard was chosen as it had LGA 775 support, Dual PCI-E x16 slots, DDR3 support and 8 SATA ports. It basically is a DDR3 version of the other Gigabyte board I liked the look of.
  • Same Core2Duo Wolfdale as I used with other builds, LGA 775, 65W
  • The same GSkill DDR3 1333 4GB RAM I used in other builds

NAS With AMD AM3 ASUS M4A78T-E 790GX 4GB DDR3 1333 - $1,504.82

  • The motherboard here was chosen as it was on a short list of AM3 motherboards I could find, it had DDR3 support 2x PCI-E x16 slots, 5 SATA RAID ports and support for the Phenom II. I also like that this motherboard has onboard graphics, one less slot taken up.
  • The Phenom II X3 710 2.6Ghz CPU was chosen as it was 95W and was one of the more affordable Phenom II choices.
  • The same GSkill DDR3 1333 4GB RAM I used in other builds

NAS With AMD AM2+/2 ASRock A780GXE/128M 780G 4GB DDR2 1066 - $1,425.82

  • I choose this motherboard as it was AM2+/AM2, damn affordable, 2 PCI-E x16 slots, 6 SATA ports, cheap DDR2 1066 RAM and had onboard graphics. Honestly I had barely even heard of ASRock but based on customer reviews it sounds like a pretty darn solid motherboard. I could even put a AM3 CPU in here but I am a bit wary of how well that would work.
  • I chose the Phenom 2.3Ghz CPU as it was Quad Core, great price, 95W.
  • Same G.Skill RAM DDR2 1066 as I had chosen before

Current Primary Gaming PC - Older Built 2007

  • This was included so that people could see what the NAS would have for a CPU, motherboard and RAM if I decided to upgrade my primary PC. The main disadvantage of the motherboard for the NAS is that it only 1 PCI-E x16 slot and instead 3 PCI-E x1 slots (3 regular PCI slots). It does have 8 SATA ports.
  • The CPU is overkill for a NAS a bit but, if I am upgrading to a Core i7, I may as well use it for the NAS
  • I actually have 4GB of this RAM. 4x 1GB DDR2 1066 sticks of RAM. For whatever reason I couldn't get the system to run stably at DDR2 1066 and had to clock the ram down to 800Mhz, this prevented random lockups I was getting. This should be plenty of RAM for a NAS.

I should mention that I did spec out Ockie's build of CPU/motherboard and RAM combination and it came to $1325 or so with my other parts; really not that much less than the options I have listed above.

A few notes, I know the fans may not be right. They were just a placeholder for value until I found the right ones (or I will go with them if they are quiet enough). Also, the overall system cost will be a bit higher as I want to buy the Kendall Howard Rack as well for $338 (unless I can find a cheaper local 12U rack, which I haven't been able to so far). As well, I know that I don't have a graphics card listed for any of the systems. I didn't pick on out as I will either use onboard of pick out the cheapest PCI/PCI-X1 card I can find.

Overall, I am almost leaning toward the AM2+/2 ASRock system just based on pure overall value. However, it is tempting to get any of the systems using DDR3 as DDR2 is on the way out and DDR3 is now affordable. Obviously my X58 "plan" is the most expensive, but I get a NAS and new primary system out of the deal. Still though, then I am pushing the overall plan easily into the $2000 grand territory, over double what I planned to spend. I planned to spend $1000, fully expecting that it would cost $1500 or so but $2000 is a little steep for so few drives.

Anyhow, I will be interested to hear what everyone thinks of my build options! It was both fun and difficult pricing out items that were affordable/future proof (sort of)/good features.
 
If you're going to go with a WHS route, you really don't need to get a fancy dual core or i7 processor for the server, a low-power, low-energy, and low-heat 45W AMD processor would do you well, too.
 
If you're going to go with a WHS route, you really don't need to get a fancy dual core or i7 processor for the server, a low-power, low-energy, and low-heat 45W AMD processor would do you well, too.

This is true. However, I like the idea of being a bit flexible to move to software RAID5/6 on Server 2008, or run the NAS as VM server if I wanted to in the future if WHS doesn't meet all my needs. I would feel slightly nervous about building a min spec box around an OS platform that I have never used. That is why I specced up a bit from the Galaxy build to add some flexibility. But yeah, it may be better to use a lower 45W CPU to save on power. If I stick with a decent motherboard (AM2/2+ or 3) with AMD, I can always trade up in CPU's if I need it someday I suppose.
 
I really liked the idea of upgrading your workstation, and old moving parts over to the server. Yeah, you want some specialized stuff for the server (case, psu, controller card) but for the most part, file servers dont do a whole lot so why not put your new toys somewhere you'd use them more. Like as your main desktop :)
 
I really liked the idea of upgrading your workstation, and old moving parts over to the server. Yeah, you want some specialized stuff for the server (case, psu, controller card) but for the most part, file servers dont do a whole lot so why not put your new toys somewhere you'd use them more. Like as your main desktop :)

Yeah I must admit that this option is the most tempting as it's like a 2-for-1 deal.

So based on some of the feedback I have narrowed it down to three motherboard/CPU/RAM choices and tried to adjusted them based on power consumption/value. I am enjoying posting my thought process on the board; it will give me a sort of "work log" to look back on when I want to know decisions I made and why. The primary two choices are:

NAS With AMD AM2+/2 ASRock A780GXE/128M 780G 4GB DDR2 800 - $1,369.81

  • I changed this choice a bit to both A) Lower the cost and B) reduce power consumption.
  • This build is now using a 5050e 65W CPU and DDR2 800 RAM (you can't use DDR2 1066 with an AM2 CPU on this board, AM2+ only). Technically I could someday use an AM3 CPU (of course without DDR3).
  • I still like that this board is pretty full featured for a low price. Up to DDR2 1066 support, 6 SATA ports, 2 PCI-E x16 2.0, 2 PCI, 1 PCI-E x1.
  • It's a big pro that it has onboard video so I don't have to waste a slot on that, whereas I would on the LGA775 board.

NAS With Intel C2D GIGABYTE GA-EP45T-UD3P LGA 775 P45 4GB DDR3 1333 - $1,555.81

  • I decided that ultimately I liked this choice because it was middle of the road in terms of pricing and support fairly recent Intel technology (DDR3, LGA775)
  • The motherboard has the pros of: DDR3 RAM support, 2x PCI-E x16 2.0 slots, 2 PCI slots, 3 PCI-E x1 slots and 8 SATA ports (two more than the previous motherboard)
  • This choice would require buying a graphics card versus the AM2 solution that has onboard
  • I considered swapping a cheaper CPU into this build (the current CPU is a 65W) but ultimately I decided that didn't make much sense. There really aren't any cheaper CPU's that run at 1333mhz FSB. This CPU is a bit overkill for what will be a file-server initially but it seemed to "fit" the motherboard the best.
  • I also toyed with the idea of a cheaper RAM type (DRR3 1066 etc) but I decided that it was best to get CPU, RAM and motherboard FSB values that all match up. IE CPU with 1333 FSB, motherboard with 1333 FSB support and DDR3 1333mhz RAM.
  • This motherboard also has dual NIC's so if I ever decide to try teaming I could, whereas with the previous choice I could not.

The third choice I still haven't completely ruled out is upgrading my primary workstation detailed here (I will paste what I had before so my 3 primary choices are in one post). I am honestly leaning away from this choice based purely on cost and that it doesn't produce an 'ideal' NAS but still, new hardware for my primary workstation would be fun to play with:


NAS With Intel Core i7 EVGA 132-BL-E758-TR LGA 1366 Intel X58 6GB DDR3 1600 - $1,867.81


  • This needs slightly more explanation. If I employed this plan I wouldn't use the motherboard/RAM/CPU link in the build for the NAS. I would take these and put them in my current primary gaming PC and take the motherboard/RAM/CPU out of that. That is why this "build" is the most expensive; I would be building a NAS and upgrading my current system. I also chose parts here based MOSTLY on performance/reviews rather than low price.
  • This motherboard was chosen mostly as it was an AFFORDABLE x58 board (there aren't many) and it was an eVGA. Triple channel DDR3 RAM support, Tri-video card support, 9 SATA ports, Corei7, what else could a gamer want?
  • The i7 920 Nehalem 2.66Ghz was chosen as I could not justify the cost of the other Core i7 CPU's; even for my main gaming box.
  • OCZ 6GB of RAM (triple channel) was chosen for it's darn good performance, stability and very competitive pricing
  • Overall this system option is pricier than I wanted it to be but it is still appealing. However, you get the most gaming performance out of upgrading a GPU anyhow, so I wouldn't see probably a whole lot of real world gains from this (though it would be fun).

Current Primary Gaming PC - Older Built 2007

  • This was included so that people could see what the NAS would have for a CPU, motherboard and RAM if I decided to upgrade my primary PC. The main disadvantage of the motherboard for the NAS is that it only 1 PCI-E x16 slot and instead 3 PCI-E x1 slots and 3 PCI slots. It does have 8 SATA ports.
  • The Q6600 CPU is overkill for a NAS a bit but, if I am upgrading to a Core i7, I may as well use it for the NAS
  • I actually have 4GB of this RAM. 4x 1GB DDR2 1066 sticks of RAM running at 800mhz.

--------------------------------------------------------------------------------------------------------------------------------

Ultimately, I feel like it's between the AM2 board and the LGA775 option. The cost difference between the two is fairly negligible so it's really feature-to-feature comparisons that really matter here. I am honestly a bit torn between the two; they trade pros and cons in my mind.

The wildcard option of upgrading my current workstation and using the old parts to build a NAS, while extremely tempting is probably not what I will go with based purely on cost. None of these builds have yet to factor in rack costs, which I will tackle in my next post (along with some questions about fans and rails for the Norco 4050).
 
I'd either go with the AMD setup, or convert your current box and upgrade the gaming rig.

I'm in the process of converting my 780g into a media server via WHS, MV8, and in time ripping out of my lian-li and putting it in a norco. The response i've gotten is it should work just fine for streaming blurays. So as others have said, don't just throw cash into a bad ass processor/ram, rather buy more drives.
 
Current Primary Gaming PC - Older Built 2007

  • This was included so that people could see what the NAS would have for a CPU, motherboard and RAM if I decided to upgrade my primary PC. The main disadvantage of the motherboard for the NAS is that it only 1 PCI-E x16 slot and instead 3 PCI-E x1 slots and 3 PCI slots. It does have 8 SATA ports.
  • The Q6600 CPU is overkill for a NAS a bit but, if I am upgrading to a Core i7, I may as well use it for the NAS
  • I actually have 4GB of this RAM. 4x 1GB DDR2 1066 sticks of RAM running at 800mhz.

Those supermicro controllers will work in your pci slots. And you can always look at 16+ port raid cards that will go into your single pci-e 16x slot.
 
Well, I am still thinking about the motherboard choice but it really does seem to be between the AMD setup or upgrading my primary workstation to Core i7 and using my old primary workstation for the NAS (I wish this wasn't such a pricey option). Perhaps I will have to figure out some way to justify the the cost! Either way I appreciate the feedback on my final 3 system choices; more is always welcome.

So, with my motherboard/CPU/RAM choices etc almost finished I am considering some of the rack/case accessories as well and I could use some assistance. So let me start with fans.

FANS

Everyone recommended the Panaflo L1A's for the Norco 4020 case to quiet it down. However, I couldn't find these on newegg's site all I could find was:

Panaflow search results

So, based on those search results I had chosen 7 of these to incorporate into my build:

Rexus NMB-MAT (Panaflo) 80mm Case Fan

They sounded quieter than the Noro 4020 stock case fans and were affordable. Anyone have better suggestions or is this fine?

Yes, I am aware I could order fans from somewhere other than Newegg but I like dealing with their customer support and their prices are USUALLY extremely competitive.

--------------------------------------------------------------------------------------------------------

RAILS

I think this should be a simple question. I found one set of rails I think will work with the Norco 4020 and I believe should work with pretty much any rack:

Rosewill RSV-R26 26" 3-sections Ball Bearing Sliding Rail kit for rackmount chassis

They looked affordable and not terrible for the Norco 4020. Any other suggestions for 4020 rails/screws/etc?

--------------------------------------------------------------------------------------------------------

RACKS

Sadly, I was unable to find anything nearby on Craigslist or E-bay. However, I have found a few good options; I think one is probably ideal.

HUNTEC CLM-2168-15B 15U Built

The best part about it is $250, 12U AND comes from a local company. I could literally drive over to pick it up without having to pay shipping. Obviously I would still have to pay lame WA sales tax but that would be less than shipping. I believe it would fit the Norco 4020 case just fine.

Kendall Howard 12U Without Door Compact Series SOHO Server Rack

Very nice looking 12U rack for $338.20, however, I would pay $98 in shipping. Looks very sturdy and I believe would fit the Norco 4020.

iStarUSA WO15AB 15U 4 Post Open Frame Rack

Pretty plain looking rack but it is 15U for $269 and isn't that heavy (which I like). I also like that it has wheels (as all of them do for mobility). However, while I would combine it with my other newegg parts it would cost me $150 in shipping for a large item. Ouch.

StarTech 4POSTRACK25 DuraRak 25U 4 Post Open Rack

I don't know that I want to go this big as I don't exactly need that much rackspace but I thought I would throw it into the mix for consideration given that it is $339 for 25U. It looks sturdy as hell but I have a feeling I won't go for this initially. I think I want to stay quite a bit smaller/more compact to begin with (something that can fit in a closet, not a garage I don't have).

----------------------------------------------------------------------------------------------------------------------------------

Ultimately, I think the HDNW rack is the most appealing. It's a bit bigger than I was initially shooting for (8-12U) but it's a pretty fantastic price for a new rack of that size, without need for shipment (I could pick it up on a Saturday in 30 minutes). I am a bit worried that the Norco 4020 would fit properly in it but I believe that it would.

As usual any feedback on this would be appreciated! :)
 
Don't always look towards Newegg for the best prices; I bought my Panaflo L1As from Directron.com.

Sometimes Newegg just has horrible pricing for peripherals (cables, fans, little stuff) because of tax (I live in CA) as well as their shipping charge for ONE SATA cable. :mad:

For that, there's Monoprice, Directron, and other stores that you do some hunting on Google Shopping or wherever.
 
@pirivan: You in the Seattle area? So am I.. I have an extra 14U Hardigg case I'd sell you for less than any of the options you have now. PM me and I'll give you my contact info.

Oh yeah, I also have some other smaller racks I'd let go for cheap. Like an open 20U and a full 42U I think.
 
@Croakz PM sent

@Syntax Error

Thanks for the feedback. I don't think I will go with newegg for the Panaflo L1A's; they don't appear to sell them. I was thinking of getting it from:

TechSunny $6.00 Panaflo L1A

Never heard of the company but the price is right for 7 decent quiet fans. As for cabling, you bring up a really excellent point. Is this going to be an issue? What length SATA cables should I have for the NORCO 4020 case? Also, do I need lengthier power cables etc for the Corsair 850W PSU I have been using in my builds? I hadn't really put too much thought into how much extra/additional internal cabling might be necessary.

As a side note, I still haven't decided what to purchase. I'm even now considering doing a 3 way 'bump'; upgrading my primary system bumping that down to my HTPC and then bumping the HTPC components into the NAS (that motherboard is an AM2, DDR2800, 2x PCI-E x16, 6 SATA ports). Either way it's clear that I like making things more complicated for myself. :)
 
pirivan - this was a very informative thread as I am trying to do something very similar.

Have you been able to confirm that the fans are the right ones?

I have an ATX motherboard, P4 and some RAM lying around so I am going to order the same server case and get started with WHS to see how well it works. It also will give me a chance to evaluate the fans that come with the case.
 
pirivan - this was a very informative thread as I am trying to do something very similar.

Have you been able to confirm that the fans are the right ones?

I have an ATX motherboard, P4 and some RAM lying around so I am going to order the same server case and get started with WHS to see how well it works. It also will give me a chance to evaluate the fans that come with the case.

Hi Sphinx99,

I have the fans but have not yet received the case (tomorrow). As I am migrating 3 sets of components I probably won't touch the Norco 4020 until the weekend at the earliest. However, I will probably check to see if the fans fit; I expect that they will (I ordered the Panflo L1A's from techsunny). The fans came in a pretty random unpackaged box but if they work right, who cares right?

As I decided in the end to do a 3-part build (new components into the main PC, old main PC components into the HTPC, HTPC components into the NAS) the only parts I didn't order were power cables and SATA cables. I decided that as I am doing the other two migrations first, I will have a little bit of time to physically look at the case and the components in it before I decide what type and length of power/sata cabling I should buy for it. I will probably either order it all at once from newegg or head on down to Fry's once I decide.

I've written up a little migration plan for each PC to make sure everything is backed up. I've only covered the new components into the primary PC migration and it's already up to a few pages. This is my first time trying to use Acronis Universal Restore to restore OS images between different hardware. I wanted to be sure I was prepared to fully rebuild the machine and restore all my application configurations from backups if I have to start from scratch. Really though the main PC is simply a test of the Universal Restore functionality; all it has on it is the OS and some games essentially. The PC I really don't want to rebuild is the HTPC. That has a fully configured MediaPortal on it and THAT would be a pain to setup again. Even with all the configs and databases backed up I will still have to setup the codecs again and probably a few other 'tweaks' to get it just right. So, I am hoping Acronis Universal Restore will allow me to migrate between an AMD setup to an Intel one. At the very least it will all be very interesting. Then I get to start planning the NAS build. Besides making sure all my data is intact when migrating drives that already have data on them to the NAS, it should be starting from scratch; so quite a bit easier.

Anyhow, this was far more than an answer to your question but for my own sake I felt like updating this thread a bit with where I am currently in the project for my own future reference perhaps (and of course in case anyone cares).
 
Not at all, it was helpful. Again, I am doing some very similar things to you.

I have an old PC I built some years ago (IC7-G motherboard, P4/2.8, 2GB RAM, working hard drive) in a case that's fallen to pieces, so my plan is to order the same Norco case, move those components into it and setup WHS with some drives I have lying around (400GB, 500GB, couple of 72GB, 80GB) to see how well it performs. I use laptops for most of my work, so the desktop has fallen into disrepair, and this is a chance to salvage some of it. It's also an excuse to build a new Core i7 workstation, which I do need.

The only thing holding me back is a desire for the workstation to have fast local block-level storage (for database work, which is what it's for) and WHS doesn't serve out storage for high performance rdbms work. Ideally I would love a storage system that lets me expose a slice of disk out to a workstation via eSATA or some such, a la LUN presentation, but I don't think such things are possible at these price points. Having accepted that, I plan to go ahead with the WHS solution once I'm comfortable with my ability to recover it from failure.
 
Back
Top