The Goal: To have redundant, safe, fast storage for my wifes wedding photography for clients. I would prefer to be able to connect her machine and the SAN/NAS via Infiniband or 10GbE.
The previous/current setup: 2TB disks spread around between various workstations in the house so that after every wedding the data/photos could be dropped in at least 2 different drives/locations. Usually within a week or two the data/photos would be backed up to LTO3 tape and blu-ray disc. This makes file management a slight chore, and its a little more difficult to ensure proper / safe duplication. It also offers no speed improvements.
The "idea": The original idea was to build a ESX/ZFS all in one box using a LSI sas card I already had and a HP SAS expander I got cheaply and hardware I had laying around (core 2 quad on a ASUS board). Then I read that the advantages ZFS gives are muted without proper ECC ram (pointless was the term I saw). So I started trying to find a decent xeon setup with ECC ram that wasn't to expensive or power hungry. I ended up stumbling on a dual CPU supermicro board that supported 7 PCI-E slots and 12 RAM slots which seemed like a awesome setup to me (150$ board, 50$ L5520 CPUs, and 24GB of RDIMMS on 6 sticks). So I got ESX running and nexenta but it seems nexenta isn't as friendly with the VM nics as I need it to be (its kinda slow) and worse it has a 18TB *RAW* limit meaning a 9TB limit for a RAID10 setup. Nexenta also doesn't seem to play nicely with SAS expanders with SATA disks on the other end of the expander (which IMHO limits the usefulness of an expander quite a lot). Open Indiana is next on the list to try.
But here is where I am at. I'm tired of all of it. I've ended up way out spending what I should have (my fault obviously) and have come to the conclusion that this setup would/will be so stupidly complex that any problems will require extensive troubleshooting from me, which can be problematic if my illness is acting up. Of course if it stops access to her files then it messes up her business and our income so I don't want that either.
I'm back to considering a hardware raid card so I can just plug and play on a windows box, share out some directories and go but after 5 hours of digging tonight I've re-discovered why I originally started looking at ZFS. TLER and other idiotic drive issues.
Help!! I'm very close to going on a reverse ebay spree and clearing the stupid rack out and going back to the single drive/multiple machines+tape+disc solution and giving up. It stopped being fun a month or two back.
(just as a FYI) The background: I have 15 years of IT experience but have been out of the field for 5 thanks to a disability. So I've used RAID, VMware and server 03/08/08R2 etc a fair amount in the past (my work experience stopped with ESX 3.5 and server 2003).
The previous/current setup: 2TB disks spread around between various workstations in the house so that after every wedding the data/photos could be dropped in at least 2 different drives/locations. Usually within a week or two the data/photos would be backed up to LTO3 tape and blu-ray disc. This makes file management a slight chore, and its a little more difficult to ensure proper / safe duplication. It also offers no speed improvements.
The "idea": The original idea was to build a ESX/ZFS all in one box using a LSI sas card I already had and a HP SAS expander I got cheaply and hardware I had laying around (core 2 quad on a ASUS board). Then I read that the advantages ZFS gives are muted without proper ECC ram (pointless was the term I saw). So I started trying to find a decent xeon setup with ECC ram that wasn't to expensive or power hungry. I ended up stumbling on a dual CPU supermicro board that supported 7 PCI-E slots and 12 RAM slots which seemed like a awesome setup to me (150$ board, 50$ L5520 CPUs, and 24GB of RDIMMS on 6 sticks). So I got ESX running and nexenta but it seems nexenta isn't as friendly with the VM nics as I need it to be (its kinda slow) and worse it has a 18TB *RAW* limit meaning a 9TB limit for a RAID10 setup. Nexenta also doesn't seem to play nicely with SAS expanders with SATA disks on the other end of the expander (which IMHO limits the usefulness of an expander quite a lot). Open Indiana is next on the list to try.
But here is where I am at. I'm tired of all of it. I've ended up way out spending what I should have (my fault obviously) and have come to the conclusion that this setup would/will be so stupidly complex that any problems will require extensive troubleshooting from me, which can be problematic if my illness is acting up. Of course if it stops access to her files then it messes up her business and our income so I don't want that either.
I'm back to considering a hardware raid card so I can just plug and play on a windows box, share out some directories and go but after 5 hours of digging tonight I've re-discovered why I originally started looking at ZFS. TLER and other idiotic drive issues.
Help!! I'm very close to going on a reverse ebay spree and clearing the stupid rack out and going back to the single drive/multiple machines+tape+disc solution and giving up. It stopped being fun a month or two back.
(just as a FYI) The background: I have 15 years of IT experience but have been out of the field for 5 thanks to a disability. So I've used RAID, VMware and server 03/08/08R2 etc a fair amount in the past (my work experience stopped with ESX 3.5 and server 2003).