I'm jealous, been scoping 10gbe, but can't justify the cost for my house honestly. Hoping it starts coming down in price because it's my biggest bottleneck at the moment. I can get 1GB/s from my drives, but 1gb throughout on my network only nets me like 100MB/s. I do have quad Ethernet ports but only a 4 port router so I can't bridge all of them, and even if I did I still would only have 1gb to a single endpoint. 750 for 4 drives is good, I have 6 drives, was going to do 8, but 2 were different manufacturers and slightly slower drives. Sadly my servers onboard is SATA in, so my SSD only gets like 230MB/s, but has great random I/O. I put things like my Plex transcoding on the SSD, large movie/files on the RAID, etc. I have backups of my programming stuff, but all my movies I still have the original discs, so not critical in any fashion. Like I said, it was mostly just to play and learn, and.. why not? I have dual Xeon with 96GB ram. Least I can do is use a fast RAID.I use four 8TB USB pulls in RAIDZ1 (RAID5) to backup four 6TB Ironwolfs in RAID10, still on ZFS...
Still getting ~750MB/s sequential across 10GbE. I had planned to add a few drives to that just to get performance up to 1GB/s, but I simply have no use for the space (yet).
I don't think the use of hardware controllers would be helpful in my case, while at work we use them for two-drive mirrors... ...because...
It doesn't sound like youd gain much from hardware raid. Me, I'm look into a larger rack mount that can hold a bunch of drives and switch to an external raid card. It wouldn't make sense to do all that and then run to each disk individual just to use software raid. The upgraded card I'm scoping also has battery backup, faster i/o and 1GB of onboard ram. Seems a waste to not use any of it .