Steam library on NAS?

/dev/null

[H]F Junkie
Joined
Mar 31, 2001
Messages
15,182
Hey guys,

I keep running out of disk space on my local Win10 machine for my steam library. I've got a recycled HP DL380G6 with 12 SAS 3TB drives & an 400GB SLC cache & 10Gbit adapter running FreeNAS. I've traditionally only used this for VM backups but I've been thinking of carving out a slice of 1-4TB and serving up a SMB share for my win10 box.

Rather than run locally on SSDS, do you think there is going to be any noticeable difference between running it off a network drive on 10Gbit fiber vs local ssd? I have an older NVME SSD (PM951 I think, 1TB) drive that seems to max out on reads sub 1GB/s.
 
I recently added another NVMe drive for the steam library due to space, but I am curious on if a NAS solution would be beneficial in the long run. I'll keep an eye on your thread to see how it works out for you if you should try it.
 
Not an expert but in my experience in my enterprise environment any NAS is always slower than a local HDD let alone an SSD.
 
There is only one sure way to find out......... :)

On paper, you shouldnt see much of a difference as long as the following are true:

1) Your drives in the SAN/NAS are able to match the speed of your local storage option
2) You limit the work the SAN/NAS is doing with other systems during game sessions
3) Your 10Gbit switch actually has the capacity to "switch" at full line rate
4) Your 10Gbit network adapters are good ones that also are able to keep up at full 10Gbit line rate

I really like the idea!
 
try to carve out an iscsi volume and serve it as a block store directly. Should have less latency than SMB, as it doesn't have an extra overhead and has less abstraction. a separate isolated vlan on that iscsi traffic should help as well.
 
try to carve out an iscsi volume and serve it as a block store directly. Should have less latency than SMB, as it doesn't have an extra overhead and has less abstraction. a separate isolated vlan on that iscsi traffic should help as well.

I think that is too complex for what I want to do. I don't really want to add iscsi onto the nas.

Any benchmarks you guys want me to run?

So I copied some files there initiallyi and I was seeing (on the large files at least) around ~ 870MB (not Mbit...MiB)/s transfers, to that is a bit less than 7Gbit/s.
 
There is only one sure way to find out......... :)

On paper, you shouldnt see much of a difference as long as the following are true:

1) Your drives in the SAN/NAS are able to match the speed of your local storage option
2) You limit the work the SAN/NAS is doing with other systems during game sessions
3) Your 10Gbit switch actually has the capacity to "switch" at full line rate
4) Your 10Gbit network adapters are good ones that also are able to keep up at full 10Gbit line rate

I really like the idea!

1) I can get ZFS scrubs at > 1GB/s so I think I should be OK here.
2) Backups only run when I'm sleeping so it should be idle when I'm playing
3) I have a CRS317 that is asic based to switch at line rate
4) I have Mellanox Connectx-2....so not the best, but i could probably spend $30 to pick up a ConnectX-3 if this becomes an issue.

Only big weakness I see is I'm pretty much out of pci-e lanes on my board with 2 x 1070. The "x16" slot is pci-e 2.0 with just 4 lanes for the 10Gbit card.
 
Main issue is that you'd be adding TCP/IP latency on top of OS/filesystem latency to drive latency versus a native SSD option. iSCSI would limit some of that, and having a 10Gbps setup with a 4x 6TB 7200RPM mirror already set up, I may look into doing this myself.

Not sure how complicated iSCSI is to share from FreeNAS 11.2 back to Windows 10 Pro, though.
 
I understand that & it makes sense....however even if it's higher latency, is it going to be noticeable ? If it's 3% slower, I don't care. if everything takes 50% longer to load, I'm going back to SSD.
 
I understand that & it makes sense....however even if it's higher latency, is it going to be noticeable ? If it's 3% slower, I don't care. if everything takes 50% longer to load, I'm going back to SSD.

That's going to be game dependent. I load random stuff to a spinner, AAA stuff to an SSD, and really don't worry about load times. At worst, be specific about what you put where. One example of terrible spinner performance is the Battlefield series.
 
You're going to add network latency, using a network based file share with SMB overhead on top of it, with spinning disks....from a local SSD

Yea you'll probably notice a difference and I doubt it's only 3%. Not in sequential read benchmarks but in real life settings, you are going to feel it.

If you can do an iscsi mount to the box over 10gb I'd look in to that. Much less overhead
 
Well, I have moved Metro 2033 Redux...don't notice any speed difference, yet. So far so good :)
 
So I've been running a test- a pair of 500GB SATA SSDs across the 10Gbase-T link (Aquantia NICs and an HP switch), shared using iSCSI and mounted on the workstation, can't tell the difference in terms of load times.
 
Back
Top