Hello,
I'm interested in running something between my server and my main pc to accelerate network transfer speed. I often move large files and obviously gigabit works, but I'd like something faster. The raid 6 arrays and my ssd are capable of 370MB/sec or more read and writes for large files. 10g ethernet was my original thought, but it's very expensive (and noisy) to get a switch at this point. A lot of my searching has landed me looking at sfp+ DAC, but that has a maximum length of 10m and taking the most direct path (back of pc to ceiling, straight line over server, straight shot down) I'm looking at 33'-4" (9.95m.) I think that is cutting things way to close, and it means the cable would be right against the drop ceiling track, and really cutting things too close for my liking.
So I'm looking for some advice on how to best proceed with getting this connection going. I'll present a few options below, because there's a lot of different information on what cards to get as well, I'm sure there's a more up to date suggestion around. I've been looking at new stuff, but I'm not opposed to considering used equipment on ebay etc.
1) Buy two 10gbit ethernet pci express cards for each pc, and run a 30' cat 6a cable between server and patch panel, and a 7' cat 6a cable between patch panel and my pc. Looks like I can pick up the cards for $100 each and the cables will run me about 20, only finding cat 6 keystone jacks though maybe that's fine?
2) Inspired by this fiber keystone jack I saw at monoprice https://www.monoprice.com/product?c_id=104&cp_id=10426&cs_id=1042606&p_id=2875&seq=1&format=2 maybe I should run my own cables between to almost do a simulated dac? This would also have the nice benefit of opening the opportunity to do a switch later if I wanted, although I do not forsee this being something I ever do. Monoprice sells various color fiber pretty affordably (I think?) but the colors are for different types and I'm a bit lost there. I would need two sfp+ pci express cards, a 10m run of fiber, the keystone jack, a 3m run of fiber, and two of the sfp+ termination modules so I can connect the fiber to the cards. These seem pretty expensive, but I probably just have no idea what I'm looking at. Also doesn't have to be monoprice, was just a reference I was using.
3) I could raise up the server and go the dac route anyway. I would have to buy / fabricate some kind of support for it, and I'd still be cutting it close. I'd also be making a new hole in the ceiling directly over my pc. Not my top choice but cost wise this may make the most sense. I'd need two cards and the cable.
Thank you for reading, I am open to comments and suggestions, including methods that I have not explored here.
I'm interested in running something between my server and my main pc to accelerate network transfer speed. I often move large files and obviously gigabit works, but I'd like something faster. The raid 6 arrays and my ssd are capable of 370MB/sec or more read and writes for large files. 10g ethernet was my original thought, but it's very expensive (and noisy) to get a switch at this point. A lot of my searching has landed me looking at sfp+ DAC, but that has a maximum length of 10m and taking the most direct path (back of pc to ceiling, straight line over server, straight shot down) I'm looking at 33'-4" (9.95m.) I think that is cutting things way to close, and it means the cable would be right against the drop ceiling track, and really cutting things too close for my liking.
So I'm looking for some advice on how to best proceed with getting this connection going. I'll present a few options below, because there's a lot of different information on what cards to get as well, I'm sure there's a more up to date suggestion around. I've been looking at new stuff, but I'm not opposed to considering used equipment on ebay etc.
1) Buy two 10gbit ethernet pci express cards for each pc, and run a 30' cat 6a cable between server and patch panel, and a 7' cat 6a cable between patch panel and my pc. Looks like I can pick up the cards for $100 each and the cables will run me about 20, only finding cat 6 keystone jacks though maybe that's fine?
2) Inspired by this fiber keystone jack I saw at monoprice https://www.monoprice.com/product?c_id=104&cp_id=10426&cs_id=1042606&p_id=2875&seq=1&format=2 maybe I should run my own cables between to almost do a simulated dac? This would also have the nice benefit of opening the opportunity to do a switch later if I wanted, although I do not forsee this being something I ever do. Monoprice sells various color fiber pretty affordably (I think?) but the colors are for different types and I'm a bit lost there. I would need two sfp+ pci express cards, a 10m run of fiber, the keystone jack, a 3m run of fiber, and two of the sfp+ termination modules so I can connect the fiber to the cards. These seem pretty expensive, but I probably just have no idea what I'm looking at. Also doesn't have to be monoprice, was just a reference I was using.
3) I could raise up the server and go the dac route anyway. I would have to buy / fabricate some kind of support for it, and I'd still be cutting it close. I'd also be making a new hole in the ceiling directly over my pc. Not my top choice but cost wise this may make the most sense. I'd need two cards and the cable.
Thank you for reading, I am open to comments and suggestions, including methods that I have not explored here.