10 GbE starting to become reasonable for home?

vr.

Supreme [H]ardness
Joined
Mar 30, 2010
Messages
4,416
Noticed today that switches with 10 GbE ports are starting to become quite low in price on eBay. Is this a sign that the price is finally starting to move or are these switches junk, lacking features or not usable to attach a server or two and start playing with the speeds?

I'm mostly seeing HP 2900's but there are others too.

edit: maybe there are just rear stacking ports? The fronts must just be sfp 1 GbE?
 
Last edited:
There are even quite affordable motherboards from Asus and ASRock with integrated 10 Gb/s interfaces. Technology for local networks is thus available. Filling the 10 Gb/s with data seems to be bigger issue.
 
Filling the 10 Gb/s with data seems to be bigger issue.

Yup.

I run a 10GbE backbone (SAN, Fileservers, backup servers, ESXi) at work, and 90% of the time is does nothing that 1GbE cant do. It really takes a lot of things happening at once to fully utilize that kind of bandwidth. Many of my steel servers still run on aggregated 1GbE cause they cannot really utilize 10GbE effectively. It is usually that the disk subsystem cannot keep up with the writes for long.

Now with a multi-member SAN or SSD storage on a beefy controller, you can very effectively use 10GbE, but it has to be end to end on the server hardware side. A weak server on one side will slow it all down.

My 2 cents. :D
 
My view on this is, for my whims, if I have Server A transferring one or more 500 GB files to Server B, I want to see more than 1 GbE worth of movement. I really don't care if it sits idle most of the time. Computing to me is not about how (over)utilized a given resource is, it's about how quickly the task completes when called upon.
 
Depending on the switch and the end point hardware, you may not see much improvement at all.

Those switches have 10GbE uplinks, so you will need SFP+ adapters to use those ports.
 
but won't work with hyper-v
 
Okay, my 2 cents:

Standard layer 2 10 gig Ethernet is indeed becoming doable. If you want to have 1 fat pipe between your storage box and main workstation, switches are available that make life very easy for you.

There is a 24port copper 1GBe switch with 4 SFP+ from DLink that is affordable (DGS-1510-28X):
http://www.dlink.com/uk/en/support/product/dgs-1510-series-gigabit-stackable-smart-managed-switches

If you buy one of these, with brocade 1020 cards from ebay, you can have up to 4 fast connections to your regular Ethernet network. Much simpler then running parallel network with infiniband...

One quick order to fiberstore.com for the SFP+ modules and cables and you are all set.

Stay clear of the 10GBASE-T copper crap. SFP+ is cheaper in the end, and really no hassle. Use OM3 LC-LC patch cords.
 
Depends what you count as "reasonable". Fun hobbies are never cheap :).

I've been running a "10G network" at home for about a year now:

Juniper EX3300-24T - $1000, new
Intel X520-DA2 nics in each server/desktop - $100 each, new from China
MM OM3/4 cables - maybe $50 total?, new
Optics for NICs and switch - $500 total?, new
 
Intel X520-DA2 nics in each server/desktop - $100 each, new from China

May I ask where did you get it exactly for this price? Because thats stupid cheap.

I'm building a little portable mITX box that will act as an initial config PXE boot server for 30-200 servers at a time so the 10G ports would help a lot.
 
You could also look at the Mellanox Connect-X 2 cards, you can get the EN 10gb version on ebay for around 35-50$ each. This is what i have been using in my machines the last year or two.
 
I paid:
$170 per nic and sfp - ebay x520-sr1
$50 per genuine cisco sfp
$1700 for cisco ws-c4948e
bit over $2200 got me 10gbit on 3 systems with possibility of a 4th. I'm extremely happy with it, had no problems maxing out my ssd to disk (cache) array at almost 700mbytes/sec (screenshot). Even the standard 1gbit ports seem faster than my hp 1810g it replaced. Ya maybe it's too expensive, but what this thing cost retail, it's a steal.
 
I have 10gbe between servers, limitation is disks.
 
Back
Top