10gbe for OmniOS?

danswartz

2[H]4U
Joined
Feb 25, 2011
Messages
3,704
I have a dell r905 off ebay. Works great, with a couple of mellanox connectx-3en cards between it and the 2 esxi hosts. I'm thinking of switching to OmniOS, but am having trouble coming up with known, supported 10gbe cards. I have a couple of old connectx-2 cards, and I know they work (especially with SRP) with solaris distros, but esxi 6.0 mellanox drivers don't seem to support SRP (dunno when, if ever.) If I boot OmniOS, it doesn't even recognize the connectx-3 cards (no surprise there.) Whichever way I go, I can skip a switch for the moment, and connect the 2 esxi hosts directly to the storage server. If I can't use the mellanox cards, I'd buy 3 of something (at least one of them dual ported - for the storage server) and go that route. I'm not sure I know enough to care if I went rj-45/catX, or with some SFP+ solution. Any thoughts/tips would be appreciated.
 

_Gea

2[H]4U
Joined
Dec 5, 2010
Messages
4,051
For me its quite clear, the winner and mainstream is 10G ethernet with a view to 40G.
More and more serverclass mainboards arrive with 10G ethernet onboard together with quite affordable switches example from Dlink or Netgear up to switches like the HP 5900 with 48 x 10G Base-T and 40G QSFP+ that I use for my pools. Cheaper 10G adapters are on the way.

In my own setup all buildings and floors are connected via 10G fiber, monomode or multimode depending on distance. In my serverroom I have a mix of older CX4 with SFP+ and 10G Base-T copper. More and more of my Mac and PC desktop systems and two pools with 3D, AV or publishing software are connected via 10G Base-T to offer Storage access with a performance that is comparable to a local SSD.

In my newer OmniOS systems and all desktops I use Intel X540-T1 or T2 (dual port). They are well supported as well as the X520 with SFP+. You should only stay away from X550 and Pentium-D as they are not yet supported. Performance wise, I was able to achieve > 900 MB/s via SMB 2.1 to the newest OmniOS with an Intel P750, see http://napp-it.org/doc/downloads/performance_smb2.pdf

IB is dead, at least on the free distributions of the Solaris family. I do not expect that Illumos (or Delphix, Joyent, Nexenta or OmniTi behind) are willing to add support for newer IB hardware. Same with Mellanox as the Illumos platform is too small.
 

rkd29980

Limp Gawd
Joined
Oct 19, 2015
Messages
167
I was able to achieve > 900 MB/s

qNOnh96.gif
 

danswartz

2[H]4U
Joined
Feb 25, 2011
Messages
3,704
Thanks, I will look into the X540. Probably T2 for the storage server, and T1 for each esxi box (unless there is no real price difference...)
 

danswartz

2[H]4U
Joined
Feb 25, 2011
Messages
3,704
Hmmm, here is a question: as currently planned, I would need an X540-T2 for the storage box and 2 X540-T? for the 2 esxi boxes. Given that the connectx-3 on esxi work fine (and have QSFP connectors), is it feasible to get an X520-DA2 for the storage server, and 2 QSFP/SFP+ cables? If not, I might just go with X520 all around, as ebay is showing me these as MUCH cheaper than the X540-T1 or T2... I would then need two SFP+/SFP+ cables, but those seem relatively cheap?
 

danswartz

2[H]4U
Joined
Feb 25, 2011
Messages
3,704
I ended up getting a decent deal on ebay. 3 X520-DA2 with 2M direct attach cable. Total a little under $450.
 
Top