Network pics thread

Nice! That is sweet and simple! How does the wifi work in that rack? Looks like you've got it in a faraday cage.
Small house, a little over 900 sq ft so it's able to hit everywhere it needs...but it's range is definitely reduced being in there. I figured if I needed to I could mount it on the side but so far so good.
 
I’m proud of you.
Thanks that was my old job, I'll grab a pic of the really nice setup I did at my new one (here's a preview of a couple of layouts we tried).
5xOUcjbm.jpg
X6SZ5Zrm.jpg
 
Small house, a little over 900 sq ft so it's able to hit everywhere it needs...but it's range is definitely reduced being in there. I figured if I needed to I could mount it on the side but so far so good.
Nice! That is all that matters!
 
Ignore the hdmi over ethernet adapters and such, this was 6" cables.
We had LEDs in the cable management at one point but those were taken out for whatever reason.
Each cubicle gets 1 port so all'ofem need to be hot, voip phone passes through to computer.
Red is executive offices and no touchie ports, green is APs, white is conference phones

zO4PGiDl.jpg

The nice thing is even when it gets messy like the above it still looks nice, plus tracing and cable replace is a piece of cake.
 
Last edited:
Ignore the hdmi over ethernet adapters and such, this was 6" cables.
We had LEDs in the cable management at one point but those were taken out for whatever reason.
Each cubicle gets 1 port so all'ofem need to be hot, voip phone passes through to computer.
Red is executive offices and no touchie ports, green is APs, white is conference phones

View attachment 148969

The nice thing is even when it gets messy like the above it still looks nice, plus tracing and cable replace is a piece of cake.

That looks really nice. But, what do you do when you need to add another switch? You’ll have to move everything to keep it evenly spaced.
 
That looks really nice. But, what do you do when you need to add another switch? You’ll have to move everything to keep it evenly spaced.
As I mentioned all the ports are hot, ie) every single patch port is live to a switch, we wouldn't need to add more, or if we did we'd need to add a new patch panel and switch anyway so we could have it spaced and organized accordingly if we did.
Its 48 rj-45 ports per switch (with 2 SFP+ for the fiber inbound) and 48 ports per patch panel so its 1-1.
I think it was about 612 patches run to this location there were a few patches at the end that weren't needed/punched down, but the switch has a comparable amount of open ports if they were ever added to the maximum.


This is one of our two IDF so everything is access ports and nothings changed in the past 3 years since it was installed, non of the cables have gone bad, but we could easily change out the cable (which surprised me, I thought those 6" would fail every now and then being so short).
 
Here's my humble home submission...

In the house:
Supermicro 5018D-FN8T running Untangle HomePro
Ubiquiti EdgeSwitch ES-12F as the "core" switch that handles server connections and the fiber uplinks from the shop
Cisco SG350-28P powers 8 IP cameras as well as the AP's in the house (UAP-AC PRO and a UAP-AC-HD)
Netgear GS510TP handles the overflow since keep adding crap to the network :)

At the bottom is another GS510TP as well as an EdgeRouter X that I use as a "lab".
KIMG0598.jpg


In the shop (connected back to the house via 5 OM3 runs in conduit):

Poor picture :-(
Cisco SG250-26P powers the other 6 IP cameras as well as the AP's in the shop and extra garage (2 x UAP-AC-IW)
Backup Netgear ReadyNAS RN102 with external enclosure to back up my Netgear ReadyNAS RN516
KIMG0383.jpg
 
I have a VERY small network set up at home finally.
The ancient Dell on top is handling firewall and adblock duties via pfsense and pfblockerng
it connects to an inexpensive dlink fully managed gigabit switch to provide network connectivity for the R710 and my home PC and trendnet mesh wifi hub
The R710 on bottom is running ESXi 6.7 and runs a few VMs that contain plex, emby, nextcloud and a few other things.
Once we move I plan on getting a rack and adding more hardware.
servers.jpg

I'll post up some pics of our submerged racks from work tomorrow if I remember to get some pictures.
 
I have a VERY small network set up at home finally.
The ancient Dell on top is handling firewall and adblock duties via pfsense and pfblockerng
it connects to an inexpensive dlink fully managed gigabit switch to provide network connectivity for the R710 and my home PC and trendnet mesh wifi hub
The R710 on bottom is running ESXi 6.7 and runs a few VMs that contain plex, emby, nextcloud and a few other things.
Once we move I plan on getting a rack and adding more hardware.
View attachment 183547

I'll post up some pics of our submerged racks from work tomorrow if I remember to get some pictures.
yes for the submerged rack pics. Never seen these before
 
Cisco SG250-26P powers the other 6 IP cameras as well as the AP's in the shop and extra garage (2 x UAP-AC-IW)
Backup Netgear ReadyNAS RN102 with external enclosure to back up my Netgear ReadyNAS RN516View attachment 151893
I like the idea of using that small temp/humidity/clock thingee to have a quick monitor of the environment--where did you get it and how much was it?
 
Whoa! That is unreal! I thought they would be custom servers or something. What is the liquid being used? That can't be water, can it?
 
It's probably Florentine or some such stuff along those lines. Non conducting liquid.

It's cool.
 
Whoa! That is unreal! I thought they would be custom servers or something. What is the liquid being used? That can't be water, can it?

The liquid is called Electrosafe dielectric liquid coolant. It is both electrically and chemically inert. It has the consistency of mineral oil. The server chassis are custom made but the hardware is nothing special. Normal motherboards, ram, nvme, ssd and GPU (for the GPU nodes, we do have a lot of cpucompoute nodes that are not in oil) The second two pictures those servers have 4 RTX 2080 ti's in them. 3 quads (4 tanks per quad) 42 chassis per tank (168 GPU per tank) = 504 GPUs The first picture those tanks are pretty much all 1080 ti tubs.

they all run 10GB network (the black cable) and dedicated IPMI (the white cable) PDU and switches are mounted on the back of the tanks, but we have submerged switches in the past as well.

Not related, but I got this sweet led backlit blanking panel today.
supermicro.jpg
 
Last edited:
The liquid is called Electrosafe dielectric liquid coolant. It is both electrically and chemically inert. It has the consistency of mineral oil. The server chassis are custom made but the hardware is nothing special. Normal motherboards, ram, nvme, ssd and GPU (for the GPU nodes, we do have a lot of cpucompoute nodes that are not in oil) The second two pictures those servers have 4 RTX 2080 ti's in them. 3 quads (4 tanks per quad) 42 chassis per tank (168 GPU per tank) = 504 GPUs The first picture those tanks are pretty much all 1080 ti tubs.

they all run 10GB network (the black cable) and dedicated IPMI (the white cable) PDU and switches are mounted on the back of the tanks, but we have submerged switches in the past as well.

Not related, but I got this sweet led backlit blanking panel today.
View attachment 183760
Thats some serious fucking gear. What does your company use the GPU farm for?
 
The liquid is called Electrosafe dielectric liquid coolant. It is both electrically and chemically inert. It has the consistency of mineral oil. The server chassis are custom made but the hardware is nothing special. Normal motherboards, ram, nvme, ssd and GPU (for the GPU nodes, we do have a lot of cpucompoute nodes that are not in oil) The second two pictures those servers have 4 RTX 2080 ti's in them. 3 quads (4 tanks per quad) 42 chassis per tank (168 GPU per tank) = 504 GPUs The first picture those tanks are pretty much all 1080 ti tubs.

they all run 10GB network (the black cable) and dedicated IPMI (the white cable) PDU and switches are mounted on the back of the tanks, but we have submerged switches in the past as well.

Not related, but I got this sweet led backlit blanking panel today.
View attachment 183760
That is awesome. Where did you get that?
 
Thats some serious fucking gear. What does your company use the GPU farm for?

Analyzing siesmic data

That is awesome. Where did you get that?

Seriously. I've got 3U that need to be blanked thanks to rear-facing top-of-rack networking stuffs. I'd LOVE something like that for my rack.

Came from Supermicro long before I started this job. We decomissioned a bunch of racks and these were in them.
 
I was considering doing that, I will look into it, thanks.
I’m running a regular usg on a gig isp connection (940/50) with no issues.I even have the DPI data inspection turned on. That being said it does run very hot sitting horizontally with the status light facing up (even in open area). It runs about 15 C cooler now sitting/mounted vertically. The no fan passivle cooling is great too, it sits on my desk or in my closet.
 
What are temps like in that thing?

Not sure if your asking me, but I'm going to assume you are. Not sure the exact temperature as I haven't had a need to measure it. All the equipment stays rather cool inside, as there is a strong AC case fan mounted inside the top of the case.
 
Very nice setup. When we hired contractors for my parents house that's what I expected and instead got a surface mount patch panel that wasn't even complete and tons of jacks that failed gigabit. Morons.
 
Small, but potent little sucker this one is.

looks nice, you need to get the cleaing lady to do some cleaning for all the dust :) I would have loved to have my 42U rack mounted that way, with an backdoor on it and only the front facing out.
 
Edit** Update

I had to get a slightly deeper rack to fit my new Silverstone 2U 8 bay rackmount server. worked our for the best as I like the new layout better. The Server is an all in one ESXi 6.7 U2 host. It will host plex with a P400 GPU directed pass, freenas with a H200 passed to it and a few other misc VMs.
 

Attachments

  • IMG_20200515_221637_02.jpg
    IMG_20200515_221637_02.jpg
    284.9 KB · Views: 0
  • IMG_20200515_221637_01.jpg
    IMG_20200515_221637_01.jpg
    322 KB · Views: 0
Last edited:
Power supply in pfsense server died. Had to rig this up until replacement comes in.
 

Attachments

  • 08845B02-C697-4C8B-A40C-0C3F1E68CCDF.jpeg
    08845B02-C697-4C8B-A40C-0C3F1E68CCDF.jpeg
    598.1 KB · Views: 0
Power supply in pfsense server died. Had to rig this up until replacement comes in.
I'd at least sit the cover on top of it as the cpu needs those fans at the back to pull air across it to cool. (y)
 
Back
Top