Network pics thread

now just imagine what performance you'd get if you rendered a video with avisynth+x264+MeGUI, and with Adobe After Effects. :D
 
My company is getting a 1.2TB RAM with 144 physical cores + HT (= 288 HT'd cores) with 360GHz accumulated setup.. will attempt to snag a screenshot, and see if I can get a virtual machine using 100% resources built for a task manager shot. :D
 
LOL, no, he doesn't win. Maybe among those who can post pictures, but no -- not overall.
 
you could fold like 1 protein in seconds!

do i dare ask the cost of one of those suckers?

I was goofing off on the Dell Enterprise site and set up a R920 with 40 cores (80 with HT), 2TB of RAM and a bunch of SSDs and 15k SAS in near-line storage for $220,000 retail. (OS and DB software extra.)

I printed it to PDF and sent it to my boss saying that I'd found a replacement unit for our data warehouse build. I suggested that five should suffice for our SQL cluster. :D
 
I was goofing off on the Dell Enterprise site and set up a R920 with 40 cores (80 with HT), 2TB of RAM and a bunch of SSDs and 15k SAS in near-line storage for $220,000 retail. (OS and DB software extra.)

I printed it to PDF and sent it to my boss saying that I'd found a replacement unit for our data warehouse build. I suggested that five should suffice for our SQL cluster. :D

The box is a Cisco UCS C460 M2, 4x E7-4870 procs, 40 cores/80 threads. We have several and use them exclusively for SQL Server 2008 R2, we run many VLDB's in the 3TB - 8TB range.

WP_000089.jpg
 
So,

I figured I'd share my network at work. Please keep in mind, What you see in the picture I inherited. I've only been here since April in my current capacity. Yes I'm very aware that we have alot of work left to do.........alot.......

worknetwork.jpg
 
So,

I figured I'd share my network at work. Please keep in mind, What you see in the picture I inherited. I've only been here since April in my current capacity. Yes I'm very aware that we have alot of work left to do.........alot.......

I like the one practically empty patch panel at the bottom lol

Finished the Unifi Deployment, getting rid of these POS.
SonicWall makes good Firewalls but their wireless AP's suck.

stackofsonicpoints.jpg
 
The almost empty patch panel in the middle, goes with the almost empty switch just above it, that only has 1 thing plugged into it.
 
I work for the group that supports the network for the worlds fastet supercomputer now :p

I guess I win? :p
 
I work in a telco central office which is the hub for the region. I probably win in terms of miles of cable we own. Especially if we break it down to individual wire strands. :D I keep wanting to ask if I can take pics but I'm pretty sure it will be no.

But who says this is a contest anyway. :p It's still cool to see all the different setups.
 
So,

I figured I'd share my network at work. Please keep in mind, What you see in the picture I inherited. I've only been here since April in my current capacity. Yes I'm very aware that we have alot of work left to do.........alot.......

/snip/

Looks familiar. I know your pain. Have fun and post an updated picture when it is cleaned up!
 
I should have grabbed a screen shot of the last cluster we shipped. 200 2.66Ghz Physical cores, 3TB of Ram. It was awesome. However it did like to trip the three 30am breakers...
Anyways, enough e-penis talk, let's get back to networks!

I finally rebuilt my home lab's SAN. 4x 160GB SSD and 4x 1.5TB Spindle in Raid 10/5(Z) respectivly. 200Mbps Read/600Mbps write on some older Sata 2 controllers/drives. Not bad. Still loving OI151a/Napp-It.
This is tied via a dual port Intel nic to my ESXi host. More then I need.
Q6fTVl.jpg


Started to put the Cisco lab back together, I have two more 1841's and two 3750G's on the way. I have not decided if it is worth using the 2610/11's I have.
NSwiul.jpg

wnOSdl.jpg



Oh and I almost forgot- the PSU fan in my pfsense box finally died the other week... Well, more like 3 months ago and I finally replaced it. The wife was tired of the random bearing squeaks coming from the closet. I found a PICO powersupply at work and asked if I could have it. Ended up with a box of them, haha. Anyways, it will not fit my chassis and I finally decided I just was not using the features here at home. So I dusted off the Cisco 881w and set it up last night. One day I will find another chassis, but for now this will work fine,
30ckfl.jpg
 
Last edited:
I finally rebuilt my home lab's SAN. 4x 160GB SSD and 4x 1.5TB Spindle in Raid 10/5(Z) respectivly. 200Mbps Read/600Mbps write on some older Sata 2 controllers/drives. Not bad. Still loving OI151a/Napp-It.
This is tied via a dual port Intel nic to my ESXi host. More then I need.

Have you got your home lab's SAN (OI151a/Napp-It) installed on the ESXi host or is that installed on a other server? I just having terrible performance with my HP micro server/Freenas/ESXi installed on the same host.
 
Out of curiosity for those of you with data centers do you stack your gear (firewalls, routers, load balancers, etc) on top of each other in your network racks or do you leave at least a 1U spacing in between devices? I'm thinking of doing

from bottom up (heaviest to lightest)

etc
1U blocking panel or cable mgmt
2U appliance
1U blocking panel or cable mgmt
2U appliance
 
I think it would all depend on what type of device you are putting the racks and how many devices will be in the rack. For switches, I'd definitely leave a 1U space and use that 1U for cable management.
 
Have you got your home lab's SAN (OI151a/Napp-It) installed on the ESXi host or is that installed on a other server? I just having terrible performance with my HP micro server/Freenas/ESXi installed on the same host.

I tried the same setup and it was painfully slow. I have a Supermicro/Quad core Xeon host that this links to. Having 4x 3.2Ghz cores and 32GB ram is 10x better then trying to do a All-in-One with a n40l.
 
Why don't you like them?

1. Some of the AP's would set themselves to international mode and change their channel. Manually overriding them would sometimes fix it, but more often then not they would reset back to international mode.

2. Bizarre incompatibility with certain wireless nics. We had a number of students and staff's computers would not fully connect to the network. Some would get connected but then fail to get an IP address. After trying multiple things, updating drives usually fixed the problem. They never had issues connecting to any other wireless network, they only had issues with our SonicPoints. This was just with a open guest network, no security, no Radius, etc.

3. The AP's supported N however it was extremely flaky. Most of the time they would not handle N and would either reboot or just stop moving traffic. We had to disable N on all the AP's.

4. Randomly they would disconnect from the network. The link would stay up and would stay powered on, however they dropped off the mac table so they did not appear on the switch. The manager would also be unable to communicate with it, and only a no power / power on the switch port would get them to come back up.

5. Some would lose the SSID but would continue to function normally if you connected to the default SSID.

6. Some of them got stuck updating firmware. Occasionally they would finally get updated and work for a while, but then they would lose their firmware and repeat the process.

7.
Way to fuking over priced for what they are. I just installed one today. 300$ for a ap, what a rip off !


After a year of dealing with crappy wireless I am so glad to have ripped them out and put in good wireless.
Unifi is working great and zero issues.
Heck I came in one morning when we were still testing Unifi and there were 60 students on one Unifi AP! The two SonicPoints in that room were down and the only thing left was the Unifi. The students were only Blackboard testing so no major network traffic, but they had no issues.
 
Out of curiosity for those of you with data centers do you stack your gear (firewalls, routers, load balancers, etc) on top of each other in your network racks or do you leave at least a 1U spacing in between devices? I'm thinking of doing

from bottom up (heaviest to lightest)

etc
1U blocking panel or cable mgmt
2U appliance
1U blocking panel or cable mgmt
2U appliance

I will leave 1U for cable management if the device has forward facing ports.
 
No, because no photo attached!

havent had time for a tour yet

Lawrence Livermore, or IBM?

it was IBM, they backed out and now its Cray, it hasnt fully been launched yet thanks to the contract dispute. Supposed to be 1 Petascale when done


Out of curiosity for those of you with data centers do you stack your gear (firewalls, routers, load balancers, etc) on top of each other in your network racks or do you leave at least a 1U spacing in between devices? I'm thinking of doing

from bottom up (heaviest to lightest)

etc
1U blocking panel or cable mgmt
2U appliance
1U blocking panel or cable mgmt
2U appliance

how much space depends on what it is and how its airflow is

i have old cisco stuff with fans on the bottow so they space, most newer stuff is side to side or front to back so that you can stack without starving them
 
48 port patch panel
48 port switch
48 port patch panel
48 port switch
48 port patch panel
48 port switch

Then 1 foot patch cables between them. Looks ugly, but easy to trace a cable from patch panel to switch port.
link
 
it was IBM, they backed out and now its Cray, it hasnt fully been launched yet thanks to the contract dispute. Supposed to be 1 Petascale when done
Huh? I thought the LLNL/DOE system was already running in excess of 15 petaflops.
 
48 port patch panel
48 port switch
48 port patch panel
48 port switch
48 port patch panel
48 port switch

Then 1 foot patch cables between them. Looks ugly, but easy to trace a cable from patch panel to switch port.
link

We do something similar, though it's a lot neater.

24 port switch
1u cable management
48 port patch panel
1u cable management
48 port switch
1u cable management
48 port patch panel
1u cable management
48 port switch
1u cable management
48 port patch panel
1u cable management
24 port switch

We also use 1 ft patch cords, and are able to tuck them into the cable management panels.
 
Similar as well

24 port patch kit
1u cable management
48 port switch
1u cable management
24 port patch kit
1u spacer kit
24 port patch kit
1u cable management
48 port switch
1u cable management
24 port patch kit
1u spacer kit
24 port patch kit
1u cable management
48 port switch
1u cable management
24 port patch kit
1u spacer kit
24 port patch kit
1u cable management
fiber optics switch

thats how my main one looks, we use 4 different colored 1' Patch cables, each color designates the 4 rooms that are ethernet only based, a fifth color we use for office machines and are labeled.
 
ours is-
48 port patch panel
custom 8" patch cord
48 port switch
48 port patch panel
custom 8" patch cord
48 port switch
48 port patch panel
custom 8" patch cord
48 port switch

It's nice not having to mess with the organizers, the switches stack on the back.
 
Back
Top