Network pics thread

Mobility achieved.



qbukqpegM0U1kaZicF7HDShAWw3QS7L7MsApfNXdEmc=w155-h207-p-no


e2qLp639daGHD2hKtKuZvBGs8TnxIaFeJgQmlXptrCQ=w276-h207-p-no



Waiting on a super slow ebay shipper to get the optiplex 980 3.6 i5 to me so the Silverstone case can become the rack mount TFTP/VMware server.

Any recommendations on processor/ram for the VMware ? I'd say I'll be running as many as 4-5 instances at a time max.

H4NbRMGtet_fdzi9gwXRsbX9tdv5qeugSkJgTwYvnRA=w264-h197-p-no
 
Last edited:
I'm assuming enough ram for each virtual to have as much as you'd want a live system ? or can each instance share ram ?
 
Often you can get away with pretty small amounts of ram for each VM, like maybe double what the OS requirements are. The requirements tend to be too low for actual usage though. Just because it installs and runs does not mean it does so very well. :p My main server at home maxes out at 8GB as the motherboard can't fit more, and I've had as much as 7 vms running at once before. This was mostly production stuff. (not many users though, mostly me) Linux I usually put around 256MB, Windows 512MB to 1GB. Depends what the VM is for. Currently I don't have as much stuff but do have a Minecraft server and I ended up giving it 4GB. My VPN server has like 64MB.

If building a server now, given how cheap ram is and how motherboards have come a long way and can support high amounts, I'd go with no less than 32GB, probably even 64GB.
 
Don't forget that VMware does memory deduplication really well (for details, see here), so if you have two windows VMs with 2GB RAM each running at 50% memory utilization, you aren't using 2GB of RAM on the hypervisor.

So you actually need less RAM than you think for a lab.
 
Depends what you intend to run though, things like Exchange and Sharepoint are big time ram hogs.
 
Found some deals in the past couple of weeks and bought some goodies for a poor mans ESXi lab build:

i7 3770
ASRock Z77 mobo w/ VT-d
32GB Ram 1600
Quad port Intel NIC

Need a 256gb+ SSD soon and 1TB for repositories and will be good to go.

Pics to follow soon.
 
Here's what we've been working on lately. This is our Global Fusion Center. We're a security integration company and this is at our HQ. Allows us to monitor our client's systems remotely 24/7. That big "Sentry VMS" screen you see is one of those electrically charged windows. Press a button, it disapears. We also do rear projection onto it, so it has two purposes. The last two images are taken from our Media Room where we hold presentations/training.

9051055586_781beb1c88_b.jpg


9048822789_c71e2848cd_b.jpg


9048823757_6c98fa59cf_b.jpg


9051050998_843f7b49dc_b.jpg
 
100 points for rear projected security glass into NOC, -100 points for goofy american flag....
 
Woah that's a nice looking NOC. I keep forgetting to ask if I can take pics of ours, as well as all the equipment such as the DMS100 etc.
 
Here's a shot of the NOC where I work. The wall on the left of the image is actually a different layout now. Had the pleasure of seeing it in person last month. I work in Calgary and the NOC is in Toronto.

nVk2SGl.jpg
 
Nice, looks pretty fancy! And windows! not something you see often in a NOC. :p Or is that offices?


Here's some pics of my latest build, a file server.

In case interested my build thread is here.





Removing my dual port fibre HBA card from my environmental control server to put in the new file server I originally built that server to control the IBM SAN when I first got it, as I thought I could add my own drives to it so I would have used it for production. That server eventually got converted to being an environmental control/monitoring server (controls my hvac, and monitors lot of stuff around the house).


Added fibre channel HBA to file server


Closer look of cards


Back of server. I'll probably end up moving that "card 3" label up a bit, because it's going to bug me. :p


This server is pretty much done at this point. Just need to do some more "burn in" testing and then start planing the migration of my data from my old server. The old server will continue to be production, it's just that the data will be on this server now. The easiest way to do this would be to use iSCSI so it would be 100% seamless to the old server but I think I will fix up all the permissions so it matches across all of my servers, might even look at kerberos/ldap and then use NFS. So that will be a side project that will be part of the migration. Idea is to offload file operations from my old server as it's on it's last legs. Next upgrade is replacing that old server.

I also can't access the back of my server rack at this point, so I wont be racking this till I'm at a point where I can access it. Middle of basement renovations. It's also very dusty from cutting wood, especially the dricore, so I might wait till all that is done just to avoid the dust. Surprisingly my environmental server is not all that dirty. My PC upstairs gets dirtier than that in a single day of running, no joke. Guess cat hair is worse than saw dust.
 
envious of the light in that NOC.

My desk...is in the dungeon. I'll try to remember to take a pic tonight.


My home rack is coming along though, and the racks that I'm attempting to sell to CCNA/CCNP candidates on Ebay.

www.ccnporbust.blogspot.com

20130620_115149.jpg


Surprisingly, the rack is only using 430 watts, even with a PC running an ASUS GTX 660Ti, and three monitors.

D9vY88OE4pG4J8UZsuc8V3-NhdLJ8gekusQQ3C4U4yU=w155-h207-p-no


CCNA/CCNP Rack (price is getting ready to come down to under a grand. Believe it or not, that cut's margins pretty close.)

http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&item=251291203439

48XxjD0ur6YD1mqF7NDf9zRY_SNhlIkG6Z5nTMM5PuQ=w155-h207-p-no
 
Last edited:
As an eBay Associate, HardForum may earn from qualifying purchases.
Often you can get away with pretty small amounts of ram for each VM, like maybe double what the OS requirements are. The requirements tend to be too low for actual usage though. Just because it installs and runs does not mean it does so very well. :p My main server at home maxes out at 8GB as the motherboard can't fit more, and I've had as much as 7 vms running at once before. This was mostly production stuff. (not many users though, mostly me) Linux I usually put around 256MB, Windows 512MB to 1GB. Depends what the VM is for. Currently I don't have as much stuff but do have a Minecraft server and I ended up giving it 4GB. My VPN server has like 64MB.

If building a server now, given how cheap ram is and how motherboards have come a long way and can support high amounts, I'd go with no less than 32GB, probably even 64GB.

Do you use your MC Server in a VM? just curious.
 
Do you use your MC Server in a VM? just curious.

Yeah the Minecraft server is in a 2003 VM. Actually my old UO server test server VM. I just gave it more ram.

Eventually I want to move the minecraft server to my online dedicated server though. I don't have lot of bandwidth so if I plan to have more people play it will start to be laggy.
 
How do you guys run MC servers on a VM?? How many people do you typically have? I run www.nyminecraft.com and I have a dedicated server for it, and it still is not enough... thing requires some serious ram. I have an i7 chip in there with 16GB ram. My worlds also over 100GB in size i think
 
Nothing special for a VM, same way you'd run it on a physical machine. I suppose there may be some VM specific tweaks out there but never looked into it. The VM is windows 2008 with all the dependencies required to run .NET 4.0 (probably irrelevant for MC, but it was needed for something else before) and the latest Java, or at least whatever was the latest at the time, since that thing updates almost daily. :p

It's only me and a friend of mine that plays, so we're only 2 on it at most. Nothing crazy. The world is only about 340MB right now.

But yeah the bigger the world the more ram you'll need. Mine uses around 3GB of ram! I only have 8GB total on that server. I need to build a new server that uses DDR3 so I can put more ram. Probably my next build one the file server is paid off. I *could* use the file server as a VM server too but I really want to keep it separate.
 
Was going to wait till I'm done with my basement renos (dust) before I rack my server, but I was too eager to do it so it's done and I can clear my office workbench, so I just did that now.




Yeah I know, it's a MESS behind there, but part of my goal with the basement renos is to build some type of cable management system for the server rack, with some PDUs. So for now I'm not bothering with any cable management as I will be completely redoing everything from scratch and labelling as I go. Probably a 1 day job right there.



ICMP reply and alarm clear! Helps if I turn the port up and put it in the right vlan. :p



Which drive do I format first! Actually it's more like: which one is which, so I can reconstruct the old raid array on the SAN. :p

Also, the bottom two enclosures are currently unusable, because I went and used a couple drives as regular sata drives, and that broke the firmware lock on them (was not aware till after) so they no longer work in the SAN. I have to figure out which drives these are and decommission them. The reason I built this server instead of just using this SAN is because of this. I can't just put any sata drive in there. So I will use it as extended storage since I have it, but wont rely on it for actual productivity stuff. Will be mostly for backups/archiving. I don't have enough UPS capacity to protect it, so I rarely turn it on.

I wont be leaving this on till I'm done the renovations though, since it's brand new and all clean I want to keep it that way. When the server room is done it will have proper hvac with air filtration.
 
Some pics from the lab (click for 1600x1200):

Installed it last week. can you guess what product? ;)


some VNX love (with messy cabling)


My biggest "love" around the block: Avamar Gen4 grid with a Media Access node on top.


Also a backup software. Any guess?


Atmos Gen2.


The guts of a DMX3.


Rocking the flashy blue LED bars :).
 
Yeah the Minecraft server is in a 2003 VM. Actually my old UO server test server VM. I just gave it more ram.

Eventually I want to move the minecraft server to my online dedicated server though. I don't have lot of bandwidth so if I plan to have more people play it will start to be laggy.

gotcha, and yea same, I only have 5Mbit up, so I want to do the same eventually
 
I just finished racking the gear from my lab into my colo to keep it running full time. I pulled it out of my house due to power issues after cabling it all up only to have to tear it all down. Hopefully I can keep it here intact for a while (because I'm sick of rebuilding it :D).

IMG_3042s.png


IMG_3033s.png


I've always been irritated about these cabinets, as their posts aren't set at a standard depth to accommodate a bunch of half-U cases at the top of the cabinet. That plus the lack of space for cable management makes the backside a lot more messy than I'd prefer. But whatev'. I'll take what I can get.

Sonicwall 2040 Pro
Dell PowerConnect 6248P switch
5x Supermicro server (8-core Opteron, 16GB)
HP Proliant DL160 G5 (2x 8-core Xeon, 48GB)
2x ProLiant DL120 G5 (2x 4-core Xeon, 16GB)



Virtualization Hosts (two ESXi5, two Hyper-V)

IMG_3044s.png


IMG_3036s.png






utility servers, database server and iscsi target

IMG_3045s.png


IMG_3039s.png




Closeup of network components

IMG_3035s.png
 
Last edited:
lol a colo for a lab, damn, that must be expensive. Guessing this is stuff that does not need any hands on then? I've thought of building a server to colocate for my web stuff instead of leasing, as leasing tends to rape you on things like disk space and ram. Though the issue is if something fails you have to pay for remote hands which is very expensive. That's a really nice setup though. Just curious, how much is that costing you per month to colo? Guessing you are renting a half or full rack?
 
I just changed over from physical hardware for my colocated server to a virtual server. This was after the second hardware failure in two weeks.

Though it looks like I have to move a friend's website off my virtualized box and onto another shared hosting box due to just how damn big his site is.
 
lol a colo for a lab, damn, that must be expensive. Guessing this is stuff that does not need any hands on then? I've thought of building a server to colocate for my web stuff instead of leasing, as leasing tends to rape you on things like disk space and ram. Though the issue is if something fails you have to pay for remote hands which is very expensive. That's a really nice setup though. Just curious, how much is that costing you per month to colo? Guessing you are renting a half or full rack?

If I remember right his company has lots of colo space. Prime to just drop on in ;)
 
lol a colo for a lab, damn, that must be expensive. Guessing this is stuff that does not need any hands on then? I've thought of building a server to colocate for my web stuff instead of leasing, as leasing tends to rape you on things like disk space and ram. Though the issue is if something fails you have to pay for remote hands which is very expensive. That's a really nice setup though. Just curious, how much is that costing you per month to colo? Guessing you are renting a half or full rack?

Free.

If I remember right his company has lots of colo space. Prime to just drop on in ;)

That's correct, and good memory, Calvinj.

Oh nice, so physically accessible at a low/no cost? Yeah that would be pretty sweet.

Free space/power/bandwidth. And it's a 15 minute drive so it's convenient. I'm stoked on it.


Good times. :D
 
Did you do the install yourself or let EMC do it? That wiring is atrocious :(. Fix it fix it fix it!

I am EMC :p. This is a view form our own lab. I don't have any pictures from before the cleanup, but boy was that a mess. Needless to say that we don't really care THAT much about structured cabling best practices. A lot of stuff comes and goes so no point in spending hours in something that'll only stay for some weeks. we will probably get an Avamar Gen4S grid so I'm planning on moving the current Gen4 to the Isilon rack together with the Gen4S.

Obviously when we install equipment at a customer site, we make the wiring job as clean as possible.

@ Shockey
First picture: Isilon X200.
Second one: EDL 4106.

EDIT:
Friday is present day! Gen4S grid arrived at the office. Going to rearrange some racks. Pictures incoming :).
 
Last edited:
Any chance you have some updated PICs of said colo? LOL You had that big build thread

Nothing has changed there in a long time. VMs get upgraded and whatnot, but the hardware is still the same... and pushing five years old. :(


Probably the reason why I keep futzing with my lab hardware. I gotta keep the chops up! :D
 
Back
Top