Separate names with a comma.
Discussion in 'Virtualized Computing' started by agrikk, Dec 18, 2008.
Exactly. Something I can manage
Just upgraded from a NUC i5 1.2GHz w/ 12GB RAM, 256GB SSD + 1TB data to a Dell PowerEdge T320 from work. Xeon 1.9GHz hex core w/ 96GB RAM, 3x256GB SSD & 3x1TB 7200 RAID 5 arrays.
Kind of makes my laptop less necessary...I can go back to nesting things on a server instead of carrying around my lab and not using it.
Although, whenever I get a proper license, I'll likely go back to a few NUCs and a NAS. But, for now, this will suffice. Especially with the price.
So I got an ibm x3550 m3 from work. I'm not sure if it's dual 8 thread or dual 12 thread, but it will have 192gb of ram. I only have three hdd cages right now, so I'll need to order more. I play on running a 240gb ssd and two 1tb reds in raid 1 for now. Should have it up and running over the weekend.
Anyone have any know how if running esxi 6.5 on one of these?
Just go check the HCL.
ughhh I have been so busy.. everything is powered off and I haven't been on forums or anything.. so my bad.. bigtime.. 8 months late!!!
that is the one I followed...
Since I did some cleanup. Thought I would share the progress.
Was able to keep the front door but had to remove the rear.
Sorry for flash reflection, didn't realize it was there until I uploaded.
Going to try color coding cables. Made the purple ones for management. Black is standard network use for now.
Updated my lab this year.
Dell R720 - ESXi 5.5 - File Server, pfsense, linux box, miner
Dual Intel Xeon E5-2650L V2 10c/20T 1.7ghz
192GB DDR3 ECC
400GB Intel Enterprise SSD for the VM's
Two 1TB Barracudas (storage)
Dell R620 - ESXi 5.5 - Miner
Dual E5-2667 6c/12T 2.9ghz
64GB DDR3 ECC
250GB 5400RPM 2.5" VM drive
Dell R620 - ESXi 5.5 - Miner
Dual E5-2620 6c/12T 2.0ghz
32GB DDR3 ECC
320GB 5400rpm 2.5" VM drive
VMs on spindles?
No pics? where's the HW pron?!
I do too on the san and in raid.
Laptop spindles at that LOL. They're non critical VM's, could care less about their boot time or performance, this is my home lab and all they're going to do is run miners for now.
Pic maybe to come lol, the cable management is horrendous. Hardly get a down time to work on it since my wife is home when I'm home and everything runs through the internet. Guess I could clean it up and move the server 1 PSU at a time. I'll take some tonight and if I remember I'll post em lol.
Found this on my phone, maybe it will hold you over until the HW pics come lol. This is my R720.
Here you go . Dont mind the cables lol.
Updated my lab with a Pure m20 with 10TB raw.
3.6:1 with about a dozen various VMs on it.
Pics of that pure?
I posted pics in the Network pics thread, but I finally got my ESX lab up and running around a much more performant iSCSI SAN.
Supermicro H8SGL-F motherboard
Opteron 6128 8 cores @ 2 GHz
32GB PC1333 RAM
Mellanox ConnectX 10g HBA
Fujitsu XG2000 10Gig switch
SAN1 (iSCSI SAN)
Supermicro X8DTE-F motherboard
2x Xeon E5620 4 cores @ 2.4GHz each
48GB Registered ECC PC1333 RAM
IBM M1015 HBA (Flashed to LSI 9240-IT mode)
8x 256GB Samsung EVO 850 SSD
Mellanox ConnectX 10gig HBA
I am able to generate 150,000 read IO/s sustained with a combined non-sequential read/write load of 40,000.
Finding the combination of firmware and drivers and OSes that would make the ConnectX cards work together (in either ethernet mode or Infiniband mode) was an absolute street fight but it all finally plays nicely together.
If I were to do it over again, I would have gone with the ConnextX-3 SFP cards, but I would have had to use NexentaStor CE for my SAN and pay a premium for a managed Infiniband switch.
Should be posting here in a few days, but.. the lab will be just about finished up (LOL that's what they all say).
Not as elaborate as a full rack, but it will be simple and get the job done.
Two DL380 G6's [ Dual L5630's + 64gb + few local WD VelociRaptors ]
Netgear RR2304 4-bay 1u NAS
TP-Link gigabit switch of some sort .. will have a more detailed post once all wired up
Detailed with pictures would be nice. I am running a single server like yours, its not bad, but man I hate myself for picking the 2.5 SAS model
I love the 2.5 SAS. more drives per RU. what issues / concerns are you having?
For one, No, because this only has cage for 8 2.5 sas drives. but my biggest issue is the cost of drives that size. to get 1tb sas drives in that size im looking at 200 bucks give or take per drive. where if it was a 3.5 setup, I could find drives under a hundred
This is why I went with the DL380P G8 with 12 LFF drives! 3.5 drives are way cheaper!
Looking back last year when I purchased my two DL380's... there is sometimes a little regret in not going with possibly the Dell R510's or R710 LFF.
Camera phone picture... not the best quality.. need to tidy up and dust the top off.. TP-Link switch got removed and I ended up scoring a Netgear GS748t v5 Sweet.. hopefully it works out , and is a decent switch. Traded my old dual 771 beast away for it. The little 2u Rosewill server ; no use as of yet... it has an ASrock QC5000m board in it, A4-4000 quad core, 2gb or ram.. not hooked up.. just filling space currently. Might make it a physical domain controller, once I get the lab rolling..
Looks like you have plenty of gear, whats keeping you from getting that lab going?
Plus, a few months back, I had some drives crap out, so I've been down for a bit.
Ready to fire up everything from scratch , again.
Depends on your needs. If you just want to run one as a lab, 146gb sas drives are cheap. if your patient 300's are as well.
Im at a loss I dont know what Im going to do right now. I need faster speed than im getting out of 10k drives. 3-4 vm's per disk and its crawling.
Could go SSD's but then I lose storage space so idk
Just started my journey to the ESXI world. I have been using Virtual Box on my desktop for years and recently build a new PC so I have a spare PC for dedicated VM hosting.
My ESXI server spec is as follow:
CPU: i7 2600k
Just started playing with the configuration and very happy to see that it has an option to see all VLANs traffic. I have an EdgeRouter X at home so now I can host VMs and can be reside in multiple VLANs.
I use mine as a lab machine with some minor internal server hosting on the side. It's my old desktop, so besides the p3700 I already had all of it.
CPU: I 74930k
Ram: 48 GB @ 1600
ssd1: 500 gig mx 200
ssd2: 1.6tb intel p3700
Graphics card: r9 fury - because I didn't have another one hanging around when I built it. It'll be changed soon enough, lol.
Supermicro 826 SAS3 (12 bay 2U) chassis with:
- Single socket basic X10 board
- Quad core Xeon E5 v3 (Haswell)
- 16 GB ECC DDR4
- 2x128GB SSD
- 2x4TB WD Gold
- quad gigabit PCI-E Intel ethernet
Pretty basic compared to most here but it does everything I'm asking of it, which is basically run Proxmox serving a 4TB mirrored ZFS volume and run a pfsense router VM. There's other stuff I'll play around with eventually but for now I'm just enjoying having real network storage and a good router.
ESXI1 - running off a flash drive:
Some random cheap MSI Motherboard
24 GB DDR3-1600
1x256 GB SSD, 1x1TB Samsung F3 Hd, 1x3TB Hitachi Ultrastar 7K4000
2 port Intel NIC
ESXI2 - running off a flash drive:
Some random cheap Gigabyte Motherboard
16 GB DDR3-1333
1x120 GB SSD, 1x3TB Hitachi Ultrastar 7K4000
2 port Intel NIC
The servers also have access to my NAS via iSCSI
All VM Run as single function appliances:
R Studio Server ( https://www.rstudio.com/products/rstudio/ )
Bind DNS Resolver #1
Bind DNS Resolver #2
Owncloud Calendar service
ZoneMinder Security Camera PVR
I have been playing around with a Lenovo D30. Dual Xeon (forget cpu at the moment) and just put 128gb ram in it.
From "My" experience. even at work... I run out of memory before cpu so the dual xeon works great. On drives, I do not have raid setup at all. Just single sata drives setup and I have them as default...
datastore1, datastore2, etc. I then balance my vms across them and it seems that I never really see any performance issues.
Ill fire it up later and try to take a pic or 2
just tried googling this and not finding anything. What is this??? looks interesting...
Ahhhh i saw that on my searching but reading it ..it didnt say minecraft so confused me.
Here is my basic setup.
Supermicro X9DRI-F motherboard
Dual Xeon E5-2643
32GB ECC memory
Samsung 850 PRO 256gb boot/vm drive
2 x WD RED 4TB data drives
running esxi 6.5 free edition
Ubuntu Plex server
Windows 10 pro (for testing)
Window server 2012R2 - IP camera / fileshare / handbrake encoding for plex
I never went to 6.5 because i couldnt find the free edition. I gotta look again.
Create a free account and login, it will let you download the setup and license file.
Older pic, missing some drives, but here's my little home lab. Excuse the dust. The oldschool C2100 isn't doing any crunching anymore, but the 1U rig is my home test-bed + media server + NVR, on ESXI 6.5 right now. 136TB of raw disk here.
House has 1000/1000 GPON, from same provider I use at work for our colo. So that's convenient (I work from home a lot).
Dell X1052P (picked this up open box on eBay for $200, score. And yes, I know this belongs on the other side of the rack. I like to look at the blinky lights though, sorry.)
2x E5-2640 V4
4x 10TB IronWolf Pro
This thing is due to be replaced w/ a JBOD enclosure
12x 8TB WD Red (thanks Best Buy)
Corsair case is my old X99 rig, she gone. Sold to a member here actually.
What a PITA keeping that case in a rack. Not doing that again, heh.
2x Emotiva XPA-100
Worked great... and ummm 6.7??? aight checking it out..
Just got my 'new' ESXi 6.7 server setup. Was running the hypervisor off the datastore disks before, now running off a 16GB ADATA USB 3.0 flash drive. Cloned the USB stick for a quick restore in case of failure. Previously upgraded from ESXi 6.5 running a Supermicro X9SCM + 16GB DDR3-1333 ECC + Intel E3-1240v2. Now running an ASRock Rack EP2C612 WS + 64 GB DDR4-2133 ECC Reg + Dual Intel Xeon E5-2667v3's. Transferred over my VMX files and VMDK's to the new datastore, reimported the VM's, and it all booted up just fine. Very smooth transition to the new hardware, no issues at all. Next up is creating some new VM's since I have a lot more resources available, and completely re-doing my VM strategy. Right now I have a Network Management VM with PRTG and a few other applications -- I'll likely keep that.. and my other VM is bloated with SnapRAID + Emby Server + ServerWMC + Ubiquiti NVR + Backup Management and the list goes on. I'm going to break it out into several dedicated VM's, and also virtualize my pfSense firewall. I have two additional 250GB Samsung Evo 850's I plan on dropping in whenever I get a chance to start reconfiguring everything and laying out my final strategy. I'll likely upgrade the storage array soon too, likely with FreeNAS and some new HGST drives.
Norco RPC-4220 4U Rackmount Case
1000w Corsair RM1000 Power Supply
Asrock Rack EP2C612 WS Motherboard w/ IPMI/KVM
2x Intel Xeon E5-2667v3
64GB (8x8GB) Samsung DDR4-2133 Registered (8x M393A1G40DB0-CPB)
1x Intel Pro 1000 Dual Gigabit PCIe Ethernet Adapter
3x IBM m1015 SAS Controllers Flashed to IT Mode
2x 250GB Samsung Evo 850 SSD's (VM Datastore)
7x 4TB Western Digital SSHD's
10x 2TB Western Digital Green HDD's
5x SFF-8087 to SFF-8087 cables
3x 120mm Arctic Cooling F12 PWM Fans (HDD Wall)
2x 80mm Arctic Cooling F8 Rev.2 PWM Fans (Rear Case)
2x Noctua NH-U9DXi4 PWM CPU Fans
well finally got around to getting it to 6.7 and also took this pic for it...
works great so far...
Nice setup. I have a decent lab at work I can tinker with. Tempted to post an image of the lab setup here. (the esxi cluster showing resources and such.) But I think this is for home use only. And some of your home systems are SICK.