How long do your ESXi machines take to boot?

Viper GTS

n00b
Joined
Jun 12, 2006
Messages
63
I am getting rather annoyed at how long my home ESXi machine takes to boot. It's 30+ minutes, maybe more like 45. 90% of that time is spent at one line: s.v00. I've found lots of references on google to crashing during boot on s.v00, but very little reference to what it actually is or why it could take so long when it all works fine after it's up.

This is a whitebox build running fairly standard hardware, licensed with vSphere Essentials (to allow me 5.0 Update 1 + my current 48 GB RAM). I was hoping the update to 5.1 would fix it given this:

http://kb.vmware.com/selfservice/mi...nguage=en_US&cmd=displayKC&externalId=2007108

Unfortunately no luck, still the same exact issue. It's not like I'm rebooting it all that often but when I do it puts whatever I'm doing on pause for the better part of an hour.

Anybody else seeing this?

UPDATE:

Resolved, issue was the USB stick being incredibly slow. Added a Samsung 830 SSD and it now boots <30s.

Viper GTS
 
Last edited:
from the time esxi starts loading, maybe 60-90 seconds. A couple of minutes on all the preboot bios stuff (raid controller, bmc, drac etc)
 
Mine seems to take 2-3 minutes. It boots from SAN using a qlogic iSCSI HBA, which seems to take a significant chunk of said time.
 
mines about 2-3 minutes,,,mostly the raid controller bios doing its stuff
 
That sounds like a really long time and is longer than any server we have at work. I'd consider a clean reinstall. NOTE: that will delete your data, so if that is important be sure to back it up.
 
I've got some at work that take 10 minutes but they're far more complex environments (FC attached cluster nodes with a dozen ethernet connections etc). I don't mind them taking 10 minutes.

I'm leaning towards a reinstall as well, this is kind of absurd. I think there's an in-place reinstall...

And...

The installer goes slow at the same point. Bizarre.

Viper GTS
 
Last edited:
What hardware is ESXi installed on (hard drive specifically). I've had a similar issue in the past but it ended up being a bad hard drive causing the problem.

In general, my Secondary ESXi server in my sig boots up in about 3 minutes. The primary takes a couple minutes more for some reason.

I will mention that I originally built the primary using a USB key for the ESXi install. That took FOREVER to boot in my opinion, and it was only about 10 minutes. 30+ is absolutely crazy....
 
One of mine takes 5-6 minutes including the BIOS/HBA stuff. The other is maybe 2 minutes, but it has no add-in cards or native drives. Both use USB keys to run ESXi from.
 
What is your boot media? (USB, disc, etc) If it's a USB key then have you tried switching USB keys? Maybe the key is just slow or worse, going bad. (This is just might be a shot in the dark.)
 
It's on a USB key but I just booted off an IPMI mounted 5.0U1 installer ISO and it does the same thing off the installer (at the same point).

I don't think it's USB key related.

Viper GTS
 
are you able to see exactly where its hanging up for an extended period of time? Like what driver etc?
 
Yep all the excess time is at:

Loading /s.v00

I've had no luck figuring out what it is.

Viper GTS

guess i shoulda read your OP more closely hehe. Sounds related to a faulty hardware piece. Or a piece that is going bad.
Raid card or other. even if you arent booting from any device on the hardware it may still be causing an issue.
 
I'm assuming you've tried switching the placement of your USB key to different ports? Not sure if you have more then one USB controller on your system. What about a RAM test? Perhaps a failing chip while its loading everything to memory causing a problem. I have a small setup but it boots in 2-4 minutes tops including POST. At work we have some older servers running 4.1 that boot via FC SAN that only take 3-5 minutes.
 
My machine with 2x raid controllers usually takes about 10 mins to boot

Other one takes around 4 mins with a few NICs
 
So, can't say what's in s.v00, but IIRC, it's the largest file. it's SO not supposed to do that though. That's even before the vmkernel loads - that's syslinux bootstrapping.
 
Are you sure everything is up and running then you boot? there are documented bugs in 5.0, that I think where fixed in u1, that would cause 30-45 min timeouts if iscsi/fc target was configured but not available at boot time. This has bit me causing 15-30min boot times, where normally it is 5min (including bios).
 
Both my boxes boot from a 4gb USB and from button-push to full access is about 2-3 min. Maybe 5min max. Doesn't seem to matter which port I use or brand of USB (I have two for each system).

Bill
 
Are you sure everything is up and running then you boot? there are documented bugs in 5.0, that I think where fixed in u1, that would cause 30-45 min timeouts if iscsi/fc target was configured but not available at boot time. This has bit me causing 15-30min boot times, where normally it is 5min (including bios).

This is the functional equivalent of ~miles~ before that point. You're worried about the landing - we haven't even gotten off the ground yet. Vmkernel hasn't even loaded at that point, we're still loading in the bootbanks.
 
Update for all who care:

My test of an IPMI mounted ESXi installer got me thinking, maybe this is all just a performance issue?

I took a trip up to MicroCenter today & picked up a 64 GB Samsung 830 (the smallest, cheapest drive they had) and an external DVD drive (been meaning to get one anyway, figured it would be another useful data point).

I installed the 830 and removed the USB key I had been using. I booted off a burned copy of the 5.0U1 installer, it still took quite a while on s.v00 but seemed marginally faster than the IPMI boot. I ran a clean install to the 830, rebooted and...

ESXi now boots sub 30 seconds. So having a fast storage device makes all the difference in the world.

I have other USB mounted systems that don't take anywhere near so long to boot so maybe the flash drive that I was using is just a flaming pile in spite of reviews showing it was a fast drive. Or maybe this Tyan dual socket board has issues, who knows. Between this and the inability to log locally I think I am officially done using USB as boot devices. The convenience of booting off an internal USB port is nice, but from now on I think it's dedicated spindles/SSD for me.

Thanks for all the ideas guys.

Viper GTS
 
I think usb is just slower. I used to boot of a usb key and it was meh for time, since I switched to boot from san, it's quite a bit faster (although I haven't clocked it - just subjective...)
 
I think usb is just slower. I used to boot of a usb key and it was meh for time, since I switched to boot from san, it's quite a bit faster (although I haven't clocked it - just subjective...)

I would imagine that the IOPS aren't particularly high for the controller in a USB thumb drive.

That, and if you are in USB2.0 mode, data transfer rates tend to peak at just over 20MB/s.
 
Glad you got it solved, your server probably doesn't have USB 2.0.

For example: My dual opteron rackable server has SATA3, but only has USB1.1. Installing ESXi took forever, but installed on a HD, it boots in about 60 seconds.
 
Just for fun, check to see if the "USB 2.0 Controller Mode" config in the bios is set to "hi speed" or "full speed".

Strangely enough, "hi speed" is 2.0 and "full speed" is 1.1

Might help you in the future if it is set wrong.
 
Just for fun, check to see if the "USB 2.0 Controller Mode" config in the bios is set to "hi speed" or "full speed".

Strangely enough, "hi speed" is 2.0 and "full speed" is 1.1

Might help you in the future if it is set wrong.

Yep, the "full speed" standard is part of the 1.1 spec.

2.0 introduced hi speed which is faster than full speed.

Its confusing.
 
Wow, 30+ minutes is quite a while. That would get annoying for me.


I've got some at work that take 10 minutes but they're far more complex environments (FC attached cluster nodes with a dozen ethernet connections etc). I don't mind them taking 10 minutes.

Ours take around 5-7 minutes, but we have a setup similar to yours. We run ours off of SD cards, so that may be the primary reason its taking even that long - the cards are probably class 2 or something. I can't remember since it's been a while since I actually installed them.

But we also have some specific drivers that have to be loaded and it has FC cards for shared storage...but we aren't purposely trying to reboot these things either. :)
 
Back
Top