HyperV takes forever to Start/Stop

MavericK

Zero Cool
Joined
Sep 2, 2004
Messages
31,892
Hello,

Just starting to play with HyperV for the first time - got a rig up and running to play with, and it seems like regardless of the VM I try to spin up, it takes like 10-20 minutes for it to start or stop. Is this normal? I wouldn't say the machine is particularly slow, but it's not specifically server HW, either.

Rig is a 4790K, 32 GB RAM, 1.5 TB RAID1 running Windows 10 Enterprise.

It's a brand new OS install. Just not sure what to look for? I've checked the event logs and nothing seems to indicate a specific issue. I don't see any excess of resources being used, either.
 
Is the VM fresh as well? Definitely not normal, boot times are like normal PC startup times on my Hyper-V hosts.
 
Yeah, it is. I wonder if it has something to do with it being RAID1? They aren't slow drives, but I keep reading that RAID10 or SSDs are the way to go. I really can't imagine it should take as long as it does to start/stop the VMs, though.

EDIT: Worth noting that once the VMs are running, they perform just fine and are not taxing the system much at all. It's only the inane start/stop time required.
 
Last edited:
What is the file system on the raid 1?

If it is ReFS, then you need to disable integrity streams on the Hyper-V folder and child items. Integrity streams aren't designed to work on persistently open files and causes major performance issues for them. Once the streams are disabled you need to copy the files impacted and delete the originals, integrity streams are a persistent setting on files at creation.

NTFS is still the recommended file system for Hyper-V, but ReFS works just a well so long as integrity streams are off (I'm using ReFS on my server 2016 Hyper-V volumes)
 
What is the file system on the raid 1?

If it is ReFS, then you need to disable integrity streams on the Hyper-V folder and child items. Integrity streams aren't designed to work on persistently open files and causes major performance issues for them. Once the streams are disabled you need to copy the files impacted and delete the originals, integrity streams are a persistent setting on files at creation.

NTFS is still the recommended file system for Hyper-V, but ReFS works just a well so long as integrity streams are off (I'm using ReFS on my server 2016 Hyper-V volumes)

It's NTFS, but thanks for the tip. I'm going to try fully uninstalling and reinstalling Hyper-V to see if that makes any difference. Failing that, I may just try to get a RAID 10 or even RAID 0 running. Not terribly worried about data integrity since I am just using it as a test bench.
 
RAID1 should not be an issue. I run most of my VMs on RAID1 volumes on smaller server builds with no issue.
 
Yeah, for kicks I just tried Hyper-V on my main rig (in sig) as it's very similar, using a single HDD rather than the SSD, and the boot time was nearly instant. So something else is going on with the test lab machine.
 
So, I may have figured out the issue, or at least it seems to be working now. I did a full Win 10 refresh on the host machine, reinstalled Hyper-V and also updated my NIC driver (sometimes it would hang trying to create an External virtual switch). Now the switch adds fine and the VMs seem to boot almost instantly. Weird!
 
Back
Top