ESXi 5 with an IBM m1015

MattGG

n00b
Joined
Aug 19, 2010
Messages
11
Setup:
cpu: core i7 2600
mobo: ASRock Extreme4 Gen3
ram: 16gb ddr3
hdd: 6x 2tb seagate green (already has zfs courtesy of FreeNAS)
raid card: IBM m1015

So I recently purchased an m1015 from ebay as recommended to me and since it has arrived I have hit a wall trying to get ESXi 5 to install. Before I had this card I had no problems installing. It hangs up on "Loading module megaraid_sas ...".

I just tried to install ESXi 4.1 update 2 (IBM customization, just to see what happens) and it hangs in the exact same spot. I'm new to virtualization but anyone have any advise?
 
Try installing with the hba pulled out. I assume you will pass it thru to freenas using vt-d? If so, you will need another drive (on mobo sata port?) for esxi local datastore (e.g. this is an all in one?) if not, it's not clear to me how the existing zfs data pool will be useful in any way, since esxi won't grok it.
 
I wonder if someone had previously tried to flash that card and gave up when they couldn't finish. That sounds very similar to the problems I had when I was having trouble flashing the firmware/ROM back on after I cleared it off.

I since got it working by buying a different motherboard that let me flash the card properly, and then it worked in the original system again.
 
I would start by updating to the latest firmware, I had a similar problem and that was the fix.
 
Thanks for all the replies. I had already flashed it to latest firmware. Though, I was able to pull out the m1015 and install ESXi 5 then put back the hba. I was even able to mark it as passthrough and installed FreeNAS as a virtual OS. I then edited the settings for it in vSphere and passed it as a PCI device. However, after all that, FreeNAS sees no disks.

I did notice that the raid card, while booting up, shows 5 JBOD and 1 unconfigured bad drives. It should be 6 JBOD. Is the raid card configuration blocking my virtual FreeNAS from seeing the disks?

With no prior experience with raid, I'm afraid to "configure" the raid card for fear of losing all the data I have on the ZFS volume.
 
Last edited:
More info: I've been using these reverse breakout cables from the backplane (I have an rpc-4220) to the motherboard while booting FreeNAS from a thumb drive. I just tried to boot FreeNAS from a thumb drive but using the hba and no breakout cables... it did not see any drives. So this leads me to believe my fault lies within the raid card configuration. Thoughts?
 
Yeah, I would try flashing the card to IT mode using the instructions found here:

http://hardforum.com/showpost.php?p=1037513464&postcount=8

If you have issues with a PAL error, try flashing on a different motherboard. Afterwards, go into the bios of the card (Ctrl+C) and make sure to select "Disabled" where it asks if you want the disks to be presented to the BIOS/card. Then reboot and give that a shot.
 
You should be using forward breakout cables

I had mistakenly bought those first and they didn't work. I then found out I needed reverse because I'm not connecting the hba directly to the disks. The disks are plugged into a backplane as part of the case I have (RPC-4220) then either to the mobo sata (reverse breakout) or to the hba (regular sff-8087 cable).
 
Yeah, I would try flashing the card to IT mode using the instructions found here:

http://hardforum.com/showpost.php?p=1037513464&postcount=8

If you have issues with a PAL error, try flashing on a different motherboard. Afterwards, go into the bios of the card (Ctrl+C) and make sure to select "Disabled" where it asks if you want the disks to be presented to the BIOS/card. Then reboot and give that a shot.

Thanks, I will try this after work today.
 
did you try putting the card into different PCI express 2.0 slot?

set onboard SATA to disable or IDE

good luck
 
I had this happen on two boards recently and all I had to do was change a setting in the BIOS from "legacy ROM" to "EFI-compatible ROM" - voila, boots fine.
 
I had this happen on two boards recently and all I had to do was change a setting in the BIOS from "legacy ROM" to "EFI-compatible ROM" - voila, boots fine.

One of the first things I read but it didn't so anything for me.

I have flashed the m1015 into IT Mode. It only took me 2 motherboard tries! But now virtual FreeNAS doesn't boot all the way.

It hangs saying:
run_interrupt_driven_hooks: still waiting after 60 seconds for xpt_config mps_startup
run_interrupt_driven_hooks: still waiting after 120 seconds for xpt_config mps_startup
run_interrupt_driven_hooks: still waiting after 180 seconds for xpt_config mps_startup
.
.
.

I was never able to push ctrl + c to to enter the card's bios. It says to push ctrl + c but it doesn't do as I say!
 
Last edited:
I saw that message myself. Google seemed to indicate the current mps driver is borked. I gave up.
 
Last edited:
Small update, I was able to boot freenas from the thumbdrive using the hba and no breakout cables. It even says the volume is healthy. I wasn't able to do this before the IT mode flash. So this tells me it has something to do with ESXi passing the hba to the virtual freenas, right?
 
So you're saying that you booted directly to FreeNAS bypassing ESXi and all was good? If that's the case, perhaps your motherboard is having an issue with VT-d. Have you googled around for other people using your board + VT-d + ESXi?
 
One of the first things I read but it didn't so anything for me.


It hangs saying:
run_interrupt_driven_hooks: still waiting after 60 seconds for xpt_config mps_startup
run_interrupt_driven_hooks: still waiting after 120 seconds for xpt_config mps_startup
run_interrupt_driven_hooks: still waiting after 180 seconds for xpt_config mps_startup
.
.

I was never able to push ctrl + c to to enter the card's bios. It says to push ctrl + c but it doesn't do as I say!

MattGG:

See my post here: http://hardforum.com/showthread.php?p=1038483037#post1038483037 for solving your Freenas and SAS2008 card problem when running under ESX.

In Short:
Remove the card from the VM.

Boot Freenas VM.

Edit the loader.conf
Add the following to it:

hw.pci.enable_msi="0" # Driver Interrupts problem SAS2008
hw.pci.enable_msix="0" # Driver Interrupts problem SAS2008

Reboot VM and make sure it boots OK.

Shut down VM and ADD LSI card to VM.

Boot VM and all should be fine.


As for your LSI Card Bios not showing up:
Are you using a commercial motherboard or a server based motherboard?

I have found the following:

Pressing CTRL+C on a server based motherboard works flawlessly, it envokes the LSI BIOS every time.

When I flashed my M1015/Dell H200 cards to LSI IT cards though, I encounterd the same problem as you are encountering....... Pressing CTRL+C does not envoke the BIOS.

It seems to be a problem that commercial ABit/Asus/Gigabyte motherboards do not remap the LSI Bios into the correct spot in memory.
It remaps the Bios to after the Default boot drive point.

In my case it certainly was not the LSI cards fault.

If you remove all bootable drives+usb from your PC. Start it up and Press CTRL+C at the LSI prompt, the card bios will be shown when the PC attempts to boot. It executes the jump into the card bios after it finds no bootable drives.

Or try the following:

When your motherboard boots, Select the option that allows you to choose from CDROM/Hard Drive List to boot from (F11 on mine) then, when the LSI card initialises, Press CTRL+C. Wait for your motherboard to display the boot drive selection menu and choose the LSI card.

That should also allow you to jump into the LSI card's BIOS.

Good Luck.
 
MattGG:

See my post here: http://hardforum.com/showthread.php?p=1038483037#post1038483037 for solving your Freenas and SAS2008 card problem when running under ESX.

In Short:
Remove the card from the VM.

Boot Freenas VM.

Edit the loader.conf
Add the following to it:

hw.pci.enable_msi="0" # Driver Interrupts problem SAS2008
hw.pci.enable_msix="0" # Driver Interrupts problem SAS2008

Reboot VM and make sure it boots OK.

Shut down VM and ADD LSI card to VM.

Boot VM and all should be fine.

Good Luck.

Thank you very much! I edited loader.conf and that did it; it works perfectly! Granted, I had to set it to a single core machine. I orginally had it to be a single socket dual core emulation but after the loader.conf edit there was an a bunch of Interrupt storm messages on irq18. Google told me to limit it to one cpu. It's ok, it doesn't need a lot of cpu power anyway.
 
. I orginally had it to be a single socket dual core emulation but after the loader.conf edit there was an a bunch of Interrupt storm messages on irq18. Google told me to limit it to one cpu. It's ok, it doesn't need a lot of cpu power anyway.

You are welcome:

See my post here about multiple CPU's:

http://hardforum.com/showpost.php?p=1038483037&postcount=1

See this part:

Under ESX5i, if you add more than 1 CPU to the Freenas VM, I encounter “Interrupt Storm on IRQx” (IRQ18 in my case).
To solve that, boot into Freenas VM Bios: Disable Floppy drive, Com ports, Printer ports and anything else that you are not going to use.
Save changes.
Reboot
Solved!
 
thanks for the solution

I used system -> add tunable to add the

hw.pci.enable_msi="0" # Driver Interrupts problem SAS2008
hw.pci.enable_msix="0" # Driver Interrupts problem SAS2008

I also got the IRQ18, I solved by adding another e1000 network adapter and removing it afterwards
 
Last edited:
Back
Top