Cannot do S1, S3 or S4 with LSI 9260-8i M5015?

FireDemon

Limp Gawd
Joined
Feb 10, 2006
Messages
256
Hoping for a little help here. Hopefully it's not just the nature of the beast.

I'm running a ServeRAID M5015 9260-8i card. Updated to the latest firmware (v12.15.0-0248). I currently have a 2x1TB Samsung 870 array in RAID0 (Expandable to 6 drives) and will be adding a 2x8TB Barracuda Pro RAID1 array next week. The driver/card is not properly resuming from standby or hibernate states. For S1/S3, the system will go into standby alright but crashes upon resuming. I was first using the out of box Windows driver which identified the card as an Avago and was released in 2015. With this driver, the system would resume but the card would show in device manager with an error and after aabout 60 seconds, the system would bluescreen and show the error "DRIVER_POWER_STATE_FAILURE". I updated it to the M5015 IBM ServeRAID driver released in 2018 which operates the card properly but now the system does not resume from S1/S3 at all. You just get a blank screen for a few moments upon waking the system up and then the whole thing resets and begins the POST cycle.

When attempting an S4 hibernate state, the system powers down and resumes properly however upon waking back up, the ServeRAID BIOS is not invoked during POST. This results in a longer than normal resume time at the Windows logo and when it finishes resuming from S4, the card shows up in device manager with an error and any arrays/drives connected to the card are not identified until the next full reboot. I verified it was the card by removing it from the system. Everything functions normal in its absence. Fully shutting down the system or restarting with the card installed do not give any issues.

System specs:

Asus Crosshair VIII Hero
Ryzen 5800X
32GB Corsair Vengeance LPX
Asus RTX 3070 Dual OC
Cudy WiFi 6 AX200 Bluetooth/WiFi card
Windows 10 Pro 64 bit
500GB NVMe Samsung 970 for OS drive


Is there a fix for this? Or is it the nature of the beast with this type of card?
 
maybe don't use Sleep on a server raid card it's really not supported due to nature of how the raid card works (it has its own mini pc and OS and does not like going to sleep usually as are designed for 24/7 running, very lucky in the past it was working at all)

just shutdown the computer (only 60 seconds longer to get past the raid card start up)
 
maybe don't use Sleep on a server raid card it's really not supported due to nature of how the raid card works (it has its own mini pc and OS and does not like going to sleep usually as are designed for 24/7 running, very lucky in the past it was working at all)

just shutdown the computer (only 60 seconds longer to get past the raid card start up)

I would be OK with not using S1 or S3. The part that really confuses me is the S4. I discovered that roughly 15-20% of the time, it will resume normally from an S4 state. The majority of the time, what I described in the OP is what happens. When viewing the event log however, "Hibernate command received from host" is shown. Makes me think there's something wrong with the card.
 
None of these devices were intended for sleep states of any kind - that doesn't happen with a server.
 
None of these devices were intended for sleep states of any kind - that doesn't happen with a server.

Agreed but FYI, I upgraded the card to a 1GB M5016 (9270CV-8i).

With the latest firmware and drivers, S4 is no longer an issue. S1 and S3 still are a no go but that's not a huge deal given the circumstances.

Not sure why resuming from S4 would sometimes work about 25% of the time with the M5015. The rest it would just skip the BIOS and fail to boot properly.
 
Back
Top