Why can't Windows Server 2016 find the hard drives after loading RAID card driver?

Joined
May 22, 2010
Messages
2,079
I went to install Windows Server 2016 on my Gigabyte 7PESH3 system, which is housed in a Supermicro 4U or pedestal SC747-1400B-TX chassis and I still cannot get an Operating System installed with this system. When I attempt to do so during create partition with the Windows installer after loading the driver by clicking load driver, browse, USB Drive(C:), Windows 10 and Server 2016 Raid Driver Folder, starport64, and either Areca Starport-64 SATA RAID Host Adapter or Areca Virtual S/8I RAID Driver it still can't find the drives.

I have a similar problem with Linux except with Linux it's supposed to automatically find the driver on the USB flash drive for the most part and it can't because I don't know where to put it and no Linux distro tells you how or where to put it in the documentation, but instead just tells you how to do software RAID and that doesn't help. Also, everyone on here just tell me RAID just complicates things and that's not helping either. Therefore, basically I have about a 3000+ dollar 4U server that I can't use or get any use out of because I can't get a server operating system installed on it. Can anyone help or do I need to try Microsoft's forum about Windows Server 2016 and the appropriate Linux distro forum for help with the appropriate Linux distro.
 
Is it a nice new shiny driver from the manufacturers' website or off a disk that came in the box?
 
Is it a nice new shiny driver from the manufacturers' website or off a disk that came in the box?

The driver is off the manufacturers website and has been placed on a usb flash drive for the Windows Installer to read from, which it reads it. However, after loading the driver from the source it still says it cannot find any drives. The drivers that are on the disk or disc if on optical media that came in the box are probably outdated and obsolete by now or for this purpose, so of course I'm not using those.
 
I would be sending an email to the RAID manufacturer and see if they have a work around or if there is some other known issue.

I've seen lots of peeps use the disk that came with hardware too many times not to ask that question.
 
Stupid question but... you already set up the array in the RAID config?
 
Stupid question but... you already set up the array in the RAID config?

Yes, I think I did, but I don't remember if it could find the drives either and yes the RAID card is connected to the back plane of the hotswap bays.
 
I would be sending an email to the RAID manufacturer and see if they have a work around or if there is some other known issue.

I've seen lots of peeps use the disk that came with hardware too many times not to ask that question.

Sounds like a good idea, but either I did this before and they didn't reply or I didn't because I tried to call them and I couldn't find the number. However, I going to send them an email now.
 
Yes, I think I did, but I don't remember if it could find the drives either and yes the RAID card is connected to the back plane of the hotswap bays.

wait... so you can't remember if you even set up any of the disks on the RAID card, and you can't even remember if the RAID card even detected the disks. Yet you're troubleshooting a Windows install?

Might want to verify some of those things before you even get to installing Windows.
 
^ what he said.

You need to jump into the disk controller setup during POST, it will tell you the keys to press (Control-G or something).
Then verify that the controller can see the drives and verify the health of the RAID array virtual drive if you created one.

.
 
Stupid question but... you already set up the array in the RAID config?

I checked and yes there is a RAID array setup in the config or firmware. However, drive 1 was free out of the 4 drives, so I'm using it as a pass-through if that's ok.
 
Stupid question but... you already set up the array in the RAID config?

I tried to install an Operating System again using drive 1 as a pass through Volume, but no Operating System out of CentOS 7.2 or Windows Server 2016 would install because they can't find a hard drive.
 
Try this... it's the long way around but it works.

Install the OS on a single drive, even from a normal SATA port on the mobo.
Once you have the OS running, add the drivers for the RAID controller and make sure the OS sees it.

Image the OS, and then restore it to the RAID array. It should boot right up.

.
 
might also try a linux live boot.
help determine if the issue is with the Windows drivers or something else.
 
This happened with a very old HP Athlon64 system with an nvidia chipset when I was trying TP5.. I gave up because there weren't a lot of drivers to choose from. 2016 probably drops support for some raid controllers...
 
Check and see if there are 2012 drivers for the disk controller.
Try the 2012 drivers if you can't get the 2016 drivers working.

I had to use 2008 disk controller drivers on my 2012 server.

.
 
I checked and yes there is a RAID array setup in the config or firmware. However, drive 1 was free out of the 4 drives, so I'm using it as a pass-through if that's ok.

Why are you using a separate RAID card for passthrough when that board has an LSI 2008 that you can use for it?

Can you list exactly what cards you have in the board and what ports all of your drives are plugged into?
 
wait... so you can't remember if you even set up any of the disks on the RAID card, and you can't even remember if the RAID card even detected the disks. Yet you're troubleshooting a Windows install?

Might want to verify some of those things before you even get to installing Windows.

Yea I sorta forgot if I even set up a RAID volume and I didn't document it until now. Also, yes I'm troubleshooting a Windows Installation too, but this is the strangest error I've ever encountered too because the installation media is being detected and the RAID cards Firmware of Setup interface can detect as well as know the drives are there. However, no operating system can detect the drives even Windows, which allows me to manually search for and activate the driver during installation because Linux requires that I put the driver in the proper directory or extract it to a flash drive properly, so it knows where to look or write an iso to a flash drive to do the same thing. However, no matter what I do for Windows or Linux it can't find the drives.
 
Why are you using a separate RAID card for passthrough when that board has an LSI 2008 that you can use for it?

Can you list exactly what cards you have in the board and what ports all of your drives are plugged into?

I'm separate RAID card for passthough because I don't know how to connect the SFF-8643 or whatever type of SAS like SATA connector to the SATA connector on my back plane, but I know how to connect the SFF-8086 or SFF-8087 mini-SAS connector to it because it's included with the card.

Also, I've been told that Software RAID is not as reliable as hardware RAID, even if I've been told I'm using a software RAID card because it requires a driver and the Operating Systems don't just detect the RAID card. Whatever though because I paid about $400 for the card and it seems more advanced as well as has a more elaborate setup than software RAID cards from companies, like SYBA even if that still doesn't make it a hardware RAID card.

The following is the only reason I can find in the book we used for one of my Linux/Unix classes, but I think it might be saying hardware RAID is bad if not both depending on the situation:


"The latter feature does improve performance because it makes writes appear to complete instantaneously. It also protects against a potential corruption issue called the “RAID 5 write hole,” which we describe in more detail starting on page 241. But beware: many of the common “RAID cards” sold for PCs have no non-volatile memory at all; they are really just glorified SATA interfaces with some RAID software onboard. RAID implementations on PC motherboards fall into this category as well. You’re really much better off using the RAID features in Linux or OpenSolaris on these systems."


Nemeth, Evi; Snyder, Garth; Hein, Trent R.; Whaley, Ben (2010-07-14). Unix and Linux System Administration Handbook (Kindle Locations 6386-6390). Pearson Education. Kindle Edition.
 
Yea I sorta forgot if I even set up a RAID volume and I didn't document it until now. Also, yes I'm troubleshooting a Windows Installation too, but this is the strangest error I've ever encountered too because the installation media is being detected and the RAID cards Firmware of Setup interface can detect as well as know the drives are there. However, no operating system can detect the drives even Windows, which allows me to manually search for and activate the driver during installation because Linux requires that I put the driver in the proper directory or extract it to a flash drive properly, so it knows where to look or write an iso to a flash drive to do the same thing. However, no matter what I do for Windows or Linux it can't find the drives.

I'm sorry but you're completely lost. If you configured a raid in the firmware you have now 1 drive (as far as the operating system sees it) that is an array of n drives. Also if your card does have a firmware for configuration it's most likely not a fake raid and it won't need any special drivers to operate. Linux doesn't 'require' you to do anything with the driver except have them on any suitable filesystem i.e. usb drive.

Either your device is not working, you configured the hardware wrong or you simply don't know what you're doing.
 
I'm sorry but you're completely lost. If you configured a raid in the firmware you have now 1 drive that is an array of n drives. Linux doesn't 'require' you to do anything with the driver except have them on any suitable filesystem i.e. usb drive.

Whatever B00nie it won't read them correctly even if that is all it requires and yes I'm a little lost about it because the procedure is not in the CentOS documentation, like software RAID is and I can't find any better help else where.
 
Check and see if there are 2012 drivers for the disk controller.
Try the 2012 drivers if you can't get the 2016 drivers working.

I had to use 2008 disk controller drivers on my 2012 server.

.

There's a thought, but I'll have to get them from Areca because I don't think I have them. However, I think I tried Server 2012 with the 2012 drivers and no luck, but okay try an older Windows version driver with a newer version of Windows. This does not sound like a good idea though and makes very little sense.
 
This happened with a very old HP Athlon64 system with an nvidia chipset when I was trying TP5.. I gave up because there weren't a lot of drivers to choose from. 2016 probably drops support for some raid controllers...

No that's not the issue because Areca has a Windows 10 and Server 2016 driver, but once installed it still can't detect the drives.
 
Whatever B00nie it won't read them correctly even if that is all it requires and yes I'm a little lost about it because the procedure is not in the CentOS documentation, like software RAID is and I can't find any better help else where.

Your problem is that you're reading text books without understanding the big picture. What exactly is the make and model of your raid card? It's possible you've simply misconfigured the hardware so that the backplane is not compatible with the raid card.
 
^ what he said.

You need to jump into the disk controller setup during POST, it will tell you the keys to press (Control-G or something).
Then verify that the controller can see the drives and verify the health of the RAID array virtual drive if you created one.

.

It's not Ctrl-G, but it is something because it's Tab I believe.
 
Try this... it's the long way around but it works.

Install the OS on a single drive, even from a normal SATA port on the mobo.
Once you have the OS running, add the drivers for the RAID controller and make sure the OS sees it.

Image the OS, and then restore it to the RAID array. It should boot right up.

.

The reason I'm not doing that is for the same reason I said I'm using a single drive in pass-through connected to a RAID card, but thanks. Anyway, even direct or close to direct SATA connection to the drive I would still need to connect any possible SATA connection on the board to the back plane for the hot swap bays, so maybe I'll give this a try. However, I don't know how or if this is possible considering I don't have the motherboard manual and know how I'll connect these even if I do get the pdf version of the manual, since I don't believe I have a normal SATA cable and defeats the purpose of have a RAID card that can control all the drives connected to the back plane.
 
I don;t know your raid controller.
But some of the RAID controller i've worked with you cant use an unconfigured drive like you can on the "raid" functinaly that a built into motherboards
you had to configure a Raid array even for a single disk before it would be accesable.

dont know if it is your issue just shooting from the hip here.
 
So on your Areca, in the firmware, you have it set to JBOD when you TAB/F3 into the firmware so that's it's just doing passthrough?

Also, why won't you tell us the exact hardware configuration, what Areca card is it and how many disks are plugged into it?

If you want to use the onboard LSI instead of using the Areca with your backplane, you just need to buy a 8087 reverse breakout cable: https://smile.amazon.com/Cable-Matters-Internal-Mini-SAS-Breakout/dp/B018YHS9GM/
 
So on your Areca, in the firmware, you have it set to JBOD when you TAB/F3 into the firmware so that's it's just doing passthrough?

Also, why won't you tell us the exact hardware configuration, what Areca card is it and how many disks are plugged into it?

If you want to use the onboard LSI instead of using the Areca with your backplane, you just need to buy a 8087 reverse breakout cable: https://smile.amazon.com/Cable-Matters-Internal-Mini-SAS-Breakout/dp/B018YHS9GM/

Okay, so you're saying that by using passthrough I'm using JBOD.

Also, I told you the motherboad, case, and Areca RAID card, but you and everyone else didn't read my post carefully enough. I know I did not tell you the hard drive models I'm using though, which are Seagate 2TB SSHD's. They graphics card shouldn't matter though, but it's an Nvidia Quadro. There are 4 disks and didn't I make that clear enough.

Thanks, but this: https://smile.amazon.com/Cable-Matters-Internal-Mini-SAS-Breakout/dp/B018YHS9GM doesn't help though because I already have it for my RAID card and this is not what I need to connect the onboard LSI to the back plane considering I need the SATA like SAS to SATA connector, which I don't know what it's called.
 
I don;t know your raid controller.
But some of the RAID controller i've worked with you cant use an unconfigured drive like you can on the "raid" functinaly that a built into motherboards
you had to configure a Raid array even for a single disk before it would be accesable.

dont know if it is your issue just shooting from the hip here.

I don't know either, but I have an array setup with passthough for drive 1 and RAID 5 for drives 2-4, but the disks still can't be detected by the OS.
 
Okay, so you're saying that by using passthrough I'm using JBOD.

Also, I told you the motherboad, case, and Areca RAID card, but you and everyone else didn't read my post carefully enough. I know I did not tell you the hard drive models I'm using though, which are Seagate 2TB SSHD's. They graphics card shouldn't matter though, but it's an Nvidia Quadro. There are 4 disks and didn't I make that clear enough.

It's getting more frustrating trying to help you because you're not answering questions and you are clearly out of your depth. What Areca card is it, all you've said is it's an Areca, which model?

Thanks, but this: https://smile.amazon.com/Cable-Matters-Internal-Mini-SAS-Breakout/dp/B018YHS9GM doesn't help though because I already have it for my RAID card and this is not what I need to connect the onboard LSI to the back plane considering I need the SATA like SAS to SATA connector, which I don't know what it's called.

This is a reverse breakout cable, you use it to connect 4 of the LSI ports on board to the 8087 port on your backplane. But since we don't know your exact Areca card with what ports it has on it and I can't tell from the TigerDirect picture of your case what ports it has on the back of the hotswap it's all a shot in the dark. What port(s) are on the backplane?
 
There's a thought, but I'll have to get them from Areca because I don't think I have them. However, I think I tried Server 2012 with the 2012 drivers and no luck, but okay try an older Windows version driver with a newer version of Windows. This does not sound like a good idea though and makes very little sense.


lol.... dude.... it makes perfect sense if there is no driver for the OS you are installing and the OS accepts older drivers. For example on my Dell server, there was no 2012 driver for the on-board RAID controller. The solution was to use the 2008 driver that the 2012 OS accepts and uses perfectly. It works fine that way. Does that make sense?

It also makes perfect sense to use different drivers if the drivers you are trying to use don't work. Sometimes brand new drivers that should work, don't work on your equipment.
So you find drivers that do work. Been there done that many times. Does that make sense?



The reason I'm not doing that is for the same reason I said I'm using a single drive in pass-through connected to a RAID card, but thanks. Anyway, even direct or close to direct SATA connection to the drive I would still need to connect any possible SATA connection on the board to the back plane for the hot swap bays, so maybe I'll give this a try. However, I don't know how or if this is possible considering I don't have the motherboard manual and know how I'll connect these even if I do get the pdf version of the manual, since I don't believe I have a normal SATA cable and defeats the purpose of have a RAID card that can control all the drives connected to the back plane.

>>I have a normal SATA cable and defeats the purpose of have a RAID card that can control all the drives connected to the back plane.


You're not getting the concept here.... it's a TEMPORARY install not using the RAID controller so that you can at least get a running OS installed on your server.
Once you have the TEMPORARY install running on a plain SATA drive, then you can add the drivers for the RAID controller.

*** It's a big time saver since you are no longer fighting with the OS install. ***

Once that is complete, you can use the free Macrium Reflect imaging software to create a backup image of your working OS install with working RAID drivers.
You can either use a boot disk on the server you are on, or connect the SATA drive to another computer to get the image of it.

When you have the image, you can boot up the TEMPORARY SATA drive on the server again, and use Macrium Reflect to RESTORE the image to the RAID array.
Then you shut down, remove the plain SATA drive, and the server should boot from and use the RAID array. Got it?

I'm an IT guy, been in the electronics and IT biz for around 30 years or so. I've used these methods, "they make sense", and they work.

You're getting good advice here from people who know what they're doing.
It's probably a good idea to listen to the advice and try to be a bit more polite when responding to people that are helping you.

ETA: See post #28. Don't try to use your single drive on the RAID controller. Connect it to the mobo SATA.


.
 
Last edited:
I hesitate to even jump into this can of worms, but I feel compelled for some reason.

While there are a number of things that could cause what you are experiencing, what hasnt been mentioned is whether you have the BIOS configured in UEFI or Legacy. If your Motherboard BIOS is in UEFI, you need to have UEFI compatible BIOS on your RAID cards too. If you dont have a UEFI BIOS on your raid cards you need to have your motherboard set to Legacy or UEFI + Legacy. You should also disable any on board storage controllers you arent using and you need to make sure your add in Areca card is set to be bootable via your motherboard BIOS. Finally, if you are installing via UEFI, then you also need to boot your Windows installation via UEFI boot. Mixing and matching these UEFI/Legacy settings will cause exactly what you are describing (assuming all your hardware is physically installed and otherwise working correctly).

Sorry to say this to you, but as other have eluded too, you do seem to be more than a little bit over your head on this one.
 
It's getting more frustrating trying to help you because you're not answering questions and you are clearly out of your depth. What Areca card is it, all you've said is it's an Areca, which model?



This is a reverse breakout cable, you use it to connect 4 of the LSI ports on board to the 8087 port on your backplane. But since we don't know your exact Areca card with what ports it has on it and I can't tell from the TigerDirect picture of your case what ports it has on the back of the hotswap it's all a shot in the dark. What port(s) are on the backplane?

Sorry I thought I told everyone it was an Areca ARC-1264il-12, but apparently not and that's what it is. A break cable is not needed for the card as it's already using one to connect from the card to the back plane and it won't work for connecting the motherboard to the backplane because the motherboard doesn't have a mini SAS connector and only has SATA.
 
lol.... dude.... it makes perfect sense if there is no driver for the OS you are installing and the OS accepts older drivers. For example on my Dell server, there was no 2012 driver for the on-board RAID controller. The solution was to use the 2008 driver that the 2012 OS accepts and uses perfectly. It works fine that way. Does that make sense?

It also makes perfect sense to use different drivers if the drivers you are trying to use don't work. Sometimes brand new drivers that should work, don't work on your equipment.
So you find drivers that do work. Been there done that many times. Does that make sense?


Using different or older drivers worked Spartacus, so no need try to tempararily not use the RAID card and connect the drive to a SATA controller on the motherboard. Thanks.

If only I could get CentOS to install though, but that is for another thread.


>>I have a normal SATA cable and defeats the purpose of have a RAID card that can control all the drives connected to the back plane.


You're not getting the concept here.... it's a TEMPORARY install not using the RAID controller so that you can at least get a running OS installed on your server.
Once you have the TEMPORARY install running on a plain SATA drive, then you can add the drivers for the RAID controller.

*** It's a big time saver since you are no longer fighting with the OS install. ***

Once that is complete, you can use the free Macrium Reflect imaging software to create a backup image of your working OS install with working RAID drivers.
You can either use a boot disk on the server you are on, or connect the SATA drive to another computer to get the image of it.

When you have the image, you can boot up the TEMPORARY SATA drive on the server again, and use Macrium Reflect to RESTORE the image to the RAID array.
Then you shut down, remove the plain SATA drive, and the server should boot from and use the RAID array. Got it?

I'm an IT guy, been in the electronics and IT biz for around 30 years or so. I've used these methods, "they make sense", and they work.

You're getting good advice here from people who know what they're doing.
It's probably a good idea to listen to the advice and try to be a bit more polite when responding to people that are helping you.

ETA: See post #28. Don't try to use your single drive on the RAID controller. Connect it to the mobo SATA.


.

This might work, but it's just weird. I like your advice about using an older Windows Server 2012 Driver because it worked when I did that alot easier and there was no need to crack the case.
 
I hesitate to even jump into this can of worms, but I feel compelled for some reason.

While there are a number of things that could cause what you are experiencing, what hasnt been mentioned is whether you have the BIOS configured in UEFI or Legacy. If your Motherboard BIOS is in UEFI, you need to have UEFI compatible BIOS on your RAID cards too. If you dont have a UEFI BIOS on your raid cards you need to have your motherboard set to Legacy or UEFI + Legacy. You should also disable any on board storage controllers you arent using and you need to make sure your add in Areca card is set to be bootable via your motherboard BIOS. Finally, if you are installing via UEFI, then you also need to boot your Windows installation via UEFI boot. Mixing and matching these UEFI/Legacy settings will cause exactly what you are describing (assuming all your hardware is physically installed and otherwise working correctly).

Sorry to say this to you, but as other have eluded too, you do seem to be more than a little bit over your head on this one.

Thanks because this may be true, but I think I do have UEFI enabled on the card and the motherboard. I'll check though.
 
>>I like your advice about using an older Windows Server 2012 Driver because it worked when I did that alot easier and there was no need to crack the case.

Wait..... the 2012 driver worked?

.
 
scharfshutze009 in waaaay over his head again!

I don't know what it is with you and raid, but my gawd, have you ever got it to work? On any OS apart from Windows?

There is nothing wrong with software raid under Linux. When your hardware raid solution fails and your card or even the controller on the card isn't made anymore and your backups turn to shit, you'll be cursing that hardware raid configuration. The beauty of a software raid configuration under Linux is that your array will pretty much be visible on any distro.

How can you not at least know the specifics of your hardware? Can you upload some pictures that people can use as a guide for assistance?
 
Last edited:
Sorry I thought I told everyone it was an Areca ARC-1264il-12, but apparently not and that's what it is. A break cable is not needed for the card as it's already using one to connect from the card to the back plane and it won't work for connecting the motherboard to the backplane because the motherboard doesn't have a mini SAS connector and only has SATA.

With a reverse breakout cable you go from the 4 SATA ports from your mother board to the 8087 SAS port on the backplane. It sounds like you don't have an 8087 SAS port on the backplane, you just have sata connections? If that's the case, you just use Sata cables to wire it up.

Thanks for the card info, we know it uses 8087 SAS connectors. I think this is your backplane: https://www.thomas-krenn.com/en/wiki/BPN-SAS-747TQ_SAS/SATA_Backplane, if that's the case you should be able to use normal Sata cables to go directly from your motherboard to the backplane.
 
Back
Top