3.2TB SSD for $240

Hi,
i have a HP 6.4TB Version and i can not attach the device. Error "Failed to read from or write to the device" Please how can i fix it?
View attachment 358700
What driver did you try and run? I remember with the 6.4tb the one I got was a NIB 2017 era Lenovo branded SX350 model. I seem to recall it wouldn’t work with one of the available drivers I tried unlike the 3.2s. I had to use one prior generation to get it operational before the latest firmware could be added or something along those lines. I don’t remember if I used the Lenovo driver or WD’s. Its obviously been over a year since I did this so I don’t really remember all that well as I haven’t had to touch it since.
 
What driver did you try and run? I remember with the 6.4tb the one I got was a NIB 2017 era Lenovo branded SX350 model. I seem to recall it wouldn’t work with one of the available drivers I tried unlike the 3.2s. I had to use one prior generation to get it operational before the latest firmware could be added or something along those lines. I don’t remember if I used the Lenovo driver or WD’s. Its obviously been over a year since I did this so I don’t really remember all that well as I haven’t had to touch it since.
I test the 4.3.4 and 4.3.3 from HP and the 4.3.7 from WD. The device is from ebay and i think the last owner flash the device to the latest firmware...
 
I want to say I used one previous version of the Lenovo or WD sourced driver but that was before the firmware was flashed. Try the one previous WD if you can for the moment and I‘ll try and backtrack how I installed.

I dunno if you’re in a return window but I’m kinda in a situation with the two computers I used these in so I don’t know how fast I can get to them. The 6.4 is in my upstairs media server/gaming computer I recently got a 3090 for it so I’m recasting it for better ventilation but am waiting on some PSU extension cables, which won’t be here until like Monday or Tuesday. The 3.2s are in my Threadripper and I’m Currently in the process of building a raid but all my drives won’t be here until Thursday but I’ve got the bays all hooked up where I normally have it plugged in.
 
For anyone who needs drivers for HP or dell / sandisk I put two installers in a zip here:


These two installers have worked for every fusionIO I've tried, 1x1.3tb dell, 1x3.2tb sandisk, and 2x6.4tb HP

If you don't trust my upload, for HP its the HPE_VSL-4.3.4_Windows and its the storepoint installer, for the rest its the Dell_IO_Management_3.2.15.1699_x64
 
My PC crashed due to thermal CPU heat with a stress test after installing my setup in a new case, and when it came back up the Fusion IO 3.2 TB drive is no longer recognized. I tried running the fio-status -a command and it says the drivers need to be loaded.

1624923858762.png


I uninstalled and re-installed the drivers to no success. I have a single orange solid light on the back of the card, which indicates what I already know.

1624924131404.png

1624924096475.png



I'm quite sure the 3.2TB drive didn't get too hot as I've got two 200MM fans blowing up directly onto it in my new CoolerMaster sl600m case. I guess I'll try to reseat the card or reset my bios or something, but right now the card isn't being recongized. I did get a windows update recently, and I'm half wondering if it's that instead?

1624923992754.png


Can't do much of anything if I can't get the drivers to load?
1624924916409.png
 
Last edited:
My PC crashed due to thermal CPU heat with a stress test after installing my setup in a new case, and when it came back up the Fusion IO 3.2 TB drive is no longer recognized. I tried running the fio-status -a command and it says the drivers need to be loaded.

View attachment 370183

I uninstalled and re-installed the drivers to no success. I have a single orange solid light on the back of the card, which indicates what I already know.

View attachment 370186
View attachment 370185


I'm quite sure the 3.2TB drive didn't get too hot as I've got two 200MM fans blowing up directly onto it in my new CoolerMaster sl600m case. I guess I'll try to reseat the card or reset my bios or something, but right now the card isn't being recongized. I did get a windows update recently, and I'm half wondering if it's that instead?

View attachment 370184

Can't do much of anything if I can't get the drivers to load?
View attachment 370187
I think either it died or its a driver issue. Have you tried the drivers I have linked in a previous post?

Might want to try it in a different windows install or even better another system if possible.
 
My PC crashed due to thermal CPU heat with a stress test after installing my setup in a new case, and when it came back up the Fusion IO 3.2 TB drive is no longer recognized. I tried running the fio-status -a command and it says the drivers need to be loaded.

View attachment 370183

I uninstalled and re-installed the drivers to no success. I have a single orange solid light on the back of the card, which indicates what I already know.

View attachment 370186
View attachment 370185


I'm quite sure the 3.2TB drive didn't get too hot as I've got two 200MM fans blowing up directly onto it in my new CoolerMaster sl600m case. I guess I'll try to reseat the card or reset my bios or something, but right now the card isn't being recongized. I did get a windows update recently, and I'm half wondering if it's that instead?

View attachment 370184

Can't do much of anything if I can't get the drivers to load?
View attachment 370187
I would try your idea of reseating the card as well. Also, you could roll back the last Windows update (do a shadow copy restore) and see if the behavior changes.
 
I got it working again, as I mentioned my CPU had crashed due to heat while overclocking and a purposeful stress test. I had skipped the health scan at reboot, and it wasn't warning me again. The full solution was to uninstall all drivers, reseat the card in a different PCI-E slot, and reinstall the drivers. Then the card was recognized by windows and I could get a health scan kicked off. Once the health scan kicked off, I moved the card back to the original PCI-E slot and it works normally. I'm not really sure if all that was necessary, or what single piece fixed it, but any way it's working now, and is recognized again via the admin cmd prompt and drivers.
 
Last edited:
I got it working again, as I mentioned my CPU had crashed due to heat while overclocking and a purposeful stress test. I had skipped the health scan at reboot, and it wasn't warning me again. The full solution was to uninstall all drivers, reseat the card in a different PCI-E slot, and reinstall the drivers. Then the card was recognized by windows and I could get a health scan kicked off. Once the health scan kicked off, I moved the card back to the original PCI-E slot and it works normally. I'm not really sure if all that was necessary, or what single piece fixed it, but any way it's working now, and is recognized again via the admin cmd prompt and drivers.
Ah, yes. Skipping the health scan disables the entire drive, though it usually prompts you after another reboot.

Glad you were able to fix it though.
 
Just wanted to mention it since I ran into an issue. It seems running the health scan does do something and I have some weirdness in some games after a couple of crashes the BSODed the machine and on reboot it had to do a health scan. I did each one and it always came back clean, but I started getting some game crashes or super slowdown on some of the installed games on the drive. The solution turned out to be simply uninstalling and reinstalling them. If I find another game that seems broken I will try just moving the files to a different drive then copying it back to see if that can also fix it.
 
This has been a great drive for the money. I really enjoy mine. I’ve had no issues except the annoyance of the long health scan requirement if and when the PC crashes.
 
These are awesome drives for desktops, a word of caution to homelab folks though: these are only supported through ESXi 6.7(u3), the ability to run the old drivers these require is dropped in ESXi 7.

Edit: I'm very curious to see the drive stats on these, it wouldn't surprise me to find out a lot of these were snapped up by chia farmers a year ago.
 
These are awesome drives for desktops, a word of caution to homelab folks though: these are only supported through ESXi 6.7(u3), the ability to run the old drivers these require is dropped in ESXi 7.

Edit: I'm very curious to see the drive stats on these, it wouldn't surprise me to find out a lot of these were snapped up by chia farmers a year ago.
I've plotted about 1.2PB on 2x 6.4tb fusionIO drives that had about 95% life remaining when I purchased them iirc. They are both sitting at around 85% now. I used madmax with 110gb ramdisks to reduce writes by 75% for about 800TB, the first 400TB was pre madmax so partial ramdisk plotting wasn't really a thing outside of primocache / linux write caching.

Code:
        Internal temperature: 38.88 degC, max 41.83 degC
        Internal voltage: avg 1.01V, max 1.03V
        Aux voltage: avg 1.80V, max 1.83V
        Reserve space status: Healthy; Reserves: 100.00%, warn at 10.00%
        Active media: 100.00%
        Rated PBW: 22.00 PB, 86.48% remaining
        Lifetime data volumes:
           Physical bytes written: 2,974,089,492,319,072
           Physical bytes read   : 2,931,707,006,698,176

Code:
        Internal temperature: 37.40 degC, max 39.37 degC
        Internal voltage: avg 1.01V, max 1.03V
        Aux voltage: avg 1.80V, max 1.83V
        Reserve space status: Healthy; Reserves: 100.00%, warn at 10.00%
        Active media: 100.00%
        Rated PBW: 22.00 PB, 84.55% remaining
        Lifetime data volumes:
           Physical bytes written: 3,397,929,662,450,112
           Physical bytes read   : 3,335,191,475,706,720

Realistically even if they only had 10% life remaining with 2.2PB of write endurance they would still be very usable for a long time to come for desktop usage. Many consumer QLC drives have less than 500TB write endurance. Even a decent sabrent rocket 4tb TLC ssd is only like 3.6PB of write endurance.

The FusionIO SSDs actually aren't great for chia plotting since they aren't fast enough for modern processors and using older server hardware allows for cheap enough ram to fully plot in memory, which not only eliminates ssd wear but also increases plotting speed. I'm guessing most chia farmers that did end up with fusionIO SSDs were very early chia farmers like myself.

Pre madmax (or madmax without partial ram plotting) it took about 1.2TB of write endurance to make a k32 plot. With madmax it takes about 400GB of write endurance. Madmax also allows for full ram plotting with a 256GB ram disk instead of a 330GB ram disk for ram plotting pre-madmax. Basically if you can put 288GB of ram in a system you don't need high write endurance SSDs. I think it is much more common for people to build 5900x/5950x systems and throw a couple NVME SSDs in it with 128gb ram or buy an older server and load it up with cheap ram.

TLDR I think even very used FusionIO SSDs are still likely to have way more write endurance than you would use even in a high write environment for at home / homelab usage.
 
All those references to madmax and this came out of my head, lol

Madmax: Beyond SSD (To the tune of We Don't Need Another Hero)
We don't need another crypto
We don't need another scam-o
All we want is [drive] life beyond
these cyptos...

:ROFLMAO::ROFLMAO::ROFLMAO::ROFLMAO::ROFLMAO:
 
i'm tight and can't afford them but them like they would work great for esxi since i won't run past 6.5 or 6.7vmaybe more will crop up later. I have my eyes on a pair of Lifepo4 100AH batteries
 
i'm tight and can't afford them but them like they would work great for esxi since i won't run past 6.5 or 6.7vmaybe more will crop up later. I have my eyes on a pair of Lifepo4 100AH batteries
Just put it on a credit card?
 
More just turned up from another
seller. https://www.ebay.com/itm/125201613623
He took $175 as an offer. I’m not far from this seller so I should have it in a few days. I’ll check the remaining life. My 3 from the last go around are all working fine. The 6.4TB sx350 is too.
Thanks for the heads-up on this, bought one of the last two remaining as a back-up to my primary 3.2TB ioScale SSD. I'll report health and drive stats once I receive it in the mail. FWIW, the seller accepted my $175 offer, so thanks man114 for the heads-up on that idea! :)
 
As an eBay Associate, HardForum may earn from qualifying purchases.
Nice! Can't wait for that 6.4TB to drop as time goes on--they should still be alive thanks to the endurance but be cheap by then because 'old'. :)

Keep an eye out I have seen new Lenovo ones pop up from time to time. I paid $400 or so for a new one. The issue is I’m sort of out of computers to use a 6.4tb. Also it’s 8x PCIe so depending on your slot arrangements you could have issues on a HEDT. I’m using this one on an x299 simply for game storage but that board I’d be stuck with the 4x.
 
Keep an eye out I have seen new Lenovo ones pop up from time to time. I paid $400 or so for a new one. The issue is I’m sort of out of computers to use a 6.4tb. Also it’s 8x PCIe so depending on your slot arrangements you could have issues on a HEDT. I’m using this one on an x299 simply for game storage but that board I’d be stuck with the 4x.
It's going to be years before I get one of the 6.4TB ones--I'm cheap. :D My idea would be to just stick in some system and share the drive as a nice reliable network SSD. I don't know if something like TrueNAS or openmediavault supports it, but that would be the other way to get it on the network.
 
Well it’s here looks brand new and 97.72% remaining
 

Attachments

  • 3FCC6E5F-4657-4329-B14F-4142E6117439.jpeg
    3FCC6E5F-4657-4329-B14F-4142E6117439.jpeg
    719.8 KB · Views: 1
Keep an eye out for Samsung MZ-4LB3T8HMLA as well. Ebay seems to have the price falling pretty fast. $375 3.84 TB Gen3 x4 drive. Just be aware it needs a sled or card as it's "M.3". I used this but I think I will eventually switch to this U.2 sled and this m.2 converter.
 
As an eBay Associate, HardForum may earn from qualifying purchases.
As an Amazon Associate, HardForum may earn from qualifying purchases.
I currently have on 3.2tb IOscale in my 5950x system, and bought another 3.2TB HP one off ebay (coming tuesday) for only $150 after offer sent.

Anyone got this working on Windows 11? My current desktop is Windows 10 and its been great for massive photodrops/duplication processes. Just going to test it out on my other desktop system that I'll probably be testing Windows 11 out.
 
I currently have on 3.2tb IOscale in my 5950x system, and bought another 3.2TB HP one off ebay (coming tuesday) for only $150 after offer sent.

Anyone got this working on Windows 11? My current desktop is Windows 10 and its been great for massive photodrops/duplication processes. Just going to test it out on my other desktop system that I'll probably be testing Windows 11 out.
I've successfully installed a 3.2tb on windows 11 for my father's PC using the same drivers I posted a while back. He has no issues with it.
 
I've successfully installed a 3.2tb on windows 11 for my father's PC using the same drivers I posted a while back. He has no issues with it.

Great to know. Hopefully these drivers are able to be used for a few more iterations of Windows.
 
just wanted to add that my 6.4 version has been flawless since installed on window 7,

what is the command prompt to check health of drive again ?
 
I think it’s fio-status -a

thanks, here is what it spits out on my drive..........

Microsoft Windows [Version 6.1.7601]
Copyright (c) 2009 Microsoft Corporation. All rights reserved.

C:\Windows\system32>fio-status -a

Found 1 VSL driver package:
4.3.6 build 1173 Driver: loaded

Found 1 ioMemory device in this system

Adapter: ioMono (driver 4.3.6)
ioMemory SX300-6400, Product Number:F13-005-6400-CS-0001, SN:1643D0074,
FIO SN:1643D0074
ioMemory Adapter Controller, PN:pA006012114
Product UUID:78ba9478-3116-5b6f-b77b-3b1101588361
PCIe Bus voltage: avg 12.04V
PCIe Bus current: avg 0.82A
PCIe Bus power: avg 9.85W
PCIe Power limit threshold: 74.75W
PCIe slot available power: unavailable
Connected ioMemory modules:
fct0: 04:00.0, Product Number:F13-005-6400-CS-0001, SN:1643D007
4

fct0 Attached
ioMemory Adapter Controller, Product Number:F13-005-6400-CS-0001, SN:164
3D0074
ioMemory Adapter Controller, PN:pA006012114
Microcode Versions: App:0.0.30.0
Powerloss protection: protected
PCI:04:00.0
Vendor:1aed, Device:3001, Sub vendor:1aed, Sub device:3001
Firmware v8.9.9, rev 20170222 Public
6400.00 GBytes device size
Format: v501, 12500000000 sectors of 512 bytes
PCIe slot available power: 75.00W
PCIe negotiated link: 8 lanes at 5.0 Gt/sec each, 4000.00 MBytes/sec tot
al
Internal temperature: 62.51 degC, max 66.44 degC
Internal voltage: avg 1.02V, max 1.02V
Aux voltage: avg 1.79V, max 1.83V
Reserve space status: Healthy; Reserves: 100.00%, warn at 10.00%
Active media: 100.00%
Rated PBW: 22.00 PB, 94.77% remaining
Lifetime data volumes:
Physical bytes written: 1,149,526,094,090,960
Physical bytes read : 1,804,410,505,689,120
RAM usage:
Current: 3,932,444,224 bytes
Peak : 3,938,056,064 bytes
Contained Virtual Partitions:
fct0: ID:0, UUID:ec914d1d-c606-4ec8-ac6e-d1625e84b95e

fct0 State: Online, Type: block device, Device: \\?\PhysicalDrive4
ID:0, UUID:ec914d1d-c606-4ec8-ac6e-d1625e84b95e
6400.00 GBytes device size
Format: 12500000000 sectors of 512 bytes
Sectors In Use: 11328377895
Max Physical Sectors Allowed: 12500000000
Min Physical Sectors Reserved: 12500000000


C:\Windows\system32>
 
Anybody get this working in linux? Works great in windows 11 but linux can't even see it.
 
Anybody get this working in linux? Works great in windows 11 but linux can't even see it.
I had good luck with this repo:

https://github.com/RemixVSL/iomemory-vsl4

I had no luck with DKMS or the dpkg route but just building from source worked.

You can get the card working without fio-utils but it is a bit annoying to not have those tools available. Here is a link to a zip of the .deb files I needed to get commands like fio-status -a to work, assuming you are on a debian based OS.



EDIT: the card should show up as /dev/fioa once the drivers have been loaded and you've restarted. You can format/mount from there. If you create a partition on it you should see it as /dev/fioa1
 
Last edited:
As an eBay Associate, HardForum may earn from qualifying purchases.
Back
Top