A few days I posted that I had got the AMD R9 Fury working with GPU Passthrough in ESXi in this thread:
Passthrough Gaming Rig Upgrade
At the time i didn't quite know a step by step procedure of how I got things working (other than "i went to bed"), I just wanted people to know that it was possible since I hadn't seen any literature online saying that the R9 Fury would work, but lots of similar literature about the AMD Radeon 280/290/380/390 series etc.
I'd also like to note that it seems that this "procedure" is a very dependent on what GPU you're using. I see lots of other procedures out there, some are much simpler than this. It seems that the procedure is generally easier with older AMD GPU's....?
With a more structured troubleshooting approach, I spent the last few days documenting a step by step procedure on how I got this working. Turns out it was still difficult to recreate, even with a VM that I knew worked, and it sure took a long time to determine what steps were really required and which were optional, but here it is:
Motherboard: Supermicro X10SRH-CLN4F
CPU: Xeon E5-1650v3
RAM: 128GB DDR4 PC4-1700 ECC REGISTERED DIMM MEM-DR432L-SL01-ER21
Video Card: Sapphire R9 Fury Tri-X 4GB HBM PCI-E (not the OC’ed version)
OS: ESXi 6.0.0
Hopefully this helps someone out there. If you have any questions let me know. Thanks.
--BroccLee
Passthrough Gaming Rig Upgrade
At the time i didn't quite know a step by step procedure of how I got things working (other than "i went to bed"), I just wanted people to know that it was possible since I hadn't seen any literature online saying that the R9 Fury would work, but lots of similar literature about the AMD Radeon 280/290/380/390 series etc.
I'd also like to note that it seems that this "procedure" is a very dependent on what GPU you're using. I see lots of other procedures out there, some are much simpler than this. It seems that the procedure is generally easier with older AMD GPU's....?
With a more structured troubleshooting approach, I spent the last few days documenting a step by step procedure on how I got this working. Turns out it was still difficult to recreate, even with a VM that I knew worked, and it sure took a long time to determine what steps were really required and which were optional, but here it is:
Motherboard: Supermicro X10SRH-CLN4F
CPU: Xeon E5-1650v3
RAM: 128GB DDR4 PC4-1700 ECC REGISTERED DIMM MEM-DR432L-SL01-ER21
Video Card: Sapphire R9 Fury Tri-X 4GB HBM PCI-E (not the OC’ed version)
OS: ESXi 6.0.0
- Install ESXi 6.0
- Directly attach monitor to GPU (using DisplayPort in my case)
- Boot up ESXi
- Add R9 Fury GPU adapter to DirectPath I/O Configuration screen in vSphere Client. Note that you will be forced to add both the video and audio adapters.
- Reboot ESXi
- Add new Win10 64-bit VM.
- (Optional) Dedicate 32GB RAM to VM (optional - doesn’t seem to matter if RAM is dedicated or not)
- (Optional) EFI Boot Mode (optional - can be BIOS as well, doesn’t seem to matter)
- (Optional) Dedicate 32GB RAM to VM (optional - doesn’t seem to matter if RAM is dedicated or not)
- Install Windows 10
- Install VMWare Tools
- (Optional) Do all Windows Critical Updates (optional - can also do later but rebooting VM is a pain since VM doesn’t seem to release the GPU on VM reboot, causing VM to freeze before login screen)
- Shut down VM
- (Optional) Take VM snapshot (to make troubleshooting easier if required)
- Boot Win10 VM up
- Either
- Disable windows driver search (Right click This PC, Properties, Advanced System Settings, Hardware tab, Device Installation Settings, No, Never Install, Uncheck “Automatically get device app info…”
OR
- Install Display Driver Uninstaller v15.3.1, run it, allow it to reboot to safe mode. Once in safe mode, do a AMD Driver uninstall by clicking “Clean and Restart”. DDU disables automatic updates of windows device drivers, effectively doing the same thing as item a)
- Disable windows driver search (Right click This PC, Properties, Advanced System Settings, Hardware tab, Device Installation Settings, No, Never Install, Uncheck “Automatically get device app info…”
- Shut down Win10 VM
- Add R9 Fury video adapter as a PCI Device to VM in VM Settings
- Add R9 Fury audio adapter as a PCI Device to VM in VM Settings
- Start up Win10 VM
- Once Win10 boots up, device manager should still show a yellow exclamation mark on a generic Microsoft Basic Display Adapter (in other words, should not say AMD Radeon quite yet)
- Wait ~2 minutes or so to make sure Windows isn’t attempting to update drivers in background
- Install AMD Drivers. You can install all of AMD’s bloatware as well, no need to exclude anything). My Sapphire GPU has a blue LED that turns on when it is active - that turns on here, and DP-connected monitor turns on
Driver versions Crimson 15.12, 16.3.1, and 16.3.2 all tested with my procedure here, all seem to work.
- Once driver installation completes, shut down Win10 VM
- Reboot ESXi physical host (this will free up the GPU since the blue light turns on during driver installation effectively consuming the GPU resources)
- Start up Win10 VM using vSphere
- Open VM Console window in vSphere, click to give focus.
- This vSphere console window is the “secondary” monitor, and your directly-connected monitor is set up as the “primary monitor”. Move mouse around in vSphere console and see if you can get the mouse over to the primary monitor. Cursor should weirdly disappear and then appear on primary monitor when crossing over from right half of console window to left hand side of console window
- Click to enter login/password
- Log in and have fun!
- Open VM Console window in vSphere, click to give focus.
- Make a mental note that you can’t reboot the VM (only). If you have to reboot the VM, you have to reboot the entire ESXi physical machine.
- If you want, re-enable automatic update of Windows device drivers (it got disabled in step 13 above)
Hopefully this helps someone out there. If you have any questions let me know. Thanks.
--BroccLee
Last edited: