Virtualizing a HTPC

MadsRC

n00b
Joined
Jul 26, 2012
Messages
4
Currently I'm in the process of building a few HTPC's for the house, and while looking for parts it struck me... Why not virtualize them and run them off my ESXi 5.1 server?

I did a little googling but couldn't find all the answer I was looking for, so here goes.

I plan on virtualizing a Windows 7 machine (For Blu-Ray) with XBMC, and then I need to find a way to present said machine to the TV.

For that I would need a low powered machine, that needs no more computing power than being able to show full HD (Right?, All the processing would be done at the server?)

Guess the 2 questions are:

1: Would that be a viable option? or would it be better/cheaper to just buy and install a physical pc with XBMC?

2: How to I present the virtual XBMC machine to the TV? I'm guessing some client-software installed on a Linux (I prefer Linux) OS that is installed on a cheap small machine. How would the screen/image quality be? Would It be better If I had a physical XBMC machine?
 
If going the VM route, couldn't you just get one of those Android Stick PC's that plug into your TV's HDMI and run the VM from that?
 
For XBMC you will need to passthrough a video card to the VM, then just run a cable from the video card to the TV. With out the video card the VM will not have the graphics processing required for XBMC to play smooth.

But doing this will allow you to use HDMI audio out. Then you would just need to passthrought the USB IR receiver.

Some people have been working on doing this configuration for a while with little luck so far, mostly because ESXi has had issues with video card passthrough support.
 
windows 7 with windows media center VM

xbox 360 at the TV setup as a windows media center extender

that's what i do, works great

cheap, easy, multipurpose use hardware at the TV

with a HDHomerun Prime tuner you could even ditch your cable boxes and use the 360 as a STB.
 
As said above, to run an independent vm as a htpc off esxi, you basically have to passthrough a gpu and a USB hub (unless the USB receiver you have can be passed through individually - the ones I've seen act similar to mice/kb's and can't).

With that said, passthrough in itself has caveats, much less gpu passthrough. If you haven't played with it before, the bottom line sticking point is having vt-d available - if you have, gpu passthrough is a sticky subject that is not an exact science. To date, I've only seen esxi gpu passthrough successful on amd cards, not nvidia. For plenty of on-topic discussion, there's an active thread on the VMware forums:

http://communities.vmware.com/thread/297072?start=600&tstart=0

For clarification, I have personally passed through several gpus successfully on different platforms, so it's definitely doable.

The other caveat I'd mention is that 5.1 broke USB passthrough in several instances (I believe it traced back to certain USB hubs being on the pci bus). I'm not sure if there's been any fix for this in the last month as I've been mostly out of town and not tinkering, but my home lab servers are still on 5.0u1.

Edit: to answer your other question, and add some comments - image quality is the same - for testing, I've had full gaming at 1600p running on a vm that performed the same as native, so using it for htpc would work, especially on cheaper cards that you could pick up easily. Also, if your goal is to run xbmc, I'd consider just running a direct boot xbmc skipping the w7 part - people seem to have had great use of that, and those xbmc installs appear to have come a long way (not to mention the installs would be smaller and less resource intensive).

As to the benefit of running multiple htpcs off of one esxi, it has the upside of having central management. Upgrading can come to updating one vm image and having all the htpcs updated; hardware upgrade can be off a central pool, and that means less wasted resources (which is a real concern with htpcs that individually probably have a fair amount of downtime/non-concurrent use).
 
Last edited:
This all sounds like a great idea but a heck of a lot of work and expense. I just have a couple of NAS's with my media on them, connected to my wired network and a couple of boxee box media players (hardware) which works for me.
 
openelec.tv would be a great install for this. I have it running on hardware Since my server doesn't have PCIe ports open for video cards.

The main issue with this setup is you need one video card per HTPC VM. Currently I think the best option is a nettop PC loaded with openelec.tv

I have a Acer R3610 running openelec.tv with the aeon nox skin. It works great. Plays 1080p, video just fine.
 
The route of a Win7 VM and using an Xbox 360 as an extender will work; however, directly connecting an output display to a VM with a passed through GPU barely works and I wouldn't recommend it.

Test setup:
Supermicro X9SRI-F
Intel E5-2620
32GB ECC RAM
PowerColor AX7750 1GBD5-DH Radeon HD 7750
SiliconDust HDHomeRun Prime
ESXi 5.0

The main problem is that you cannot remove the virtual GPU, which is not HDCP compliant. As a result, you will not be able to send tv to the display or any content protected movies, etc. Despite being disappointed, I continued to test just running youtube videos. For the most part, I was able to watch videos just fine, but every now and then, you get pops and pauses in the audio. That was unacceptable, so I scrapped the project altogether. It is a shame because I already have lots of hardware and just adding a video card would have been great, especially since my ESXi server is on all of the time.

Keep in mind that I did not test with the latest version of ESXi...perhaps it's better.
 
My HDHomerun Prime's will be here in a couple days, I'll let you know if the ESXi5 VM WITHOUT videocard passthrough works with live, encrypted TV.

My setup will be as follows:
2x Dual Core Opteron 270
16GB DDR ECC
XBox 360 setup as a media extender
Windows 7 Ultimate VM
3D enabled in VM settings.

Right now I know for a fact that the setup works with 2x 360's streaming h264 off the VM, but I'll update you on the live tv when I get it all set up.
 
The 360s are doing the video processing so yes they would work. All the VM is doing is acting as a file server.
 
The 360s are doing the video processing so yes they would work. All the VM is doing is acting as a file server.

Not entirely true for video that isn't natively supported by the 360 (MKV for example). In that case the PC is transcoding, which also works just fine with my setup.
 
Not entirely true for video that isn't natively supported by the 360 (MKV for example). In that case the PC is transcoding, which also works just fine with my setup.

However since it is just transcoding, the CPU can do this and you do not need a video card to handle this task.

I run a SageTV server on an Esxi VM and it works great. Note if doing to use any tuners for DVR type use I strongly recommend a networked tuner like HdHomeruns this way you have no issues getting it to work with a VM.
 
Here's a really stupid question, I apologize for derailing in advance, thinking out loud here. If I should post as a new topic LMK I will.

Could I just thin client the house?

Build a server with a GPU and 2012. Then RDSH to thin clients including Mediacenter or XBMC.
 
This all sounds like a great idea but a heck of a lot of work and expense. I just have a couple of NAS's with my media on them, connected to my wired network and a couple of boxee box media players (hardware) which works for me.

Your setup is simpler to set up, but it is worse in every other way. More components. Non-centralized. More power draw (one of virtualization's top benefits being that of increased power efficiency due to fewer machines needing to be run). Limited to whatever your low-end proprietary boxes support. Limited customization possibilities. Any customization you do on one box not being able to automatically replicate to the others. etc. Your statement is like asking why a race car driver would bother to modify a car to be fast when they could just buy a Civic like you have. Nothing wrong with owning a Civic, but the Civic really doesn't compare in the situation being discussed, you know?

Those saying to pass through a video card are correct. You will need IOMMU for that.
 
Last edited:
This is a good idea, if the HTPC could be virtualized with a cc tuner setup for extenders.
 
I currently use vmware 5.0 with a ceton PCI card. I also passed a gpu through to the windows 7 vm.

SO my vmware server sits behind the tv and an HDMI cable is directly connected to the TV. I then use xbox in the bedroom for TV.

Some things to note....
1) For the windows 7 VM, i had to override DMA to get the cable card activated
2) When changing channels, it takes some time on the vmware TV to complete the change. Xbox changes right away.
3) DVD's do not work on the Vmware TV. Video lan is what i use in it's place.
4) I am able to record 4 channels, watch a tv program on the vmware TV, and watch another program on xbox tv.
 
I have done this, and would probably recommend against it

Gigabyte 990FXA-UD3
FX8150 CPU
32GB DDR3 Ram
60GB OCZ Agility 3
Asus HD6450
Asus HD5450
Unknown HD4350

3 films running in three separate Windows 7 VMs

DSCN0858.jpg


The issues I had:
- Sometimes the host would crash randomly
- DXVA would not work at all
- VMs wouldn't always start
- ESXi 5.1 won't work, need to use 5.0

I came to the conclusion that in the future it may be the way forward, but for now it's still too early to use properly.
 
Technically Xen is probably better for this purpose anyway. It has the best GPU passthru support still, as far as I know.
 
I would recommend VM for server/datastorage in a remote part of the house. Anything with a number of drives gets noisy. Would not want that noise in the room I want to watch TV in. Then either use prebuilt extenders like PS3, xbox etc or build your own low power QUIET PC to do the display driving.

You can build a small, very quiet machine for the TV itself. I would use a small SSD to put the OS on or maybe even a flash drive since you do not need much storage. Using low power CPUs and even a pico style PSU and you can have silent client.

Then for the VM host build a freakin monster of a machine without worry about size or noise level since you will put that in a location where those issues do not matter.
 
With VMs you can still place the server in a remote room, depending on how long the cable runs will be. You can get 100FT HDMI cables, and 82FT USB cables.

Once vmware fixes the issues with GPU pass-through this will be a great project, but with the current status, you would be better off using XEN, or light way media players.

I would not requirement the PS3/360 for this use as the media players are no where as advanced for this as XBMC, or Plex. They will also require you to trans-code the videos, which will double the power usage compared to a small light weight system.
 
I would recommend VM for server/datastorage in a remote part of the house. Anything with a number of drives gets noisy. Would not want that noise in the room I want to watch TV in. Then either use prebuilt extenders like PS3, xbox etc or build your own low power QUIET PC to do the display driving.

Not nearly as good using a console. I use PS3 for DLNA and it sucks. Seeking is crazy slow. Folder load times are crap. Need to make sure the media server software I run supports all the weird video formats I need. 360 is more useful than PS3 though.

One could just run a HDMI cable from a remote VM box over a CAT-6 HDMI balun and a USB CAT-5/6 balun and have silent, good, and lower power overall. Just harder to setup and requires IOMMU which Intel is very stingy about.
 
Last edited:
I was more interested in the feasibility of an RFX setup, not xen or vmware. I've done the VM and access via RDP and yeah its rough. My line of thinking was more like this.

Large back end file/vm server SKT2011 6core w/64GB RAM and highend GPU w/3-4GB vRAM running 2012 on it. Enable RemoteFX on the VMs.
Thin client front end for PCs throughout the house, including HTPCs. Using something like an HP or viewsonic thinclient, or better yet a zeroclient like, thinlinx hooked to a screen and whatever HID for that terminal (K/M or remote, gamepad) . Then remote session host to the local terminal. No sure you could do heavy gaming but I would think HD streaming should work
Thoughts?
 
The issue with RFX is that HyperV only allows each VM to access 200MB of ram. It could work, but might be limited based on vram.
 
I run an ESXi 5 box with several VMs, including a Win7 VM w/Media Center - no passthru GPU. I've allocated minimal resources to it as I only have 1 extender (ceton echo) on the network. I use a HDHR Prime 3-tuner network device with a cable card to decrypt FIOS cable. This works perfectly for my needs.

Some things to consider:
1) For a multi-TV setup you definitely want a single media center "server" and extenders. This is the easiest way to share recorded TV throughout the house. You can hack around it - but extenders are relatively cheap (ceton, or xbox 360).
2) Windows 7 media center does not officially support recording to a network disk. You can fake it out, or setup some copy jobs (to run when a recording is complete).

I got around #2 by allocating my W7MC vm a ton of vmfs disk. My ESXi datastore is served by a VM that runs Nexenta and serves up lots of ZFS disk - which is great for VMs.

This is my post on my ESXi setup - still running great: http://hardforum.com/showthread.php?t=158270
 
2) Windows 7 media center does not officially support recording to a network disk. You can fake it out, or setup some copy jobs (to run when a recording is complete).

I am not using Media Center but SageTV. However Win7 does support iSCSI, if your file server supports iSCSI you could have the media VM mount the iSCSI device. That is how I handle my record drive with sage but mainly because it gives me better performance than SMB and for the record drive that can have 6 HD streams written with 3 streams reading at the same time performanace was a concern.
 
I submitted a ticket to Ceton inquiring about the feasibility of using the Ceton PCIe tuner card with ESX on a Win7 VM. Their "CYA response" was:

The infiniTV is recognized as a network device and can suffer from communication problems when used in conjunction with virtual machines or VPN's. Also, even though it would be used as a server the video adapter would need to be HDCP compliant. As part of the Windows Media Center setup, which is completely separate from the infiniTV, you would need to run the Digital Cable Advisor tool. The tool checks to make sure your system is capable of handling digital cable. This is a Microsoft requirement and HDCP compliance is one of the checks in this test. If the system does not support HDCP then the advisor will not pass and you will be unable to access copy protected content.

Using a server class board, and then adding to it the complexities of it being a VM- I doubt the gfx will pass this test. How did you guys get around this?

I flat out do not believe that with PCI passthrough, the InfiniTV card will have 'communication problems'
 
Has there been any update to this? I am going to attempt this as well, with two concerns-
1) Ceton pass-through via VMWare for the Win7 VM to even recognize/ use the tuner
2) HDCP compliance in the form of the Digital Cable Advisor tool

I'm hoping step one is easy, and there is a hack (registry maybe?) for step two.
 
Since you are going to pass-through the video/ceton HDCP shouldn't be an issue.
 
For the HDCP/DCA issue, you can use this override utility: http://www.missingremote.com/guide/override-digital-cable-advisor-windows-media-center-7

The Ceton can be passed through under ESXi 5.0 (not 5.1, however). I did this for awhile, strictly using the VM as a host/server to run XBOX 360 extenders. I never watched video from within the VM, though.

Perfect!!!!!

That is exactly my goal. Host the Ceton in a VM and use extenders such as the Echo and the 360.

While network tuners is a good option, the extender route allows for consolidated recording while using less powerful machines (extenders Vs HTPCs)
 
The Ceton can be passed through under ESXi 5.0 (not 5.1, however). I did this for awhile, strictly using the VM as a host/server to run XBOX 360 extenders. I never watched video from within the VM, though.

Why did you move from this config, if you dont mind me asking. Would you happen to know why 5.0 works but 5.1 doesnt... that seems counter intuitive
 
Why did you move from this config, if you dont mind me asking. Would you happen to know why 5.0 works but 5.1 doesnt... that seems counter intuitive

5.1 broke a lot of passthrough devices - stuff that was never supported, like the Ceton, worked fine in 5.0, but caused a PSOD in 5.1.

I ended up moving away from that configuration in part because I could never get extenders to reliably play back all of my non-TV media without having to re-encode a lot of it, and the hassle eventually became too much. Now I'm just using a dedicated HTPC directly on the TV--more power consumption, but being able to playback full bit rate BR rips is worth it.

I'm toying with the idea of trying to get virtualization working again and utilizing a dedicated video card passed through to the VM for output, but I have very low expectations of this working properly since I'm not willing to invest in a professional graphics card just for this purpose.
 
I thought about it as well until I looked into the details. I settled on a relatively low powered pc that I also sometimes use as a desktop/internet surfing/remote desktop client, and as the desktop I use to connect to my virtualization servers, home servers, etc.
 
Bump. Any one have new XP on this?

Yes. I'm working on an all-in-one. I'm not using ESXi though. What I have now is working as a gaming machine. But I'm working to consolidate all of my computers (well most of them) into a large all-in-one and a HTPC is apart of it.
 
Yes. I'm working on an all-in-one. I'm not using ESXi though. What I have now is working as a gaming machine. But I'm working to consolidate all of my computers (well most of them) into a large all-in-one and a HTPC is apart of it.

im sorry but if i understand your post correctly... your NOT using your HTPC in a vm currently.
 
im sorry but if i understand your post correctly... your NOT using your HTPC in a vm currently.

You basically could say yes I am because I've already tested the configuration. As of right now I have Windows and Linux working concurrently. The Windows side has a GPU passthrough. I'm using Mythtv on the Linux side. I'm not passing through the TV Cards because I don't have to. My current setup isn't the final configuration though.

The only question left is if I can passthrough multiple GPU's. The only reason I want to passthrough more than one is that I want to be able to game on the HTPC as well. My configuration is based on this guide here.

Guide: GPU Passthrough Using KVM + LVM2 + Ubuntu Gnome - From Beginning to End
 
your base OS is KVM and your running myth on the KVM. do i have that right?
 
Back
Top