Virtualized gaming options

captainwin

n00b
Joined
Feb 26, 2013
Messages
17
My objective is to put together a server that can host graphically demanding content for low end devices over LAN. I want friends to be able to bring their own low end laptop and experience high end gaming. Additionally, I want a device with the minimal specs necessary to create a seamless remote PC user experience via my HDtv and monitor. The system overall should be able to support between 2 and 4 concurrent users. I have an MSDN VS ultimate account, and so I have virtually unlimited licenses to all Microsoft products, if that helps.

I'm able to spend up to $1500 on the server, and however much is necessary on other devices - such as mini pc, thin client, or zero clients for the HDtv and monitor. Though, cheaper is always better.

The technologies I've researched that might help accomplish my goal are Citrix HDX, Microsoft RemoteFX, and Ericom Blaze Server. Below, I've linked an older article that outlines the differences between HDX and RemoteFX. I think HDX meets my needs, and so I would want a host and client devices that meet the criteria needed to take advantage of HDX. Ericom is optional, but my testing on windows azure showed enormous increases in remote desktop responsiveness. I'm not even sure if I can get a single license for Ericom, but the trial seemed very promising for this kind of setup and so I wanted to mention it.

http://www.brianmadden.com/blogs/br...0/28/citrix-hdx-3d-vs-microsoft-remotefx.aspx

I've also researched thin/zero client products from HP, Dell, Ncomputing, Centerm, Fujitsu, 10zig, and ThinLinX. I've also looked at mini-pc options, which are dramatically cheaper than zero and thin clients. However, I would like to understand how such a cheap device can deliver a desktop-like experience. Overall, the product that advertises compatibility with HDX and also has reasonable price point is Ncomputing's N500. You can get one for $100 to $175. That seems kind of steep, so I'm hoping there's something better available.

I've read all of the content on this forum regarding virtualized gaming, which is really my most important goal. The results people have achieved seem to vary; with some having success even using mediocre configurations. I read about the mouse problems with remotefx, and that's absolutely not tolerable. So, I'm hesitant to start buying things before I know how everything will interact.

Here's the server configuration I'm considering:

PCPartPicker part list / Price breakdown by merchant / Benchmarks

CPU: AMD FX-8350 4.0GHz 8-Core Processor ($189.99 @ Newegg)
CPU Cooler: Thermaltake Water 2.0 Performer 81.3 CFM Liquid CPU Cooler ($64.99 @ Newegg)
Motherboard: ASRock 990FX Extreme4 ATX AM3+ Motherboard ($148.47 @ Newegg)
Memory: G.Skill Ripjaws X Series 8GB (1 x 8GB) DDR3-1600 Memory ($46.99 @ Newegg)
Memory: G.Skill Ripjaws X Series 8GB (1 x 8GB) DDR3-1600 Memory ($46.99 @ Newegg)
Storage: Crucial M4 128GB 2.5" Solid State Disk ($107.95 @ Mac Connection)
Storage: Crucial M4 128GB 2.5" Solid State Disk ($107.95 @ Mac Connection)
Video Card: Sapphire Radeon HD 7950 3GB Video Card (2-Way CrossFire) ($305.66 @ Newegg)
Video Card: Sapphire Radeon HD 7950 3GB Video Card (2-Way CrossFire) ($305.66 @ Newegg)
Wired Network Adapter: Intel EXPI9402PT 10/100/1000 Mbps PCI-Express x4 Network Adapter ($72.84 @ Compuvest)
Case: NZXT Phantom (White) ATX Full Tower Case ($119.99 @ Amazon)
Power Supply: NZXT HALE 90 750W 80 PLUS Gold Certified ATX12V / EPS12V Power Supply ($109.99 @ Newegg)
Total: $1618.47
(Prices include shipping, taxes, and discounts when available.)
(Generated by PCPartPicker 2013-02-27 00:15 EST-0500)

Can I get some ideas about different products I might be able to use? Will my hardware selection play nice with remoting? Thanks in advance!
 
I decided to try the Zealz GK802. It's the first quad core mini-PC device, and $100 new. I'm optimistic about the performance, but we'll see how this device works out when it arrives in a week or so.

As far as the remoting client, I'm thinking I'll try Splashtop GamePad THD, which has some pretty decent demos on youtube. It's advertised as remote desktop with optimizations for gaming. The other option is probably Kainy, but I read a few user testimonials saying that Splashtop is a better product.

To pass the time, I'm going to demo the remote gaming idea using my current 570gtx rig running windows server 2012 and hyper-v. I'll post the results once I can find a suitable client device...
 
So, here's some preliminary results:

Windows Server 2012 running on i5 with a GTX570.
Server installed on an SSD.
VM stored on a separate SSD.
3 CPUs allocated to the VM.
Connected to the VM using a variety of remote access products.

For my testing, I simply opened up Diablo 3 and loaded up the character select screen.

Test #1 (no RemoteFX vGPU assigned to the VM): Game managed to start up, but the performance was horrendous - as expected. The system could barely render mouse movement.

Test #2 (RemoteFX vGPU assigned to the VM): Game started up quickly, and the game seemed to run fine on the VM. However, standard Microsoft remote desktop delivered unplayable framerates.

Test #3 (RemoteFX vGPU assigned to the VM, connected using Ericom's remoting protocol): Same VM performance, but the remote access experience was much better. The game was *almost* playable at 800x600, but not even close at 1920x1080. I would estimate about 8 fps at 1080p and 18 or so at 800x600.

Test #4 (RemoteFX vGPU assigned to the VM, connected using Splashtop 2): Splashtop kept giving me an error saying I couldn't run games in fullscreen mode, so I had to put the game in fullscreen windowed. The visual quality was really bad and I couldn't maximize the Splashtop client window. However, the streaming speed approached Ericom's. I would rather use Ericom since with Splashtop you have to use their client.

Overall, I suspect RemoteFX is enabling a completely playable experience server side, but the limiting factor is the streaming protocol. I think if I could figure out the bottleneck there, I'd have a very good setup.

I wanted to mess around with the remote desktop configuration, but Windows Server 2012 doesn't play well with DNS and remote desktop services installed on the same box. I wasn't able to bring up the remote services management screen after a couple hours of troubleshooting, and so I'm going to just try Multipoint Server 2012, which is supposed to be a little more user friendly. I'll post the results soon.
 
Multi-point was more trouble than I expected. It has a very specific purpose, and that purpose isn't gaming.

I reinstalled win2k12, and also installed teradici arch. I can say this combination is the highest performing so far, but it's still completely unplayable. There is a lot of unpredictable chop. I can't discern much difference between 800x600 and 1920x1080 after I installed teradici arch, so I'm thinking maybe there is something else going on. When I'm at the character select, I'm getting at least 25fps, but then suddenly it will drop down to 0fps for a second or so, and then jump back up to normal framerates. In game, I was able to run around in town, but there was a lot of screen freeze and choppiness.

Looking at the task manager, the highest I saw the receive bandwidth reach was around 25Mbps. So, that means the VM was sending out 25 and the server reading 25. 50Mbps doesn't seem like enough to cause a LAN bandwidth issue. CPU utilization never went above 30%. Memory was well below the max.

I'll have to do some more fiddling to see what I can come up with.
 
I was wrong about the VM performance. According to 3DMark11, the VM is horribly underperforming:

Graphics Score 2553

Physics Score 3456

Combined Score 1131

The VM is using a RemoteFX GPU according to the device manager. However, fps in Diablo 3 was capping around 10 to 14 at 1080p. So, RemoteFX is either very underwhelming, doesn't play well with my 570GTX, or some feature of the technology is not being used properly.

Outside of the VM, Diablo 3 hits about 30 to 40 fps. So, use that as a reference.

I'm going to look at setting up XenServer to try to use HDX 3D Pro. I'll post back my results.
 
I've been doing somewhat similar tests at work, and it seems fruitless so far. We need decent video playback 1080p+ and the technology just isn't there.
 
Unless I am missing something, XenServer needs a minimum of two machines; one for XenServer and one for XenCenter. If anyone knows a way to run them both on one machine, let me know. I'm thinking XenServer is going to be faster than Hyper-V since it is a bare metal virtualization setup.
 
Unless I am missing something, XenServer needs a minimum of two machines; one for XenServer and one for XenCenter. If anyone knows a way to run them both on one machine, let me know. I'm thinking XenServer is going to be faster than Hyper-V since it is a bare metal virtualization setup.

XenCenter is the management interface for XenServer.
You can run a VM on the XenServer that has XenCenter installed.
 
At this point I'm just waiting for my RAID card to come in to complete my setup. I've been messing around with it already in VMWare just because I like my VMs to be portable, but it is buggy as all hell. I'm going to be giving Xen a shot.

Anyone more familiar with it, is it best just to use the XCP boot iso or setup a regular Ubuntu server install and load the XCP-XAPI packages?
 
XenCenter is the management interface for XenServer.
You can run a VM on the XenServer that has XenCenter installed.

This makes sense. Are there any guidance docs on how to get this setup? I couldn't find anything after an exhaustive search. Everything I found just shows me how to create VMs after XenCenter is already available.

EDIT: Actually, XenCenter 6.1 free edition doesn't have GPU pass-thru. Only the enterprise edition and above have the feature enabled. So, unless someone wants to shell out $2500 for an unproven idea, then this idea is shot. I guess I'll look into ESXi now...

http://www.citrix.com/products/xenserver/features/editions.html?ntref=next
 
Last edited:
That's the commercial version. The open source version of Xen does support GPU passthrough for free.
 
That's the commercial version. The open source version of Xen does support GPU passthrough for free.

Interesting. Are there any basic guidance documents that can help me step by step with setting up a single machine host? One of the great things about XenServer seems to be the ease of getting it installed. However, I am having a really hard time finding information on how to, for instance, install the Xen hypervisor and create and access a win server 2012 VM with only a single machine available to me. I'm reading xen.org, but it's a lot of information.
 
Last edited:
So, all of my hardware is in and assembled. I am running VMware off of a usb stick, have windows 8 pro running in a vm, and even have both of my 7950 cards recognized by the VM. I am in the process of creating a win2008 server so I can install view connection server and take advantage of software PCoIP.

Bad news, Diablo 3 still had unplayable FPS over rdp and view client console. I haven't tested the performance of the VM, but my feeling is that the radeon card isn't being used. Is there a way to confirm a process is utilizing a particular device? I will post back more results soon regarding this issue.

Anyone happen to know if using crossfire is possible under ESXi? I wasn't able to get the catalyst control center to install, so I am not sure how I would go about enabling crossfire.
 
Last edited:
So, all of my hardware is in and assembled. I am running VMware off of a usb stick, have windows 8 pro running in a vm, and even have both of my 5970 cards recognized by the VM. I am in the process of creating a win2008 server so I can install view connection server and take advantage of software PCoIP.

Bad news, Diablo 3 still had unplayable FPS over rdp and view client console. I haven't tested the performance of the VM, but my feeling is that the radeon card isn't being used. Is there a way to confirm a process is utilizing a particular device? I will post back more results soon regarding this issue.

Anyone happen to know if using crossfire is possible under ESXi? I wasn't able to get the catalyst control center to install, so I am not sure how I would go about enabling crossfire.
Did you pass through the video to the OS?
 
With View you have to use PCoIP to enable 3D hardware acceleration.

That's what I'm working on now. I'm trying to figure out how to get PCoIP working without needing to install vCenter Server. I don't have $6,000 laying around for this little project, unfortunately.
 
Well, not to get too realistic on you - but if you can support a single playable session you've accomplished a great thing. VDI which is essentially what you're doing here isn't designed for this type of use case yet. And don't forget all VMware software comes with a 60 day trial. After that time, register with another email address and you can get another 60 day key. Repeat until your heart's content.

You're not getting many replies from the veterans here because most of us don't want to touch this topic with a 10 foot pole. But I will say it's great to read of your adventures, so I'll encourage you to keep at it and keep us informed. If this were an easy thing to do, it would have already been commercialized and turned into the next great gaming console. That said - good luck!
 
Well, not to get too realistic on you - but if you can support a single playable session you've accomplished a great thing. VDI which is essentially what you're doing here isn't designed for this type of use case yet. And don't forget all VMware software comes with a 60 day trial. After that time, register with another email address and you can get another 60 day key. Repeat until your heart's content.

You're not getting many replies from the veterans here because most of us don't want to touch this topic with a 10 foot pole. But I will say it's great to read of your adventures, so I'll encourage you to keep at it and keep us informed. If this were an easy thing to do, it would have already been commercialized and turned into the next great gaming console. That said - good luck!

Haha, thanks. I'll happily spend hours of time achieving nothing. It's fun even if only as a learning experience.

I did find a lightweight and incredibly simple product by VMWare called Fling Boomerang. This allows software PCoIP with only the agent installed on an ESXi VM. No vCenter Server is needed. So, I think I have the PCoIP part covered now.

I can't seem to get the VM to use the GPU cards I've passed through. I can see them both on the device manager, but Diablo is still running at 3fps. Additionally, if I disable the Savage 3D card, then the desktop becomes extremely sluggish. So, I think the passed through cards are never being utilized. Is there something extra I need to do inside the VM to get the OS to use the passed through devices?
 
Haha, thanks. I'll happily spend hours of time achieving nothing. It's fun even if only as a learning experience.

I did find a lightweight and incredibly simple product by VMWare called Fling Boomerang. This allows software PCoIP with only the agent installed on an ESXi VM. No vCenter Server is needed. So, I think I have the PCoIP part covered now.

I can't seem to get the VM to use the GPU cards I've passed through. I can see them both on the device manager, but Diablo is still running at 3fps. Additionally, if I disable the Savage 3D card, then the desktop becomes extremely sluggish. So, I think the passed through cards are never being utilized. Is there something extra I need to do inside the VM to get the OS to use the passed through devices?

Yeah, buy high end tegra cards and chop them up using the virtualized drivers in ESX - should cost around 5k, give or take.

DirectPath was never intended for this. When you pass the card through, the guest uses it like a REAL card - including the video out on the back of the board. The card doesn't understand the concept of output via PCoIP, because it's a consumer video card - it outputs out the port on its ass (speaking metaphorically). You're trying to redirect the output of a card to a different "port", one that it has no concept or understanding of.

Realistically, you'll have to get a card that is designed to understand that - it'd be one of the Tegra cards that has drivers for PCoIP output. But those are professional lever cards, and they ain't cheap.
 
Well, not to get too realistic on you - but if you can support a single playable session you've accomplished a great thing. VDI which is essentially what you're doing here isn't designed for this type of use case yet. And don't forget all VMware software comes with a 60 day trial. After that time, register with another email address and you can get another 60 day key. Repeat until your heart's content.

You're not getting many replies from the veterans here because most of us don't want to touch this topic with a 10 foot pole. But I will say it's great to read of your adventures, so I'll encourage you to keep at it and keep us informed. If this were an easy thing to do, it would have already been commercialized and turned into the next great gaming console. That said - good luck!

We're doing it - had a big display at VMworld running 3dMark 08 or so. Not the latest, but something ;)

But... you can't do it with consumer cards. Unless you just plug a monitor into each one. Then you ahve to figure out keyboard crap...
 
subbed

I am curious to find out the outcome. I had a similar thread a while ago, but was discouraged before I conducted any testing.
 
Yeah, buy high end tegra cards and chop them up using the virtualized drivers in ESX - should cost around 5k, give or take.

DirectPath was never intended for this. When you pass the card through, the guest uses it like a REAL card - including the video out on the back of the board. The card doesn't understand the concept of output via PCoIP, because it's a consumer video card - it outputs out the port on its ass (speaking metaphorically). You're trying to redirect the output of a card to a different "port", one that it has no concept or understanding of.

Realistically, you'll have to get a card that is designed to understand that - it'd be one of the Tegra cards that has drivers for PCoIP output. But those are professional lever cards, and they ain't cheap.

Thank you! I was able to verify this. I plugged one monitor into the remote machine and one into the ESXi machine. The mouse and keyboard were plugged into the remote machine. I connected to the VM from the remote machine using Boomerang, and, within the VM, set the desktop to only display on the monitor plugged into the ESXi machine. I started up Diablo 3, and it was silky smooth. The mouse and keyboard were very responsive. PCoIP wasn't being used for the video output because, as you suggested, output was via HDMI from the ESXi box. So, I guess, if anything, I've shown that keyboard and mouse input is fine over PCoIP...

I suppose the only way this will work is if there is some way to redirect the video card output to a destination other than the port on the back of the ESXi box...
 
Which, unless someone has gotten REALLY creative, is limited to the pro cards with special drivers to "carve" them between guests and pass back the results on the internal video card, effectively "faking" the actual card in the guest by handing off the processing and returning the results and forwarding via PCoIP.
 
Speaking of getting creative, could I use a VGA to LAN adapter and stream HDMI output wirelessly? That would let me get video output from the back of the ESXi box, at least. This solution isn't really the most elegant, but I think it would work for however many GPUs I have inside the machine.
 
Speaking of getting creative, could I use a VGA to LAN adapter and stream HDMI output wirelessly? That would let me get video output from the back of the ESXi box, at least. This solution isn't really the most elegant, but I think it would work for however many GPUs I have inside the machine.

It might, no idea how the lag is on those.
 
hmm it is kinda viable using "free" solutions so far, so far i've up to the point i have a single GPU passthrough with a consumer ATI card and streamed from that VM to another windows running VM that relatively lag and latency free and kinda good enough for gaming at 1200x800 resolution (native reso of that screen) but i still have minor problems with inputs of mouse not registering properly over a 3d environment.

chopping the GPU for a multiseat environment and a better PCoIP implementation is my next goal
 
hmm it is kinda viable using "free" solutions so far, so far i've up to the point i have a single GPU passthrough with a consumer ATI card and streamed from that VM to another windows running VM that relatively lag and latency free and kinda good enough for gaming at 1200x800 resolution (native reso of that screen) but i still have minor problems with inputs of mouse not registering properly over a 3d environment.

chopping the GPU for a multiseat environment and a better PCoIP implementation is my next goal

If you don't mind me asking, how did you stream from one VM to another? I was able to pass the card through to the VM, but none of the streaming protocols know what to do with the passed through device. This was talked about previously in the thread, so I'm curious what approach you used.
 
I'm still randomly experimenting with other options and will be trying spice soon and thin clients

i tried splashtop as it has a droid app as well and it worked pretty good for streaming but it has its limitation that it cant do full screen apps like games so you would have to run them in window mode
 
I'm still randomly experimenting with other options and will be trying spice soon and thin clients

i tried splashtop as it has a droid app as well and it worked pretty good for streaming but it has its limitation that it cant do full screen apps like games so you would have to run them in window mode

Ah, I tried splashtop and didn't get playable framerates. I think splashtop uses software rendering, which gives dramatically lower performance compared to utilizing a passed through GPU. As long as the resolution remains very low for all the moving screen parts, I think Splashtop manages to do okay.
 
hmm well 1200x800 is very playable or perhaps im very tolerable but i've compared it side by side with an raw output screen to a splashtop running inside VM screen and the latency between them is minimal that you wont really notice it.

true that splashtop is not as good as a raw PCoIP solution but that requires some expensive hardware and software which is beyond my student budget

i'm doing this as my final year project as well as my personal interests to have a LAN in a box to have cluster to do all the processing and have multiple devices like droids and Rpi act as a thin client to it for gaming and various purposes

cluster would provide hardware redundancy and single point of upgrade
 
one thing we know is that nvidia had to do a lot of their own work to make this stuff work well with nvidia grid, and they still aren't done. so it will take a while for microsoft, vmware, and citrix to copy and advance in a generic way that applies to all graphics cards.

there is a chance that geforce titan has some of the virtualization features of gk110, and there is a chance that some of the drivers on some platforms expose some of those features, and there is a chance that some of the virtualization platforms would use those features today. like 0.01% chance

until then we've got to deal with what we've got: one graphics card per VM and kludgy remoting?
 
Subscribed. I would be interested in how this develops.

Now my input:

I'm running a Windows VM on a Linux Mint (Ubuntu) host with Xen hypervisor and VGA passthrough. The VGA passthrough is awesome and works really well (just like bare metal). My reason for choosing Xen over other solutions (KVM, Hyper-V, VMware, Parallel Workstation, etc.) is simple: A year ago it was the most advanced hypervisor for VGA passthrough, and it probably still is.

My Windows VM uses a Nvidia Quadro 2000 card which is specified as "multi-OS" - non Multi-OS Nvidia cards are tricky to impossible to passthrough. AMD cards, the ones the OP uses, should work just fine. My host OS uses a cheap AMD card.
My how-to can be found here: http://forums.linuxmint.com/viewtopic.php?f=42&t=112013. Note: Ubuntu is even easier to setup, since it has the LVM option in the installer.

The tricky part has already been described: How to stream the video over the LAN to a remote thin client. I don't have the answer, unfortunately. PCoIP looks like an interesting technology, but I would try Spice as well. The latter is open source.

Does anyone have experience with redirecting or streaming video? Or does anyone know of Nvidia providing drivers for the Quadro series to accomplish that? Any experience with Spice?
 
The tricky part has already been described: How to stream the video over the LAN to a remote thin client. I don't have the answer, unfortunately. PCoIP looks like an interesting technology, but I would try Spice as well. The latter is open source.

Does anyone have experience with redirecting or streaming video? Or does anyone know of Nvidia providing drivers for the Quadro series to accomplish that? Any experience with Spice?

Spice features page has 3D acceleration listed as future functionality. So, Spice probably is not an option for now.
 
one thing we know is that nvidia had to do a lot of their own work to make this stuff work well with nvidia grid, and they still aren't done. so it will take a while for microsoft, vmware, and citrix to copy and advance in a generic way that applies to all graphics cards.

there is a chance that geforce titan has some of the virtualization features of gk110, and there is a chance that some of the drivers on some platforms expose some of those features, and there is a chance that some of the virtualization platforms would use those features today. like 0.01% chance

until then we've got to deal with what we've got: one graphics card per VM and kludgy remoting?

0.01% chance?

We were showing it off at VMworld, running games and 3dmark, live, in view desktops using PCoIP. It just takes the right card. It'll be fully supported in the next release, and already works in various forms.
 
0.01% chance?

We were showing it off at VMworld, running games and 3dmark, live, in view desktops using PCoIP. It just takes the right card. It'll be fully supported in the next release, and already works in various forms.

If the performance is where it needs to be, I could see never building myself another dedicated workstation again - just one beastly server.
 
"right" card can be an issue. no reason why a consumer card wont do the job but no nvidia wants you to fork out atlest $2000 for a quadro and you get subpar gaming performance.

the affordability of such card is beyond what a home user and most small business can afford
 
Back
Top