Why Virtualized 3D Graphics Are Moving to GPUs

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
In separate talks, virtualization experts Rachel Berry, Thomas Poppelgaard and Dane Young each featured NVIDIA GRID vGPU graphics acceleration. It was also in sessions and demos throughout the show, including those from our partners Cisco, Dell, HP and NetApp. Traditional virtual desktop infrastructure (VDI) offerings relied solely on the support of server CPUs. But limits imposed by the CPU made it nearly impossible to get a satisfactory user experience from virtualized, interactive, media-rich applications.

As a result, virtualization had worked well only for some—primarily task workers and certain knowledge workers. Left out were those with more graphically intense workloads—graphic designers, developers, and video producers and editors. That’s now changing. GRID technology is opening new pathways for these users by offloading graphics processing from the CPU to the GPU. Dell, Citrix and NVIDIA technologies offer a powerful combination to get this done. With Dell PowerEdge R730 servers running Citrix XenDesktop 7 and NVIDIA GRID vGPUs, IT staff can deliver rich, PC-graphics experiences and applications to more users. Meanwhile, applications and critical data remain protected and secure in the data center.
 
Virtualized GPU graphics with Nvidia cards were working great in KVM until Nvidia disabled it in their drivers to convince us to buy their overpriced GRID crap!

FU Nvidia! AMD all the way.
 
Ya, if anything I prefer to keep it inhouse, so I have my own powerful graphics card in a network closet, and thin client peripherals around the house. With data caps and latency and everything, I just trust a CAT6 gigabit network over that any day of the week.
 
Virtualized GPU graphics with Nvidia cards were working great in KVM until Nvidia disabled it in their drivers to convince us to buy their overpriced GRID crap!

FU Nvidia! AMD all the way.

WOW! That's nuts. That's the first I've heard of that.
 
WOW! That's nuts. That's the first I've heard of that.

Yeah, though I wish I could edit my post and add the disclaimer that the KVM stuff would let you use the GPU with one VM, not share it over a bunch. Submitted the first reply a bit too quickly.

But, still, the capability to do that within KVM was very useful to me when it was available and did allow gaming within a VM no problem.
 
If you just want 1 to 1, can't you just do a PCI passthrough to a specific VM? I'm not terribly familiar with KVM.
 
If you just want 1 to 1, can't you just do a PCI passthrough to a specific VM? I'm not terribly familiar with KVM.

This is what Nvidia blocked in drivers.

You can pass it through. But only one VM at a time can use it. Except that now it's zero at a time with Nvidia because they blocked it in drivers.

I would have edited my original post to clarify after I realized how unclear I was, but the lack of edit button continues to make this subforum awesome, huh?
 
If you just want 1 to 1, can't you just do a PCI passthrough to a specific VM? I'm not terribly familiar with KVM.

Yes, but only for specific cards... particularly pricey ones. It used to be that you could do it with any card until Nvidia disabled it in their drivers.

They have a habit of doing this with lots of desirable "features" across their products in order to feature-lock or upsell customers. Remember when you used to be able to use a Nvidia GPU as a PhysX card alongside an AMD GPU as the primary video card?
 
Didn't know that was something Nvidia could block, thought it was handled 100% by the hypervisor, invisible to hardware.

In any case, assuming that's true. It's stupid and I wonder what Nvidia's true motivations were to do that. No one buys a GRID for 1 to 1, so what's the point in crippling that use case?
 
In any case, assuming that's true. It's stupid and I wonder what Nvidia's true motivations were to do that. No one buys a GRID for 1 to 1, so what's the point in crippling that use case?

Because that's just how Nvidia operates. Most companies don't care about you, but Nvidia really goes out of their way to prove it.

They always claim that driver bugs break features, yet they never actually provide a reason to how they could do that, and they've never fixed those supposed bugs. Very curious, is it not?

Well, anything I say is second-hand info to you, but I will note that I have actually experienced the problem first-hand and am not just parroting stuff I heard elsewhere. Still second-hand info to you, but at least it's not third-hand.
 
I believe you and Sunny is corroborating.

I'm just trying to think through the reasoning, beyond "Fuck you customer, eat this!". GRID is for enabling a hundred systems, no one is going to buy one for a single VM, even after disabling the passthrough. People will just go physical and buy a top of the line workstation for half the cost of a GRID.

So what the hell is Nvidia gaining? Just doesn't make sense to me is all. Wondering if there's more to it. Too lazy to Google.
 
This will be good for gaming on next-gen computers that are almost all passively cooled tablets and further helps push the current clunky, inefficient desktop computing model into obsolete-landia that much faster.

Even better for publishers, services like Steam can just let you play the game on their servers without you even downloading a copy to your device so they have much tighter control over the software and that prevents people from sharing copies of it since the content is never stored on the local computer. They only need to update their server farm instead of offering patches for download and it prevents users from tampering with game content to add functionality or inappropriate nudity. All good things for the gaming industry in general since this means more games can be played on more power efficient devices that have less or no local storage.
 
Virtualized GPU graphics with Nvidia cards were working great in KVM until Nvidia disabled it in their drivers to convince us to buy their overpriced GRID crap!

FU Nvidia! AMD all the way.

Yes. Yes. Yes.

In a separate note, I love how so many different types of work loads are being moved to the GPU. The future of the GPU is so bright! :)
 
When I changed from Radeon 5850 to 7950, I did try a friends GTX280 which did not work, although I only had it for 2 days, so I can't be entirely sure that Xen needed patches or not. But my 7950 works flawlessly. And to be clear, I've played AAA games only in virtual machine since 1 day before 11/11/11 (Skyrim release date for those that don't know/remember). Also racing games on 3 FullHD monitors for over a year.
 
Back
Top