Anyone excited about nV's cloud GPU stuff

tangoseal

[H]F Junkie
Joined
Dec 18, 2010
Messages
9,743
Like the title says...

Whos jumping on the bandwagon?

Can the tech be applied to your own GPU at home and streamed to any PC you want like on a road trip?

Does it only use nVidias datacenter GPUs and you cant use your own?
 
Cloud GPUs seem like an awesome idea for general compute tasks. Especially if there were a convenient API that made it appear to be a CUDA device on the local machine, it'd probably be great for non-real-time CUDA-accelerated tasks, of which I can imagine a few that might be relevant to the industry I work for.

I'm not really seeing the utility for games, though. I play games mostly as an excuse to build a hot-rod gaming PC, so the idea of renting time on one from nVidia is completely unappealing to me. I could see it being an easy sell to the crowd that enjoys watching game streams, though.
 
Sure, p2 and p3 boxes in AWS. We use them at work for AI stuff. No need to have it hook to your local machine, just ssh to the box (from your local) and it's fairly seamless. Load any data you need to s3 or scp to the box from local if you want to keep it contained to just your box. Most companies that build the software will send you an ami that has everything you need, so very little effort to get running.

Not for gaming, this is for hpc/ai. I don't see any case where remote rendering would make sense due to latency. If you want super graphics and high framerate, having a 100ms+delay for that image to actually get back to you along with delay in your controls getting there will ruin the experience.
 
Cloud GPUs seem like an awesome idea for general compute tasks. Especially if there were a convenient API that made it appear to be a CUDA device on the local machine, it'd probably be great for non-real-time CUDA-accelerated tasks, of which I can imagine a few that might be relevant to the industry I work for.

I'm not really seeing the utility for games, though. I play games mostly as an excuse to build a hot-rod gaming PC, so the idea of renting time on one from nVidia is completely unappealing to me. I could see it being an easy sell to the crowd that enjoys watching game streams, though.
Ha - yea the amount of time spent building/researching/tweaking vs gaming isn't what it used to be...
 
It's still game streaming, which means latency is still the enemy.
Even under ideal circumstances, the image looks noticeably worse when using Steam streaming. That alone kills it for me personally, even if I were to get past my insistence on playing games on my own hardware.

If I wanted to compromise on graphical quality, I'd play on a console.

Edit: Garbled a quote. Fixed now.
 
There's Parsec for those with a great home Internet connection that wants to play using their own PC.
 
Last edited:
I think the cloud stuff is pretty cool. Old carmudgeon me who was materialistic probably would of balked, but to be honest I think the mentallity of outsourcing is just how it is now. I've spent over 3k+ in equipment just the last 2 years to up my game experience, and most of the stuff is out of date now, and requires a thousand or 2 here to get the latest. Realistically I could of paid a month to month service, and use what I needat my own time. The convenience of it being just made over the net makes it better.....so I like the idea. But also always ideas are cheap, execution is where it matters, so it would be cool to see how this goes when there's actually a high volume of people using it/etc.
 
Even under ideal circumstances, the image looks noticeably worse when using Steam streaming. That alone kills it for me personally, even if I were to get past my insistence on playing games on my own hardware.

If I wanted to compromise on graphical quality, I'd play on a console.

Edit: Garbled a quote. Fixed now.

I haven't used it myself; but people claimed there was latency even when using SteamLink which is just stream stuff within your local network from PC to TV.

With all the random hops and congestion on the Internet itself; I can't see it working all that well unless you're just playing some casual game.
 
I haven't used it myself; but people claimed there was latency even when using SteamLink which is just stream stuff within your local network from PC to TV.

With all the random hops and congestion on the Internet itself; I can't see it working all that well unless you're just playing some casual game.
There is. How noticeable the latency is depends on the game; I don't notice it much in Witcher 3, but it was pretty obvious in Remember Me. I could see it being a serious deal-breaker in games like PUBG or CS:GO, and it would obviously be a huge problem for VR.

The real problem for me with Steam streaming is the compression. Even over a gigabit link, 1080P ends up looking worse than youtube, which at least in witcher 3, makes the game look really bad. Like, if you go out in the areas with lots of trees, it's super obvious.

Anyway, I can see this service maybe having a place among some gamers, especially if they do a good job tying it in with Twitch or Youtube or something, but it's wasted on me. Like I said before, the hobby is as much about building a hot rod to me as it is about actually playing games, and the idea of just renting time on someone else's PC doesn't excite me at all.
 
I’d be more excited about enough bandwidth being common place to make a tech like this worth having.
 
Latency can be mitigated through predictive algorithms. Based on expected inputs, adjustments can be made to make latency virtually non existent. Nvidia mentioned with nearlt no latency impact using thier tech. Something tells me they will be doing deep learning on a per user basis and thrn applying predictive logic to that persons recognized patterns. I know I know sigh... its all scifi.
 
Latency can be mitigated through predictive algorithms. Based on expected inputs, adjustments can be made to make latency virtually non existent. Nvidia mentioned with nearlt no latency impact using thier tech. Something tells me they will be doing deep learning on a per user basis and thrn applying predictive logic to that persons recognized patterns. I know I know sigh... its all scifi.

There is only so much you can do when it comes to predicting/mitigating latency for complex real-time applications like modern video games. Light is simply not fast enough sometimes...

I do like cloud gaming because I can't afford to have both a top of the line desktop and a top of the line laptop. Also, I like my laptops compact with a good battery life (and also, silent). So yeah, I set up Steam Streaming over my VPN for that reason. But to be honest, I rarely use it. When I'm not home I'm not always connected to a solid fibre network to start with. And also because, you know, life :)

It does work very well, if you have a lot of spare bandwidth on both ends GPU encoding & decoding with a high bitrate gives impressively low latency.
 
Better idea is when not at home, do other things. ie life.
Look forward to playing at home in full quality when you get back.

Yeah though I made it sound so let me rephrase the overarching idea...were not really talking about having more gaming access in as much as ... will the industry move where you no longer need a 600 dollar gpu and an IGP is fine because heavy gaming is offloaded to a datacenter instead.

I dont think thats realistic... imagine this: Pubg 4 is out 10 years from now and 30mil play it. You would need 30 million gpus in a datacenter? Not happeniing unless they are all clustered and part of a giant shareable hive making one or several big ass logical gpus
 
Can the tech be applied to your own GPU at home and streamed to any PC you want like on a road trip?

This has already been available via Nvidia Gamestream. Latency is of course going to be an issue.

I use Moonlight gamestream to stream from home to work.Good thing about this is it is pretty self contained. Just get a portable Chrome install.

As for complaining about the quality and latency it depends on what your expectations are. I still would only play Overwatch at home but why there is plenty of games that I don't mind the latency and quality hit to do so outside of home either. Or even something like I might play a game like Deus Ex or Witcher 3 at home primarily but if I'm doing back tracking cleanup for quests and etc. I'm fine with the latency and quality hit and just being more time efficient doing it over streaming.

I think the cloud stuff is pretty cool. Old carmudgeon me who was materialistic probably would of balked, but to be honest I think the mentallity of outsourcing is just how it is now. I've spent over 3k+ in equipment just the last 2 years to up my game experience, and most of the stuff is out of date now, and requires a thousand or 2 here to get the latest. Realistically I could of paid a month to month service, and use what I needat my own time. The convenience of it being just made over the net makes it better.....so I like the idea. But also always ideas are cheap, execution is where it matters, so it would be cool to see how this goes when there's actually a high volume of people using it/etc.

You don't need to spend 3k+ in equipment to get the equivalent quality this will be offering.
 
Latency - no thanks, unless I decide to play something like an rts against a computer?
 
It's not just Parsec - a far-more-available option in the eastern US (and one that I used until I switched to Pascal) is Liquid Sky; in fact, I primarily used it for an RTS (which is not as bound by latency as a shooter is). (The RTS in question is Anno 2205.)
In the case of Anno 2205, it's small (which is something that is not true of many games at all these days) - the bigger surprise (to me) was that Anno 2205 was smaller than Starcraft II: Wings of Liberty (which is older). But what is still the bigger driver when it comes to gaming in general? It is STILL *twitch* gaming (shooters mostly) which is at the mercy of bandwidth most of all - look at the complaints in terms of ALL the ways to leverage the technology (Parsec, LiquidSky, even AWS directly) - the complaints all have to do with bandwidth.
LiquidSky uses nVidia Tesla GPUs - we should be familiar with them as merely a technology, as it's what nVidia GameStream (for the Shield streaming devices and tablets) has been leveraging, along with the GRID SaaS (Streaming as a Service) appliances that have been using. Could it be deployed to ordinary "Joe/Jane" user PCs? Of course - GRID SaaS hasn't gone away merely because Pascal and LiquidSky are out there.
 
I've always chased the highest resolutions possible when gaming - currently using a 32" 4K gsync monitor. Cloud GPU makes zero sense for those that game at high resolutions... the latency alone completely kills it, even if you do have a massive bandwidth net connection via fiber.
 
I look at this as a solution in search of a problem.

If I want a GPU, I'll BUY a GPU.

I'm not mining. So I refuse to RENT a GPU.
 
I look at this as a solution in search of a problem.

If I want a GPU, I'll BUY a GPU.

I'm not mining. So I refuse to RENT a GPU.

Now that's it's been months I feel as this was just a feeler into a test market. A hypothesis and nothing more. Probably won't thrive in this day and age.
 
Now that's it's been months I feel as this was just a feeler into a test market. A hypothesis and nothing more. Probably won't thrive in this day and age.
When pushing 4K on consoles, sound infrastructure, unique games, VR on PS4 etc. I just don't see how this would be enticing blurry 1080p with latency. Would be funny if someone hijacked the GPU's into a massive mining endeavour. Maybe Nvidia is already doing that.
 
720p gaming remotely I could see as useful for ppl who could then just get thin and light laptops with nice ram, storage, and cpu specs. While streaming their game from home or the cloud.
 
I was messing around with it again this evening and this is the message I got .... so apparently it is more popular than we naysayers would believe.



upload_2018-4-22_19-54-18.png
 
No need to subscribe to that service if you have a good home internet upload speed and a Geforce 600 - Geforce 10 series.
 
Latency and video compression artifacts
Sounds like gamer's hell
 
I have very stable low latency gig fiber, still not remotely good enough for smooth or responsive gaming. Typical internet connection is 10+ times worse.

High quality video is 20Gbps or more, the tech world is already taking ages just to update the standard for a short cable run in the same fucking room.
I blame DRM and similar interia to some extent, networking already has 100Gbps+ over long fiber or short twinax (QSFP28) which could be done way cheaper if mass produced.
 
Inputs need to travel to server then it must render frame then encode it to send it back and PC will download it and decode it and then display presumably waiting for v-blank before buffer swap. With fast connection this whole encode - decode & display stuff will take a lot of time time even without including network latency.

Ideally for LANs image encoding could be abandoned and raw image sent in small chunks as soon as possible to try to display them even before target PC gets all the data and it could get us decent input lag. But do they do it that way? It would be harder to implement and monitor refresh rate dependent. It is possible so it will be interesting testing topic for sure.
For Internet I would assume they wouldn't never try to send uncompressed data. It could still be sent in chunks to try to display parts image while monitor is still drawing ones it received before but that could easily end up in artifacts due to random latency spikes so it is very unlikely they would go for that.

Because people are so f***ing ignorant of input lag and would point out any artifacts as major flaw it it is most likely they went with easiest to implement solution and just send compressed H.264 stream adding tons of latency, even more just to have time to resend missed frames.
 
Back
Top