GeForce Now Cloud Gaming Preview

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,534
Considering that we can't buy video cards any more due to AMD and NVIDIA selling all those to miners, sometimes directly in the form of naked GPUs, maybe streaming cloud gaming is the answer. NVIDIA's streaming gaming service, GeForce Now, went into beta a couple weeks ago, and there is a wait list currently for access if you are interested. The video below is covering actual gameplay using GeForce Now.

Check out the video.

Everyone knows gaming looks best on a PC, and especially so on a gaming desktop. We also know that gaming PCs are not cheap, but what if there was another option--where you could get play the newest games at the highest graphics, but on an older desktop computer or laptop. Nvidia's GeForce Now gives us a glimpse of the future with the concept of cloud gaming.
 
Well, since I pretty much exclusively game in VR now, this is a complete non-starter for me even if I had a good enough internet connection and the input lag for controls were ok, the latency to the headset would be a vomit fest.
 
when i tried it, it felt like i was playing on a console system and not a pc.
 
Demo the game system that's touted as desktop bypass on the desktop it's supposed to bypass? A fail fail?
 
This is a very VERY sad state of affairs. Still hoping for the whole mining thing to come crashing down.
 
Did we not learn anything from Sony's streaming service? Streaming is not the solution to this high priced graphics cards. We need a 3rd competitor to AMD and Nvidia that can supply the demand. Not a price gouging service that will never work due to input lag.
 
You know, the Xbox One X has a AMD RX 580 in it, so if someone were to hack it to mine bitcoins...

Sigh, you fuckers just stop! :sneaky: rofl. Man. you miners are like people from North Korea seeing a sign for free food.
 
I hate the fucking cloud. Yet more digital divide, terrific. What next, will they blockchain the games? Someday someone's going to explode a few EMP bombs in a few strategic places and the whole world will just stop functioning...I'm becoming a luddite time to quit.
 
You know, the Xbox One X has a AMD RX 580 in it, so if someone were to hack it to mine bitcoins...

Actually his has more than a RX580.. has to be a 590 or something.. The Rx480 I bought for my son has 36 compute units.. the Xbox One X has 40 Compute units..
 
Everyone is hating on this but it's an example of Nvidia trying to be prepared for where the world will be in 5 to 10 years, not where we are right now. If you live on a farm somewhere with 5mb dsl, this isn't for you and it will never be for you. If you live in a major metropolitan area or suburb where most ISP's have 100mbps as the standard advertised tier and where docsis 3.1 or fiber is either available or being rolled out, bandwidth gets as high as 1gbps synchronous. Clearly this service would work in such a situation. So the next question is, will bandwidth go up or down over the next decade? The answer is up. Saying nobody will have the bandwidth to play this stuff properly is like saying "nobody will ever need more than 640k of memory".
 
I must admit I was pretty impressed with the Steam Link. Granted the host computer was on a wired connection in the same house. But it allowed me to play Borderlands with my son co-op in the same room. There was very little compression artifacts (usually on certain textures). And input lag? I don't know, I'm not a Level 12 Pro Awesome like most of you, I just play games to have fun.
 
Sigh, you fuckers just stop! :sneaky: rofl. Man. you miners are like people from North Korea seeing a sign for free food.

I'm glad you know I was kidding about the mining on this gaming service. LOL.. but what do you want to bet that someone develops a hack for that.

Resolution setting.. uhh.. 8k, Multi player, yea.. uhhh... HDR and 140hz refresh gysync locked or something.. So yea... I need ALLL the GPU power to drive that. (starts bitmine encoding) Oh no it's great I'm going to game until I die...
 
Everyone is hating on this but it's an example of Nvidia trying to be prepared for where the world will be in 5 to 10 years, not where we are right now. If you live on a farm somewhere with 5mb dsl, this isn't for you and it will never be for you. If you live in a major metropolitan area or suburb where most ISP's have 100mbps as the standard advertised tier and where docsis 3.1 or fiber is either available or being rolled out, bandwidth gets as high as 1gbps synchronous. Clearly this service would work in such a situation. So the next question is, will bandwidth go up or down over the next decade? The answer is up. Saying nobody will have the bandwidth to play this stuff properly is like saying "nobody will ever need more than 640k of memory".
First, you can have all the bandwidth in the world, you can't get rid of the latency. Second, you need 12.5 Gbps for 4k@60Hz. A measly 3.2 Gbps for 1080p60. When do you expect that will be standard?
 
50 Megabit connection Requirement? Am i gonna use up my impending bandwidth cap in a day?
And....that means I can't ever change ISPS as only one ISP in my area offers that speed. *sighs*
 
Sigh, you fuckers just stop! :sneaky: rofl. Man. you miners are like people from North Korea seeing a sign for free food.
I don't mine but I do realize that unless this problem is everyone's problem, including console gamers, then it won't be solved.

BTW, Nvidia's streaming service is likely loaded with enterprise level of graphic cards. So how can Nvidia keep up with the demand of a streaming service but not people purchasing graphic cards?

235kaz.jpg
 
Everyone is hating on this but it's an example of Nvidia trying to be prepared for where the world will be in 5 to 10 years, not where we are right now.
If that's the world in 5 - 10 years form now then I don't wanna be on this planet anymore.

If you live on a farm somewhere with 5mb dsl, this isn't for you and it will never be for you. If you live in a major metropolitan area or suburb where most ISP's have 100mbps as the standard advertised tier and where docsis 3.1 or fiber is either available or being rolled out, bandwidth gets as high as 1gbps synchronous. Clearly this service would work in such a situation. So the next question is, will bandwidth go up or down over the next decade? The answer is up. Saying nobody will have the bandwidth to play this stuff properly is like saying "nobody will ever need more than 640k of memory".
Not a matter of bandwidth but latency. Anyway, there's a lot of other reasons besides delays in games actions.

#1 Can't mod games.
#2 Don't own games. You have to pay a monthly fee for this service.
#3 Worst possible image quality. People forget that this is a compressed video feed that will not look like a 1080 ti running at max settings.
#4 And of course delays in your actions.

Why the fuck would anyone want this shit? Just to be able to pay Nvidia a monthly fee?
 
  • Like
Reactions: Meeho
like this
You know, the Xbox One X has a AMD RX 580 in it, so if someone were to hack it to mine bitcoins...
No bro it's a Vega 64 in the X, I heard it on the internet it's totally true.
 
Last edited:
yay sucktastic latency incoming which probably wont be solved anytime soon, probably get worse in the future. gotta love playing games that look like a badly encoded youtube video.
 
  • Like
Reactions: Meeho
like this
If that's the world in 5 - 10 years form now then I don't wanna be on this planet anymore.


Not a matter of bandwidth but latency. Anyway, there's a lot of other reasons besides delays in games actions.

#1 Can't mod games.
#2 Don't own games. You have to pay a monthly fee for this service.
#3 Worst possible image quality. People forget that this is a compressed video feed that will not look like a 1080 ti running at max settings.
#4 And of course delays in your actions.

Why the fuck would anyone want this shit? Just to be able to pay Nvidia a monthly fee?

Nobody said they would stop selling video cards and implement a North Korea style ban where this is your only option. This is for people who can't afford a super nice gaming rig. $20/mo (or whatever this will cost) will be significantly more palatable for a lot of people versus $700 every 2 years for a new bleeding edge video card. For people who want 4k, who want the lowest latency, who are willing to build custom rigs, for people who want to mod, there will always be the gaming we have right now.


First, you can have all the bandwidth in the world, you can't get rid of the latency. Second, you need 12.5 Gbps for 4k@60Hz. A measly 3.2 Gbps for 1080p60. When do you expect that will be standard?

You're talking completely uncompressed video streams which obviously nobody is doing. I don't know if you're one of those people that says you can only listen to uncompressed flac audio because you can totally hear the difference between that and 256 kbit/s mp3's but that's basically the argument you're making. There is a level of encoding that can be done that, combined with ultra settings, will give someone a better experience than playing a game on low settings with no video compression. That is what this service is currently marketed for. The guy in the video, multiple times, says "you don't need dedicated graphics to run this".

If your thought is "this service is going to suck compared to my dual 1080ti SLI rig" you are not the person this service is aimed at. The idea is "does this service offer something better than having a i3 or i5 w/ an old or no graphics card at all. I think the long term person this will be aimed at, if latency proves not to be an issue, is all of the low end people as well as someone that gets more value out of paying X dollars a month than buying a new videocard every couple years. i doubt it will ever be aimed at the most hardcore gamer with the highest possible demands.


edit:

While I do think it will be possible in the future for people to do cloud gaming as an alternative to owning a card, Nvidia is basically setting this up to completely fail. From their CES presentation regarding pricing:

7:08 pm – GeForce Now will be available in March for early users, and for $25 for 20 hours of plays.. It’s basically a GeForce gaming PC on demand. The detail is incredible. It’s playing on a GTX 1080. < https://blogs.nvidia.com/blog/2017/01/04/live-nvidia-ces-2017-keynote/ >

I don't know who in their right mind would pay $25 for 20 hours. $25 unlimited would at least be comparable to buying a 1080 or 1070 every 2 years (25*12*2) but for 20 hours? That's just ridiculous.
 
Last edited:
You're talking completely uncompressed video streams which obviously nobody is doing. I don't know if you're one of those people that says you can only listen to uncompressed flac audio because you can totally hear the difference between that and 256 kbit/s mp3's but that's basically the argument you're making. There is a level of encoding that can be done that, combined with ultra settings, will give someone a better experience than playing a game on low settings with no video compression. That is what this service is currently marketed for. The guy in the video, multiple times, says "you don't need dedicated graphics to run this".

If your thought is "this service is going to suck compared to my dual 1080ti SLI rig" you are not the person this service is aimed at. The idea is "does this service offer something better than having a i3 or i5 w/ an old or no graphics card at all. I think the long term person this will be aimed at, if latency proves not to be an issue, is all of the low end people as well as someone that gets more value out of paying X dollars a month than buying a new videocard every couple years. i doubt it will ever be aimed at the most hardcore gamer with the highest possible demands.
Yes, I'm talking uncompressed, or at least loslessly compressed. I haven't been playing for three decades and seeing advances in graphics to start reverting now. I would also rather play at sharp and lag free lower quality than blurred added latency ultra quality. I also like to own my games and be able to replay them when I retire and not be at a mercy of yet another cloud.
 
So I have been playing a few rounds of Fortnight Battle Royale on GeForce Now, a lot better than I would have guessed. Not perfect but doable. I am far and away from a "pro," when it comes to online shooters, but I am going to play around with it some more. And I do have the bandwidth needed.

 
Why the heck is this explosion of streaming everything NOW when every ISP AND WIRELESS COMPANY IS CAPPING MONTHLY DATA?? They've gotta be in cahoots.

Also, I;m not sure we needed a 7 minute video where he said the same thing every 30 seconds.
 
Yes, I'm talking uncompressed, or at least loslessly compressed. I haven't been playing for three decades and seeing advances in graphics to start reverting now. I would also rather play at sharp and lag free lower quality than blurred added latency ultra quality. I also like to own my games and be able to replay them when I retire and not be at a mercy of yet another cloud.

I guess you could test it right now if you wanted. Try playing something like Ghost Recon Wildlands on 1080p medium or low and then try watching a youtube 1080p ultra video of the game. I would probably rather have the youtube version.
 
I'm still wondering what all of this latency that everyone is taking about is... Us pleebs on coax or dsl might not have the best latency, but I can ping < 5 ms hundreds of miles away if I'm on a fiber connection. Wifi actually has the potential to offer < 1ms to the tower, then if it's fiber from there back, you might start seeing < 10 ms to anything in your general area. Considering that 60hz = 16.6 ms, 5ms isn't going to be breaking the bank.

If you're using HEVC encoding 1080p60 is certainly doable with quality high enough you likely can't tell the difference it was ran through the compressor. If using HEVC means the eye candy can be turned up because you have access to a pair of titans in SLI, you will probably end up with better quality than if you just had an older video card that could only run medium settings. Not saying that will be the case, but the possibility is likely there.
 
I guess you could test it right now if you wanted. Try playing something like Ghost Recon Wildlands on 1080p medium or low and then try watching a youtube 1080p ultra video of the game. I would probably rather have the youtube version.
I see a 1060/480 is enough for very high settings.

I'm still wondering what all of this latency that everyone is taking about is... Us pleebs on coax or dsl might not have the best latency, but I can ping < 5 ms hundreds of miles away if I'm on a fiber connection. Wifi actually has the potential to offer < 1ms to the tower, then if it's fiber from there back, you might start seeing < 10 ms to anything in your general area. Considering that 60hz = 16.6 ms, 5ms isn't going to be breaking the bank.

If you're using HEVC encoding 1080p60 is certainly doable with quality high enough you likely can't tell the difference it was ran through the compressor. If using HEVC means the eye candy can be turned up because you have access to a pair of titans in SLI, you will probably end up with better quality than if you just had an older video card that could only run medium settings. Not saying that will be the case, but the possibility is likely there.
It will probably be more than 10 ms for >90% of users to the GeForce farm, then you must add the compression latency, then the return latency. Can't see it get better than 60 ms total and that's alot and a best case scenario.
 
I see a 1060/480 is enough for very high settings.


It will probably be more than 10 ms for >90% of users to the GeForce farm, then you must add the compression latency, then the return latency. Can't see it get better than 60 ms total and that's alot and a best case scenario.

The one upside is that the server farm is probably right on a backbone so if you had a 60ms ping to the game server, the actual geforce server might only be 10ms so instead of being 60ms + 60ms = 120ms you might actually be 60ms to gefoce + 10ms to the game server = 70ms. Then again, with all the money Nvidia has, they might even pay to co-locate in the same facility that some games host so you might have even less.
 
Nobody said they would stop selling video cards and implement a North Korea style ban where this is your only option. This is for people who can't afford a super nice gaming rig. $20/mo (or whatever this will cost) will be significantly more palatable for a lot of people versus $700 every 2 years for a new bleeding edge video card. For people who want 4k, who want the lowest latency, who are willing to build custom rigs, for people who want to mod, there will always be the gaming we have right now.
You're making a lot of assumptions here.

Firstly, this gives Nvidia too much power. Nobody should want this just from a consumer point of view. If Nvidia wanted to make any game exclusive to their platform, you won't have a choice. It's basically like playing games on a console, with an internet connection required. If BloodBorne 2 was on Geforce Now, then that's the only place to play that game. This is the same problem we have with other streaming services that have exclusive content. Now imagine if AMD and Intel got on this action, we'll have to pay for triple services just to play games. Nothing about this will be cheap.

Secondly, who upgrades their PC every 2 years? Lots of people still rock a 2500K with a GTX 680 and have no problems playing todays games. I still rock my FX-8350 and have no plans to upgrade anytime soon. For people like me, I'd rather continue to own outdated hardware and wait a good 5-7 years before upgrading. For those who want 4k Ultra settings, this does them no good either. It appeals to idiots, and nobody else.

You're talking completely uncompressed video streams which obviously nobody is doing. I don't know if you're one of those people that says you can only listen to uncompressed flac audio because you can totally hear the difference between that and 256 kbit/s mp3's but that's basically the argument you're making. There is a level of encoding that can be done that, combined with ultra settings, will give someone a better experience than playing a game on low settings with no video compression. That is what this service is currently marketed for. The guy in the video, multiple times, says "you don't need dedicated graphics to run this".
It's been known for a while that ultra settings don't offer much over medium or high settings anyway. Besides, no compression algorithm is going to produce a equal or better imagine than the one generated on your PC. Ultra streaming settings is not going to be equal to Ultra on your end. That's just the nature of compression, in that you lose something to reduce the file size. You also exchange CPU cycles for lower file size. There's no free lunch here.
If your thought is "this service is going to suck compared to my dual 1080ti SLI rig" you are not the person this service is aimed at. The idea is "does this service offer something better than having a i3 or i5 w/ an old or no graphics card at all. I think the long term person this will be aimed at, if latency proves not to be an issue, is all of the low end people as well as someone that gets more value out of paying X dollars a month than buying a new videocard every couple years. i doubt it will ever be aimed at the most hardcore gamer with the highest possible demands.
Nobody buys a new video card every couple of years, unless you're always buying the latest from Nvidia. A cheap $100 card is still better than streaming. A good $200 card will last you many years. Those who buy $400+ cards are not going to be interested in this. This has no business existing.

I'm still wondering what all of this latency that everyone is taking about is... Us pleebs on coax or dsl might not have the best latency, but I can ping < 5 ms hundreds of miles away if I'm on a fiber connection. Wifi actually has the potential to offer < 1ms to the tower, then if it's fiber from there back, you might start seeing < 10 ms to anything in your general area. Considering that 60hz = 16.6 ms, 5ms isn't going to be breaking the bank.
The only way this will work if they put the server next door to your home. For example you could be servers in NYC and California, which would work for those people but nobody else. Latency is all about distance and hops made to get to where you need to go. It's not worth the trouble.
If you're using HEVC encoding 1080p60 is certainly doable with quality high enough you likely can't tell the difference it was ran through the compressor. If using HEVC means the eye candy can be turned up because you have access to a pair of titans in SLI, you will probably end up with better quality than if you just had an older video card that could only run medium settings. Not saying that will be the case, but the possibility is likely there.
That's very unlikely. Also, you lose out on modding and emulation. Have you heard of the Switch emulator?

This service is not the way. Ebola on Nvidia.
 
Last edited:
I see a 1060/480 is enough for very high settings.


It will probably be more than 10 ms for >90% of users to the GeForce farm, then you must add the compression latency, then the return latency. Can't see it get better than 60 ms total and that's alot and a best case scenario.

I don't disagree that for most people out of the gate, latency is still going to be high. But if you pick some of the top data center areas, they are more likely to also have faster internet services as well. However I'd guess that we can get better latency than 60ms. A quick google search talks about ~20ms latency for sending the video, with the biggest slowdown coming from having to buffer some of a frame before it can send it off. The return trip definitely won't be 20ms, as there is no encoding video back, it's just simply sending some key presses back to the console. Right now I don't think anyone will have a working demo as fast as that, but it's definitely not off the table that you could see < 20ms or even < 10ms in the future for the entire process.

https://www.design-reuse.com/articles/33005/understanding-latency-in-video-compression-systems.html
https://www.altera.com/content/dam/altera-www/global/en_US/pdfs/literature/wp/wp-cast-low-latency.pdf



You're making a lot of assumptions here.

The only way this will work if they put the server next door to your home. For example you could be servers in NYC and California, which would work for those people but nobody else. Latency is all about distance and hops made to get to where you need to go. It's not worth the trouble.

That's very unlikely. Also, you lose out on modding and emulation. Have you heard of the Switch emulator?

This service is not the way. Ebola on Nvidia.


Despite the horror outcome of this solution, most of the issues can and will be solved. There are already servers sitting on your doorstep, so having GRID systems near your home isn't really a far step. Your ISP likely already has an akamai cluster and / or several other CDNs tied directly into their network, so this is just another system. Will your area get one close by? It's going to depend on whether or not it becomes successful. At this point Nvidia owns GRID, but that definitely doesn't mean that if someone flashes cash at them, someone else could be buying and running the hardware. If EA decided they wanted to experiment with hosting the entire stack of an online game, I wouldn't be surprised if they could work something out to have GRID systems put in place. I think a lot of people are stuck on the idea right now that GRID requires some huge datacenter, but the concept could easily turn into a pair of servers with video cards attached to them. If we're only talking $25K systems to deploy versus an entire rack, or an entire datacenter, the ROI can change significantly. Will that happen? I have no clue. Do I think scaling will be a huge factor is making this a success? Yes.

As for modding, yup, this service would definitely put a stop to that. But you're already SOL on a lot of the AAA games these days anyway, so it's already not an option for you. I have no doubts the portable market would once again be the target for this. If you can hook a game controller to an Android tablet, you can stream 6+ hours of HD gaming to it, but you might only get 3 hours if you had to run the game locally. If the server is > 10ms and the overall video latency is 20ms or less, no one playing on a tablet would likely notice the difference. If you're already buying $400 GPUs, I don't see Nvidia wanting this to cannibalize that market. But if you notice Intel accounts for 70% of the graphics market. So there are a lot of people out there who could be playing AAA games if someone else's equipment ran it.

https://wccftech.com/nvidia-amd-intel-gpu-market-share-q3-2017/
 

Despite the horror outcome of this solution, most of the issues can and will be solved.
Technically, no they cannot. The best solution is to convince people that there isn't "much" of a delay, and the imagine quality is the same or better. Basically marketing. And of course market games that don't relay on twitch like reflexes. Games like Minecraft and HearthStone would work. Games like Dark Souls and OverWatch would never work.


There are already servers sitting on your doorstep, so having GRID systems near your home isn't really a far step. Your ISP likely already has an akamai cluster and / or several other CDNs tied directly into their network, so this is just another system. Will your area get one close by? It's going to depend on whether or not it becomes successful. At this point Nvidia owns GRID, but that definitely doesn't mean that if someone flashes cash at them, someone else could be buying and running the hardware. If EA decided they wanted to experiment with hosting the entire stack of an online game, I wouldn't be surprised if they could work something out to have GRID systems put in place. I think a lot of people are stuck on the idea right now that GRID requires some huge datacenter, but the concept could easily turn into a pair of servers with video cards attached to them. If we're only talking $25K systems to deploy versus an entire rack, or an entire datacenter, the ROI can change significantly. Will that happen? I have no clue. Do I think scaling will be a huge factor is making this a success? Yes.
Keep in mind we're in a post net neutrality world here, so this GRID service is also going to face other problems besides technical ones. And I really doubt that Nvidia will ever get enough of these servers to every inch of the world to make this viable, much to the same reason why ISPs don't go for every home.
As for modding, yup, this service would definitely put a stop to that. But you're already SOL on a lot of the AAA games these days anyway, so it's already not an option for you.
You know what they say when you assume. You make an ass out of u and me. No AAA can stop anyone from modding it. Only multiplayer games can do that, and those are not the kinda games you normally don't mod anyway. Though that doesn't mean people don't. For example something like VRChat would never work just because it's one giant modding game.
I have no doubts the portable market would once again be the target for this. If you can hook a game controller to an Android tablet, you can stream 6+ hours of HD gaming to it, but you might only get 3 hours if you had to run the game locally. If the server is > 10ms and the overall video latency is 20ms or less, no one playing on a tablet would likely notice the difference.
There's that assumption again. You could just have a gaming PC and run software on it to stream to your Android tablet. It's free and would likely have even less latency, unless you plan to travel far away.
If you're already buying $400 GPUs, I don't see Nvidia wanting this to cannibalize that market. But if you notice Intel accounts for 70% of the graphics market. So there are a lot of people out there who could be playing AAA games if someone else's equipment ran it.

https://wccftech.com/nvidia-amd-intel-gpu-market-share-q3-2017/
Good news, I hear that Intel is going to have CPUs with Vega graphics. Also in other good news, AMD is releasing APU's as well. Also in good news, even Intel is working on their own graphics.

Right now you have a point, but you'd still be better off having a desktop PC and streaming games from that.
 
I am in the beta for the Apple side of Geforce Now. This is actually really cool and I find it extremely hilarious that next to my "real" rig with a GTX 1080 Ti, I have an old iMac and it plays games in my Steam library just as well - on 2012 technology. Only difference is that on my PC the fps can be way higher and resolution of the monitor as well. In all honesty, this is a great thing for gaming companies not so good for the PC hardware manufacturers since it literally puts off the needing to upgrade for years... If all you do on your computer is play games.
 
Back
Top