Rumor: Google Stadia May Be Getting Shut Down

I remember way back in like 2010 Onlive said they were going to get servers in every local ISP building. Which is funny because it minimizes the "advantages" of cloud gaming.

And no matter how close you move the servers, it's never going to be enough for the gamers that care about latency, image quality, and FPS. The only real advantage cloud gaming can provide to users over local hardware is cost and convinience. And they're even struggling with those.
I wouldn't say "never," but it would definitely require leaps in networking beyond what we know now. Enough to effectively deliver lossless video and minimal lag at reasonable distances (i.e it wouldn't need OnLive's "server at every ISP" strategy). That might not happen for a long time, if ever, but remember this: if you'd told yourself 10 years ago that even your phone could download hundreds of megabits per second in 2022, you would've been skeptical.
 
I mean the way I would do it is have regional servers all equipped with quantum entangled network cards to eliminate regional lag so that you would only have the lag from house to server. lol
 
That kind of thing would never work. Never mind the impossibility of running all the possibilities for input
The AI part is for not having to do that, not only the likely (in orders and quantity) but also those for which the lag would be an issue if it occurs
 
Wouldn't this make the lag worse? I mean if the game is ahead of what you see on screen, that is literally the definition of lag. So making the game run even farther ahead would only make it worse logically.
Probably but I'm just trying to figure out what "negative latency is". It is just marketing wank that someone coined up after talking to an engineer for like 5 minutes. You know like how all marketing wanks came about for the past 40 years.
I wouldn't say "never," but it would definitely require leaps in networking beyond what we know now. Enough to effectively deliver lossless video and minimal lag at reasonable distances (i.e it wouldn't need OnLive's "server at every ISP" strategy). That might not happen for a long time, if ever, but remember this: if you'd told yourself 10 years ago that even your phone could download hundreds of megabits per second in 2022, you would've been skeptical.
In 2006 I bought a Nokia Ngage from a local game store because it was considered a failure and was cheap, and then modded the device so it could play nes, snes, and genesis emulators along with movies like the Monty Python and the Holy grail. I remember walking into RadioShack to buy actual electronic components and was immediately harassed by employee's to buy a phone. I asked the employee if their phones can play movies like my Ngage, which I then pulled it out and started to play a movie, and the other customers lost their minds wondering if they too could buy a phone that plays movies. The answer was no. Since then I moved onto Windows Mobile devices like HTC Kaiser where I put custom roms to maximize the functionality of the device long before the iPhone and Android was ever created. So yea, I don't know about 10 years but 16 years ago I was already ahead of the mobile market game. To me the move to Android was just a better OS than Windows Mobile. I even installed Android on my HTC Kaiser and was impressed with the performance.

As for latency, the answer is never. This is a speed of light issue that isn't going to be fixed with better routers or faster devices. As schoolslave put it, this is rent-seeking. If you're a company you want reoccurring revenue because that's guaranteed income. It's much easier to get you into a service than to get you out of one. Cloud gaming is just another attempt at trying to sell you service that needs a monthly fee.
 
The AI part is for not having to do that, not only the likely (in orders and quantity) but also those for which the lag would be an issue if it occurs
Ok... Still doesn't fix the other issues. Maybe you are able to take the possibilities down from the thousands or tens of thousands in to "just" 50-100... that is still 50-100x the amount of hardware you need to run, and that many more frames to send, and so on. Also there's a real issue with "Only some things are calculated ahead," and that is now you have inconsistent lag, which is even worse. Jittery lag is more noticeable than consistent lag. This is why frame pacing issues are a big deal with games, and some games on console will choose to cap to 30fps and hold that solid rather than have a jittery experience around 60fps. If you are rendering some, but not all, frames ahead then you have variable lag between the ones you do and don't. A frame cache hit and miss will feel different.

It is just not a useful technology right now. Maybe someday, but right now it is not useful for streaming.
 
It is just not a useful technology right now. Maybe someday, but right now it is not useful for streaming.
Or for very simple game (say a Street fighter 2 which has a very well defined limited possible future states and not something analogue a la mouse or head movement):

https://www.diva-portal.org/smash/get/diva2:1560069/FULLTEXT01.pdf

The results show that the new model slightly outperforms the naive method in prediction accuracy, with the difference being greater for longer predictions. However, it has far higher requirements both in terms of memory and computation cost. It seems unlikely that the model would significantly improve on current rollback netcode implementations. However, there may be ways to improve predictions further, and the effects on user experience remains unknown

But it smell like if we would have quantic computer type of situation where it could be interesting and well working for anything remotly complicated, the making the action occuring X numbers of game frame before the game catched you did them and fastly calculate their way back is more realistic for something that could be called negative latency (marketed has runahead sometime).

I would not use never work type of sentence for something like this, maybe in 2400 thinking it was ever an issue will be a fun tubedays talk.
 
Or for very simple game (say a Street fighter 2 which has a very well defined limited possible future states and not something analogue a la mouse or head movement):

https://www.diva-portal.org/smash/get/diva2:1560069/FULLTEXT01.pdf

The results show that the new model slightly outperforms the naive method in prediction accuracy, with the difference being greater for longer predictions. However, it has far higher requirements both in terms of memory and computation cost. It seems unlikely that the model would significantly improve on current rollback netcode implementations. However, there may be ways to improve predictions further, and the effects on user experience remains unknown
As they note there in addition to needing a model with better prediction to be useful, it also has to be the case that when you mis-predict, the impact is minimal which isn't the case and would be an even bigger issue with something where video is sent over the network. This is all for local rendering in their test so when you miss a prediction, you can render the correct frame and display it next monitor refresh. If you miss a prediction with a streaming game, you don't get the correct frame until after the network latency, plus transfer latency, plus compression buffer, etc.

Buffering is another issue people seem to gloss over when talking about streaming of games. The way we get things to not suck on higher latency, unreliable, links is with buffers. If you've ever watched data transfers from something like Youtube you find it tends to send you a big bunch of frames as quick as it can to fill a large buffer, then waits and does it again as that buffer depletes. It doesn't have to worry about small jitters, drops, etc because you have a plie of data to play through before it needs to get you the next bit. Works great for streaming videos, not so good for games. While you can have however big a buffer you like, the bigger it is the worse your latency is. If you had a one second buffer, which is kinda short for most streaming, that means you have introduced a second of lag, totally unacceptable. You want as little as possible. You can't afford to have NO buffer though, probably need at least a single frame so 16-33ms additional latency, depending on frame rate.

Thing is, if you go low with that buffer the chance of dropouts goes up. Those of us who play with pro audio can tell you all about that. In the pro audio world, you can adjust the size of the audio buffer on your soundcard. The lower the buffer, the lower the latency. Some high end cards will let you push it down in to the range of 1ms or less. That's great for responsiveness, when you press a key you hear the result right away... But man when you push it that low you get at risk of having things drop out. If your system can't get all the processing done fast enough and get the audio data to your card asap, then the buffer empties and you get a dropout. Bigger buffers make that less likely, at the cost of more latency.

Basically it means that to have acceptable latency, they have to use a small buffer, and that means that you will have something very sensitive to network stability. Drop a packet, you are probably going to get a visual dropout. Network congestion, dropout. Etc, etc.

There's just a whole lot of issues with streaming when it is interactive.
 
I also don’t think AI/ML for latency compensation is appropriate - if the model becomes large enough to “predict” the next frame(s) based on prior inputs and game state then am I really playing a game or “being played”?
 
Probably but I'm just trying to figure out what "negative latency is". It is just marketing wank that someone coined up after talking to an engineer for like 5 minutes. You know like how all marketing wanks came about for the past 40 years.
I can only think of one solution to negative lag. The service predicting all possible moves that the player can do next and play 500 instances of the game simultaneously. And then they can decide which one to show to the player. Basically what the multiverse theory is for IRL.
 
I can only think of one solution to negative lag. The service predicting all possible moves that the player can do next and play 500 instances of the game simultaneously. And then they can decide which one to show to the player. Basically what the multiverse theory is for IRL.

Since we're talking about magic. Couldn't that service be observing other play sessions in semi real-time and basing the predictions off that?

That way it would only need to do 397 instances... Also time travel.
 
Since we're talking about magic. Couldn't that service be observing other play sessions in semi real-time and basing the predictions off that?

That way it would only need to do 397 instances... Also time travel.
Of course, and the ai can learn to play the game itself, quit google, become a twitch streamer and create an onlyfans.
 
I can only think of one solution to negative lag. The service predicting all possible moves that the player can do next and play 500 instances of the game simultaneously. And then they can decide which one to show to the player. Basically what the multiverse theory is for IRL.
This does exist for emulators but in more complicated games like Halo Infinite this is just not possible. A lot of people forget the stuff Nvidia did back when they made the nForce chipset and they had network adapters that had lower latency, The reason we used PS2 ports instead of USB was for lower latency. CRT monitors have lower latency than LCD. Wired devices have lower latency than wireless. Now add cloud gaming and latency becomes a big problem. There's a reason why a desktop PC will play Stadia with lower latency than a ChromeCast on your TV using the Stadia gamepad.
 
Since we're talking about magic. Couldn't that service be observing other play sessions in semi real-time and basing the predictions off that?

That way it would only need to do 397 instances... Also time travel.
I mean not magic, just infeasibility. It could actually be done, in theory, with a lot of hardware today. You take the current game state, and then all of the possible next game states and you render out each possible next state. As soon as the player does something, or does nothing, you display the already rendered frame for that state, then repeat the process. We have the kind of hardware to do this, in theory, if we wanted to. Even if there are 10,000 possible next game states from each current one, we have massive clusters that are 10,000 computers large. You could have them all work in parallel, running multiple copies of the game and synchronize it all. It would be a hell of a task, but this is the kind of thing that a supercomputer, with its high speed inter-node network, could do if you wanted to...

...however it would NOT be done because of the cost. Even though companies like Google and Amazon have the level of hardware you'd need to do that, and maybe the level of interconnection, nobody would actually bother since the cost would be extreme. It's one thing to stream people games when you have a computer running a copy of it, and maybe more than on person running on the same computer if the game is low spec enough it can be split up. It another thing to need a massive datacenter for each player.

So it isn't magic, it is just not feasible. The emulators show that. They really can do it, it really works, you just need a super high spec computer to do it with extremely old games. It isn't useful technology for modern games.

All that aside, if you did do it, as I noted in my earlier post, you'd have to send all those frames to the client which would crush the network and negate any latency gains.

There's no "negative latency" to be had with streaming, there never will be. Steaming will ALWAYS incur the time it takes data to travel in both directions, the transmission time, and the buffering time. That is on top of whatever local latency (monitor, etc,) there is.
 
The funny thing about claiming tha negative lag is attempting to do every possible permutation of the future is... actually kinda correct.

It's a fighting game. For most attacks/combos there aren't 10,000 variations. There's like... 5. Almost all attacks have a delay between button press and actual hitbox confirmation, and there's a damn good chance, if you follow what a combo is going to naturally do, that you'll guess adequately. Negative lag basically has outcomes setup for the common scenarios that all, by design, have input delays, and then changes the animation to match the input delay of the move. For 2D games with hitboxes and pre-defined combos, this is actually kinda easy to predict.

Games that have a lot more variables - such as an FPS with zero aiming assistance - can't do "negative lag" nearly as well because there's always that 1% chance that instead of firing on the enemy, you decide to just... do something else. Aiming Assistance is about 50% trying to make FPS usable with dual-stick, and 50% being able to fudge having absolutely exact networking latency. "Close enough" is basically what aiming assistance does.

https://arstechnica.com/gaming/2019...g-games-use-delay-based-and-rollback-netcode/ here is a rather indepth article explaining how it works.
 
This is the part of the story that bothered me:

“And while Stadia’s approach to streaming games for consumers was built on a strong technology foundation, it hasn’t gained the traction with users that we expected so we’ve made the difficult decision to begin winding down our Stadia streaming service.”

Umm... no. This approach was flawed from the beginning. Anybody who's tried to develop or even regularly use an online gaming service on a flaky residential broadband connection knew that this service was going to fail.

This might have worked in some ideal future world where everyone has multi-gigabit fiber connections hardwired to their homes, but that's not the reality we're in now.
 
This is the part of the story that bothered me:

“And while Stadia’s approach to streaming games for consumers was built on a strong technology foundation, it hasn’t gained the traction with users that we expected so we’ve made the difficult decision to begin winding down our Stadia streaming service.”

Umm... no. This approach was flawed from the beginning. Anybody who's tried to develop or even regularly use an online gaming service on a flaky residential broadband connection knew that this service was going to fail.

This might have worked in some ideal future world where everyone has multi-gigabit fiber connections hardwired to their homes, but that's not the reality we're in now.
The problem I've always seen with shit like this is that the people it'll work the best for are the people who probably don't want it, and the people who might want it, it isn't likely to work well for. So ya, if you have a good, stable, high bandwidth, low ping Internet connection, and you have a high quality TV/monitor that has low input latency, etc then you can probably get a good experience. But the kind of people for who that is a priority often are willing and able to just spend the money to get a good gaming setup at home. The kind of person I could see this really appealing to is the kind who DOESN'T want to (or is not able to) spend the money. They need low costs. Well guess what? That means that they are much more likely to have crap Internet and the like, so it won't be a good experience.

Plus they really shot themselves in the foot, quality wise. I remember the original chatter being that it would be maxed settings, maybe even settings beyond what you could have at home because big huge cloud power! In reality, you got roughly Xbox One X settings. Man, that's not a very expensive console. Like maybe you get some more interest if you legitimately are offering settings that only a $6000, dual GPU monster PC could deliver because not many people are willing to buy that. But when you are offering what you get from a few hundred dollar console, but worse because of lag and compression artifacts, that's a much smaller market.
 
No idea what the "typical Stadia customer" was supposed to be, but I did have one friend that had one and actually liked it. He was a non-gamer who hadn't owned a console in forever, but was intrigued by the concept. Internet speeds around here are quick, so bandwidth wasn't an issue. He dug it, so I'll have to ask what he thinks of this. For all I know he would've been more happy with an Xbox or PS instead. I'm curious if he plans to buy either of those or if the plan is just to bail on gaming again.
 
No idea what the "typical Stadia customer" was supposed to be, but I did have one friend that had one and actually liked it. He was a non-gamer who hadn't owned a console in forever, but was intrigued by the concept. Internet speeds around here are quick, so bandwidth wasn't an issue. He dug it, so I'll have to ask what he thinks of this. For all I know he would've been more happy with an Xbox or PS instead. I'm curious if he plans to buy either of those or if the plan is just to bail on gaming again.

I've had one for over a year. I've got a decent connection (200mb) and honestly, it was perfect for me when I wanted to play in the short amount of time I had to dedicate to gaming. I've got 3 boys who all have their own Xbox's (well, one of them is technically mine) but I just hate going to play a game I haven't played in awhile and have to wait a few minutes for an update. By then, my gaming time has passed and I've got other duties to move onto. It was perfect for me and the casual style games that I play (PGA Tour, Wreckfest, etc.). Never had any slowdown to speak of, and within 1 minute of turning my TV on, I was up and playing.

I'm glad they're refunding all the money I spent for my pro subscription ($10/month) and the couple of game purchases I made along with the hardware, however, there's now nothing really else out there outside of Luna which has its own issues. X-cloud as I understand it is not ready for primetime.

Stadia wasn't perfect by no means, but for this married father of 3 with little time for gaming, it was perfect for ME. RIP
 
The problem I've always seen with shit like this is that the people it'll work the best for are the people who probably don't want it, and the people who might want it, it isn't likely to work well for. So ya, if you have a good, stable, high bandwidth, low ping Internet connection, and you have a high quality TV/monitor that has low input latency, etc then you can probably get a good experience. But the kind of people for who that is a priority often are willing and able to just spend the money to get a good gaming setup at home. The kind of person I could see this really appealing to is the kind who DOESN'T want to (or is not able to) spend the money. They need low costs. Well guess what? That means that they are much more likely to have crap Internet and the like, so it won't be a good experience.

Plus they really shot themselves in the foot, quality wise. I remember the original chatter being that it would be maxed settings, maybe even settings beyond what you could have at home because big huge cloud power! In reality, you got roughly Xbox One X settings. Man, that's not a very expensive console. Like maybe you get some more interest if you legitimately are offering settings that only a $6000, dual GPU monster PC could deliver because not many people are willing to buy that. But when you are offering what you get from a few hundred dollar console, but worse because of lag and compression artifacts, that's a much smaller market.

Totally agree. The marketing claimed cloud gaming is some magical thing using a supercomputer to game. The reality is it's just playing on a normal PC over the internet.

Even as there are improvements to make internet faster and cheaper there are also improvements to make hardware faster and cheaper. It makes it very hard for cloud gaming to be worth it to the majority of gamers.
 
Back
Top