Google sidelining Stadia

Games also have an inherit latency built into them. Higher frame rates produce lower input lag. I'd personally would rather buy a 1080P monitor at 144Hz than a 4k Monitor at 60Hz just because of the increased response time. For years console games claimed that 30fps was good enough for anyone and that's clearly a lie. If frame rates drop then input lag is increased, and Google isn't using state of the art graphics for their servers. Their CPU's aren't high clock speed CPU's either so again frame rates will drop bellow 60fps. They could just lower texture quality and turn down graphics but that means the gameplay experience won't be on part with a high end gaming PC, which is kinda their selling point.
It's already not. If you look around for Stadia comparisons you find that it is maybe on par with an Xbox Series X, usually not even that. That is all not taking in to account the degradation in quality video compression brings (like it looks the same on Youtube but in person you'd notice the lack of compression from the Xbox).
 
Everyone in the match connects to the same machine so physics, player input etc are all calculated on the same machine, and that machine has an array of GPUs rendering out the visuals individually for each player.
This is basically what I said earlier in the thread. If you're playing CS or WZ or Fortnite or PUBG, you're already playing at 60-100ms even when your local machine has 0ms simply based on the speed of light and servers still running at only 64 ticks. This is true even for the kids who type in all caps that their one true love is low latency and who invested their parents' retirement funds in 240i monitors refreshing at 10khz.

For the people who actually understand this, the game then becomes how to allocate the latency budget. If the machine doing the rendering is also the machine that's running the match, then all of the desync problems that plague these games (and, coincidentally, enable various cheats) go away. That doesn't sound so bad by itself - but there's NFW I'm paying $600 to play PUBG.

From a practical standpoint, the problem here is that the engines have to be architected this way from day 1, and after the engines are built, the games built on top of them also have to be built around this from day 1. Neither of those will go far without hardware that caters to this concept (see posts re:codecs essentially requiring ASICs (or FPGAs aka GPUs (third nested parenthesis because why not))). Each of these steps requires a tremendous amount of development (read: $$). This is an area where the first major entrant will fail by default, the second will go a bit further and then fail, and the third will finally show promise but also fail. The fourth one will own the market. No business wants to be the first three. The most important bit here is that none of the failures need to be for technical reasons but rather will happen as purely business fails.

I'll admit that, as deeply opposed to GaaS as I am, it is still morbidly interesting to watch everyone running circa 2015 hardware talk about how cloud gaming will always have unplayable latency. Those folks couldn't recognize latency until 120ms after it hit them in the face. Go look at a Steam hardware survey and then tell me about latency. LOL.
 
One thing they could do with cloud gaming that would at least mitigate some lag is basically set it up the same way consoles do split screen games.
The best thing to do with cloud gaming is put games that don't depend on reaction times. Turn based RPG's come to mind. The problem is the most popular games are heavily dependent on reaction times.
I'll admit that, as deeply opposed to GaaS as I am, it is still morbidly interesting to watch everyone running circa 2015 hardware talk about how cloud gaming will always have unplayable latency. Those folks couldn't recognize latency until 120ms after it hit them in the face. Go look at a Steam hardware survey and then tell me about latency. LOL.
Why would 2015 hardware mean anything? PC hardware doesn't evolve as much as it used to and you could run nearly a decade old PC and still get 60fps. Stadia uses a Vega 56 equivalent GPU and that's 6 year old hardware. You also don't get compression artifacts in old hardware like you would on cloud gaming services. If you want to get technical, really old games from the 90's typically ran 60fps and had the lowest input lag due to using hard wired controllers and a CRT TV. The age of the hardware has nothing to do with it so long as the hardware can handle the game at a reasonable frame rate.

 
Why would 2015 hardware mean anything?
2015 you would have been on the higher end to get access to USB 1.3 which allowed for a polling rate of 1000hz which would have gotten you a base of 8ms input latency, but that’s only if they didn’t cheap out on it otherwise it was likely only 500 which was standard on USB 1.2, unless they cheeped out there which was then following USB 1.1’s 250hz polling speeds. So at best 8ms for the USB controller to register a key press at best 32ms at worst. Then you can add windows and drivers there bringing it from a range of 18-42, assuming 60fps on a 60hz screen that’s adding at best another 16.6ms assuming it was a higher end screen otherwise it could very easily be double that bringing a top end 2015 gaming PC between 40 and 81ms after including delays between the GPU and Monitor interface. Being unable to maintain 60fps would just increase things from there or if the CPU is pinned then it gets dramatically worse as the USB input interrupts often collide and get dropped.

still a better experience than Stadia though… cheaper too.
 
Have any of you ever met anyone that prefers cloud gaming? Never met, heard, read, about anyone, that prefers cloud gaming.
Android Police is one of the tech blogs I frequent (less so lately since they redesigned their site and its ads are more intrusive), and any Stadia article the post, you'll see many fanboys down in the comments about it. There's one in particular named Lars Jepson or something and he's usually vehemently defending Stadia to any criticism anyone else posts about it and constantly raves about how awesome it is to play AAA games on his phone or laptop or whatever.

Meanwhile I don't see how one would enjoy playing many AAA games on a cramped phone or laptop screen. I would rather wait to play when I'm at home on a proper large TV/monitor or my ultrawide monitor and with a nice speaker or headphone setup. Playing a good action/adventure game or even a FPS like Doom Eternal on a small phone screen would kind of kill the immersion and point of playing those types of games IMO. Playing Rocket League or some other casual game though I guess I could see the appeal.
 
Android Police is one of the tech blogs I frequent (less so lately since they redesigned their site and its ads are more intrusive), and any Stadia article the post, you'll see many fanboys down in the comments about it. There's one in particular named Lars Jepson or something and he's usually vehemently defending Stadia to any criticism anyone else posts about it and constantly raves about how awesome it is to play AAA games on his phone or laptop or whatever.

Meanwhile I don't see how one would enjoy playing many AAA games on a cramped phone or laptop screen. I would rather wait to play when I'm at home on a proper large TV/monitor or my ultrawide monitor and with a nice speaker or headphone setup. Playing a good action/adventure game or even a FPS like Doom Eternal on a small phone screen would kind of kill the immersion and point of playing those types of games IMO. Playing Rocket League or some other casual game though I guess I could see the appeal.
Agreed but I'm buying a Steam Deck to see if we can stretch this a little bit.
 
Why would 2015 hardware mean anything?
My bad. I thought hardware had an impact on performance. I guess I'll sell all of my stuff and replace it with something 10 years old because, clearly, hardware doesn't mean anything.

Stadia uses a Vega 56 equivalent GPU and that's 6 year old hardware.
Yes, I think it's pretty well established by now that Stadia sucked. One of the reasons it sucked is because it ran on GPUs with roughly the performance of an Apple Watch. But, again, hardware doesn't mean anything, right?

You also don't get compression artifacts in old hardware like you would on cloud gaming services.
This is only sort of correct. Instead of compression artifacts, you get pixels the size of your hand and scene poly counts in the 4-digit range. I'll take the compression artifacts 10 times out of 10.

This is also assuming that no technologies like DLSS exist. Then again, I suppose since hardware doesn't matter, DLSS doesn't exist.
 
2015 you would have been on the higher end to get access to USB 1.3 which allowed for a polling rate of 1000hz which would have gotten you a base of 8ms input latency, but that’s only if they didn’t cheap out on it otherwise it was likely only 500 which was standard on USB 1.2, unless they cheeped out there which was then following USB 1.1’s 250hz polling speeds. So at best 8ms for the USB controller to register a key press at best 32ms at worst. Then you can add windows and drivers there bringing it from a range of 18-42, assuming 60fps on a 60hz screen that’s adding at best another 16.6ms assuming it was a higher end screen otherwise it could very easily be double that bringing a top end 2015 gaming PC between 40 and 81ms after including delays between the GPU and Monitor interface. Being unable to maintain 60fps would just increase things from there or if the CPU is pinned then it gets dramatically worse as the USB input interrupts often collide and get dropped.

still a better experience than Stadia though… cheaper too.
I'm confused, are you talking USB standards? USB 2.0 has been around for nearly 20 years and it should handle 1000hz just fine. USB 3.1 should handle maximum 8000 Hz polling rate, and most motherboards I've owned for nearly 10 years has had these, and I don't buy high end motherboards either. Maybe your 1's was a typo and you meant 3's?
Meanwhile I don't see how one would enjoy playing many AAA games on a cramped phone or laptop screen. I would rather wait to play when I'm at home on a proper large TV/monitor or my ultrawide monitor and with a nice speaker or headphone setup. Playing a good action/adventure game or even a FPS like Doom Eternal on a small phone screen would kind of kill the immersion and point of playing those types of games IMO. Playing Rocket League or some other casual game though I guess I could see the appeal.
With Valve's Deck coming out there's certainly a market for it. Portable handhelds have been around for over 30 years so it's not like this wasn't always a thing. The problem is that mobile phones and tablets aren't the best platform to play cloud games. It's barely the right environment to play games in general. Games on mobile have to be made for touch screens, otherwise the game is unplayable. You could attach a gamepad but that gets clunky and you're better off with a Switch. Cloud gaming believes it can just stand in place of a console or PC with no need to change how games function but that's clearly a lie. Even gamepads have changed how first person shooters work by giving them aimbot.

My bad. I thought hardware had an impact on performance. I guess I'll sell all of my stuff and replace it with something 10 years old because, clearly, hardware doesn't mean anything.
Some people still use a 2500K with a GTX 970 for 1080P gaming and they get 60fps. Maybe not at medium or high but the hardware still works just fine even today.
Yes, I think it's pretty well established by now that Stadia sucked. One of the reasons it sucked is because it ran on GPUs with roughly the performance of an Apple Watch. But, again, hardware doesn't mean anything, right?
No, Stadia sucked for many other reasons. In fact, most people claim that Stadia had the best performance despite the older hardware. The issue is that either people experience lag or the cost of entry was too high. Stadia needs you to buy games from them and only them. Nvidia did this years ago with Grid and that failed horribly. Sony does have a large selection of games you can play at no additional cost but they also had the worst service. What Stadia has is called techno feudalism. Once you enter their walled garden, you aren't in capitalism. You don't actually own the games and the service can and will go bankrupt as we see right now.

This is only sort of correct. Instead of compression artifacts, you get pixels the size of your hand and scene poly counts in the 4-digit range. I'll take the compression artifacts 10 times out of 10.
Exactly what resolution do you expect older hardware to run to get giant pixels? Is 1080P that terrible for you? Here's the compression artifacts I'm talking about.
a6irvvo40dn21.jpg

This is also assuming that no technologies like DLSS exist. Then again, I suppose since hardware doesn't matter, DLSS doesn't exist.
Who cares about DLSS? It's the poor mans 4k.
 
I'm confused, are you talking USB standards? USB 2.0 has been around for nearly 20 years and it should handle 1000hz just fine. USB 3.1 should handle maximum 8000 Hz polling rate, and most motherboards I've owned for nearly 10 years has had these, and I don't buy high end motherboards either. Maybe your 1's was a typo and you meant 3's?
Yeah, ignore me I am old and mixing up my decades, senile old man here yelling at the clouds.
With Valve's Deck coming out there's certainly a market for it. Portable handhelds have been around for over 30 years so it's not like this wasn't always a thing. The problem is that mobile phones and tablets aren't the best platform to play cloud games. It's barely the right environment to play games in general. Games on mobile have to be made for touch screens, otherwise the game is unplayable. You could attach a gamepad but that gets clunky and you're better off with a Switch. Cloud gaming believes it can just stand in place of a console or PC with no need to change how games function but that's clearly a lie. Even gamepads have changed how first person shooters work by giving them aimbot.
I want one, just not first-gen, that thing is an emulation beast and I would love to turn it into a portable retro console
Some people still use a 2500K with a GTX 970 for 1080P gaming and they get 60fps. Maybe not at medium or high but the hardware still works just fine even today.
Just donated mine (it was running a GTX 1060 though) the AMD (I want to say R7 370) blew out and got the 1060 6GB as a replacement/upgrade, gave them my OLD Dell 24" 1080p monitor I had on it and they are getting good use from it.
No, Stadia sucked for many other reasons
There is a good idea in a vacuum, where literally NO other alternatives exist, but there are lots of alternatives and lots of games and nothing on Stadia is an "OMG I must play this now " title so there is no way anybody at google who plays games looked at this and said "yeah this will turn a profit"
Who cares about DLSS? It's the poor mans 4k.
I'm not sure that's quite right, but developers have added graphical complexity at a faster rate than the GPU's can keep up with so its a product for the times that will help with growing transition periods, because you know soon they are going to be touting 8K gaming as the next thing and I will love to see what kind of power we need to handle that.
But wow those screenshots, I mean that is some crap compression, it's like they rendered it at 720p compressed it with some shitty twitch streaming software then just used some basic upscale filter to bring it back to 1080p, that may just be a super bad example but its bad is worse than I expected.
 
There is a good idea in a vacuum, where literally NO other alternatives exist, but there are lots of alternatives and lots of games and nothing on Stadia is an "OMG I must play this now " title so there is no way anybody at google who plays games looked at this and said "yeah this will turn a profit"
It's all about the reoccurring revenue. It's why Microsoft has Gamepass and why everyone wants you to subscribe to their service, because they know you'll forget and the monthly subscription keeps rolling in. The problem is cloud gaming is suppose to be an alternative to buying expensive hardware but todays cheap hardware is actually pretty good. AMD's Ryzen laptop APU's are really good for gaming, and even Intel is pretty decent too.
I'm not sure that's quite right, but developers have added graphical complexity at a faster rate than the GPU's can keep up with so its a product for the times that will help with growing transition periods, because you know soon they are going to be touting 8K gaming as the next thing and I will love to see what kind of power we need to handle that.
That's always been a thing since Quake 3 Arena and that really hasn't prevented people from running it on a Voodoo Rush graphics card or an Nvidia TNT. Nvidia will always try to push some new technology that even their best hardware can barely handle and nobody will care about it for a few generations until even built in graphics can handle it easily. Right now that's Ray-Tracing.
 
Back
Top