https://kotaku.com/google-stadia-streaming-failing-shutdown-report-stream-1848487185/amp
They’re not killing it, but they are moving the focus away from gaming.
They’re not killing it, but they are moving the focus away from gaming.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
https://kotaku.com/google-stadia-streaming-failing-shutdown-report-stream-1848487185/amp
They’re not killing it, but they are moving the focus away from gaming.
It will be forced in everyone eventually. It gives publishers completely control and will pretty much end piracy. The latency will be good enough for the majority of people.I don't think cloud gaming will ever be for anything but casual games.
The latency issue just kills it.
Your input lag winds up being whatever it normally is for local games plus two times your network latency, once for the mouse/keyboard input to travel to the server, and once for the image to travel back to you.
And then there's the cat fight over licensing that will inevitably restrict availability of titles...
Yea, when i see kids discussing which grease to put on keyboards to improve key response and speed, i cant imagine a group wanting this type of product. I see the allure of potential high end gaming on only needing a terminal to play on, but i just do see a whole lot in that group. Those who can't afford a gaming pc i don't think are going to want to pay the monthly fees for such setups as an alternative.I don't think cloud gaming will ever be for anything but casual games.
The latency issue just kills it.
Your input lag winds up being whatever it normally is for local games plus two times your network latency, once for the mouse/keyboard input to travel to the server, and once for the image to travel back to you.
And then there's the cat fight over licensing that will inevitably restrict availability of titles...
I see it eventually coming for MMO or other multiplayer MMO like games. Game installs basically a sandboxed RDP client at that stage, input lag can be compensated for with various adaptive algorithms and frame timing.It will be forced in everyone eventually. It gives publishers completely control and will pretty much end piracy. The latency will be good enough for the majority of people.
I wouldn't say never, just "not now." Latency can improve, and access to titles depends on the service. Remember, even streaming music services seemed doomed to fail in 2003, when iTunes and other download stores became all the rage. They were right at the time — many services only had small catalogs, and internet access just wasn't up to the job then. But flash forward a couple of decades and streaming is the dominant form of music consumption thanks to full catalogs and near-ubiquitous broadband.I don't think cloud gaming will ever be for anything but casual games.
The latency issue just kills it.
Your input lag winds up being whatever it normally is for local games plus two times your network latency, once for the mouse/keyboard input to travel to the server, and once for the image to travel back to you.
And then there's the cat fight over licensing that will inevitably restrict availability of titles...
I wouldn't say never, just "not now." Latency can improve, and access to titles depends on the service. Remember, even streaming music services seemed doomed to fail in 2003, when iTunes and other download stores became all the rage. They were right at the time — many services only had small catalogs, and internet access just wasn't up to the job then. But flash forward a couple of decades and streaming is the dominant form of music consumption thanks to full catalogs and near-ubiquitous broadband.
It will be forced in everyone eventually. It gives publishers completely control and will pretty much end piracy. The latency will be good enough for the majority of people.
Basically, this here, while there is absolutely no way to make the streaming services actual latency match that with a local machine newer render techniques can retroactively make up for it, over a streaming service it is possible to measure the delay between the server and the client so you can get an average delay as a known variable, it can then work out some algorithm magic with timestamps on both the user input signals sent to the servers combined with the outputs sent and it can retroactively apply signal inputs. When I last saw a demo of that tech it resulted in some visual jittering which looked sort of like a frameskip but that was back in 2018 when getting to tech conferenced was still a thing, I have to imagine it advanced along since then, but it will never be flawless, simply "good enough" so it really is dependent on the titles made available, RPG's, anything turn-based, most MMO's, probably even RTS titles, dungeon crawlers, sure those are all going to be fine titles for the platform but any fast-paced FPS or something with jumping puzzles is a straight-up no go.Fair, never say never. Technology continually improves.
That said, while bandwidth has improved immensely in the last decade, in that time latencies have been mostly unchanged. At the very least they have improved at a much slower rate.
If we are waiting for:
1.) local input latency +
2.) network send latency +
3.) render time latency +
4.) image compression latency +
5.) image send network latency +
6.) local machine decompression latency +
7.) local display latency
to become competitive with just:
1.) Local input latency +
2.) local render time latency +
3.) local display latency
...
I think we'll be waiting for a while.
Certainly they can never become equal, and that isn't even necessary, but the whole 7 stage pipeline needs to become short enough that it doesn't make a difference, and that I think is unlikely any time soon, especially considering how obsessed kids are with the reduced input lag from extreme high refresh rate rendering these days.
Let's not forget who has the power in this relationship.
If gamers decide they won't pay for something, it won't happen. Many of the undesirable things that happen in the industry happen because people continue pay for things that are anti-consumer because they are so obsessed the "gotta have it" even with all of its shortcomings.
I suspect this one might be a hard sell for at least the PC gaming crowd, but you never know. The weakness to stand on principle among consumers never ceases to amaze me.
I don't think cloud gaming will ever be for anything but casual games.
The latency issue just kills it.
Your input lag winds up being whatever it normally is for local games plus two times your network latency, once for the mouse/keyboard input to travel to the server, and once for the image to travel back to you.
And then there's the cat fight over licensing that will inevitably restrict availability of titles...
That's true, but the next time it happens will be the first, or nobody would know what Denuvo is.If gamers decide they won't pay for something, it won't happen.
Nah, the OS is just going to become more and more bereft of features, and you will need to subscribe to a service to get specific parts of functionality out of it. Or better yet, pay to opt out of features you don't want.seen enough your OS will be a Cloud OS that you connect to (for a fee)
Challenge accepted.I'm never wrong.
In 2012 when the first version of the Rift came out I said Half-Life 3 will be a VR game. I guess I was wrong because they made Half-Life: Alyx instead of Half-Life 3.
It's one of those technologies where you can see the immediate appeal and how it would ideally work... but reaching that ideal involves a massive amount of work. Like self-driving cars.I remember a friend right after high school talking about how cloud gaming was the future and was just a couple years (advancements) away... that was 20 years ago.
This is true in many aspects of life. The mild inconvenience of having to deal with third-party DRM, supplemental anti-cheat software, bloated install sizes, etc. can all be accepted just as each minor erosion of privacy, data breach, etc. are not enough to deter people from the perceived positive aspects of whatever service they're enjoying.Let's not forget who has the power in this relationship.
If gamers decide they won't pay for something, it won't happen. Many of the undesirable things that happen in the industry happen because people continue pay for things that are anti-consumer because they are so obsessed the "gotta have it" even with all of its shortcomings.
I suspect this one might be a hard sell for at least the PC gaming crowd, but you never know. The weakness to stand on principle among consumers never ceases to amaze me.
Most of the 7 stage pipeline could be engineered down to a level where it wouldn't impact gameplay. Google couldn't do it, but that's because Google can't even write Hello World unless Apple shows them how to do it first - but I digress.If we are waiting for:
1.) local input latency +
2.) network send latency +
3.) render time latency +
4.) image compression latency +
5.) image send network latency +
6.) local machine decompression latency +
7.) local display latency
to become competitive with just:
1.) Local input latency +
2.) local render time latency +
3.) local display latency
...
I think we'll be waiting for a while.
Most of the 7 stage pipeline could be engineered down to a level where it wouldn't impact gameplay. Google couldn't do it, but that's because Google can't even write Hello World unless Apple shows them how to do it first - but I digress.
The network send latency could be cut down to just a single digit ms today simply by spending more for better routing. With some focused engineering on the infrastructure side, plus expensive nationwide/global deployment, it could drop to consistently under 3ms.
Image compression and decompression latency could be engineered down to sub-3ms in a single generation. When I remote into my machines via Parsec at 4K, I typically see ~5ms attributed to each side of that. Technologies already exist (but are not yet deployed) which would bring that down to 1ms. I haven't checked what it's like at only 1080p, but I presume it's appreciably lower than what I see at 4K.
So, the latency can largely be brought down to better than acceptable levels with some work. It's a solvable problem. You know what is not a solvable problem, however? Computational resource availability.
1000 simultaneous users who want 3090-level performance will always require 1000 3090s-worth of resources available. Nobody is ever going to pay a subscription price which would allow the provider to recoup that in an acceptable amount of time. Usage is largely cyclic (ie, primetime-focused) with nearly zero usage in off-hours. Since the compute resources need to be in the same region as the users, that also means the provider couldn't simply assign the resources to regions in far-off timezones to increase utilization.
That, IMO, is where the real unsolvable bottleneck lays.
$100 for 6 months, or $100/mo when 6 months are purchased at once? There's a pretty big difference, but either way, it still means a long time to recoup just the basic hardware costs unless the usage pattern looks like a gym membership (<5% of gym members typically use the gym in any given month).They have an RTX 3080 tier that from looking at their forums people are paying for. They're billing it at $100 every 6 months. According to settings in some games it looked like the basic paid tier ($50 / 60 mo) is running Tesla T10 GPUs
Every time i forget to change my TV to gaming mode, i can tell in about 3 seconds of using my ps5. 40-50ms input lag is a deal breaker. People don't want it.Done right input lag would be indistinguishable from server lag
Most people have higher input lag than that running on their local PC's mid-range PC running a 240hz 1080p screen is on average going to have like 65ms between button press and the first frame render. Higher than that for online games, most popular FPS titles are going to be into the mid-'80s assuming a good internet connection.Every time i forget to change my TV to gaming mode, i can tell in about 3 seconds of using my ps5. 40-50ms input lag is a deal breaker. People don't want it.
$100 for 6 months, or $100/mo when 6 months are purchased at once? There's a pretty big difference, but either way, it still means a long time to recoup just the basic hardware costs unless the usage pattern looks like a gym membership (<5% of gym members typically use the gym in any given month).
That's why people are paying thousands for PS5's and GPU's right? They've been trying to force cloud gaming for nearly a decade and it hasn't taken off. Ask Sony who forced anyone who wanted to play PS3 games through their cloud gaming service. They even killed the PS3's online service for games. Yet, nobody bought into it. They either got a real PS3 or they run RPCS3 on PC and enjoy the games at no cost.It will be forced in everyone eventually.
They sure wish it would but as history has shown the more companies try to fight piracy the easier it gets to pirate. I wonder how many people will share the same cloud service to play a game?It gives publishers completely control and will pretty much end piracy.
Majority of idiots you mean. Everyone complains about the latency except those who promote it. I'm a network engineer and cloud gaming will never be possible. The speed of light is absolute. The only way cloud gaming will have any future is if it's super cheap. Sony wants a lot for their service but it does offer a lot in games. Nvidia offers no games but does allow you to use some of your existing PC library. Google offers neither and wants you to buy their games to use their service. The fact is most of these cloud gaming services are going to offer you older games or games that didn't fair well in the market if they go Netflix like subscription. Games that are so old or so Indie that they can run on pretty much any hardware available today for cheap with no latency. You know, the same hardware that cloud gaming is hoping you'd use to play on their service. Old games can be found cheap, probably cheaper than the monthly fee, or piracy because fuck morals.The latency will be good enough for the majority of people.
If you are designing a game specifically to be played online and bake the latency times specifically into the engine it could be done, but nobody is going to play CoD, Destiny 2, Halo, or any sort of FPS title on the existing setup and thing "WOW this is awesome!" It by its nature is going to be a limiting platform that is and always will be garbage for AAA twitch reflex games. But if you wanted to build the most advanced SimCity/Sims/Civilization crossover game where things could be done with an input poll-time of say 6s and design the game to be played remotely from the said environment then yeah that could work just fine but it would have to be designed to be run remotely from a sorta shitty network not designed to be run on a state of the art gaming PC and ported to the service. Just not going to fly.That's why people are paying thousands for PS5's and GPU's right? They've been trying to force cloud gaming for nearly a decade and it hasn't taken off. Ask Sony who forced anyone who wanted to play PS3 games through their cloud gaming service. They even killed the PS3's online service for games. Yet, nobody bought into it. They either got a real PS3 or they run RPCS3 on PC and enjoy the games at no cost.
They sure wish it would but as history has shown the more companies try to fight piracy the easier it gets to pirate. I wonder how many people will share the same cloud service to play a game?
Majority of idiots you mean. Everyone complains about the latency except those who promote it. I'm a network engineer and cloud gaming will never be possible. The speed of light is absolute. The only way cloud gaming will have any future is if it's super cheap. Sony wants a lot for their service but it does offer a lot in games. Nvidia offers no games but does allow you to use some of your existing PC library. Google offers neither and wants you to buy their games to use their service. The fact is most of these cloud gaming services are going to offer you older games or games that didn't fair well in the market if they go Netflix like subscription. Games that are so old or so Indie that they can run on pretty much any hardware available today for cheap with no latency. You know, the same hardware that cloud gaming is hoping you'd use to play on their service. Old games can be found cheap, probably cheaper than the monthly fee, or piracy because fuck morals.
To give you an idea on how quickly things are evolving, the Stadia service is using Vega 56 equivalent cards to run their games. You know... something that's 6 years old. No Ray-Tracing, no high frame rates or high resolutions with high frame rates. A Vega 56 is a very capable card today but not the same as Nvidia's Grid... I mean Geforce Now. I'll be back when Nvidia renames their failed service a forth time to relfect that they went the Netflix route instead of just allowing you to play some of your PC games.
It will be forced in everyone eventually. It gives publishers completely control and will pretty much end piracy. The latency will be good enough for the majority of people.
That's why people are paying thousands for PS5's and GPU's right? They've been trying to force cloud gaming for nearly a decade and it hasn't taken off. Ask Sony who forced anyone who wanted to play PS3 games through their cloud gaming service. They even killed the PS3's online service for games. Yet, nobody bought into it. They either got a real PS3 or they run RPCS3 on PC and enjoy the games at no cost.
They sure wish it would but as history has shown the more companies try to fight piracy the easier it gets to pirate. I wonder how many people will share the same cloud service to play a game?
Majority of idiots you mean. Everyone complains about the latency except those who promote it. I'm a network engineer and cloud gaming will never be possible. The speed of light is absolute. The only way cloud gaming will have any future is if it's super cheap. Sony wants a lot for their service but it does offer a lot in games. Nvidia offers no games but does allow you to use some of your existing PC library. Google offers neither and wants you to buy their games to use their service. The fact is most of these cloud gaming services are going to offer you older games or games that didn't fair well in the market if they go Netflix like subscription. Games that are so old or so Indie that they can run on pretty much any hardware available today for cheap with no latency. You know, the same hardware that cloud gaming is hoping you'd use to play on their service. Old games can be found cheap, probably cheaper than the monthly fee, or piracy because fuck morals.
To give you an idea on how quickly things are evolving, the Stadia service is using Vega 56 equivalent cards to run their games. You know... something that's 6 years old. No Ray-Tracing, no high frame rates or high resolutions with high frame rates. A Vega 56 is a very capable card today but not the same as Nvidia's Grid... I mean Geforce Now. I'll be back when Nvidia renames their failed service a forth time to relfect that they went the Netflix route instead of just allowing you to play some of your PC games.
Yeah, my PS5 is hooked up to an 11-year-old LG plasma. I mean it looks beautiful, but I have to imagine the latency is insane but for RPGs and turn-based games that I like it's more than fine.You're coming from the perspective of a hardcore gamer. And I agree from that perspective, it will never be good enough for us.
But there are tons of plebs that play consoles on garbage TVs that add 50ms+ of input latency and don't know there's anything better. Even the people buying the new PS5's and Xbox whatever Xs.
Also to add to that console gamers played at 30 FPS and even less for decades. You didn't hear many complains from them.You're coming from the perspective of a hardcore gamer. And I agree from that perspective, it will never be good enough for us.
But there are tons of plebs that play consoles on garbage TVs that add 50ms+ of input latency and don't know there's anything better. Even the people buying the new PS5's and Xbox whatever Xs.
Also to add to that console gamers played at 30 FPS and even less for decades. You didn't hear many complains from them.
I could see Microsoft making WoW 2 a streamed game, having that running in their data center would be pretty hard to hack/bot/data mine when none of it lives on the local machine. It's already a subscription service that wouldn't be a hard sell.Yeah. I'm sure there will be some rumbling, but for the average person most will embrace it. Streaming from Netflix doesn't result in the best video quality, but most people choose to stream movies/TV shows.
Plus the younger generation will grow up in a world of streaming/cloud subscriptions and likely the idea of buying dedicated hardware and having to "bother" with buying a disc, download or even downloading a patch on a console will seem like a big inconvenience.
Fair, never say never. Technology continually improves.
That said, while bandwidth has improved immensely in the last decade, in that time latencies have been mostly unchanged. At the very least they have improved at a much slower rate.
If we are waiting for:
1.) local input latency +
2.) network send latency +
3.) render time latency +
4.) image compression latency +
5.) image send network latency +
6.) local machine decompression latency +
7.) local display latency
You can do that it would be similar to how some asymmetric SLI technologies work, the problem would be the amount of data you would have to transmit back and forth you would basically be moving the frame buffer back and forth and I’m not sure there is a home network in existence that could handle that load. Intel has proposed a similar concept though for the ARC GPU’s when it detects an onboard iGPU offloading some of the minor tasks to it but it’s only really the PCIE5 bandwidth rates that make it feasible.It would be interesting if someone could craft an engine that decouples complex lighting/shading calculations from simpler geometry calculations.
Something like:
1) Geometry engine just renders textures and geometry on local machine, no lighting calculations done. Use a very light GPU on local machine.
2) Lighting/ Shading engine renders everything correctly and sends it as textures to the Geometry engine across the network.
I'm not a graphics engineer, and I certainly dont expect the above to necessarily work, but something in that spirit. Give complex beautiful calculations to the remote server, leave simple latency sensitive geometry calculations local.
This is an inherently flawed point of view because it is so narrow it completely misses...well, almost everything.I'm a network engineer and cloud gaming will never be possible. The speed of light is absolute.