Google sidelining Stadia

I don't think cloud gaming will ever be for anything but casual games.

The latency issue just kills it.

Your input lag winds up being whatever it normally is for local games plus two times your network latency, once for the mouse/keyboard input to travel to the server, and once for the image to travel back to you.

And then there's the cat fight over licensing that will inevitably restrict availability of titles...
 
I don't think cloud gaming will ever be for anything but casual games.

The latency issue just kills it.

Your input lag winds up being whatever it normally is for local games plus two times your network latency, once for the mouse/keyboard input to travel to the server, and once for the image to travel back to you.

And then there's the cat fight over licensing that will inevitably restrict availability of titles...
It will be forced in everyone eventually. It gives publishers completely control and will pretty much end piracy. The latency will be good enough for the majority of people.
 
I don't think cloud gaming will ever be for anything but casual games.

The latency issue just kills it.

Your input lag winds up being whatever it normally is for local games plus two times your network latency, once for the mouse/keyboard input to travel to the server, and once for the image to travel back to you.

And then there's the cat fight over licensing that will inevitably restrict availability of titles...
Yea, when i see kids discussing which grease to put on keyboards to improve key response and speed, i cant imagine a group wanting this type of product. I see the allure of potential high end gaming on only needing a terminal to play on, but i just do see a whole lot in that group. Those who can't afford a gaming pc i don't think are going to want to pay the monthly fees for such setups as an alternative.
 
I sorta liked the look of that controller they developed - but they should've had a premium dedicated console to release alongside their cheap streaming boxes. Maybe really give a hard push on android gaming - oh well, too late now. Stadia will just be added to a long list of failed Alphabet hardware.
 
It seems unfeasible based on the ways things work right now, but 1-2 unforeseen technical advances can change things in a hurry. I can absolutely see gaming going this direction in the not too distant future.
That said, I don't think Google will be the ones pioneering much of anything. They're the ones who compete with themselves at every turn and end up killing off all of their efforts just as people start to care.
 
It will be forced in everyone eventually. It gives publishers completely control and will pretty much end piracy. The latency will be good enough for the majority of people.
I see it eventually coming for MMO or other multiplayer MMO like games. Game installs basically a sandboxed RDP client at that stage, input lag can be compensated for with various adaptive algorithms and frame timing.
Done right input lag would be indistinguishable from server lag, throw in some direct VPN tech to minimize impact from external service providers and bam, for any MMO’esque title you’ve got a decent little setup. Take it a step further and utilize those TPM2 modules and run the client in a self contained authenticated VM and you have a system that is going to be hard as hell to bot or generally hack.
 
I don't think cloud gaming will ever be for anything but casual games.

The latency issue just kills it.

Your input lag winds up being whatever it normally is for local games plus two times your network latency, once for the mouse/keyboard input to travel to the server, and once for the image to travel back to you.

And then there's the cat fight over licensing that will inevitably restrict availability of titles...
I wouldn't say never, just "not now." Latency can improve, and access to titles depends on the service. Remember, even streaming music services seemed doomed to fail in 2003, when iTunes and other download stores became all the rage. They were right at the time — many services only had small catalogs, and internet access just wasn't up to the job then. But flash forward a couple of decades and streaming is the dominant form of music consumption thanks to full catalogs and near-ubiquitous broadband.
 
I wouldn't say never, just "not now." Latency can improve, and access to titles depends on the service. Remember, even streaming music services seemed doomed to fail in 2003, when iTunes and other download stores became all the rage. They were right at the time — many services only had small catalogs, and internet access just wasn't up to the job then. But flash forward a couple of decades and streaming is the dominant form of music consumption thanks to full catalogs and near-ubiquitous broadband.


Fair, never say never. Technology continually improves.

That said, while bandwidth has improved immensely in the last decade, in that time latencies have been mostly unchanged. At the very least they have improved at a much slower rate.

If we are waiting for:

1.) local input latency +
2.) network send latency +
3.) render time latency +
4.) image compression latency +
5.) image send network latency +
6.) local machine decompression latency +
7.) local display latency

to become competitive with just:

1.) Local input latency +
2.) local render time latency +
3.) local display latency

...


I think we'll be waiting for a while.

Certainly they can never become equal, and that isn't even necessary, but the whole 7 stage pipeline needs to become short enough that it doesn't make a difference, and that I think is unlikely any time soon, especially considering how obsessed kids are with the reduced input lag from extreme high refresh rate rendering these days.

It will be forced in everyone eventually. It gives publishers completely control and will pretty much end piracy. The latency will be good enough for the majority of people.

Let's not forget who has the power in this relationship.

If gamers decide they won't pay for something, it won't happen. Many of the undesirable things that happen in the industry happen because people continue pay for things that are anti-consumer because they are so obsessed the "gotta have it" even with all of its shortcomings.

I suspect this one might be a hard sell for at least the PC gaming crowd, but you never know. The weakness to stand on principle among consumers never ceases to amaze me.
 
Fair, never say never. Technology continually improves.

That said, while bandwidth has improved immensely in the last decade, in that time latencies have been mostly unchanged. At the very least they have improved at a much slower rate.

If we are waiting for:

1.) local input latency +
2.) network send latency +
3.) render time latency +
4.) image compression latency +
5.) image send network latency +
6.) local machine decompression latency +
7.) local display latency

to become competitive with just:

1.) Local input latency +
2.) local render time latency +
3.) local display latency

...


I think we'll be waiting for a while.

Certainly they can never become equal, and that isn't even necessary, but the whole 7 stage pipeline needs to become short enough that it doesn't make a difference, and that I think is unlikely any time soon, especially considering how obsessed kids are with the reduced input lag from extreme high refresh rate rendering these days.



Let's not forget who has the power in this relationship.

If gamers decide they won't pay for something, it won't happen. Many of the undesirable things that happen in the industry happen because people continue pay for things that are anti-consumer because they are so obsessed the "gotta have it" even with all of its shortcomings.

I suspect this one might be a hard sell for at least the PC gaming crowd, but you never know. The weakness to stand on principle among consumers never ceases to amaze me.
Basically, this here, while there is absolutely no way to make the streaming services actual latency match that with a local machine newer render techniques can retroactively make up for it, over a streaming service it is possible to measure the delay between the server and the client so you can get an average delay as a known variable, it can then work out some algorithm magic with timestamps on both the user input signals sent to the servers combined with the outputs sent and it can retroactively apply signal inputs. When I last saw a demo of that tech it resulted in some visual jittering which looked sort of like a frameskip but that was back in 2018 when getting to tech conferenced was still a thing, I have to imagine it advanced along since then, but it will never be flawless, simply "good enough" so it really is dependent on the titles made available, RPG's, anything turn-based, most MMO's, probably even RTS titles, dungeon crawlers, sure those are all going to be fine titles for the platform but any fast-paced FPS or something with jumping puzzles is a straight-up no go.

But if a company were to come out with the next "MUST HAVE MMO" and made it streaming exclusive I could see that being a thing, but if it is released on multiple platforms I would see that streaming version being the last-ditch "it's my only option" option for everybody, which isn't really a good place to be from a marketing prespective.
 
I don't think cloud gaming will ever be for anything but casual games.

The latency issue just kills it.

Your input lag winds up being whatever it normally is for local games plus two times your network latency, once for the mouse/keyboard input to travel to the server, and once for the image to travel back to you.

And then there's the cat fight over licensing that will inevitably restrict availability of titles...

Cloud gaming is inevitable, but it will never 100% replace local gaming.

It will first replace console gaming.

It's going to be slowly phased in to consoles.
First the normal consoles will all have the option of streaming games.
Then there will be multiple models with low end models only supporting streaming and more expensive models able to play the games normally.
They'll even have apps that run directly on TVs, no consoles required.
The expensive consoles will eventually become niche and some even phased out, the competitive/hardcore gamers that care about latency, image quality, etc. will move on to PC gaming.


However, there are plenty of hardcore PC gamers that actually care about things like latency, image quality, etc.
Real PC gaming will never die and there will be plenty of people saying PC gaming is dead like they always do and they'll be wrong like they always are.


I've already said all the above back when the first cloud gaming service was created, and then again when Stadia, etc. were announced. This prediction will 100% come true just like all my others. I'm never wrong.
 
seen enough your OS will be a Cloud OS that you connect to (for a fee)
Nah, the OS is just going to become more and more bereft of features, and you will need to subscribe to a service to get specific parts of functionality out of it. Or better yet, pay to opt out of features you don't want.
 
I remember a friend right after high school talking about how cloud gaming was the future and was just a couple years (advancements) away... that was 20 years ago.
 
I'm never wrong.
Challenge accepted.

*searches by user name and keyword "wrong"*

In 2012 when the first version of the Rift came out I said Half-Life 3 will be a VR game. I guess I was wrong because they made Half-Life: Alyx instead of Half-Life 3.

By your own admission, you were WRONG! :p

Anyways, to contribute to the topic; I knew this was coming basically since its release. Virtually all reports of multiplayer games that weren't cross-play enabled showed how dead Stadia has been. Also, the infrastructure still isn't there to support cloud gaming anywhere that's remotely rural and out of big cities, in the US at least. But it's the same deal in Europe since I've been living here for the past few years; I have to have cellular internet if I'm going to get anything over 20-30 Mb and that's only during off-peak hours where I typically get a fraction of my rated/advertised speeds still.
 
I remember a friend right after high school talking about how cloud gaming was the future and was just a couple years (advancements) away... that was 20 years ago.
It's one of those technologies where you can see the immediate appeal and how it would ideally work... but reaching that ideal involves a massive amount of work. Like self-driving cars.
 
Let's not forget who has the power in this relationship.

If gamers decide they won't pay for something, it won't happen. Many of the undesirable things that happen in the industry happen because people continue pay for things that are anti-consumer because they are so obsessed the "gotta have it" even with all of its shortcomings.

I suspect this one might be a hard sell for at least the PC gaming crowd, but you never know. The weakness to stand on principle among consumers never ceases to amaze me.
This is true in many aspects of life. The mild inconvenience of having to deal with third-party DRM, supplemental anti-cheat software, bloated install sizes, etc. can all be accepted just as each minor erosion of privacy, data breach, etc. are not enough to deter people from the perceived positive aspects of whatever service they're enjoying.
 
I think the biggest problem with Stadia is/was that it requires you to purchase the games strictly for use with their service. nVidias GeForce Now service on the other hand utilizes your current libraries from Steam, Epic, & Ubisoft.

As for input lag issues I haven't really noticed much when using it, my son (18) however says it's more responsive than his XBox One when playing Destiny 2 on it. This was on a Shield Portable with founders priority level membership, not sure of the experience is the same with the free version or not.
 
Shame. A friend of mine uses Stadia quite a bit. Honestly the latency issue is way overblown, it worked surprisingly way when I tried it in his place. But then again he has a good internet connection so that may play some part. But compared to Geforce Now Stadia's business model is just dumb, you have to buy the games twice if you want to stream them AND play locally.
 
If we are waiting for:

1.) local input latency +
2.) network send latency +
3.) render time latency +
4.) image compression latency +
5.) image send network latency +
6.) local machine decompression latency +
7.) local display latency

to become competitive with just:

1.) Local input latency +
2.) local render time latency +
3.) local display latency

...


I think we'll be waiting for a while.
Most of the 7 stage pipeline could be engineered down to a level where it wouldn't impact gameplay. Google couldn't do it, but that's because Google can't even write Hello World unless Apple shows them how to do it first - but I digress.

The network send latency could be cut down to just a single digit ms today simply by spending more for better routing. With some focused engineering on the infrastructure side, plus expensive nationwide/global deployment, it could drop to consistently under 3ms.

Image compression and decompression latency could be engineered down to sub-3ms in a single generation. When I remote into my machines via Parsec at 4K, I typically see ~5ms attributed to each side of that. Technologies already exist (but are not yet deployed) which would bring that down to 1ms. I haven't checked what it's like at only 1080p, but I presume it's appreciably lower than what I see at 4K.

So, the latency can largely be brought down to better than acceptable levels with some work. It's a solvable problem. You know what is not a solvable problem, however? Computational resource availability.

1000 simultaneous users who want 3090-level performance will always require 1000 3090s-worth of resources available. Nobody is ever going to pay a subscription price which would allow the provider to recoup that in an acceptable amount of time. Usage is largely cyclic (ie, primetime-focused) with nearly zero usage in off-hours. Since the compute resources need to be in the same region as the users, that also means the provider couldn't simply assign the resources to regions in far-off timezones to increase utilization.

That, IMO, is where the real unsolvable bottleneck lays.
 
Not shocked, Google is famous for cutting off side projects before their time.

Stadia can actually work very well provided you have awesome internet connection. I did Cyberpunk on it and it was mostly fine (at the time a year or so ago, hard to tell if some bugs were CP's fault or Stadia glitches... I'll blame CP). Definitely better than the console experience. But yeah it is kind of a bummer than I had to buy it just for Stadia, and when it eventually dies off I will lose access to the game I paid for. OTOH I got the controller and a 4k Chromecast for "free" with the game purchase.

Also, game selection was not great. Too much kiddie stuff and indie games no one cares about. Like the same stuff in every Humble Bundle deal.
 
Most of the 7 stage pipeline could be engineered down to a level where it wouldn't impact gameplay. Google couldn't do it, but that's because Google can't even write Hello World unless Apple shows them how to do it first - but I digress.

The network send latency could be cut down to just a single digit ms today simply by spending more for better routing. With some focused engineering on the infrastructure side, plus expensive nationwide/global deployment, it could drop to consistently under 3ms.

Image compression and decompression latency could be engineered down to sub-3ms in a single generation. When I remote into my machines via Parsec at 4K, I typically see ~5ms attributed to each side of that. Technologies already exist (but are not yet deployed) which would bring that down to 1ms. I haven't checked what it's like at only 1080p, but I presume it's appreciably lower than what I see at 4K.

So, the latency can largely be brought down to better than acceptable levels with some work. It's a solvable problem. You know what is not a solvable problem, however? Computational resource availability.

1000 simultaneous users who want 3090-level performance will always require 1000 3090s-worth of resources available. Nobody is ever going to pay a subscription price which would allow the provider to recoup that in an acceptable amount of time. Usage is largely cyclic (ie, primetime-focused) with nearly zero usage in off-hours. Since the compute resources need to be in the same region as the users, that also means the provider couldn't simply assign the resources to regions in far-off timezones to increase utilization.

That, IMO, is where the real unsolvable bottleneck lays.

They have an RTX 3080 tier that from looking at their forums people are paying for. They're billing it at $100 every 6 months. According to settings in some games it looked like the basic paid tier ($50 / 60 mo) is running Tesla T10 GPUs
 
They have an RTX 3080 tier that from looking at their forums people are paying for. They're billing it at $100 every 6 months. According to settings in some games it looked like the basic paid tier ($50 / 60 mo) is running Tesla T10 GPUs
$100 for 6 months, or $100/mo when 6 months are purchased at once? There's a pretty big difference, but either way, it still means a long time to recoup just the basic hardware costs unless the usage pattern looks like a gym membership (<5% of gym members typically use the gym in any given month).
 
Done right input lag would be indistinguishable from server lag
Every time i forget to change my TV to gaming mode, i can tell in about 3 seconds of using my ps5. 40-50ms input lag is a deal breaker. People don't want it.
 
Every time i forget to change my TV to gaming mode, i can tell in about 3 seconds of using my ps5. 40-50ms input lag is a deal breaker. People don't want it.
Most people have higher input lag than that running on their local PC's mid-range PC running a 240hz 1080p screen is on average going to have like 65ms between button press and the first frame render. Higher than that for online games, most popular FPS titles are going to be into the mid-'80s assuming a good internet connection.

But yes PS4 and 5 have excellent control interfaces getting that down to less than 15ms from the controller (as low as 2.8ms under ideal wireless conditions on the PS5) to the unit which over wireless is absolutely insane, paired with a good TV that will stay in the low '20s for most games some of them pop into the '30s depending on the game engine and when they are doing their polling.

But yes Sony puts a shitload of work into their controllers and it shows, I'm actually starting to prefer my PS5 over my PC for many games just because of the little niceties there which at first I wrote off as gimmicky but have come to appreciate.
 
Last edited:
$100 for 6 months, or $100/mo when 6 months are purchased at once? There's a pretty big difference, but either way, it still means a long time to recoup just the basic hardware costs unless the usage pattern looks like a gym membership (<5% of gym members typically use the gym in any given month).

It's $100 every six months, so $200 yearly total. It would certainly be interesting to see their use patterns on those levels. It's still hard for me to imagine it being worth it to sub for the higher tier, at least from my use perspective.
 
It will be forced in everyone eventually.
That's why people are paying thousands for PS5's and GPU's right? They've been trying to force cloud gaming for nearly a decade and it hasn't taken off. Ask Sony who forced anyone who wanted to play PS3 games through their cloud gaming service. They even killed the PS3's online service for games. Yet, nobody bought into it. They either got a real PS3 or they run RPCS3 on PC and enjoy the games at no cost.
It gives publishers completely control and will pretty much end piracy.
They sure wish it would but as history has shown the more companies try to fight piracy the easier it gets to pirate. I wonder how many people will share the same cloud service to play a game?
The latency will be good enough for the majority of people.
Majority of idiots you mean. Everyone complains about the latency except those who promote it. I'm a network engineer and cloud gaming will never be possible. The speed of light is absolute. The only way cloud gaming will have any future is if it's super cheap. Sony wants a lot for their service but it does offer a lot in games. Nvidia offers no games but does allow you to use some of your existing PC library. Google offers neither and wants you to buy their games to use their service. The fact is most of these cloud gaming services are going to offer you older games or games that didn't fair well in the market if they go Netflix like subscription. Games that are so old or so Indie that they can run on pretty much any hardware available today for cheap with no latency. You know, the same hardware that cloud gaming is hoping you'd use to play on their service. Old games can be found cheap, probably cheaper than the monthly fee, or piracy because fuck morals.

To give you an idea on how quickly things are evolving, the Stadia service is using Vega 56 equivalent cards to run their games. You know... something that's 6 years old. No Ray-Tracing, no high frame rates or high resolutions with high frame rates. A Vega 56 is a very capable card today but not the same as Nvidia's Grid... I mean Geforce Now. I'll be back when Nvidia renames their failed service a forth time to relfect that they went the Netflix route instead of just allowing you to play some of your PC games.
 
That's why people are paying thousands for PS5's and GPU's right? They've been trying to force cloud gaming for nearly a decade and it hasn't taken off. Ask Sony who forced anyone who wanted to play PS3 games through their cloud gaming service. They even killed the PS3's online service for games. Yet, nobody bought into it. They either got a real PS3 or they run RPCS3 on PC and enjoy the games at no cost.

They sure wish it would but as history has shown the more companies try to fight piracy the easier it gets to pirate. I wonder how many people will share the same cloud service to play a game?

Majority of idiots you mean. Everyone complains about the latency except those who promote it. I'm a network engineer and cloud gaming will never be possible. The speed of light is absolute. The only way cloud gaming will have any future is if it's super cheap. Sony wants a lot for their service but it does offer a lot in games. Nvidia offers no games but does allow you to use some of your existing PC library. Google offers neither and wants you to buy their games to use their service. The fact is most of these cloud gaming services are going to offer you older games or games that didn't fair well in the market if they go Netflix like subscription. Games that are so old or so Indie that they can run on pretty much any hardware available today for cheap with no latency. You know, the same hardware that cloud gaming is hoping you'd use to play on their service. Old games can be found cheap, probably cheaper than the monthly fee, or piracy because fuck morals.

To give you an idea on how quickly things are evolving, the Stadia service is using Vega 56 equivalent cards to run their games. You know... something that's 6 years old. No Ray-Tracing, no high frame rates or high resolutions with high frame rates. A Vega 56 is a very capable card today but not the same as Nvidia's Grid... I mean Geforce Now. I'll be back when Nvidia renames their failed service a forth time to relfect that they went the Netflix route instead of just allowing you to play some of your PC games.
If you are designing a game specifically to be played online and bake the latency times specifically into the engine it could be done, but nobody is going to play CoD, Destiny 2, Halo, or any sort of FPS title on the existing setup and thing "WOW this is awesome!" It by its nature is going to be a limiting platform that is and always will be garbage for AAA twitch reflex games. But if you wanted to build the most advanced SimCity/Sims/Civilization crossover game where things could be done with an input poll-time of say 6s and design the game to be played remotely from the said environment then yeah that could work just fine but it would have to be designed to be run remotely from a sorta shitty network not designed to be run on a state of the art gaming PC and ported to the service. Just not going to fly.
 
It will be forced in everyone eventually. It gives publishers completely control and will pretty much end piracy. The latency will be good enough for the majority of people.

Exactly. That is why developers are pushing for it. Latency? Ubisoft laughs in your face. You'll deal with it or stop gaming.

It won't happen soon, but I think within the decade we'll see a big push for streaming/cloud gaming. Not everywhere in the world will have 100% stable internet, but they'll incentivize it. Say Assassin's Creed 50 will be cloud only for 6 months, and then when a traditional version comes out a lot of side missions, weapons, and other content will not be available in that version.
 
Stream Gaming is pretty awesome for some games. FPS and Driving games are awful, but RPGs and strategy aren't bad. Witcher 3 was amazing on the Nvidia Shield with maxed out settings and running in 4k . I'm just sad you can't launch steam on the shield anymore and stream those games from the steam/nvidia servers.
 
That's why people are paying thousands for PS5's and GPU's right? They've been trying to force cloud gaming for nearly a decade and it hasn't taken off. Ask Sony who forced anyone who wanted to play PS3 games through their cloud gaming service. They even killed the PS3's online service for games. Yet, nobody bought into it. They either got a real PS3 or they run RPCS3 on PC and enjoy the games at no cost.

They sure wish it would but as history has shown the more companies try to fight piracy the easier it gets to pirate. I wonder how many people will share the same cloud service to play a game?

Majority of idiots you mean. Everyone complains about the latency except those who promote it. I'm a network engineer and cloud gaming will never be possible. The speed of light is absolute. The only way cloud gaming will have any future is if it's super cheap. Sony wants a lot for their service but it does offer a lot in games. Nvidia offers no games but does allow you to use some of your existing PC library. Google offers neither and wants you to buy their games to use their service. The fact is most of these cloud gaming services are going to offer you older games or games that didn't fair well in the market if they go Netflix like subscription. Games that are so old or so Indie that they can run on pretty much any hardware available today for cheap with no latency. You know, the same hardware that cloud gaming is hoping you'd use to play on their service. Old games can be found cheap, probably cheaper than the monthly fee, or piracy because fuck morals.

To give you an idea on how quickly things are evolving, the Stadia service is using Vega 56 equivalent cards to run their games. You know... something that's 6 years old. No Ray-Tracing, no high frame rates or high resolutions with high frame rates. A Vega 56 is a very capable card today but not the same as Nvidia's Grid... I mean Geforce Now. I'll be back when Nvidia renames their failed service a forth time to relfect that they went the Netflix route instead of just allowing you to play some of your PC games.

You're coming from the perspective of a hardcore gamer. And I agree from that perspective, it will never be good enough for us.

But there are tons of plebs that play consoles on garbage TVs that add 50ms+ of input latency and don't know there's anything better. Even the people buying the new PS5's and Xbox whatever Xs.
 
You're coming from the perspective of a hardcore gamer. And I agree from that perspective, it will never be good enough for us.

But there are tons of plebs that play consoles on garbage TVs that add 50ms+ of input latency and don't know there's anything better. Even the people buying the new PS5's and Xbox whatever Xs.
Yeah, my PS5 is hooked up to an 11-year-old LG plasma. I mean it looks beautiful, but I have to imagine the latency is insane but for RPGs and turn-based games that I like it's more than fine.
 
You're coming from the perspective of a hardcore gamer. And I agree from that perspective, it will never be good enough for us.

But there are tons of plebs that play consoles on garbage TVs that add 50ms+ of input latency and don't know there's anything better. Even the people buying the new PS5's and Xbox whatever Xs.
Also to add to that console gamers played at 30 FPS and even less for decades. You didn't hear many complains from them.
 
Also to add to that console gamers played at 30 FPS and even less for decades. You didn't hear many complains from them.

Yeah. I'm sure there will be some rumbling, but for the average person most will embrace it. Streaming from Netflix doesn't result in the best video quality, but most people choose to stream movies/TV shows.

Plus the younger generation will grow up in a world of streaming/cloud subscriptions and likely the idea of buying dedicated hardware and having to "bother" with buying a disc, download or even downloading a patch on a console will seem like a big inconvenience.
 
Yeah. I'm sure there will be some rumbling, but for the average person most will embrace it. Streaming from Netflix doesn't result in the best video quality, but most people choose to stream movies/TV shows.

Plus the younger generation will grow up in a world of streaming/cloud subscriptions and likely the idea of buying dedicated hardware and having to "bother" with buying a disc, download or even downloading a patch on a console will seem like a big inconvenience.
I could see Microsoft making WoW 2 a streamed game, having that running in their data center would be pretty hard to hack/bot/data mine when none of it lives on the local machine. It's already a subscription service that wouldn't be a hard sell.
 
Fair, never say never. Technology continually improves.

That said, while bandwidth has improved immensely in the last decade, in that time latencies have been mostly unchanged. At the very least they have improved at a much slower rate.

If we are waiting for:

1.) local input latency +
2.) network send latency +
3.) render time latency +
4.) image compression latency +
5.) image send network latency +
6.) local machine decompression latency +
7.) local display latency

It would be interesting if someone could craft an engine that decouples complex lighting/shading calculations from simpler geometry calculations.
Something like:
1) Geometry engine just renders textures and geometry on local machine, no lighting calculations done. Use a very light GPU on local machine.
2) Lighting/ Shading engine renders everything correctly and sends it as textures to the Geometry engine across the network.

I'm not a graphics engineer, and I certainly dont expect the above to necessarily work, but something in that spirit. Give complex beautiful calculations to the remote server, leave simple latency sensitive geometry calculations local.
 
It would be interesting if someone could craft an engine that decouples complex lighting/shading calculations from simpler geometry calculations.
Something like:
1) Geometry engine just renders textures and geometry on local machine, no lighting calculations done. Use a very light GPU on local machine.
2) Lighting/ Shading engine renders everything correctly and sends it as textures to the Geometry engine across the network.

I'm not a graphics engineer, and I certainly dont expect the above to necessarily work, but something in that spirit. Give complex beautiful calculations to the remote server, leave simple latency sensitive geometry calculations local.
You can do that it would be similar to how some asymmetric SLI technologies work, the problem would be the amount of data you would have to transmit back and forth you would basically be moving the frame buffer back and forth and I’m not sure there is a home network in existence that could handle that load. Intel has proposed a similar concept though for the ARC GPU’s when it detects an onboard iGPU offloading some of the minor tasks to it but it’s only really the PCIE5 bandwidth rates that make it feasible.
 
I'm a network engineer and cloud gaming will never be possible. The speed of light is absolute.
This is an inherently flawed point of view because it is so narrow it completely misses...well, almost everything.

Roundtrip speed of light from San Diego to Boston (basically the two most distant population centers in the US) is ~30ms. You are, by the slimmest technicality possible, correct on that front: A San Diego user who is playing on a cloud gaming service hosted in the most distant populated corner of the continent will never get better than 30ms latency (assuming no magic like quantum entanglement). Congrats on knowing a factoid, but let's look at how things actually work in real life:
  • You're talking about online gaming against human opponents located beyond the walls of your living room. How does that play out today?
    • The game server is hosted in some distant data center. It is the same data center, in fact, that hosts the vGPUs for the cloud gaming service.
    • Your computer sends data to that server (15ms), the server crunches some numbers (16+ms), the server sends the data back to your computer (15ms), your computer renders a frame and puts it onto the screen (16+ms).
    • From input to displayed frame, we're at 62ms minimum.
  • Ok, now let's compare that to a cloud gaming service with current-gen GPUs:
    • Your computer sends data to the server (15ms), the server crunches some numbers (16+ms), the server sends the data back to your computer (15ms), your computer decodes the frame (5ms) and puts it onto the screen (16+ms)
    • From input to displayed frame, we're at 67ms minimum, an increase of 5ms.
  • In the nearly 10 years since The Almighty CS:GO was released (August this year will mark the 10th anniversary), your reaction time has slowed by more than the difference
Why is the difference so small? It's because nearly everything that happens in the second scenario also happens in the first one - it's just that the physical location of where the frames are rendered is different.

The problems holding back cloud gaming today are not technical, they are simply the business case. Expensive GPUs are expensive no matter which chassis they're stuck in. Games are currently made for local rendering, and this means there is a lack of optimization when they're run on the cloud (ie, they are all done in separate standalone sessions which then communicate with a standalone server, rather than the more time- efficient unified approach). This is because the business case for cloud gaming has not justified architecting major titles specifically for it - yet.

Oh, it's also worth the epiphany that CS:GO is a six-sigma outlier for gaming. Calling it a niche is overstating its size and importance by an order of magnitude. There is an entire world of gaming that doesn't rely on its users falsely thinking they need every fraction of a millisecond in latency reduction. Over 80% of Steam players are playing their titles at 60fps or less. There are strategy games, sports games, RPGs, non-twitch shooters, etc etc which all play great with realistic cloud latency. Saying categorically that cloud gaming could never work is just silly.

This all probably make me sound like I'm in favor of cloud gaming. I'm actually very opposed to it. The idea of locally executed software with perpetual licenses getting replaced by SaaS disgusts me. You'll have to pry my perpetual owned licenses from my cold dead hands. The goal of cloud gaming is not to combat piracy, it is not to improve the user experience, it is not to reduce cheating. The goal is plain and simple: increase revenues by turning a $30 game that you play for five years into a $10/mo rental that ends up totaling $600 after playing for five years. Eff that and the horse it rode in on.
 
Back
Top