Google sidelining Stadia

Usage is largely cyclic (ie, primetime-focused) with nearly zero usage in off-hours. Since the compute resources need to be in the same region as the users, that also means the provider couldn't simply assign the resources to regions in far-off timezones to increase utilization.
Has long as it need to be close that is true, but if gaming hardware is very similar to hardware that can do other things (say mining type is viable), you could have an relatively easy aways 24 hours usage of that type of hardware going on I imagine.
The goal is plain and simple: increase revenues by turning a $30 game that you play for five years into a $10/mo rental that ends up totaling $600 after playing for five years. Eff that and the horse it rode in on.

If that is true that would create a giant incentive to create game a play a lot, if I actually love a game so much that I play unstop for 5 year's I would not mind paying a lot for it and for me would end up paying a lot for games I do not play I think. I can see how it could be popular (like Netflix vs a collection of Dvds)
 
I see it eventually coming for MMO or other multiplayer MMO like games. Game installs basically a sandboxed RDP client at that stage, input lag can be compensated for with various adaptive algorithms and frame timing.
Done right input lag would be indistinguishable from server lag, throw in some direct VPN tech to minimize impact from external service providers and bam, for any MMO’esque title you’ve got a decent little setup. Take it a step further and utilize those TPM2 modules and run the client in a self contained authenticated VM and you have a system that is going to be hard as hell to bot or generally hack.
bullshit, there is NO technology that can overcome the latency of a cloud based system. no matter what clever network prediction is employed, there will always and forever be the delay between what is displayed on your monitor and whatever input you just made. always and forever, this cannot change, at least until the only thing limiting latency is literally the speed of light.

once everything goes to the cloud (which i, sadly, do agree with probably happen eventually) i will quit gaming forever (aside from "retro" gaming pre-cloud style) because i hate. hate. hate. hate input lag and all its forms. we had joyous input latency in the early days of videogaming, then it got wrecked as we transitioned from CRT to LCD tech, and just now we are finally getting close to how it used to be, and this shit is just going to do the equivalent of setting gaming back decades again.

edit: and i'm fully aware that there's a good chunk of players out there, young and old, who can (and are probably doing so right now) play with half a second of input lag and just keep truckin' because they don't know any better. i'm happy for them, i wish i could ignore it too but i can't.
 
bullshit, there is NO technology that can overcome the latency of a cloud based system. no matter what clever network prediction is employed, there will always and forever be the delay between what is displayed on your monitor and whatever input you just made. always and forever, this cannot change, at least until the only thing limiting latency is literally the speed of light.

once everything goes to the cloud (which i, sadly, do agree with probably happen eventually) i will quit gaming forever (aside from "retro" gaming pre-cloud style) because i hate. hate. hate. hate input lag and all its forms. we had joyous input latency in the early days of videogaming, then it got wrecked as we transitioned from CRT to LCD tech, and just now we are finally getting close to how it used to be, and this shit is just going to do the equivalent of setting gaming back decades again.

edit: and i'm fully aware that there's a good chunk of players out there, young and old, who can (and are probably doing so right now) play with half a second of input lag and just keep truckin' because they don't know any better. i'm happy for them, i wish i could ignore it too but i can't.
Adaptive algorithms have gotten latency down to the mid-'80s at 1080p, it's not until you get into 4K that you start seeing 200+. The controller input is a minor fraction of the actual delays, on a direct PC depending on the USB controller it can easily be as bad as 26.6ms before you even get the monitor involved but on a mid-range PC with a cheaper 240hz 1080p screen is going to be in the 40-60 range. So somebody playing from a cloud service at 1080p can reasonably expect to see double the latency, but many online games especially MMO's have been dealing with methods for masking all sorts of lag for decades, they extend the polling times and call it the GCD. So yes at 200+ 4K that is reasonably unplayable for basically anything but as the latency differences from 1080p to 4K show the bulk of said latency is in the transmission of the video stream, better algorithms there can greatly cut down on those times and they get better as bandwidth increases, but it is certainly not here yet

In regards to being where we used to be, we're not even close, arcade systems from the '80s had control to screen latencies in the single digits, rarely getting as bad as the low teens. The PS5 makes great strides in its attempts to reattain those levels, but it isn't there yet again mostly due to video transmission and HDMI speeds.
 
Last edited:
I could see Microsoft making WoW 2 a streamed game, having that running in their data center would be pretty hard to hack/bot/data mine when none of it lives on the local machine. It's already a subscription service that wouldn't be a hard sell.
My wife built a fishing bot and enjoyed using it for years until she finally got banned for it. It paid for her subscription for a long time.
 
My wife built a fishing bot and enjoyed using it for years until she finally got banned for it. It paid for her subscription for a long time.
I used a fishing bots in FFXI til they nerfed it. Made a crap ton of money while I was at work and never got banned. I did get jailed for 3 days for calling someone a obsninty tho. I am surprised I never got my character banned with all the crap I did. Used bots on kings and stuff. I never sold any just made relics and stuff.
 
Yeah she sold the fish for gold before using the gold to buy subscription tokens. She also gave away huge amounts of free fishes to avoid people reporting the bot. This worked for years until some hater nudged the bot and noticed it kept the fishing motions up despite no longer being pointed at the water. We live in Arkansas and hear people use obscenities every day if we go out in public for very long. I don't know why people get upset about it on the Internet, it's not like someone is going to pull a 9 on you for calling someone names on the Internet, unless you somehow were to construe that into SWATting or associated terribleness.
 
Yeah she sold the fish for gold before using the gold to buy subscription tokens. She also gave away huge amounts of free fishes to avoid people reporting the bot. This worked for years until some hater nudged the bot and noticed it kept the fishing motions up despite no longer being pointed at the water. We live in Arkansas and hear people use obscenities every day if we go out in public for very long. I don't know why people get upset about it on the Internet, it's not like someone is going to pull a 9 on you for calling someone names on the Internet, unless you somehow were to construe that into SWATting or associated terribleness.
Oh it was just a dick in my LS that I pissed off somehow and reported me. It wasn't something that was said in public.
 
  • You're talking about online gaming against human opponents located beyond the walls of your living room. How does that play out today?
    • The game server is hosted in some distant data center. It is the same data center, in fact, that hosts the vGPUs for the cloud gaming service.
    • Your computer sends data to that server (15ms), the server crunches some numbers (16+ms), the server sends the data back to your computer (15ms), your computer renders a frame and puts it onto the screen (16+ms).
    • From input to displayed frame, we're at 62ms minimum.
  • Ok, now let's compare that to a cloud gaming service with current-gen GPUs:
    • Your computer sends data to the server (15ms), the server crunches some numbers (16+ms), the server sends the data back to your computer (15ms), your computer decodes the frame (5ms) and puts it onto the screen (16+ms)
    • From input to displayed frame, we're at 67ms minimum, an increase of 5ms.
  • In the nearly 10 years since The Almighty CS:GO was released (August this year will mark the 10th anniversary), your reaction time has slowed by more than the difference
From input to displayed frame for the first scenario should not include some distant data center. In CS I can completely lose network connection, and still look around freely. The frames are rendering regardless of some server. This is important to me because regardless of your conscious reaction time, you can still feel the difference of an extra frame or 2. It feels sluggish, it feels not right. Everytime vsync accidently turns on in any video game I get irritated simply because of the added delay. 60ms for your conscious reaction time might not mean much, but can the motor/visual feedback system of my brain tell if there is an added 60ms delay from mouse input to video output? yes. easily.

Can i learn to play with this annoying delay? Maybe? I dont know.
 
From input to displayed frame for the first scenario should not include some distant data center. In CS I can completely lose network connection, and still look around freely. The frames are rendering regardless of some server. This is important to me because regardless of your conscious reaction time, you can still feel the difference of an extra frame or 2. It feels sluggish, it feels not right. Everytime vsync accidently turns on in any video game I get irritated simply because of the added delay. 60ms for your conscious reaction time might not mean much, but can the motor/visual feedback system of my brain tell if there is an added 60ms delay from mouse input to video output? yes. easily.

Can i learn to play with this annoying delay? Maybe? I dont know.
If you're playing CS alone without a server, you aren't playing CS. You're going to have to invent a better fake example than that.
 
If you're playing CS alone without a server, you aren't playing CS. You're going to have to invent a better fake example than that.
Which part of my post are you disagreeing with or what did you not understand?
 
Last edited:
Adaptive algorithms have gotten latency down to the mid-'80s at 1080p, it's not until you get into 4K that you start seeing 200+. The controller input is a minor fraction of the actual delays, on a direct PC depending on the USB controller it can easily be as bad as 26.6ms before you even get the monitor involved but on a mid-range PC with a cheaper 240hz 1080p screen is going to be in the 40-60 range. So somebody playing from a cloud service at 1080p can reasonably expect to see double the latency, but many online games especially MMO's have been dealing with methods for masking all sorts of lag for decades, they extend the polling times and call it the GCD. So yes at 200+ 4K that is reasonably unplayable for basically anything but as the latency differences from 1080p to 4K show the bulk of said latency is in the transmission of the video stream, better algorithms there can greatly cut down on those times and they get better as bandwidth increases, but it is certainly not here yet

In regards to being where we used to be, we're not even close, arcade systems from the '80s had control to screen latencies in the single digits, rarely getting as bad as the low teens. The PS5 makes great strides in its attempts to reattain those levels, but it isn't there yet again mostly due to video transmission and HDMI speeds.
That doesn't appear to be correct at all. nVidia has some good info on game latency but the long and the short of it is that for latency sensitive games like shooters, we can see them under 20ms on high speed hardware/monitors. That aside I think you are missing the biggest issue with cloud gaming and latency: No matter what, it will always be higher than what is local because it has ALL the same issues a local system has, plus network, plus video compression.

So if we discover a clever way to reduce the lag of the actual input from the controller, which isn't much these days but you can still see it on the graph that's great... but it works for a controller on your local system, as well as the cloud. So cloud still ends up worse.

You can't eliminate the fact that you have to compress (and decompress) the video unless you want to start to use 10+ gbit streams, and you can't eliminate the transmission time ever because of that pesky speed of light. Also there's the pesky issue of buffering. Internet connections are not perfectly reliable, there are missed packets and need to retransmit. To deal with that you need a buffer, and the more issues you want to avoid, the bigger the buffer needs to be. That's on top of any other buffers in the system (like a frame buffer in rendering). If you want smooth, glitch free playback, you need a buffer (probably at least one frame because of how video compression works) and that puts in additional lag.

This is a bitch we fight with in realtime audio when you start talking complex setups. The ADC has a buffer, the network equipment (for Dante) has a buffer, the NIC/sound card has a buffer, etc. They all add up. Each one might not be very large, but you have to take them all in to account, the most steps in the chain, the more buffers, and they are all additive when you are talking realtime latency.

As for the idea of MMOs compensating for lag yes they do, but it is a very different kind of lag. It is your client and the server communicating, not the game running. So when you do something, like change your camera or cast a spell, your client can react to that right away. Doesn't matter if the server finds out about it a little later, that's where the compensation comes in. Great, but doesn't work for streaming. It works because when I say "do X" the local system does X before the server has heard about it and confirmed it. With streaming, when you say "do X" it has to go out to the server, get rendered, compressed, sent back, and decompressed before you can see it.

Now that isn't to say you can't learn to cope with it. You can have a fair bit of latency in between you pressing a button and something happening and adapt to it... but it isn't going to feel as good, just like you can deal with a TV that doesn't have a game mode, but it doesn't feel as good. Cloud gaming will forever feel muddy and unresponsive compared to local.

There is no magic that gets you out of the speed of light.
 
Which part of my post are you disagreeing with or what did you not understand?
In one scenario, you're dealing with low local latency and high server latency and therefore shooting at players who aren't there yet. In the other scenario, you have high server latency but every frame is perfectly synced with the server data so every player you shoot at is in the position shown in the frame. Both scenarios net out the same although they take different routes to get there.

There is no magic that gets you out of the speed of light.
People keep saying this, but saying it more times doesn't make it any less wrong. Your own post had absolutely nothing to do with this, so why mention it at all?
 
That doesn't appear to be correct at all. nVidia has some good info on game latency but the long and the short of it is that for latency sensitive games like shooters, we can see them under 20ms on high speed hardware/monitors. That aside I think you are missing the biggest issue with cloud gaming and latency: No matter what, it will always be higher than what is local because it has ALL the same issues a local system has, plus network, plus video compression.

So if we discover a clever way to reduce the lag of the actual input from the controller, which isn't much these days but you can still see it on the graph that's great... but it works for a controller on your local system, as well as the cloud. So cloud still ends up worse.

You can't eliminate the fact that you have to compress (and decompress) the video unless you want to start to use 10+ gbit streams, and you can't eliminate the transmission time ever because of that pesky speed of light. Also there's the pesky issue of buffering. Internet connections are not perfectly reliable, there are missed packets and need to retransmit. To deal with that you need a buffer, and the more issues you want to avoid, the bigger the buffer needs to be. That's on top of any other buffers in the system (like a frame buffer in rendering). If you want smooth, glitch free playback, you need a buffer (probably at least one frame because of how video compression works) and that puts in additional lag.

This is a bitch we fight with in realtime audio when you start talking complex setups. The ADC has a buffer, the network equipment (for Dante) has a buffer, the NIC/sound card has a buffer, etc. They all add up. Each one might not be very large, but you have to take them all in to account, the most steps in the chain, the more buffers, and they are all additive when you are talking realtime latency.

As for the idea of MMOs compensating for lag yes they do, but it is a very different kind of lag. It is your client and the server communicating, not the game running. So when you do something, like change your camera or cast a spell, your client can react to that right away. Doesn't matter if the server finds out about it a little later, that's where the compensation comes in. Great, but doesn't work for streaming. It works because when I say "do X" the local system does X before the server has heard about it and confirmed it. With streaming, when you say "do X" it has to go out to the server, get rendered, compressed, sent back, and decompressed before you can see it.

Now that isn't to say you can't learn to cope with it. You can have a fair bit of latency in between you pressing a button and something happening and adapt to it... but it isn't going to feel as good, just like you can deal with a TV that doesn't have a game mode, but it doesn't feel as good. Cloud gaming will forever feel muddy and unresponsive compared to local.

There is no magic that gets you out of the speed of light.
Yes it can be very low on good monitors with a good motherboard, but an older say AMD 350 board or any USB port running off a chipset is dependent on the Windows pole rates which are around 10ms and simple frame cadence at something running 60fps is going to be polling for an input every 16.6ms, then you add in the monitors added latency and the numbers can get pretty big, So yes if you are running a newer machine with better hardware and pushing far higher than 60fps your input latencies are going to be far far better, but Stadia was never aimed at people with awesome gaming rigs, it was aimed at people who want to game from their anything but a gaming rig.

So no it's not going to be better than a gaming machine ever, that won't happen (Unless it's 10+ years old but then is it still a gaming machine?), but it's going to do a hell of a lot better than that $300 ACER Bob got from Walmart during Black Friday last year.

Stadia's whole problem is they are then essentially catering to casual or non-gamers, who don't have ANY other option but to use their service, but they are not only charging a subscription but then also charging full price for a product that only works on said service. So they are in a position of being the dead last pick for people who are unsure they even want to be doing what it is they are doing, then being asked to spend more money on it. From a financial perspective that is a pretty shitty place to be, and I am not at all surprised that Google is having to find a new market for it.

But yes dealing with latencies above 80ms is something you can visually identify and makes things feel spongy and you pick up on it and it is not a pleasant sensation, above 150 and it becomes an unpleasant sensation because even the average person with sub-par hand-eye coordination can "feel" (not sure if that is the right word there) that disconnect between action and reaction.

With the current technologies and internet infrastructures around the world, cloud gaming is not viable for much of what is out there. If they were to develop their own first-party titles for the service that are designed from the ground up to be played on a cloud service and takes all its limitations into consideration during every aspect of the game design then perhaps it could be better, but who would they sell that game too? It would have to be one hell of a game, and that's not a risk I see Google undertaking any time soon.
 
Yes it can be very low on good monitors with a good motherboard, but an older say AMD 350 board or any USB port running off a chipset is dependent on the Windows pole rates which are around 10ms and simple frame cadence at something running 60fps is going to be polling for an input every 16.6ms, then you add in the monitors added latency and the numbers can get pretty big, So yes if you are running a newer machine with better hardware and pushing far higher than 60fps your input latencies are going to be far far better, but Stadia was never aimed at people with awesome gaming rigs, it was aimed at people who want to game from their anything but a gaming rig.

So no it's not going to be better than a gaming machine ever, that won't happen (Unless it's 10+ years old but then is it still a gaming machine?), but it's going to do a hell of a lot better than that $300 ACER Bob got from Walmart during Black Friday last year.
The problem isn't the competition against that Acer, it is against the Xbox and PS5. That's the issue is that there are pretty good gaming systems that you can get in your home (well in theory, this chip shortage aside) that DON'T have the issue of cloud gaming. If it were just PCs vs cloud, ya I'd say cloud gaming would win easily. Most people don't want to spend on a high end PC nor do they want to deal with the issues. But the consoles are something that lots of people are willing and able to afford, that's why they are all sold out all the time.

Another issue is that the lower end the gamer, the more their other equipment like TV and network are likely to suck, and thus the more the added cloud latency matters. If you have a setup with a superfast display with a hardware Gsync module and thus basically no processing latency, input devices that report 1000 times a second and so on but you play a console style game where it is 30fps, maybe 60fps, then sure adding on a frame of latency for a buffer and another frame or two for network is annoying and noticeable, but not game breaking. You still can easily have a total button-to-screen time under 100ms which will feel a little mushy but not bad. However if you have someone that has a cheap TV, with high processing latency, whatever cheap controller Stadia uses, and a poor network connection then you can have a much higher latency. It might have more than 100ms for a local game, and start to push up to the quarter second or more range on the cloud which gets REAL noticeable.

So for it to not matter much, you need someone who has good hardware and network. Those kind of people are less likely to want to deal with the tradeoff.

This is all aside from the fact that cloud providers have shown they are clearly unwilling to throw lots of hardware at the services. While Google claimed it would be way better than a high end PC in reality most games looked, at best, equal to the Xbox Series X version and usually less.
 
The problem isn't the competition against that Acer, it is against the Xbox and PS5. That's the issue is that there are pretty good gaming systems that you can get in your home (well in theory, this chip shortage aside) that DON'T have the issue of cloud gaming. If it were just PCs vs cloud, ya I'd say cloud gaming would win easily. Most people don't want to spend on a high end PC nor do they want to deal with the issues. But the consoles are something that lots of people are willing and able to afford, that's why they are all sold out all the time.

Another issue is that the lower end the gamer, the more their other equipment like TV and network are likely to suck, and thus the more the added cloud latency matters. If you have a setup with a superfast display with a hardware Gsync module and thus basically no processing latency, input devices that report 1000 times a second and so on but you play a console style game where it is 30fps, maybe 60fps, then sure adding on a frame of latency for a buffer and another frame or two for network is annoying and noticeable, but not game breaking. You still can easily have a total button-to-screen time under 100ms which will feel a little mushy but not bad. However if you have someone that has a cheap TV, with high processing latency, whatever cheap controller Stadia uses, and a poor network connection then you can have a much higher latency. It might have more than 100ms for a local game, and start to push up to the quarter second or more range on the cloud which gets REAL noticeable.

So for it to not matter much, you need someone who has good hardware and network. Those kind of people are less likely to want to deal with the tradeoff.

This is all aside from the fact that cloud providers have shown they are clearly unwilling to throw lots of hardware at the services. While Google claimed it would be way better than a high end PC in reality most games looked, at best, equal to the Xbox Series X version and usually less.
Yeah like I said it's for people who are left with no other option and exist in a space where they can afford games, and the subscription, have a device already that they can run Stadia on and a great internet connection, but no gaming PC or Consoles... It's a very very small demographic and I don't know who they think they are marketing it to, but that can't be a large enough demographic to pay for that degree of infrastructure and if somebody told me otherwise I would call them a bold-faced liar and I would need to see some damning evidence to convince me. But given Google is having to find other markets to sell their services too looks like they couldn't find it either, I mean in 15 years on some 10g wireless network could this work sure maybe.... ish? But today is not that day, I can't see how it would be a nice experience for anybody they are trying to actually sell this to, if anything it would put them off the concept of gaming entirely with the "I tried it a few times and really didn't enjoy it"
 
In one scenario, you're dealing with low local latency and high server latency and therefore shooting at players who aren't there yet. In the other scenario, you have high server latency but every frame is perfectly synced with the server data so every player you shoot at is in the position shown in the frame. Both scenarios net out the same although they take different routes to get there.
Yes i do not disagree with that. But if given the (bad) choice between the two i prefer low local latency, I have a feeling many others would as well. I find it easier to use an analog interface controls (move camera viewpoint, moving mouse cursor, moving UI windows around, etc) when it is quick to respond.
 
Cloud gaming seems to be a relatively flawed concept. I think the only thing that keeps this failed concept on a perpetual return is the fact that the companies who push it stand to make a lot of money if it were successful. Similar to VR (especially with the push of Meta), it feels like brute force capitalism, where companies are going to shove bad ideas down the public's throat until they've been dumbed down and made desperate enough to accept them.
 
Maybe they knew it would fail and the whole thing is a cover for massive Crypto farms where they launder their dark money now that they are allowed to be evil?
 
Me, an owner of 3 stadia controllers because they were free and/or the cheapest way to get chromecast ultra devices. I've used Stadia for literally 2-3 hours total out of my free 3+ months.

1644461532508.png

Shocked! Shocked!
Well, absolutely not at all shocked.

Seriously though, the controllers are pretty decent and the chromecast 4k's have been REALLY useful.
 
Yeah. I'm sure there will be some rumbling, but for the average person most will embrace it. Streaming from Netflix doesn't result in the best video quality, but most people choose to stream movies/TV shows.
Video quality doesn't effect gameplay as much as latency. They had all of COVID to get people onto cloud gaming and now Google is abandoning it.
Plus the younger generation will grow up in a world of streaming/cloud subscriptions and likely the idea of buying dedicated hardware and having to "bother" with buying a disc, download or even downloading a patch on a console will seem like a big inconvenience.
If anything the younger generation will continue to use tablets and smart phones to play Among US 2, which will also run on a toaster. As inconvenient downloading patches and inserting discs are, being disconnected online and losing progress would seem to be worse. Not to mention monthly fees is something the younger generation seems to avoid. Reoccurring revenue is nice but we know through Netflix and Hulu type services is that most people either share or unsub and then resub when something interesting is on. I expect this sorta problem with GamePass as time goes on and people get their fill of what games they wanna play and then unsub. Assuming that future cloud gaming services do work like Netflix cause right now only Sony does this with PS Now.
 
What I am seeing from teens from the people I work with is the new trend of building custom keyboards for gaming. Getting individual mechanical keys and now some that are hall effect based to try to squeeze more out of their reaction time. And getting rid of the arrows and T9 bad to make it smaller and easier to transport to their friends house. It is kind of neat to see the hardcore gamer in another generation. I know that isn't all but still awesome
 
edit: and i'm fully aware that there's a good chunk of players out there, young and old, who can (and are probably doing so right now) play with half a second of input lag and just keep truckin' because they don't know any better. i'm happy for them, i wish i could ignore it too but i can't.
Imagine that they have half a second of input lag with just horrible equipment and now they have another half second or more of input lag on top of it. Do people really think they're willing to deal with a whole second of input lag? The problem is that a lot of people who think cloud gaming can work are also the same people on $2k plus computers with a TV with super low latency. Not the person who's using wifi that came with their ISP which is causing latency, and their awful 10+ year old cheap Walmart uWu brand TV from China that even if you turn on game mode you still get a lot of latency. Don't forget the bluetooth controller which is fine when wired but they're too ignorant to understand that wireless is just adding a ton of latency. This is the audience cloud gaming is catering to and nobody got time for 1 second input lag.

Adaptive algorithms have gotten latency down to the mid-'80s at 1080p, it's not until you get into 4K that you start seeing 200+.
There is no magic algorithm that fixes latency. The most an algorithm can do is predict your actions and then go with that. Some emulators actually employ this called runahead and it literally runs parts of the game ahead of time to predict your actions. This works fine for emulating something like the SNES or Genesis but not a modern game that already taxes the CPU.


The controller input is a minor fraction of the actual delays
The dualshock 4 is known for horrible input lag in bluetooth mode.
but many online games especially MMO's have been dealing with methods for masking all sorts of lag for decades, they extend the polling times and call it the GCD.
Yes but also since you're running a local client the local client also pretends to run instantly while the server is not. That's why sometimes things look like they happen one way but then a correction occurs and something else happened. The server disagreed with the client and the client corrected.
People keep saying this, but saying it more times doesn't make it any less wrong. Your own post had absolutely nothing to do with this, so why mention it at all?
Because some people seem to believe that the speed of light isn't an issue and keeps ignoring it. We don't want something like special relativity between us and our profits. With the power of capitalism we'll remove the speed of light.
 
Last edited:
The dualshock 4 is known for horrible input lag in bluetooth mode.
Seems to depend on what type of BT adapter you use and software/drivers on Windows. Otherwise on the console and with the right BT setup, lag is pretty much negligible compared to being wired. I know Domingo is pretty knowledgeable on using DS4 on windows and was saying until recently unless you had the licensed PS BT adapter (which they don't sell anymore) there was lag issues and dropped inputs, but he reported recently that with some driver update for Windows or something fixed all lag issues he saw using his DS4 wireless on Windows.

Personally I just wire up my controllers to my PC anyways so there's no potential issues. PC gaming is annoying enough sometimes for me trying to figure out how to optimize games to run or troubleshooting random software conflicts or bugs that I try to keep my setup as simple as possible.
 
Seems to depend on what type of BT adapter you use and software/drivers on Windows. Otherwise on the console and with the right BT setup, lag is pretty much negligible compared to being wired. I know Domingo is pretty knowledgeable on using DS4 on windows and was saying until recently unless you had the licensed PS BT adapter (which they don't sell anymore) there was lag issues and dropped inputs, but he reported recently that with some driver update for Windows or something fixed all lag issues he saw using his DS4 wireless on Windows.

Personally I just wire up my controllers to my PC anyways so there's no potential issues. PC gaming is annoying enough sometimes for me trying to figure out how to optimize games to run or troubleshooting random software conflicts or bugs that I try to keep my setup as simple as possible.
I always have trouble getting the DS4 working right with games with BT. Some games just ignore it. it works fine when using a cable. Connect a ps5 controller and it works perfectly everytime with BT.
 
Seems to depend on what type of BT adapter you use and software/drivers on Windows. Otherwise on the console and with the right BT setup, lag is pretty much negligible compared to being wired. I know Domingo is pretty knowledgeable on using DS4 on windows and was saying until recently unless you had the licensed PS BT adapter (which they don't sell anymore) there was lag issues and dropped inputs, but he reported recently that with some driver update for Windows or something fixed all lag issues he saw using his DS4 wireless on Windows.

Personally I just wire up my controllers to my PC anyways so there's no potential issues. PC gaming is annoying enough sometimes for me trying to figure out how to optimize games to run or troubleshooting random software conflicts or bugs that I try to keep my setup as simple as possible.

Yeah, for years and years the DS4's would drop inputs over Bluetooth. No idea why, but I tested a pile of different BT adapters (both USB and internal) and they all did it. You might not notice it, but it would regularly happen when doing complex movements with lots of button presses. The kind of input you'd use for a Zangief super in Street Fighter for example (two 360 degree rotations + 3 buttons at once in under a second), would fail most of the time. Some Tekken sequences that require almost a dozen directional inputs in about second were impossible. Yet using the official PS4 adapter, I could do these things every time. Sony stopped selling them and prices went through the roof even though the adapter launched at $25.

At some point either Bluetooth adapters got better or something else happened because it's no longer an issue. DS4's (and the DualSense) work wonderfully with newer BT adapters. My last two mobos have had built-in Intel BT adapters that work just as well as playing on the console if not better.
 
  • Like
Reactions: T4rd
like this
Yeah, for years and years the DS4's would drop inputs over Bluetooth. No idea why, but I tested a pile of different BT adapters (both USB and internal) and they all did it. You might not notice it, but it would regularly happen when doing complex movements with lots of button presses. The kind of input you'd use for a Zangief super in Street Fighter for example (two 360 degree rotations + 3 buttons at once in under a second), would fail most of the time. Some Tekken sequences that require almost a dozen directional inputs in about second were impossible. Yet using the official PS4 adapter, I could do these things every time. Sony stopped selling them and prices went through the roof even though the adapter launched at $25.

At some point either Bluetooth adapters got better or something else happened because it's no longer an issue. DS4's (and the DualSense) work wonderfully with newer BT adapters. My last two mobos have had built-in Intel BT adapters that work just as well as playing on the console if not better.

The bluetooth spec allows a lot of latency etc. because it's focused on low power. Playstation uses it differently from the standard spec so they can get lower latency. So if you're just using bluetooth hardware built into your PC it won't necessarily work the same way.
At least that was the case when Playstation 3 launched way back. Maybe on PS4 and PS5 that isn't the case anymore, and there are newer versions of bluetooth that probably have low latency modes.
 
There is no magic algorithm that fixes latency. The most an algorithm can do is predict your actions and then go with that. Some emulators actually employ this called runahead and it literally runs parts of the game ahead of time to predict your actions. This works fine for emulating something like the SNES or Genesis but not a modern game that already taxes the CPU.
Yeah, I don't know how it works, probably more to do with pre-rendering certain textures or lighting effects and, or more likely dynamically turning down graphics settings so that it's faster to process and send the frames, in either event they claim it works some sort of magic to improve latency, but its likely some sort of graphical cheat.
 
Yeah, I don't know how it works, probably more to do with pre-rendering certain textures or lighting effects and, or more likely dynamically turning down graphics settings so that it's faster to process and send the frames, in either event they claim it works some sort of magic to improve latency, but its likely some sort of graphical cheat.
For what I've seen so far, it's BS. They make handwavy claims about how they have some amazin' cloud tech will just solve all problems but nothing to indicate that it is anything more than marketing. Turning down graphics and such actually doesn't help, because you still run in to the fundamental video compression issue that you need to have a frame rendered, then you process it and compress it, and if you want the compression to be good it has to be a whole frame compression that does deltas from previous frames, then send that frame out to the client to be decompressed.

Doesn't matter what the source frame is, you have to have it, then compress the whole thing, and also know what the previous (and ideally next) frame is to do the compression. While there is video compression that isn't full-frame, like Display Stream Compression you see on DP/HDMI, it isn't very good. Small, run length, segment-by-segment compression gets you the least. Better compression takes a whole frame and looks at all the similarities in it to deal with (intraframe compression, like JPEG or AVCIntra). Even better compression then uses delta frames, where you only compress a whole frame every so often, maybe 15 frames at most, and the rest are delta frames which basically just say "this is what changed since the last frame". The best compression then takes that to the max where you have a VERY long time between intra frames and you also use bidirectional delta frames, which reference not only the changes from the previous frame, but from the next one as well. This is how things like AVC/HEVC streams online work.

That's all well and good for things that aren't interactive, but the more you go up that efficiency train, the bigger an encode/decode buffer you need and there's no getting around it. You have to have one or more frames that you are looking at to handle things. Particularly with B frames where you have to "see the future" and know what frames are coming later to decode the current frame. So you can get a 4k60 stream that looks pretty good in maybe 25mbps, but only by using all those nifty coding techniques. If you want to do it intraframe you can be looking at 400mbps or more.

Probably all they are actually doing is just tuning the encoders to use less B frames (often you get 3 in a row for normal video) or even none, which cuts the quality but lowers the size of the buffer, and acting like that is some amazing thing.
 
  • Like
Reactions: Madoc
like this
For what I've seen so far, it's BS. They make handwavy claims about how they have some amazin' cloud tech will just solve all problems but nothing to indicate that it is anything more than marketing. Turning down graphics and such actually doesn't help, because you still run in to the fundamental video compression issue that you need to have a frame rendered, then you process it and compress it, and if you want the compression to be good it has to be a whole frame compression that does deltas from previous frames, then send that frame out to the client to be decompressed.

Doesn't matter what the source frame is, you have to have it, then compress the whole thing, and also know what the previous (and ideally next) frame is to do the compression. While there is video compression that isn't full-frame, like Display Stream Compression you see on DP/HDMI, it isn't very good. Small, run length, segment-by-segment compression gets you the least. Better compression takes a whole frame and looks at all the similarities in it to deal with (intraframe compression, like JPEG or AVCIntra). Even better compression then uses delta frames, where you only compress a whole frame every so often, maybe 15 frames at most, and the rest are delta frames which basically just say "this is what changed since the last frame". The best compression then takes that to the max where you have a VERY long time between intra frames and you also use bidirectional delta frames, which reference not only the changes from the previous frame, but from the next one as well. This is how things like AVC/HEVC streams online work.

That's all well and good for things that aren't interactive, but the more you go up that efficiency train, the bigger an encode/decode buffer you need and there's no getting around it. You have to have one or more frames that you are looking at to handle things. Particularly with B frames where you have to "see the future" and know what frames are coming later to decode the current frame. So you can get a 4k60 stream that looks pretty good in maybe 25mbps, but only by using all those nifty coding techniques. If you want to do it intraframe you can be looking at 400mbps or more.

Probably all they are actually doing is just tuning the encoders to use less B frames (often you get 3 in a row for normal video) or even none, which cuts the quality but lowers the size of the buffer, and acting like that is some amazing thing.

Yeah the cloud service PR is annoying and full of BS. I haven't been impressed by anything cloud providers have done so far.
Maybe they do have some super great compression algorithms, but the compression issue will eventually be solved with enough bandwidth and faster hardware anyways so I don't really find it impressive or interesting.

The only thing I've seen that was interesting was how they have the controller send input to the cloud instead of to a console/pc then the cloud.
I guess that's cool, but with a good controller that input latency is almost nothing anyways.

The way their marketing hypes it up they make it sound like the games would have engine changes with some sort of magic lag compensation algorithms.
But all they do is run the same exact game you would run locally on a cloud computer.
 
Yeah the cloud service PR is annoying and full of BS. I haven't been impressed by anything cloud providers have done so far.
Maybe they do have some super great compression algorithms, but the compression issue will eventually be solved with enough bandwidth and faster hardware anyways so I don't really find it impressive or interesting.
Nah, they can't actually. Not only is it super hard to develop compression algorithms, but to run on low end hardware you need an ASIC. Decompressing video on a CPU is super time intensive. Like your CPU might not be fast enough to do HEVC decompression in realtime, even if it is new. To do AVC or VP9 decompression at HD rates you need a fairly heavy hitter. Well that is right out, the whole idea is a system needs to be cheap, and ideally small and low power. So how then do these tiny things like Roku decode compressed video? With ASICs, same as your desktop actually. There is a bit of dedicated silicon that knows how to decode the various compression algorithms and is REALLY fast and efficient at it. Uses a tiny bit of power and decodes no problem. Every modern CPU has it baked in...

...but the downside is that works ONLY for algorithms it is specifically designed for. It isn't arbitrary silicon that can be used for anything. So if I make a new algorithm, I need it baked in CPUs that make it in devices before end users can use it. That is one of the things that keeps adoption of new algorithms down. Netflix hasn't switched to HEVC or AV1, despite them being better compression than VP9 and stable formats, because not enough devices have decode support for them. If they did, only newer stuff could play it back, older phones/Rokus/etc would not work.

Thus if a game streaming company developed some amazing codec, they'd have to convince a bunch of manufacturers to include it in their CPU, and wait for those to get to market. I guess you could argue that Google kinda does that, they own VP9 and AV1, but those are not special for Stadia. VP9 is what Youtube uses, and AV1 is used for basically nothing at this point as it is too new.
 
  • Like
Reactions: Madoc
like this
Nah, they can't actually. Not only is it super hard to develop compression algorithms, but to run on low end hardware you need an ASIC. Decompressing video on a CPU is super time intensive. Like your CPU might not be fast enough to do HEVC decompression in realtime, even if it is new. To do AVC or VP9 decompression at HD rates you need a fairly heavy hitter. Well that is right out, the whole idea is a system needs to be cheap, and ideally small and low power. So how then do these tiny things like Roku decode compressed video? With ASICs, same as your desktop actually. There is a bit of dedicated silicon that knows how to decode the various compression algorithms and is REALLY fast and efficient at it. Uses a tiny bit of power and decodes no problem. Every modern CPU has it baked in...

...but the downside is that works ONLY for algorithms it is specifically designed for. It isn't arbitrary silicon that can be used for anything. So if I make a new algorithm, I need it baked in CPUs that make it in devices before end users can use it. That is one of the things that keeps adoption of new algorithms down. Netflix hasn't switched to HEVC or AV1, despite them being better compression than VP9 and stable formats, because not enough devices have decode support for them. If they did, only newer stuff could play it back, older phones/Rokus/etc would not work.

Thus if a game streaming company developed some amazing codec, they'd have to convince a bunch of manufacturers to include it in their CPU, and wait for those to get to market. I guess you could argue that Google kinda does that, they own VP9 and AV1, but those are not special for Stadia. VP9 is what Youtube uses, and AV1 is used for basically nothing at this point as it is too new.

Video codecs are always figure out the new one, wait for enough support (1-5%, depending on use case), support old and new, wait for devices that only support the old to die off (threshold depends on use case again), stop using the old one. If there's a great new codec that makes streaming gaming immensely better, then you've got to sell new dodads that support it, but you can say by a SuperNew Chromecast, it's gonna make your gaming better, or whatever. If there's a great new codec that makes streaming video better, you buy more disk space and add yet another series of encodings.

All of that isn't going to get my telephone company off it's butt to get my packets 10 miles away with less than 20ms round trip though. And Google clearly isn't going to do it either, Google Fiber deployment has been dead these seven years. Adding at least 20 ms of round trip on top of everything else makes cloud gaming DOA for me (add even more time if they don't have a datacenter in my metro area); not even going to try it if I have to buy games in the service that I can't use outside of the service. I'd consider it for Civilization style games, if it meant the turn times were decent at the end of the game though. :D
 
Nah, they can't actually. Not only is it super hard to develop compression algorithms, but to run on low end hardware you need an ASIC. Decompressing video on a CPU is super time intensive. Like your CPU might not be fast enough to do HEVC decompression in realtime, even if it is new. To do AVC or VP9 decompression at HD rates you need a fairly heavy hitter. Well that is right out, the whole idea is a system needs to be cheap, and ideally small and low power. So how then do these tiny things like Roku decode compressed video? With ASICs, same as your desktop actually. There is a bit of dedicated silicon that knows how to decode the various compression algorithms and is REALLY fast and efficient at it. Uses a tiny bit of power and decodes no problem. Every modern CPU has it baked in...

...but the downside is that works ONLY for algorithms it is specifically designed for. It isn't arbitrary silicon that can be used for anything. So if I make a new algorithm, I need it baked in CPUs that make it in devices before end users can use it. That is one of the things that keeps adoption of new algorithms down. Netflix hasn't switched to HEVC or AV1, despite them being better compression than VP9 and stable formats, because not enough devices have decode support for them. If they did, only newer stuff could play it back, older phones/Rokus/etc would not work.

Thus if a game streaming company developed some amazing codec, they'd have to convince a bunch of manufacturers to include it in their CPU, and wait for those to get to market. I guess you could argue that Google kinda does that, they own VP9 and AV1, but those are not special for Stadia. VP9 is what Youtube uses, and AV1 is used for basically nothing at this point as it is too new.

That's basically what I was saying. You have your hardware encoder in the cloud, and your hardware decoder on the device. It doesn't necessarily need to be a specially made ASIC, basically all video cards have them and they're going to continually improve. That part of cloud computing is easily solvable.

The round trip time is the actual interesting problem to solve IMO and they've done nothing to solve it. Their PR keeps claiming they have some magic to do it yet they have nothing.

Early cloud gaming services claimed they would manage it by putting servers in every physical ISP location. But none have done that either.
 
Yeah, I don't know how it works, probably more to do with pre-rendering certain textures or lighting effects and, or more likely dynamically turning down graphics settings so that it's faster to process and send the frames, in either event they claim it works some sort of magic to improve latency, but its likely some sort of graphical cheat.
Games also have an inherit latency built into them. Higher frame rates produce lower input lag. I'd personally would rather buy a 1080P monitor at 144Hz than a 4k Monitor at 60Hz just because of the increased response time. For years console games claimed that 30fps was good enough for anyone and that's clearly a lie. If frame rates drop then input lag is increased, and Google isn't using state of the art graphics for their servers. Their CPU's aren't high clock speed CPU's either so again frame rates will drop bellow 60fps. They could just lower texture quality and turn down graphics but that means the gameplay experience won't be on part with a high end gaming PC, which is kinda their selling point.
 
One thing they could do with cloud gaming that would at least mitigate some lag is basically set it up the same way consoles do split screen games.

Everyone in the match connects to the same machine so physics, player input etc are all calculated on the same machine, and that machine has an array of GPUs rendering out the visuals individually for each player.

That way you've at least eliminated the traditional server to client lag and only have to deal with the streaming lag.

You could even do some cool stuff like have obsurd amounts of items physically colliding and reacting to players and being part of the actual gameplay because every players all see the exact same physics calculations. Stuff that is a lot harder to do with traditional client server architecture.

These are the kind of promises cloud providers were making a decade ago.
 
Back
Top