Digital Foundry Analyzes Google's Stadia Platform

AlphaAtlas

[H]ard|Gawd
Staff member
Joined
Mar 3, 2018
Messages
1,713
Following Google's "Stadia" game streaming service announcement yesterday, Digital Foundry decided to take a closer look at the hardware behind the platform. Google says they use a "Custom 2.7GHz hyper-threaded x86 CPU with AVX2 SIMD and 9.5MB L2+L3 cache," and while they didn't mention the vendor, DF notes that they haven't seen such a configuration in any of AMD's currently shipping server CPUs, and that it should significantly outpace anything found in a modern console. Meanwhile, the GPU largely resembles a Vega 56 card with 16GB of HBM2, and the games are reportedly loaded from an SSD. Through their own testing, DF came away impressed with the platform's consistent frame pacing, and in some cases, total latency is on par with locally-run games on a console or PC.

Check out the analysis here.


Google has also demonstrated scalability on the graphics side, with a demonstration of three of the AMD GPUs running in concert. Its stated aim is to remove as many of the limiting factors impacting game-makers as possible, and with that in mind, the option is there for developers to scale projects across multiple cloud units: "The way that we describe what we are is a new generation because it's purpose-built for the 21st century," says Google's Phil Harrison. "It does not have any of the hallmarks of a legacy system. It is not a discrete device in the cloud. It is an elastic compute in the cloud and that allows developers to use an unprecedented amount of compute in support of their games, both on CPU and GPU, but also particularly around multiplayer."
 
I wonder just how much analytics their games will have. I'm not even going to bother to make an in-game ads joke.
 
Google said latency was 166ms on their PixelBook test with a 200 Mbps connection. How is that "on par" with a game running on my PC that has a total latency of about 4ms? Even running an Xbox One X game on my OLED is only going to be around 20ms. If you want to talk about locally streaming I was only getting 30-40ms of latency on my Surface Pro 4 connected wirelessly to a Steam Link. In the worst-case scenario I described latency is still twice as good (in raw numbers) as their test "PC 60 FPS" case at 79ms.
 
This: "Google has also demonstrated scalability on the graphics side, with a demonstration of three of the AMD GPUs running in concert."

What does that mean?
Does it require coding?
Is it already done at the engine level?
Are we talking some kind of Crossfire for CPU/GPU?

Does that mean that Navi could somehow bring Crossfire back in a meaningful way?
 
This is such a huge win... AMD will get loads of cash selling Google hardware. Google will then have an army of programmers improving the AMD open source drivers... Then... The entire enterprise will fail due to internet latency and bandwidth limits... But AMD and we the consumers will get a huge benefit.
 
Google said latency was 166ms on their PixelBook test with a 200 Mbps connection. How is that "on par" with a game running on my PC that has a total latency of about 4ms? Even running an Xbox One X game on my OLED is only going to be around 20ms. If you want to talk about locally streaming I was only getting 30-40ms of latency on my Surface Pro 4 connected wirelessly to a Steam Link. In the worst-case scenario I described latency is still twice as good (in raw numbers) as their test "PC 60 FPS" case at 79ms.

They are actually counting frames on a 240 fps camera between input and action taking place on screen which is different than latency detected through an application. Even then, I think what DF is saying is misleading because TCP latency is going to feel WAY different display latency from a local console, even if both "average" the same amount of time. TCP latency isn't constant which is going to lead to input feeling janky especially on poor connections. I would speculate they will try and bring AI to task to "predict" player input to smooth this out but who knows how successful that will be.


My question is how the fuck are they going to handle big launches? Like imagine trying to play GTA6 on launch day, or something like Apex drops out of the sky again, there is no way their servers could keep up. To me that was always glaring issue with streaming games that no one ever seems to want to talk about.
 
My question is how the fuck are they going to handle big launches? Like imagine trying to play GTA6 on launch day, or something like Apex drops out of the sky again, there is no way their servers could keep up. To me that was always glaring issue with streaming games that no one ever seems to want to talk about.

There is in fact a way for their servers to keep up. That's have a shit ton of servers. Anyone seriously going into this is going to have to have a provisioning model that accounts for regular play. People mostly play when they play. The title changes, but they shouldn't need to care too much about that much in the same way your PC doesn't care what game you bought this week.

The simplest way to look at this is imagine it's a PC. If you just hot seated it until it was busy 24 hours a day, you'd get a lot more gaming for the dollar out of it. Unfortunately, anyone near the PC wants to game at about the same time mostly. So you can't do that.

In theory cloud gaming can do it more, but there's that nasty latency thing. Data center A isn't going to perform well when you are trying to share it amongst someone GMT -5, +6 and +11. But google sells cloud services. During the day these units may do boring web and business crap. During the evening they will be playing games. or something like that. I'm guessing that regionally google, MS and Amazon can over-provision for gamers by letting them subsidize greater underprovisioning of something else.
 
I was only moderately satisfied with local Nvidia GameStream and Steam Streaming, and that was wired on my local network. I've tinkered with Project Stream and such and it was barely passable for a single player game, but occasionally it just dropped to low resolution and got pixelated. Of course the latency was very noticeable.

As long as they are shooting for the lowest common denominator here it's fine, but if the business model hurts small developers it's going to be a net negative.
 
Google said latency was 166ms on their PixelBook test with a 200 Mbps connection. How is that "on par" with a game running on my PC that has a total latency of about 4ms? Even running an Xbox One X game on my OLED is only going to be around 20ms. If you want to talk about locally streaming I was only getting 30-40ms of latency on my Surface Pro 4 connected wirelessly to a Steam Link. In the worst-case scenario I described latency is still twice as good (in raw numbers) as their test "PC 60 FPS" case at 79ms.
Yea these tests are bought and sold in favor of Google. Linus Tech Tips did something similar where they tested a service at their headquarters and reported minimal latency. Digital Foundry used Google's Internet, whatever that means but it certainly sounds like unrealistic best case scenario.

 
I don't care if they get it working ok, I'll not be subscribing to a gaming services. Especially on my laptop which is used when I'm traveling. I'll just stick to games that I can install on my machines.
 
This is like buying movies thru comcast. You don't own them. Buy games thru the streaming service is the same thing. Then years later you won't be able to play these games. If the stream service decides they don't have the server power to run old games
 
They are actually counting frames on a 240 fps camera between input and action taking place on screen which is different than latency detected through an application. Even then, I think what DF is saying is misleading because TCP latency is going to feel WAY different display latency from a local console, even if both "average" the same amount of time. TCP latency isn't constant which is going to lead to input feeling janky especially on poor connections. I would speculate they will try and bring AI to task to "predict" player input to smooth this out but who knows how successful that will be.


My question is how the fuck are they going to handle big launches? Like imagine trying to play GTA6 on launch day, or something like Apex drops out of the sky again, there is no way their servers could keep up. To me that was always glaring issue with streaming games that no one ever seems to want to talk about.
There is just no way this will ever work satisfactory. Some low end gamers who play angry birds on cellphones will get sold on it.
 
This is such a huge win... AMD will get loads of cash selling Google hardware. Google will then have an army of programmers improving the AMD open source drivers... Then... The entire enterprise will fail due to internet latency and bandwidth limits... But AMD and we the consumers will get a huge benefit.
I bought some AMD around 18 a while back - but it was my initial purchase some not a lot of shares. I was hoping on picking up more as it fell down to 16ish....got that wrong :)
I'm still happy to see the nice jump. I'm watching for additional pullbacks. I think AMD might have other good announcements in the near future. (No insider information from me - just speculation).
 
Back
Top