it is a multifaceted problem to provide a game streaming platform. Google is in a far better position than most as they have the data center infrastructure already in place with loads of hardware and software expertise on their payroll.
The hardware side I see as a solved problem honestly. The encoders found on modern GPU's can leverage the frame buffer directly and also support various low latency encoding schemes (M-JPEG2000) if enough bandwidth is available. It is possible today to transmit 4K60 over Ethernet on a local LAN with less than a frame of latency. If you want any sort of quality at that resolution and refresh, 10 Gbit of bandwidth is a must. A 1 Gbit does invoke compression artifacts but the latency is roughly the same. Of course few have that bandwidth at home so some commodity streaming codec will need to be used. Hardware endecoders for H.265/H.265 exist but they tend to have some additional latency by the nature of the codec algorithm but possible to be around a frame. Also on the hardware end that can save some latency are recent GPU advances that permit it to write directly to another card in the same system. In particular, being able to write the result of the compression algorithm directly to the output packet buffer of a NIC. Again, this is about saving as much latency as possible irregardless of the gains.
The piece of hardware improvement is on the networking side to leverage deterministic Ethernet. This gets guaranteed bandwidth and provides feed back between the end points of when to expect the data to arrive. Google's own data center switches likely support these features already but they are are starting to be deployed for backbone and exchanges. Once home users upgrade their own equipment, end-to-end support here can greatly assist in various predictive and pre-emptive algorithms for gaming. With bandwidth guaranteed, latency bounded with known min/max, overall quality will improve. This can't perform miracles, but it does make services like this more feasible. Of course this going beyond Google's own data centers is going to be a challenge as ISPs are notorious for not improving their own infrastructure.
One other facet I see Google pushing are some local receiver tricks, mainly variable rate refresh. A high VRR display can update as soon as a frame is received, cutting off a fraction of latency in the pipeline. Every little bit here helps. Some processing could be done locally, in particular audio and overlays.
Google just needs to capitalize on what they could be doing.
The hardware side I see as a solved problem honestly. The encoders found on modern GPU's can leverage the frame buffer directly and also support various low latency encoding schemes (M-JPEG2000) if enough bandwidth is available. It is possible today to transmit 4K60 over Ethernet on a local LAN with less than a frame of latency. If you want any sort of quality at that resolution and refresh, 10 Gbit of bandwidth is a must. A 1 Gbit does invoke compression artifacts but the latency is roughly the same. Of course few have that bandwidth at home so some commodity streaming codec will need to be used. Hardware endecoders for H.265/H.265 exist but they tend to have some additional latency by the nature of the codec algorithm but possible to be around a frame. Also on the hardware end that can save some latency are recent GPU advances that permit it to write directly to another card in the same system. In particular, being able to write the result of the compression algorithm directly to the output packet buffer of a NIC. Again, this is about saving as much latency as possible irregardless of the gains.
The piece of hardware improvement is on the networking side to leverage deterministic Ethernet. This gets guaranteed bandwidth and provides feed back between the end points of when to expect the data to arrive. Google's own data center switches likely support these features already but they are are starting to be deployed for backbone and exchanges. Once home users upgrade their own equipment, end-to-end support here can greatly assist in various predictive and pre-emptive algorithms for gaming. With bandwidth guaranteed, latency bounded with known min/max, overall quality will improve. This can't perform miracles, but it does make services like this more feasible. Of course this going beyond Google's own data centers is going to be a challenge as ISPs are notorious for not improving their own infrastructure.
One other facet I see Google pushing are some local receiver tricks, mainly variable rate refresh. A high VRR display can update as soon as a frame is received, cutting off a fraction of latency in the pipeline. Every little bit here helps. Some processing could be done locally, in particular audio and overlays.
Google just needs to capitalize on what they could be doing.