Oculus Quest 2

In a world where latency, pixilation, and motion clarity are chief concerns of VR gaming, a company sees the future of VR in streaming. Startup PlutoSphere wants to use NVIDIA's cloud-based gaming service to bring games to the Quest 2 over the internet. Good luck with that, is all I have say.

https://www.pcgamer.com/plutosphere-vr-streaming-pc-oculus-quest/

Oculus has denied any streaming apps like that from the oculus app store because the experience isn't good enough.

But I actually did see a youtube video of a guy that had some PC remote desktop app on his phone and tethered his phone to his quest 2 to play PC VR games on a remote streaming service. He was getting about 100 ms of latency and he thought it was actually feasible to play casual games like that. So I guess the same thing some people say about playing flat games on cloud streaming services.

But IMO that's not good enough for VR for the vast majority of people, even for casual games the latency would be too nauseating. And the internet infrastructure won't be good enough to do that for at least a decade.
 
Oculus has denied any streaming apps like that from the oculus app store because the experience isn't good enough.

But I actually did see a youtube video of a guy that had some PC remote desktop app on his phone and tethered his phone to his quest 2 to play PC VR games on a remote streaming service. He was getting about 100 ms of latency and he thought it was actually feasible to play casual games like that. So I guess the same thing some people say about playing flat games on cloud streaming services.

But IMO that's not good enough for VR for the vast majority of people, even for casual games the latency would be too nauseating. And the internet infrastructure won't be good enough to do that for at least a decade.

ShadowPC works just fine but it does require a good connection. And near lag free videogame streaming is definetly possible. As much as people give shit about Stadia, a friend of mine does use it (Linux gamer, uses it to play games that do not work on Linux but are on Stadia) and constantly praises how well it works and how surprisingly low the latency is. Anyway, banning videogame streaming apps was definetly a dick move by Oculus. Most people were all "hell yeah 3rd party apps easily without Sidequest and dev accounts" when they heard that Applab was coming but some were expressing worry... And this is exactly the reason why, they were right all along. Honestly I am surprised it took this short time before Facebook showed their true cards. Who are they to claim that technology is not good enough? Hell, by banning it outright makes sure it won't be even given a chance to become better! "It is not good enough now so lets make sure it will never be good in the future" does that sound sensible in any way? No, I believe Facebook has plans for game streaming of their own. It would allow PCVR games to be played on small weak mobile system like Quest, how cool is that!? They already lost wireless PCVR gaming to Virtual Deskop (it was simply too popular, imagine the shitstorm it would have caused if they woult have banned it or kept it Sidequest exclusive forever) and now they are simply making sure that they won't lose the monopoly on videogame streaming to another 3rd party software.
 
Actually, no, the USB C on Turing GPUs are worse than using a motherboards USB port for Virtual Link. Nvidia dropped the ports entirely from their latest GPUs.
Okay cool thanks for clarifying was kind of worried something was a miss in my config.
 
ShadowPC works just fine but it does require a good connection. And near lag free videogame streaming is definetly possible. As much as people give shit about Stadia, a friend of mine does use it (Linux gamer, uses it to play games that do not work on Linux but are on Stadia) and constantly praises how well it works and how surprisingly low the latency is. Anyway, banning videogame streaming apps was definetly a dick move by Oculus. Most people were all "hell yeah 3rd party apps easily without Sidequest and dev accounts" when they heard that Applab was coming but some were expressing worry... And this is exactly the reason why, they were right all along. Honestly I am surprised it took this short time before Facebook showed their true cards. Who are they to claim that technology is not good enough? Hell, by banning it outright makes sure it won't be even given a chance to become better! "It is not good enough now so lets make sure it will never be good in the future" does that sound sensible in any way? No, I believe Facebook has plans for game streaming of their own. It would allow PCVR games to be played on small weak mobile system like Quest, how cool is that!? They already lost wireless PCVR gaming to Virtual Deskop (it was simply too popular, imagine the shitstorm it would have caused if they woult have banned it or kept it Sidequest exclusive forever) and now they are simply making sure that they won't lose the monopoly on videogame streaming to another 3rd party software.

It's kind of dick move, but Oculus believes it's necessary to stop noobs from having a bad first experience and being turned off from VR completely. And I agree with them. It isn't some stupid conspiracy theory to sell their own cloud service.
 
In a world where latency, pixilation, and motion clarity are chief concerns of VR gaming, a company sees the future of VR in streaming. Startup PlutoSphere wants to use NVIDIA's cloud-based gaming service to bring games to the Quest 2 over the internet. Good luck with that, is all I have say.

https://www.pcgamer.com/plutosphere-vr-streaming-pc-oculus-quest/

How Local Movements Can Become Lagless In Future Streamed VR:

Firstly, let me preface. I prefer playing on my RTX 3080. But, streamed VR has some brilliant innovations:

There's some clever hybrid GPU co-processing tricks (cloud GPU rendering + local GPU reprojection).

Basically, the graphics is streamed, but reprojection is done locally, so that head turns are instantaneous even with laggy streaming.

This eliminates most of VR nausea from moving around on a laggy streaming service, since the streamed world laglessly reprojects around you (like a simplified version of Oculus ASW 2.0 used in the original Oculus Rift to perceptually losslessly convert 45fps to 90fps).

Those who have read the Frame Rate Amplification Technology (F.R.A.T.) article on Blur Busters in the past (google it or click the Research tab if you haven't seen it yet.) -- the method of generating cheap 1000fps at UE5 levels from a midrange GPU by year 2030. It's a similar kind of parallelism, except parts of it remotely. It is a bit easier to conceptualize the GPU parallelism behind this if you've ever tried an original Oculus Rift and its ASW 2.0 trick.

In the future, some frame rate amplification technologies (of the future, aka 2030s) conceptualize a theoretical GPU-coprocessor built into the monitor (to convert 100fps to 1,000fps for future 1000Hz displays) -- doing Oculus ASW 2.0 like tricks but for PC gaming. But technically it can be cloud-vs-local coprocessing too.

Also, a popular Oculus Rift (original) optimization was to download the third party Oculus Tray and force the VR game to 45fps permanently, and use permanent reprojection (ASW 2.0) to get 90fps on a lower-performance GPU or higher-game-detail levels. Some games worked really well with this frame rate amplification, while others did not. But if the game is properly optimized, the frame rate of the GPU is decoupled from the frame rate the eyes are seeing. With no interpolation artifacts, no noticeable extra lag, and no soap opera effect (at least in some games).

The same concept can be thought as GPU coprocessing (high power GPU in cloud, low power GPU in headset). And the technology is already in Quest 2. The Quest 2 is capable of simple 3dof rotational reprojection during headturns (not as elaborate as 6dof reprojection Oculus ASW 2.0), so streaming services probably will take advantage of this for 3dof reprojection. The GPU in Quest 2 can only do 3dof reprojection (spherical rotation) rather than 6dof reprojection (3D/parallax depth-buffer-aware reprojection like PC Rift ASW 2.0), but that is probably good enough for VR streaming services to Quest 2. It's not as nauseating as you think because of this clever GPU co-processing trick.

The Quest 2 GPU can play a video bitrate of up to 150 megabits per second (essentially perceptually lossless in H.EVC), so if the streaming service blasts that much (needs a gigabit Internet connection), remote streaming over a 100ms latency for stationary and slow-moving content is theory indistinguishable as Oculus Link with the exception of more laggy-looking hand movements (due to lack of 6dof reprojection) even if headturns are lagless (thanks to local 3dof reprojection).

The latency will be observed during actions like shooting an arrow or interacting with other players. but I would hope a Quest 3 to possibly be able to do real time 6dof reprojection with realtime compressed Z-buffer streaming. If 6dof reprojection is done in a way to completely undo 100ms latency, then even local sudden movements + sudden hand movements, can become a lot more perceptually lagless (with minor artifacts) despite being remotely rendered. The lag would then only show with things like remote players and hit registration, etc.

The real question is the proper partition of the GPU co-processing frame rate amplification architecture (cloud based GPU + local GPU co-operating), and the ratio of GPU power needed (how powerful the cloud GPU needs to be, versus how powerful the local GPU needs to be), in how it partitions frame rate amplification tasks. Needless to say, research papers on this incredible stuff is probably spewing out of those companies by now.

Metaphorically, imagine one remote system doing H.264 P-Frames remotely, then a local system doing H.264 I-Frames and B-Frames locally. That's a gross simplification of local-remote render partitioning, but the technologies that make it possible to convert 45fps to 90fps laglessly, is what is used for tomorrow's 100fps-to-1000fps conversion (for future 1000Hz monitors without unobtainium GPUs), and coincidentally for de-latencyizing VR cloud rendering, as long as there's a modicum of a local GPU (like Quest 2) to do some coprocessing tasks like 3dof or 6dof reprojection to undo the streaming latency. This way, you can use a powerful cloud GPU, plus a local less powerful GPU, to retroactive 6dof-reproject to undo the 100ms latency, to do UE5-graphics perceptually laglessly!

The elephant in the room is the remote-GPU-vs-local-GPU power-vs-power ratio needed to successfully do a worthwhile co-processing job of the gaming content -- if in-headset GPUs improve more rapidly than cloud GPUs do, then this may not work out long term (might as well use local GPU instead). But today, it's 3dof reprojection is already automatically undoing the 100ms latency today for lagless headturns on streamed VR game tests today, since 3dof reprojection is already built into Quest 2 stream-player (that's how 360 degree 24fps videos still headturns at 90 frames per second, and it is also used for Oculus Link as a simplified 3dof version of ASW 2.0).

Which means for VR game streaming to Quest 2 today -- you have lagless head turns (thanks to 3dof reprojection) but laggy sideways movements (due to lack of 6dof reprojection). To fix laggy sideways movements, will require 6dof reprojection + retroactive reprojection to undo streaming latency. This is theoretically possible for a future Quest 3 because it knows the current instantaneous 6dof position and can thus locally reproject a lagged frame to retroactively correct its 6dof position. As long as the streamed VR game is also streaming a compressed version of the Z-buffer too for essentially parallax-artifact-free local reprojection. (Oculus ASW 2.0 requires the Z-buffer to do 6dof reprojection today on my original tethered Oculus Rift). Then local movement latency is completely eliminated even if the VR stream is 100-200 milliseconds, and thus with local 6dof reprojection (with compressed Z-buffer streaming) of future VR streams, things like turning head / crouching / looking under desk / leaning to look around a wall / etc becomes lagless despite the VR streaming high lag 100+ms.

For local-remote GPU co-processing work, the bitrate increase of the extra data needed (Z-buffer streaming) remains an issue but is rapidly falling. But some experiments was done with this already and the concept is sound (with the right local:remote GPU:GPU coprocessing ratio and sufficient codec/bandwidth performance), though the Quest 2 isn't powerful enough to do streamed high-quality highly-compressed Z-buffers yet, for local 6dof reprojection. There are minor reprojection artifacts, but much less than Oculus ASW 1.0 days before the Oculus ASW 2.0. It does require a few extra tens of megabits per second for compressed Z-buffer streaming for 6dof reprojection co-processing for fully lagless local body movements during VR streaming.

New Z-buffer compression algorithms may be needed that may be able to piggyback on H.EVC or H.264 codecs (simply compress the monochrome depth maps), to keep the Z-buffer streaming bitrate low enough without very glitchy 6dof reprojection. An H.EVC extension support 16-bit monochrome (48-bit color), and a 16-bit Z-Buffer can simply be represented as a monochrome depth map. For streaming, we don't need to worry about Z-fighting, as the reprojection artifacts are only visible for near-distance objects (where Z-buffer resolution is much higher anyway), so 16-bit is sufficient precision for the "streaming + 6-dof reprojection" combo. Either way, this allows compressing a Z-buffer through a commodity video codec, to make Z-buffers compact enough for Internet streaming, making possible local 6dof reprojection at roughly ASW 2.0 quality, for lagless local movements despite 100ms VR streaming lag. VR streaming with Z-buffers then uses possibly only about 1.5 times more bandwidth (approx) than without Z-buffers. But you gain the advantage of lagless local movements from local 6dof reprojection, without too many objectionable artifacts.

This is just a way to stream Z-buffers using existing commodity video compressors that was never designed for Z-buffers. To fix some shortcomings like compression artifacts, a local Z-buffer optimized compression-artifacts deblock filter (shader based and/or neural based) can filter out most Z-buffer compression artifacts. Like foreground/background pixels erroneously popping far/near at parallax edges. The algorithm can use the visible frame buffer as hinting, possibly with a small amount of AI (like how Quest 2 Passthrough Mode uses the Snapdragon's neural network to AI-stitch the 4 cameras in real time with no seams). By gluing together a few off-the-shelf technologies, realtime 6dof reprojection probably can be possible in a future Quest 3. I doubt the Quest 2 can do more than 3dof reprojection anyway, but willing to be pleasantly surprised. The neat effect is that local 6dof body-movement latency can remain constant (low) regardless of varying Internet, with progressively worse latencies simply slowly increasing the amount of reprojection artifacts during fast movements.

At the end of the day, 100ms VR streaming latencies (for local movements) are eminently correctable via retroactive 6dof reprojection, with two streamed video eye views (at 90fps each) + two streamed Z-buffers (at 90fps each) + local 6dof reprojection GPU to undo the streamed latency for local movements in supported VR games. You'll definitely need ~100 Mbps at least for comfortable VR streaming, and the Z-buffer has to fit in that too...

Another optional theoretical enhancement in a future headset is automatically building reprojection history over a series of multiple frames -- to do parallax fill-ins. For example, moving to hide around a wall, then suddenly tilting to look again. Streaming latency in theory would make the view glitch a bit, even if your local movements were lagless. But if the reprojector retroactively kept a history of the last few seconds of frames (uncompressed pairs of 2D frames + pairs of Zbuffers) and used DLSS-like tricks, you could eliminate most of lagged-reprojection artifacts too for most movement use-cases. Because the local VR headset would remember the last few seconds of 3D, and the reprojector would jigsaw stitch them seamlessly to prevent parallax-reveal glitching from streaming latency. Mind you, the VR games would need some minor modifications to improve the local-remote GPU coprocessing experience.

BTW, I am cited in more than 20 different peer reviewed research papers🔗 including this NVIDIA frame rate amplification related paper🔗-- so I've got some reputation here and background knowledge of trends of future GPU workflow parallization.

Mind you, I prefer the original games installed on my PC though, but it's amazing how local movement latency can be decoupled from streaming latency, making cloud VR streaming possible without nausea in a much-better-than-Stadia experience. The science of frame rate amplification technologies also applies to local:remote GPU co-processing too.

UPDATE: I am also reminded of the early Microsoft pre-VR reprojection experiments called Microsoft Hyperlapse (2014 version)🔗 which essentially photogrammetry-analyzes video to generate a GPU 3D scenery which can then allow a virtual camera path (motion stabilized) different from the original path. This same "2D-3D-2D" reprojection technique can actually also be used as a frame rate amplification technology for video too, by generating new intermediate camera positions along a camera path (whether the original non-stabilized camera path, or a new Hyperlapse-stabilized camera path).
 
Last edited:
I'm at 29-35ms with my NanoHD's and Quest 2. I have a Unifi 6 LR on order so I will see if WiFi 6 makes any difference.
I have a NanoHD and just ordered the Quest 2. Do you notice any issues or have any tips for setting it up?
I'm very curious to see how the Unifi 6 LR works out for you.
 
I have a NanoHD and just ordered the Quest 2. Do you notice any issues or have any tips for setting it up?
I'm very curious to see how the Unifi 6 LR works out for you.
I don’t have any tips. It just worked flawlessly for me. I haven’t played with my quest 2 lately. I’ll make some time and let you know how the UniFi 6 LR works.
 
When I'm connected via link (wired) to my pc, just using it as a virtual desktop there is a low volume music playing in the background. When I'm watching a movie or tv show, it's very annoying to hear this in quiet scenes. I haven't found a way to turn this off. There is only one sound volume control, and it turns of all sound if I use it. I've googled it, but I didn't see any answers for quest 2. Is it even possible to turn off this background sound?
 
Last edited:
When I'm connected via link (wired) to my pc, just using it as a virtual desktop there is a low volume music playing in the background. When I'm watching a movie or tv show, it's very annoying to hear this in quiet scenes. I haven't found a way to turn this off. There is only one sound volume control, and it turns of all sound if I use it. I've googled it, but I didn't see any answers for quest 2. Is it even possible to turn off this background sound?

What the... I never had that kind of problem. There are no music playing in background during gaming. The link cable is often connected to my PC for charging even if I use VD for gaming.
 
It's not when gaming. It's when you are in link environment, with the dash. There is always a quiet background music while I'm here. When I start a game, of course you are in a game, not in link environment and that is fine. But, when I'm in link environment, I use built in virtual desktop to view my monitor, and start a movie or tv show from a local source. That background music is still playing. It's low, so you can't here it, if any music in the movie or show is playing, but in the quiet scenes you can. And no way to turn it off.
 
Just bought the Quest 2 with cable. It looks better than the cv1. I have not tried Onward totally with it. Just loaded it on the wife's unit and had her play a bit. I really like it which is why I got one. I am going to try the cable soon.
 
Hello, I have a scenario created in Unreal (v 25.4) and I have compiled it for the Oculus Quest 2. My problem is that the texts of the scene are pixelated, as if the antialising did not work.
If I use SIDEQUEST and in the tools I set the Default Texture Size to 3072, it exits the application and when I go back in everything looks perfect.
My problem is that when I restart the oculus, they return to the default values.
Is there a way to change the default settings or is it more a matter of adjusting the APK of the application?
Thanks!!
 
Hello, I have a scenario created in Unreal (v 25.4) and I have compiled it for the Oculus Quest 2. My problem is that the texts of the scene are pixelated, as if the antialising did not work.
If I use SIDEQUEST and in the tools I set the Default Texture Size to 3072, it exits the application and when I go back in everything looks perfect.
My problem is that when I restart the oculus, they return to the default values.
Is there a way to change the default settings or is it more a matter of adjusting the APK of the application?
Thanks!!

I think you need to change it in your application, the settings in side quest only change the values temporarily and there isn't a way to permanently do it from what I've heard. It has a pretty big performance impact on most games and you would not want it on for everything.
 
How Local Movements Can Become Lagless In Future Streamed VR:
So, to simplify, or maybe this is a different techinique altogether: you aren't streaming video, or pixels, but more occasionally the texture and 3d data needed for the local computer to assemble and render for real-time playback? when standing still shaking your head, with no new objects or movement in the environment, nothing new is necessary to stream to the device.
 
So, to simplify, or maybe this is a different techinique altogether: you aren't streaming video, or pixels, but more occasionally the texture and 3d data needed for the local computer to assemble and render for real-time playback? when standing still shaking your head, with no new objects or movement in the environment, nothing new is necessary to stream to the device.
Actually, it would be 2 concurrent video streams — an RGB video stream and a Z-Buffer video stream, which is reprojected local-side. Kind of like a reverse (rewind) “interpolation” that’s more lagless and more artifactless. The rewind-reprojection undoes the latency of the streaming.

However, I hate using the phrase “reverse interpolation” because under the umbrella of Frame Rate Amplification Technologies (things that can increase frame rates), there are multiple methods of doing so — both spatially and temporally — including interpolation, extrapolation, and reprojection — all of which use different algorithms.

The goal is to make it perceptually lossless. Much like how Netflix (or Blu-Ray, or E-Cinema) is often only 1 native frame per second with 23 predicted frames in between (I-Frames, B-Frames, P-Frames of the H.264 codec). Twenty years ago, you saw the compression artifacts and pulsing of video images caused by imperfect predicted frames. But Netflix predicted frames (at 1080p+) have become so good lately, almost Blu-Ray quality predicted frames (in some cases) where you can’t tell which are full native frames and which are predicted frames, whenever you watch Disney+, Netflix, Hulu, or whatnot.

However, yes, the boundaries could blur over time. A long-term future video format may use geometry (e.g. H.268 or H.269), such as the theoretical framerateless video file formats may also stream 3D geometry data rather than macroblock based formats. This concept has been discussed for years, but not enough GPU power until semi-recently.

Oculus Asynchronous Space Warp 2.0 aims to do something similar in 3 dimensions (at a 1:1 frame rate amplification ratio, converting 45fps to 90fps perceptually losslessly). Incidentially, this is also how 100fps will be converted to 1,000fps for tomorrow’s 1000Hz displays, without unobtainiumly expensive GPUs in 2030s.

The boundaries between video codecs and 3D renderers are expected to blur (in the VR world) in the next 10-20 years, to solve lots of problems, such as wireless bandwidth between PC and 8K cordless headsets. So, all this technology isn’t necessarily for remote streaming of VR, but also streaming between PC and VR headset.
 
Last edited:
There was a update today for oculus, and suddenly I seem to be locked at 72hz. I'm using quest for Assetto Corsa, and after the update it's running at only 72fps. I didn't change anything in Assetto Corsa, or Oculus link software. I even tried setting the refresh rate to 72hz, then back to 90hz, but didn't change anything. Anybody else updated and has a problem getting 90hz via link now?

Edit: Seems I'm not the only one. Lots of threads about it on reddit. It was a bug when v27 was in beta, and they still released it as official update.
 
Last edited:
Yeah we are stuck with 72hz since last update. Sucks as I really wanted to benchmark 90hz on some titles with my CPU/RAM upgrade.

I would like to think it's because the 120hz update is coming soon. (probably just dreaming though, but the hardware is capable and they have sort of hinted at it)
 
They are rolling out V28 which is supposed to fix the 72Hz bug and some other problems. They started rolling it out yesterday, so everyone should have it over the next couple of weeks.
 
Ordered one of the new xy? routers for the Quest 2. 5.2 or whatever gb. I hope it works good so I can get rid if the cord.
 
Has anyone gotten the new experimental update yet? If so, how is Air Link? Just checked mine, and no dice.
 
Ordered one of the new xy? routers for the Quest 2. 5.2 or whatever gb. I hope it works good so I can get rid if the cord.

it's AX aka wifi 6, still the same 5ghz signal as AC, but faster with significant improvements to latency.

I haven't gotten the update yet. There could be even further improvements over the VD wireless implementation. I remember reading something about how he couldn't do (or couldn't figure out how to do) certain things that they could do with a native implementation to improve it even more.

I'm pretty excited to try it, even though using VD basically works flawlessly for me with my wifi 6 router. I can't wait for 120hz support.
 
Resident Evil 4 VR Announced Exclusively for the Oculus Quest 2



Not much of a Resident Evil player but I am hyped for this game! First proper AAA game for Quest 2. I mean, as AAA as underclocked XR2 chipset can manage. :p
 
Resident Evil 4 VR Announced Exclusively for the Oculus Quest 2


So essentially you're going to be looking at ass ugly outdated textures up close? It doesn't look much more visually appealing than the Skyrim VR version.
 
So essentially you're going to be looking at ass ugly outdated textures up close? It doesn't look much more visually appealing than the Skyrim VR version.

Remember that this is on a mobile XR2 chipset with whopping 6gb shared memory between graphics and CPU. Temper your expectations, the game looks just fine.

Now, if this turns out to be a timed exclusive and they eventually release it on PCVR or PSVR then they better push the graphics a bit.
 
How Local Movements Can Become Lagless In Future Streamed VR:

Firstly, let me preface. I prefer playing on my RTX 3080. But, streamed VR has some brilliant innovations:

There's some clever hybrid GPU co-processing tricks (cloud GPU rendering + local GPU reprojection).

Basically, the graphics is streamed, but reprojection is done locally, so that head turns are instantaneous even with laggy streaming.

This eliminates most of VR nausea from moving around on a laggy streaming service, since the streamed world laglessly reprojects around you (like a simplified version of Oculus ASW 2.0 used in the original Oculus Rift to perceptually losslessly convert 45fps to 90fps).

Those who have read the Frame Rate Amplification Technology (F.R.A.T.) article on Blur Busters in the past (google it or click the Research tab if you haven't seen it yet.) -- the method of generating cheap 1000fps at UE5 levels from a midrange GPU by year 2030. It's a similar kind of parallelism, except parts of it remotely. It is a bit easier to conceptualize the GPU parallelism behind this if you've ever tried an original Oculus Rift and its ASW 2.0 trick.

In the future, some frame rate amplification technologies (of the future, aka 2030s) conceptualize a theoretical GPU-coprocessor built into the monitor (to convert 100fps to 1,000fps for future 1000Hz displays) -- doing Oculus ASW 2.0 like tricks but for PC gaming. But technically it can be cloud-vs-local coprocessing too.

Also, a popular Oculus Rift (original) optimization was to download the third party Oculus Tray and force the VR game to 45fps permanently, and use permanent reprojection (ASW 2.0) to get 90fps on a lower-performance GPU or higher-game-detail levels. Some games worked really well with this frame rate amplification, while others did not. But if the game is properly optimized, the frame rate of the GPU is decoupled from the frame rate the eyes are seeing. With no interpolation artifacts, no noticeable extra lag, and no soap opera effect (at least in some games).

The same concept can be thought as GPU coprocessing (high power GPU in cloud, low power GPU in headset). And the technology is already in Quest 2. The Quest 2 is capable of simple 3dof rotational reprojection during headturns (not as elaborate as 6dof reprojection Oculus ASW 2.0), so streaming services probably will take advantage of this for 3dof reprojection. The GPU in Quest 2 can only do 3dof reprojection (spherical rotation) rather than 6dof reprojection (3D/parallax depth-buffer-aware reprojection like PC Rift ASW 2.0), but that is probably good enough for VR streaming services to Quest 2. It's not as nauseating as you think because of this clever GPU co-processing trick.

The Quest 2 GPU can play a video bitrate of up to 150 megabits per second (essentially perceptually lossless in H.EVC), so if the streaming service blasts that much (needs a gigabit Internet connection), remote streaming over a 100ms latency for stationary and slow-moving content is theory indistinguishable as Oculus Link with the exception of more laggy-looking hand movements (due to lack of 6dof reprojection) even if headturns are lagless (thanks to local 3dof reprojection).

The latency will be observed during actions like shooting an arrow or interacting with other players. but I would hope a Quest 3 to possibly be able to do real time 6dof reprojection with realtime compressed Z-buffer streaming. If 6dof reprojection is done in a way to completely undo 100ms latency, then even local sudden movements + sudden hand movements, can become a lot more perceptually lagless (with minor artifacts) despite being remotely rendered. The lag would then only show with things like remote players and hit registration, etc.

The real question is the proper partition of the GPU co-processing frame rate amplification architecture (cloud based GPU + local GPU co-operating), and the ratio of GPU power needed (how powerful the cloud GPU needs to be, versus how powerful the local GPU needs to be), in how it partitions frame rate amplification tasks. Needless to say, research papers on this incredible stuff is probably spewing out of those companies by now.

Metaphorically, imagine one remote system doing H.264 P-Frames remotely, then a local system doing H.264 I-Frames and B-Frames locally. That's a gross simplification of local-remote render partitioning, but the technologies that make it possible to convert 45fps to 90fps laglessly, is what is used for tomorrow's 100fps-to-1000fps conversion (for future 1000Hz monitors without unobtainium GPUs), and coincidentally for de-latencyizing VR cloud rendering, as long as there's a modicum of a local GPU (like Quest 2) to do some coprocessing tasks like 3dof or 6dof reprojection to undo the streaming latency. This way, you can use a powerful cloud GPU, plus a local less powerful GPU, to retroactive 6dof-reproject to undo the 100ms latency, to do UE5-graphics perceptually laglessly!

The elephant in the room is the remote-GPU-vs-local-GPU power-vs-power ratio needed to successfully do a worthwhile co-processing job of the gaming content -- if in-headset GPUs improve more rapidly than cloud GPUs do, then this may not work out long term (might as well use local GPU instead). But today, it's 3dof reprojection is already automatically undoing the 100ms latency today for lagless headturns on streamed VR game tests today, since 3dof reprojection is already built into Quest 2 stream-player (that's how 360 degree 24fps videos still headturns at 90 frames per second, and it is also used for Oculus Link as a simplified 3dof version of ASW 2.0).

Which means for VR game streaming to Quest 2 today -- you have lagless head turns (thanks to 3dof reprojection) but laggy sideways movements (due to lack of 6dof reprojection). To fix laggy sideways movements, will require 6dof reprojection + retroactive reprojection to undo streaming latency. This is theoretically possible for a future Quest 3 because it knows the current instantaneous 6dof position and can thus locally reproject a lagged frame to retroactively correct its 6dof position. As long as the streamed VR game is also streaming a compressed version of the Z-buffer too for essentially parallax-artifact-free local reprojection. (Oculus ASW 2.0 requires the Z-buffer to do 6dof reprojection today on my original tethered Oculus Rift). Then local movement latency is completely eliminated even if the VR stream is 100-200 milliseconds, and thus with local 6dof reprojection (with compressed Z-buffer streaming) of future VR streams, things like turning head / crouching / looking under desk / leaning to look around a wall / etc becomes lagless despite the VR streaming high lag 100+ms.

For local-remote GPU co-processing work, the bitrate increase of the extra data needed (Z-buffer streaming) remains an issue but is rapidly falling. But some experiments was done with this already and the concept is sound (with the right local:remote GPU:GPU coprocessing ratio and sufficient codec/bandwidth performance), though the Quest 2 isn't powerful enough to do streamed high-quality highly-compressed Z-buffers yet, for local 6dof reprojection. There are minor reprojection artifacts, but much less than Oculus ASW 1.0 days before the Oculus ASW 2.0. It does require a few extra tens of megabits per second for compressed Z-buffer streaming for 6dof reprojection co-processing for fully lagless local body movements during VR streaming.

New Z-buffer compression algorithms may be needed that may be able to piggyback on H.EVC or H.264 codecs (simply compress the monochrome depth maps), to keep the Z-buffer streaming bitrate low enough without very glitchy 6dof reprojection. An H.EVC extension support 16-bit monochrome (48-bit color), and a 16-bit Z-Buffer can simply be represented as a monochrome depth map. For streaming, we don't need to worry about Z-fighting, as the reprojection artifacts are only visible for near-distance objects (where Z-buffer resolution is much higher anyway), so 16-bit is sufficient precision for the "streaming + 6-dof reprojection" combo. Either way, this allows compressing a Z-buffer through a commodity video codec, to make Z-buffers compact enough for Internet streaming, making possible local 6dof reprojection at roughly ASW 2.0 quality, for lagless local movements despite 100ms VR streaming lag. VR streaming with Z-buffers then uses possibly only about 1.5 times more bandwidth (approx) than without Z-buffers. But you gain the advantage of lagless local movements from local 6dof reprojection, without too many objectionable artifacts.

This is just a way to stream Z-buffers using existing commodity video compressors that was never designed for Z-buffers. To fix some shortcomings like compression artifacts, a local Z-buffer optimized compression-artifacts deblock filter (shader based and/or neural based) can filter out most Z-buffer compression artifacts. Like foreground/background pixels erroneously popping far/near at parallax edges. The algorithm can use the visible frame buffer as hinting, possibly with a small amount of AI (like how Quest 2 Passthrough Mode uses the Snapdragon's neural network to AI-stitch the 4 cameras in real time with no seams). By gluing together a few off-the-shelf technologies, realtime 6dof reprojection probably can be possible in a future Quest 3. I doubt the Quest 2 can do more than 3dof reprojection anyway, but willing to be pleasantly surprised. The neat effect is that local 6dof body-movement latency can remain constant (low) regardless of varying Internet, with progressively worse latencies simply slowly increasing the amount of reprojection artifacts during fast movements.

At the end of the day, 100ms VR streaming latencies (for local movements) are eminently correctable via retroactive 6dof reprojection, with two streamed video eye views (at 90fps each) + two streamed Z-buffers (at 90fps each) + local 6dof reprojection GPU to undo the streamed latency for local movements in supported VR games. You'll definitely need ~100 Mbps at least for comfortable VR streaming, and the Z-buffer has to fit in that too...

Another optional theoretical enhancement in a future headset is automatically building reprojection history over a series of multiple frames -- to do parallax fill-ins. For example, moving to hide around a wall, then suddenly tilting to look again. Streaming latency in theory would make the view glitch a bit, even if your local movements were lagless. But if the reprojector retroactively kept a history of the last few seconds of frames (uncompressed pairs of 2D frames + pairs of Zbuffers) and used DLSS-like tricks, you could eliminate most of lagged-reprojection artifacts too for most movement use-cases. Because the local VR headset would remember the last few seconds of 3D, and the reprojector would jigsaw stitch them seamlessly to prevent parallax-reveal glitching from streaming latency. Mind you, the VR games would need some minor modifications to improve the local-remote GPU coprocessing experience.

BTW, I am cited in more than 20 different peer reviewed research papers🔗 including this NVIDIA frame rate amplification related paper🔗-- so I've got some reputation here and background knowledge of trends of future GPU workflow parallization.

Mind you, I prefer the original games installed on my PC though, but it's amazing how local movement latency can be decoupled from streaming latency, making cloud VR streaming possible without nausea in a much-better-than-Stadia experience. The science of frame rate amplification technologies also applies to local:remote GPU co-processing too.

UPDATE: I am also reminded of the early Microsoft pre-VR reprojection experiments called Microsoft Hyperlapse (2014 version)🔗 which essentially photogrammetry-analyzes video to generate a GPU 3D scenery which can then allow a virtual camera path (motion stabilized) different from the original path. This same "2D-3D-2D" reprojection technique can actually also be used as a frame rate amplification technology for video too, by generating new intermediate camera positions along a camera path (whether the original non-stabilized camera path, or a new Hyperlapse-stabilized camera path).
Hi chief! Doesn’t asw 2.0 work over link? That’s pretty much the reason people are excited for airlink because virtual desktop doesn’t do asw so although you get smooth head turns, you do get some lag in hand movement, some jitter and positional stutter. That should be eliminated with airlink and ASW 2.0.
 
Hi chief! Doesn’t asw 2.0 work over link? That’s pretty much the reason people are excited for airlink because virtual desktop doesn’t do asw so although you get smooth head turns, you do get some lag in hand movement, some jitter and positional stutter. That should be eliminated with airlink and ASW 2.0.
Virtual Desktop long supported Quest 2’s built-in 3dof reprojection (smooth head turns despite game framerate drops) while Link will probably have 6dof reprojection (aka “ASW 2.0”). They probably probably has to include it to get AirLink to satisfactory performance specs on some use cases.

3dof reprojection = rotational only reprojection
6dof reprojection = positional reprojection (aka ASW 2.0)

3dof reprojection was done locally (in the Quest 2’s GPU) but 6dof reprojection will have to be done remotely on the PC end (as the Quest 2 does not have the GPU horsepower for 6dof reprojection). So there won’t be any positional latency compensation benefits due to lack of local 6dof reprojection.

That said, even done remotely, 6dof reprojection for wireless operation would be a big boon, compensating for the GPU’s extra overhead doing video streaming. It’s possible that the reprojection tasks will be partitioned — rotational reprojection done locally in Quest 2 with postional reprojection done remotely on PC GPU.

This is a lot of programming complexity for Oculus’ development team, but if they pull it off, it could work more reliably than Virtual Desktop.

I often used Oculus Tray Tool to force certain games to 45fps for perpetual 90fps ASW 2.0 projection with my original Oculus Rift, to prevent the small stutter during the transitions into/out of ASW mode.

For some games, it is best to have a perma-45fps for perma-ASW 2.0 operation, than have the game rapidly modulate between 45fps and 90fps, creating VR-feel changes. This tactic is useful for fluctuating framerate games on lesser GPU horsepower headroom.

Because of the way reprojection processing is done and partitioned between the local GPU (rotational reprojection) and remote GPU (positional reprojection), there won’t be stutter-cancellation for hand movements at 45fps (at this time) over AirLink for Quest 2. At least, that’s my 99%-certainity prediction — there just isn’t enough computing power in the current Snapdragon XR to frame rate amplify everything locally (yet).

6dof reprojection is often global-environment at the moment, which means the whole room runs at 90fps (despite being rendered by GPU at 45fps), but hand movements are motion that are in a different motion than the head positional motion. So hand movements often still stutter during reprojection, despite lack of stutter in the video game you’re playing (e.g. Half Life Alyx at 45fps reprojected to 90fps during ASW 2.0).

Future multi-object 6dof reprojection is more compute-intensive than DLSS or RTX in some ways. The XR, while technically impressive (GTX 1070 league graphics in Star Wars Tales From Galaxy Edge!), still has no hope in hell of pulling that off, barring miracle programmers more impressive than some of those who dealt with 8bit machine limitations. This cannot be fixed by AirLink without local multi-object 6dof reprojection (which doesn’t exist yet — perhaps ASW 3.0)

Possibly Quest 3 in late 2022 / early 2023 could. Apple VR (2022-2023) using M1x or M2 would probably have enough computing power to do global (non-multi-object) 6dof reprojection locally. Who knows?
 
Last edited:
Virtual Desktop long supported Quest 2’s built-in 3dof reprojection (smooth head turns despite game framerate drops) while Link will probably have 6dof reprojection (aka “ASW 2.0”). They probably probably has to include it to get AirLink to satisfactory performance specs on some use cases.

3dof reprojection = rotational only reprojection
6dof reprojection = positional reprojection (aka ASW 2.0)

3dof reprojection was done locally (in the Quest 2’s GPU) but 6dof reprojection will have to be done remotely on the PC end (as the Quest 2 does not have the GPU horsepower for 6dof reprojection). So there won’t be any positional latency compensation benefits due to lack of local 6dof reprojection.

That said, even done remotely, 6dof reprojection for wireless operation would be a big boon, compensating for the GPU’s extra overhead doing video streaming. It’s possible that the reprojection tasks will be partitioned — rotational reprojection done locally in Quest 2 with postional reprojection done remotely on PC GPU.

This is a lot of programming complexity for Oculus’ development team, but if they pull it off, it could work more reliably than Virtual Desktop.

I often used Oculus Tray Tool to force certain games to 45fps for perpetual 90fps ASW 2.0 projection with my original Oculus Rift, to prevent the small stutter during the transitions into/out of ASW mode.

For some games, it is best to have a perma-45fps for perma-ASW 2.0 operation, than have the game rapidly modulate between 45fps and 90fps, creating VR-feel changes. This tactic is useful for fluctuating framerate games on lesser GPU horsepower headroom.

Because of the way reprojection processing is done and partitioned between the local GPU (rotational reprojection) and remote GPU (positional reprojection), there won’t be stutter-cancellation for hand movements at 45fps (at this time) over AirLink for Quest 2. At least, that’s my 99%-certainity prediction — there just isn’t enough computing power in the current Snapdragon XR to frame rate amplify everything locally (yet).

6dof reprojection is often global-environment at the moment, which means the whole room runs at 90fps (despite being rendered by GPU at 45fps), but hand movements are motion that are in a different motion than the head positional motion. So hand movements often still stutter during reprojection, despite lack of stutter in the video game you’re playing (e.g. Half Life Alyx at 45fps reprojected to 90fps during ASW 2.0).

Future multi-object 6dof reprojection is more compute-intensive than DLSS or RTX in some ways. The XR, while technically impressive (GTX 1070 league graphics in Star Wars Tales From Galaxy Edge!), still has no hope in hell of pulling that off, barring miracle programmers more impressive than some of those who dealt with 8bit machine limitations. This cannot be fixed by AirLink without local multi-object 6dof reprojection (which doesn’t exist yet — perhaps ASW 3.0)

Possibly Quest 3 in late 2022 / early 2023 could. Apple VR (2022-2023) using M1x or M2 would probably have enough computing power to do global (non-multi-object) 6dof reprojection locally. Who knows?
According to this guy on Reddit when link is used both atw and asw is done remotely on the pc rather than the headset. ATW I believe it’s the rotational reprojection to keep us from getting sick when there is higher latency for head turning.

This would make sense rather than have the headset do rotational and the pc do positional.

early reports from lucky users who have been able to upgrade to the latest quest 2 build are reporting much smoother gameplay with airlink in comparison to virtual desktop.

https://www.reddit.com/r/oculus/com...&utm_medium=ios_app&utm_name=iossmf&context=3
 
According to this guy on Reddit when link is used both atw and asw is done remotely on the pc rather than the headset. ATW I believe it’s the rotational reprojection to keep us from getting sick when there is higher latency for head turning.

This would make sense rather than have the headset do rotational and the pc do positional.
It does make sense for ATW (rotational reprojection) and ASW (positional reprojection) to combine into the same 6dof reprojection algorithm (ATW+ASW).

Note: Different headsets all use reprojection under different brand names than ATW/ASW, so I've settled on generic terminology (3dof reprojection = rotational only, and 6dof reprojection = rotational and positional).

If algorithmically programmed correctly, local ATW (rotationals) with remote ASW (positonals) would probably be even superior.

In theory, AirLink has special considerations. Local ATW theoretically has less WiFi lag and less WiFi bandwidth than remote ATW, so as long as the latency difference is reconciled for the temporally-pipelined reprojection -- is correctly compensated for -- then it would be superior to fully-remote ATW+ASW (full 6dof reprojection), even if rotational reprojection occurs temporally later than positional reprojection for the same frame. There's nothing stopping this from being done algorithmically temporally correctly, with the correct mathematical compensations.

The USB Link doesn't need this as much as AirLink does. WiFi bandwidth. WiFi latency. And you've got video codec latencies to worry about, and ATW can be done post-process on immediate fresh head-tracking data after finishing Link video decoding step.

I suspect Oculus may have already done this optimization to make AirLink better than Virtual Desktop. Somebody needs to test if AirLink uses local ATW (rotational) with remote ASW (positional). If they skipped this and stuck completely with remote ATW+ASW, that's some really good optimization. I'll need to try it out when the firmware hits my headset.

How To Test For Local ATW + Remote ASW
A good possible test is to configure your PC to a static IP address (to prevent DHCP IP address loss from interrupting the game), then start an AirLink game, and then suddenly interrupt the network wire for 1-to-3 second (briefly unplug the Ethernet cable between PC and WiFi router) while in the middle of an AirLink game, and see what happens. If rotationals continue smoothly uninterrupted (ATW continues running) while graphics are paused/frozen for a few seconds, then we've answered the question: Local ATW FTW (pun....for the win!)

early reports from lucky users who have been able to upgrade to the latest quest 2 build are reporting much smoother gameplay with airlink in comparison to virtual desktop.
I'm not surprised with the high budgets that Oculus has access to, to make it as good as possible. I hope Virtual Desktop can specialize sufficiently enough to still be useful to keep a revenue for the great author of Virtual Desktop. A one-person programming team who did lots.

I hope godin takes my suggestion of improved virtual typing feel on nonexistent keyboards -- I have some pretty neat algorithms for that (that I told godin about on Discord) but godin might be too busy working on other features.
 
Last edited:
It does make sense for ATW (rotational reprojection) and ASW (positional reprojection) to combine into the same 6dof reprojection algorithm (ATW+ASW).

Note: Different headsets all use reprojection under different brand names than ATW/ASW, so I've settled on generic terminology (3dof reprojection = rotational only, and 6dof reprojection = rotational and positional).

If algorithmically programmed correctly, local ATW (rotationals) with remote ASW (positonals) would probably be even superior.

In theory, AirLink has special considerations. Local ATW theoretically has less WiFi lag and less WiFi bandwidth than remote ATW, so as long as the latency difference is reconciled for the temporally-pipelined reprojection -- is correctly compensated for -- then it would be superior to fully-remote ATW+ASW (full 6dof reprojection), even if rotational reprojection occurs temporally later than positional reprojection for the same frame. There's nothing stopping this from being done algorithmically temporally correctly, with the correct mathematical compensations.

The USB Link doesn't need this as much as AirLink does. WiFi bandwidth. WiFi latency. And you've got video codec latencies to worry about, and ATW can be done post-process on immediate fresh head-tracking data after finishing Link video decoding step.

I suspect Oculus may have already done this optimization to make AirLink better than Virtual Desktop. Somebody needs to test if AirLink uses local ATW (rotational) with remote ASW (positional). If they skipped this and stuck completely with remote ATW+ASW, that's some really good optimization. I'll need to try it out when the firmware hits my headset.

How To Test For Local ATW + Remote ASW
A good possible test is to configure your PC to a static IP address (to prevent DHCP IP address loss from interrupting the game), then start an AirLink game, and then suddenly interrupt the network wire for 1-to-3 second (briefly unplug the Ethernet cable between PC and WiFi router) while in the middle of an AirLink game, and see what happens. If rotationals continue smoothly uninterrupted (ATW continues running) while graphics are paused/frozen for a few seconds, then we've answered the question: Local ATW FTW (pun....for the win!)


I'm not surprised with the high budgets that Oculus has access to, to make it as good as possible. I hope Virtual Desktop can specialize sufficiently enough to still be useful to keep a revenue for the great author of Virtual Desktop. A one-person programming team who did lots.

I hope godin takes my suggestion of improved virtual typing feel on nonexistent keyboards -- I have some pretty neat algorithms for that (that I told godin about on Discord) but godin might be too busy working on other features.
So apparently the quest 2 does have the ability to run ASW on the headset while using link and presumably airlink. If you open the oculusdebugtoolcll.exe and type in help in the cmd it lists a command " asw.HmdAuto (no arguments)" with the description "Switches ASW to be performed on HMD, rather than PC (Link Only)" I verified this myself in the attached image. I guess the gpu is more free in the link environment since the pc gpu is doing most of the work. This may be something that is on by default in airlink.
 

Attachments

  • hmdasw.png
    hmdasw.png
    86.8 KB · Views: 0
So apparently the quest 2 does have the ability to run ASW on the headset while using link and presumably airlink. If you open the oculusdebugtoolcll.exe and type in help in the cmd it lists a command " asw.HmdAuto (no arguments)" with the description "Switches ASW to be performed on HMD, rather than PC (Link Only)" I verified this myself in the attached image. I guess the gpu is more free in the link environment since the pc gpu is doing most of the work. This may be something that is on by default in airlink.
Full 6dof ASW with complete Z-buffer? (ala ASW 2.0 on PC VR play)

AFAIK, the Z-buffer is not streamed over Link.

It would look like ASW 1.0 which does not need much computing power, and could fit on an XR GPU budget. So the reprojection looks worse, but has less lag.
 
Back
Top