AMD & NVIDIA GPU VR Performance: Island 359 @ [H]

Maybe, but that's gotta be awfully difficult to test on XB1. Just seems odd MS would update to code that neither of their consoles currently support when trying to bridge PC and XB.

It should be backward compatible but to what degree is up in the air right now. The premise of what MS is doing now is different than in the past, they are trying to make their new API's and shader model extensible just like their OS, no more get the full package type deal, so things are changing to a more like Vulkan type additive features. The outcome can be anyone's guess right now.

The fact that future HMDs are looking at 120Hz would seem telling and that's down around 8ms frame times. So without some interesting adaptations we won't see any progress on VR hardware as they simply spend all their time running the same content faster.

Well I'm not going to presume things on VR since I haven't looked into it much but from a human body perspective yeah, there are many other things VR can possible do with higher frame rates and lower latency which haven't been discovered yet. Unlike traditional monitors there are many variables that are just unknown at this point.
 
Pretty sure razor was joking when he said 24Hz. A while back [H] linked an article where it said the center of your viewing and in full color it's something around 60Hz or a little higher. Black and white is higher and your peripheral can sense over 150Hz. All "if I recall correctly" but year, refresh rate is not black and white.


Opps sorry miss wrote that wasn't trying to be sarcastic or anything like that lol.
 
"Just need a method to differentiate a pixel between white and black now."? That sentence still makes no sense in that context. Black and white pixels by definition are differentiated.
Talking about a single pixel here. Intensity varies to determine color already.

How does your suggestion fix the large gap between the last two frames (the last frame took longer to render because a big explosion occurred on screen or whatever)? The screen will be off for way longer while waiting for the next frame, so the perceived brightness will drop.
By not having a big gap between frames.

How is any of this better than just maintaining a framerate above 90?
Better performance? I'm sure there's some benefit to doing more with less.

Your suggestion still doesn't fix the problem. How would dimming the screen counteract the unpredictable brightness of low persistence VRR?
It's all about duty cycles and the brightness being predictable. That's the biggest key with what I proposed, you actually can predict the intervals. As for the dimming it's the same effect that makes alternating black and white look gray. That's no different than a low persistence display, just assume white is the refresh. Eyes perceive an accumulated effect from the light. Decreasing refresh dims so you'd counteract it with increased brightness. The adjustments made would also be relatively small. It's low persistence but your eyes will still accumulate the effect.

Yet another thought that makes no sense. How is the refresh rate predictable if you can change it?
The same way anything you predetermine is predictable. if you set the refresh to 95Hz it's reasonable to assume your refresh is 95Hz. Given this technique you really wouldn't need to change it, but there could be some benefits to doing so. It might also make sense to vary the refresh by your displacement in game. If moving quickly a faster refresh likely helps, but for relatively idle activity you could free up performance by lowering it.

I don't at all understand the rest of this or how it is beneficial to anyone. You want to guarantee a frame in <2-3ms? Meaning what, you're going to be doing ATW 350-500 times per second? How does that help? How does that make it less bad to vary the refresh rate on the displays?
Potentially. You'd only warp once per refresh, but the warp is far cheaper than rendering the scene again. At that point it would be pointless to vary the refresh rate as that's what the ATW is effectively doing. I'm reasonably sure this is the direction everyone is going based on technology changes. It's simply a better solution than variable refresh if you want high perceived framerates. I wouldn't be surprised if we were seeing 200Hz rates once the cables and displays could manage it. That would take a fairly beefy card to keep redrawing scenes that are largely wasteful.

I'm not sure what AAA game you're referring to and have no idea what you mean in regards to console VR games. PSVR's refresh rate is fixed in a game just like every other headset. The only difference is that it can run 90hz or 120hz depending on the game.
Doom and Fallout. I'm sure consoles have VR games in the works we haven't heard about. As for the 90/120Hz do you really think they are redrawing scenes that quickly or possibly using the ATW effect I described to fill the blanks?
 
I'm not going to respond to everything in your post because I still don't know what it is you actually want.

The current VR systems are 90hz low persistence screens.

Vive does not use ATW but does drop to 45fps with reprojection when the game can't maintain 90fps.

Oculus does ATW when the next frame isn't going to make it in time for vsync, and also does a scheduled reprojection (synchronous timewarp) before displaying each frame that did complete on time.

Just describe the system that you actually want and how it differs from the available devices.


By not having a big gap between frames.


To not have gaps between frames you are doing one of two things:
1. Sample and hold - This is the opposite of low persistence which is necessary for VR (with the relatively low refresh screens that exist now).
2. Fixed refresh with low persistence - This is what the devices are doing currently!

Using VRR in any capacity only means you will be displaying frames at a sub-native rate some of the time. This benefits no one and actually doesn't improve perceived latency. For example say you have a 120fps screen in a VR headset. You can either do 120fps vsync and use ATW when the game can't hit 120fps, which still gives you very low latency. Or, you can do some crazy VRR thing and only display frames when they are done, then timewarp, and the user will have experienced a longer gap between frames, with the same motion to photon latency, than they would have if the headset was just using vsync like a sane person would suggest.

Potentially. You'd only warp once per refresh, but the warp is far cheaper than rendering the scene again. At that point it would be pointless to vary the refresh rate as that's what the ATW is effectively doing. I'm reasonably sure this is the direction everyone is going based on technology changes. It's simply a better solution than variable refresh if you want high perceived framerates. I wouldn't be surprised if we were seeing 200Hz rates once the cables and displays could manage it. That would take a fairly beefy card to keep redrawing scenes that are largely wasteful.

There are visual artifacts to an inconsistent framerate, even with ATW. Objects moving across the screen will judder and I believe ATW itself has some artifacts because while rotational timewarping is pretty simple, positional requires additional work and is not perfect.

Here is oculus' position on the matter:

"On Gear VR Innovator Edition, ATW has been a key part of delivering a great experience. Unfortunately, it turns out there are intrinsic limitations and technical challenges that prevent ATW from being a universal solution for judder on PC VR systems with positional tracking like the Rift. Under some conditions, the perceptual effects of timewarp judder in VR can be almost as bad as judder due to skipped frames."

"In our experience, ATW should run at a fixed fraction of the game frame rate. For example, at 90Hz refresh rate, we should either hit 90Hz or fall down to the half-rate of 45Hz with ATW. This will result in image doubling, but the relative positions of the double images on the retina will be stable. Rendering at an intermediate rate, such as 65Hz, will result in a constantly changing number and position of the images on the retina, which is a worse artifact."

The best thing to do in VR is render at the native refresh of the display, with scheduled reprojection (aka synchronous timewarp) after the frame is completed to reduce motion to photon latency.

It is the best method now and will continue to be the best even with higher refresh displays. There may be a point where refresh rates are so high that you have a 500hz display and we may not notice the artifacts if the game is rendering above 250fps or whatever number it ends up being. In that case then a fixed 500 fps refresh with best effort FPS on the GPU + sync timewarp would be good.


Doom and Fallout. I'm sure consoles have VR games in the works we haven't heard about. As for the 90/120Hz do you really think they are redrawing scenes that quickly or possibly using the ATW effect I described to fill the blanks?


Fallout and Doom are being released on SteamVR which does not have native ATW in any capacity. Doom is a well optimized game and most VR capable systems should be able to run it > 90fps. Fallout 4 is another matter, I don't know what they are doing with that. If you are saying bethesda is developing their own ATW implementation then I would like to see some evidence of that. PSVR games are mostly running at 60fps with Synchronous timewarp (scheduled) to match the 120fps screen. I think there are a couple ones running at 90fps and maybe one that is doing 120fps native. It may have ATW but like with Oculus, it is intended as a safety net, and is not to be relied upon.

Edit: and for the millionth time, ATW is not necessary for a good VR experience. None of my 50+ vive games/demos has it and I have had no sickness issues (except in project cars, because I spun out a lot) or general complaints about the experience.
 
Last edited:
The only "bad" thing that i can see from all the VR reviews here, is that the FuryX is so far behind the 980Ti. I dont expect the RX480 to be "premium VR", neither the 1060 6GB, but the FuryX is in a different league and should handle VR better than the 480. One explenation i can find is that the devs code in favor of on videocard producer and that nVidia is alocating alot more resources at this moment for VR titles. Anyway, VR is in his infancy, dont need to panic now :))
 
So, I read the review, it's been a while since I commented on a review. I enjoyed reading it, and I got a good sense of exactly what is required when it comes to graphics cards to make VR a relevant experience at the highest levels possible. One positive thing I think will come out of VR technology this early is the reality of VR requiring graphics cards that are high end for the most optimal gaming experience, as your data has shown. The reason why this is a positive thing is that it will push game developers and companies to really take advantage (or it should at least) of the best video cards we have available for their games in order to make them more immersive and extract full use out of VR as a platform.

I think that it'll even out in the end, because even though it's expensive now, more advances in VR technology will make it more accessible and inexpensive for the average user as time passes.

I also honestly found the review informative in regards to why developers use a certain number of frame rates 45 and 90 to keep users from getting motion sickness, I didn't know that before. I used a friends Oculus Rift setup, and played the demo that came with the Rift, and that was my only experience with VR so far, a limited one. I just find myself getting more and more excited as I see this technology improve and game releases are starting to come out that will make this technology take off.

It's an exciting time to be alive.
 
Played this tonight.... wow. From chopper take off to the watch towers... VR immersion is top notch. I had a blast!
 
It’s no replacement for full-blown native 90Hz output, Oculus CEO Brendan Iribe says, but Asynchronous Spacewarp is effective enough that Oculus feels comfortable dropping the minimum hardware requirements for the Rift from an Nvidia GTX 970 down to an Nvidia GTX 960. The Rift’s CPU requirements are also being rejiggered, all the way down to an AMD FX-4350 dual-core Core i3-6100 processor. Previously, you needed a quad-core Core i5 or AMD FX processor that cost much, much more.

Link

I'll just leave this here. Oculus's CEO probably has more experience with VR than myself. Plenty of other tech sites writing about it. Minimums from 970 down to 960 and $500 VR systems. Wouldn't be surprised if it helped higher end cards, just need HMDs with higher refresh rates now. Drawbacks would likely lessen at higher framerates. Tech should get even better with the low level APIs and native ability to reconstruct the scene.
 
I'll just leave this here. Oculus's CEO probably has more experience with VR than myself. Plenty of other tech sites writing about it. Minimums from 970 down to 960 and $500 VR systems. Wouldn't be surprised if it helped higher end cards, just need HMDs with higher refresh rates now. Drawbacks would likely lessen at higher framerates. Tech should get even better with the low level APIs and native ability to reconstruct the scene.

Basically all they did is make it so 45 FPS with always-on reprojection is "acceptable" now. Maybe it is if they are using a different technique than the Vive, but I wouldn't want to always be in reprojection on the Vive.
 
Basically all they did is make it so 45 FPS with always-on reprojection is "acceptable" now. Maybe it is if they are using a different technique than the Vive, but I wouldn't want to always be in reprojection on the Vive.
Every frame is effectively a reprojection in that there is some sort of input lag. It's simply the most accurate rendering without an infinitely fast GPU. It also more closely mimics how human vision works and why motion blur exists. I wouldn't be surprised if it's standard on next gen HMDs with higher refresh rates. Adjusting refresh on the HMDs, even if not variable ala FreeSync/GSync, probably offers some benefit as well. The only reason it's a straight 45->90 is because DX11 is inherently serial. With low level APIs a developer could use far more accurate reprojection techniques. Doom already does this with Vulkan and they've been giving tips to Fallout, so I'd fully expect it from the big upcoming AAA games.
 
Back
Top