VR GPU power needed going forward

Joined
Nov 13, 2006
Messages
3,475
Found an interesting visual comparison of the graphics quality we can look forward to and the GPU rendering power that's going to be required for the Pimax 8K X going forward. Not my work (from the Pimax forums), but interesting nonetheless. Note that they are taking into account supersampling rendering power needed for the really good stuff:

scale-3900-2000-Hrek54q8V-ecJh3JJTRb.jpg
 
This seems like the only practical application going forward for SLI/Crossfire. Single GPU solutions are simply not going to have the horsepower to push into higher resolution territory any time soon.
 
Agreed. Even the current top end card, the Titan Xp, is only capable of a max resolution of 7680 x 4320 @ 60Hz, which while being 8K is a good 30Hz shy of meeting the needed VR refresh rate, and at that resolution, gaming graphics would pretty much turn into a stuttering slide show. Single cards capable of such feats are a ways off still. And then there is the minor problem that Display Port only supports a max of 60Hz at that resolution as well. So 8K per eye is definitely off the table for the foreseeable future until the standards get improved and hardware has a chance to evolve and catch up to them. 4K per eye @ 90Hz is very doable though via DP, but we'll still need next gen hardware (Volta) to get there without melting down a GPU. (It'll be doable with 2 1080Ti's as well, but only barely) Next gen VR @ 4k per eye is going to need two very high end cards, one driving each eye's display... not SLI. The only thing that may ease that requirement up some is if eye tracking and foveated rendering are able to enter the fray.
 
I don't think a headset will ship from an established brand with greater than 4k per eye without eye tracking. It is such a waste of power to render the full field of view at full res when you're only focused on such a tiny portion of it.

Fixed foveated rendering, checkerboarding, or sub-native rendering will also be pretty useful to alleviate power demands because you don't need to feed a full res signal to a high res headset to get a lot of the benefits of the better screens (see Pimax "8k" non-X)

The drawback of SLI/Crossfire for VR is that you lose the significant time/power savings of single pass stereo. Plus, it will never be universally supported because it will always be an expensive niche setup.
 
Right, not true SLI, but as Z06 mentioned, one card per display. Right now having two video cards is a niche market, and so is VR, but that is changing rapidly I think.

But you're both right, stopgap solutions to cheat the rendering should stretch it. I'm just skeptical about the effectiveness of eye-tracking.
 
Really good eye tracking tech is around, it's just the problem of getting it into a consumer device at a reasonable cost and then orchestrating all of the rendering pipeline components to make use of it. Much like any new tech, it won't be cheap to start, but we'll look back and laugh at it in 10 years when it costs something like $29 for the embedded eye tracking module.
 
In 10 years we'll just be plugging a cable into the base of our skulls.... I can't wait for the Matrix.
But then you will have to clean that hole in your head, what if others use your stuff, and they have dirty holes?
 
Based off of my Rift super-sampling tests, I'm fairly confident my single Titan-Xp will handle current games just fine with 2560x1440 x2 @ 90 Hz on my Pimax 8K and then a single Volta for 3840x2160 x2 @ 90 FPS on my 8K X that will replace my 8K.
 
Based off of my Rift super-sampling tests, I'm fairly confident my single Titan-Xp will handle current games just fine with 2560x1440 x2 @ 90 Hz on my Pimax 8K and then a single Volta for 3840x2160 x2 @ 90 FPS on my 8K X that will replace my 8K.
Volta would have to be 2x the power of your Titan to run 8K(If it's even true 8K)
Pimax is making great claims but i will wait for the dust to settle and see how good it really is....
 
Actually slightly over twice as powerful, but I'm taking into account my Titan-Xp won't even break a sweat powering the regular "1440p" version of the 8K.
 
Actually slightly over twice as powerful, but I'm taking into account my Titan-Xp won't even break a sweat powering the regular "1440p" version of the 8K.
Yeah, just sucks waiting for VR 2.0.....
 
It's going to be very exciting. I think around VR version 3.0 with true 8K resolution and advanced graphics VR will be a very convincing world.

I think VR timeline:

VR 1.5: Pimax 8K X - 2018/2019
VR 2.0: Next gen Rift/Vive - 2020
VR 3.0: 2024
 
Wow, I would think HTC and Oculus will have something new next year and not 2020. I also believe higher resolution Windows MR headsets will be coming out as time goes on (LG, Nvidia, Asus, Gigabyte, MSI, . . .) all have the capability to push the envelope.
 
That was back in March, so that'd put a new Oculus at spring 2019 or so. Shouldn't be 2020, I'd expect christmas shopping season 2019 at the very latest, Spring 2019 the earliest.

I agree, Gen 2 VR (I.E. true 4k per eye, not up-scaled or up-sampled) from the bigger players will hit sometime in 2019. I'd be very surprised if it came out much earlier than Summer 2019. And it'll require top tier graphics in order to power it effectively. (E.G. Volta)
 
Competition may actually be good for HTC and Oculus. Given that the bulk of potential VR users don't have hardware capable of supporting current-gen VR, holding off their second-gen until the competition burns their dollars moving the market could be a good move. Releasing a gen 2 in the near future would likely be viewed as premature; it would either be cost-prohibitive to include all of the current available features (wireless, new controllers, better room-scale, higher res headset) or feel like a stripped down product. Neither situation would be ideal.
 
Well HTC with their over priced attachments and slow take up probably because of it, plus they have the highest price headset - I just see HTC loosing sells. Since the Vive is designed for modifications/upgrades, it might be prudent for HTC to have more options including better lenses/panel etc. $100 so your headset fits better and $100 for a sensor is stupid. While not next gen, it could give it a continue edge as time goes on until Gen 2 comes out if the pricing was not ridicules.
 
Competition may actually be good for HTC and Oculus. Given that the bulk of potential VR users don't have hardware capable of supporting current-gen VR, holding off their second-gen until the competition burns their dollars moving the market could be a good move. Releasing a gen 2 in the near future would likely be viewed as premature; it would either be cost-prohibitive to include all of the current available features (wireless, new controllers, better room-scale, higher res headset) or feel like a stripped down product. Neither situation would be ideal.
It's pretty close since even a 1060/1060+ should run it pretty well. I have had 0 problems with the 980ti. Next gen with full 4K each eye will be the real test.
 
I would expect VR development to be slow. Sales were far below initial predictions and it's still a very niche market. While the display industry is always several steps ahead of the graphics industry, it's not reasonable to expect to immediately have high performance graphics options available to drive the high end displays.
 
It's going to be very exciting. I think around VR version 3.0 with true 8K resolution and advanced graphics VR will be a very convincing world.

I think VR timeline:

VR 1.5: Pimax 8K X - 2018/2019
VR 2.0: Next gen Rift/Vive - 2020
VR 3.0: 2024

I would put the Pimax 8K/8K X at VR 2.0 because they're increasing not only the pixel density, but the FoV as well. And by a good margin on both. I can't imagine that anyone is going to go beyond the specs of the Pimax 8K / 8K X for a couple of years simply due to the GPU processing requirements. I think Pimax hit the nail on the head by offering both the upscaled 8K and the native 8K X. Nobody is going to have a video card to drive anything beyond those two anytime soon.

Although I do confess to being a bit mystified as to why the 8K and 8K X are completely separate models. It seems to me that it would have been just as easy to make the 8K with a signal pass-thru to allow users to choose between the native 3840 x 2160 signal or the upscaled 2560 x 1440. Then they would only need to make one top tier unit and it would be ready for not only today's GPUs, but those to come in the next year or two. After all, isn't upscaling the signal more difficult than simply displaying the native signal resolution? Or maybe I'm missing something here. I'm no electrical engineer.
 
I would put the Pimax 8K/8K X at VR 2.0 because they're increasing not only the pixel density, but the FoV as well. And by a good margin on both. I can't imagine that anyone is going to go beyond the specs of the Pimax 8K / 8K X for a couple of years simply due to the GPU processing requirements. I think Pimax hit the nail on the head by offering both the upscaled 8K and the native 8K X. Nobody is going to have a video card to drive anything beyond those two anytime soon.

Although I do confess to being a bit mystified as to why the 8K and 8K X are completely separate models. It seems to me that it would have been just as easy to make the 8K with a signal pass-thru to allow users to choose between the native 3840 x 2160 signal or the upscaled 2560 x 1440. Then they would only need to make one top tier unit and it would be ready for not only today's GPUs, but those to come in the next year or two. After all, isn't upscaling the signal more difficult than simply displaying the native signal resolution? Or maybe I'm missing something here. I'm no electrical engineer.
So far I have read it is the greatest VR so far, but have yet to see anything substantial.
 
I would put the Pimax 8K/8K X at VR 2.0 because they're increasing not only the pixel density, but the FoV as well. And by a good margin on both. I can't imagine that anyone is going to go beyond the specs of the Pimax 8K / 8K X for a couple of years simply due to the GPU processing requirements. I think Pimax hit the nail on the head by offering both the upscaled 8K and the native 8K X. Nobody is going to have a video card to drive anything beyond those two anytime soon.

Although I do confess to being a bit mystified as to why the 8K and 8K X are completely separate models. It seems to me that it would have been just as easy to make the 8K with a signal pass-thru to allow users to choose between the native 3840 x 2160 signal or the upscaled 2560 x 1440. Then they would only need to make one top tier unit and it would be ready for not only today's GPUs, but those to come in the next year or two. After all, isn't upscaling the signal more difficult than simply displaying the native signal resolution? Or maybe I'm missing something here. I'm no electrical engineer.

8K X will have two Displayport 1.4 TCons and cables. You need an absolute top-tier GPU for that. A lot of wasted cost (overall headset price increase) sunk into something not many people would use. Not to mention delay the headset until summer 2018.
 
8K X will have two Displayport 1.4 TCons and cables. You need an absolute top-tier GPU for that. A lot of wasted cost (overall headset price increase) sunk into something not many people would use. Not to mention delay the headset until summer 2018.
You mean GPU's. If it is actual 4K @ 90 it will probably need more than 1. It will be interesting to see what they do. Maybe have it auto detect what you have and adjust the display accordingly?
I would buy 2 cards no problem.
 
8K X will have two Displayport 1.4 TCons and cables. You need an absolute top-tier GPU for that. A lot of wasted cost (overall headset price increase) sunk into something not many people would use. Not to mention delay the headset until summer 2018.

Give me true working full 4K resolution @ 90fps per eye along functional/working drivers and an increased FOV from the Rift/Vive and I would sooooo use this and wouldn't hesitate ponying up for whatever GPU hardware it would need. Yes, it's niche compared to the rest of the market, but I bet there are quite a few of us "niche" people now that have had a good taste of what VR offers... I would go there in a heartbeat. But I'm thinking Pimax won't be able to fully pull this off. My money will be on the Oculus or Samsung horse.
 
Eye tracking and variable resolution rendering should do the trick with the higher resolution panels for next generation cards and top cards now. I would expect that to be in the 2nd gen headsets.
 
8K X will have two Displayport 1.4 TCons and cables. You need an absolute top-tier GPU for that. A lot of wasted cost (overall headset price increase) sunk into something not many people would use. Not to mention delay the headset until summer 2018.
I guess we'll have to wait for more info or a teardown after the HMDs are released to see just how they're constructed. I would think the 8K should actually cost more than the 8K X because it contains an upscaler. The 8K X is just displaying the native signal, albeit through two cables instead of one. Maybe that's trickier to do. I would have thought it to be simpler (logistically speaking) to combine the two headsets and release just one 8K/X model. Plug a single DP cable into Port 1 and it upscales the signal just like in the 8K. Plug a 2nd DP cable into Port 2 and Port 1 goes into 8K X mode, bypassing the upscaler. Or even just put a physical switch on the side of the headset.

This would allow Pimax to build and stock only one model instead of two. And it would give consumers more value as well. Put the headset in 8K mode for gaming. Then switch to 8K X mode for productivity, surfing, movie watching, etc where you don't need massive GPU rendering power but want maximum clarity. In the future when GPUs are more powerful/cheaper the headset could be left in 8K X mode.

Obviously they're not going to be doing any of this, but it should be doable. The pieces are all there.
 
Scalars aren't that much. I wouldn't be surprised if the PCB for the 8K X is totally different than the 8K.
 
Why not just use the scalar of the video card? Why add extra costs, take up space and maybe add a smidgin of weight unnecessarily? For those 8k headsets the user can decide what resolution will work and then it is scaled by the video card. Eye tracking (fast enough) and smart rendering will dramatically improve performance and not decrease quality.
 
That doesn't make any sense. You need the scalar on the headset to accept the low resolution signal to upscale. The pipe size (cable) is the limitation.
 
Scalars aren't that much. I wouldn't be surprised if the PCB for the 8K X is totally different than the 8K.
Oh, I'm sure it will be. In fact, that was kinda my point. If Pimax could have combined the scaler of the 8K with the dual input of the 8K X, they could have offered a single product instead of two. The PCB inside the HMD is likely going to be the only difference between the two models. Everything else, including the displays, will be identical between them.
 
Remember that 48 Gbps speed is only 6-9 feet cable length. Not very useful for a VR headset.
 
Remember that 48 Gbps speed is only 6-9 feet cable length. Not very useful for a VR headset.

True, but that's just for passive cables. The spec allows for active and converter cables as well, so I'd expect them to be able to reach a bit further... perhaps 15-20 feet. (There's always the option of using converters and optic fiber as well, but that gets stupid expensive fast). One nice plus with the new HDMI 2.1 standard is that the much fatter pipe will now easily supports 4K@90fps, which it should be able to handle out to at least 5 meters or so as it won't be using the full 48 Gbps pipe to deliver this capability... the 6-9 feet limit would be more for 8K @ 120hz with Display Stream Compression (DSC).

The whole reason DP was being considered over HDMI by Pimax was due to the inherent bandwidth limitations of HDMI... but now DP really begins to suffer signal degradation after approx 9 feet. So it now looks like Pimax may be able to go back to considering the use of HDMI for the 8K X after all... and GPU cards supporting HDMI 2.1 should be arriving by the time they have a serious product offering ready to ship... 8-12 months from now I would expect.
 
Yes of course passive cables, which make up almost 100% of cables in use today with regard to PC's and A/V. Active cables typically add significant cost and fragility. Once you get past that 2-3 meters, the bandwidth capability drops quite fast.
 
Yes of course passive cables, which make up almost 100% of cables in use today with regard to PC's and A/V. Active cables typically add significant cost and fragility. Once you get past that 2-3 meters, the bandwidth capability drops quite fast.

Don't disagree, it's just not as dire or doom and gloom as you are painting it is all. If you are in the market for Gen 2 VR, when it finally does arrive, the cost of an active cable or some form of a booster/converter isn't necessarily going to be a show stopper. The newer HDMI 2.1 spec definitely helps as to added capability needed for next gen VR.
 
Back
Top