Best VR/HMD?

Black5Lion

Limp Gawd
Joined
Jan 1, 2013
Messages
325
Hey!
So I know nothing's out yet, but based on what we know already about the Oculus, RE Vive , and Glyph. and others(?)
Which do you think is the best? and why?

Also, I've wanted an HMD since saw Sony's back in 2012 I believe.
It doesn't necessarily need to have head tracking since I plan to use it mainly to replace my PC monitor (yes replace). So I've been wondering If I should get a Vive so it would be good for gaming too (would it even work as a standalone display?), or a Glyph which seems very promising with the projecting light into your eyes to make it extra sharp thing.
So... What do you guys think?
 
Oculus DK2 owner here
I can understand your logic, but once you try a vr headset for a while, I think you'll change your mind about using it as a monitor without any tracking. I haven't really tried that, but I think it will be very uncomfortable and straining. Even though I don't share the enthusiasm of most vr fans about feeling "presence" already with the DK2, I can tell you that some subconscious part of my brain gets fooled about what I am seeing and makes me sick when I move my head and what I see through the headset doesn't move at the same time. It's a terrible feeling for me personally, like trying to move my eyes and realising they (and my head) are bolted in a fixed position.

Other than that I guess Oculus rift consumer version and Vive will be the ones to get initially, with Vive having shown a much more robust tracking and controls system so far. (Oculus haven't shown any controllers actually).

For the visual part, the DK2 is 1920x1080, half of that for one eye and the other half for the other eye. That resolution spread over a FOV of 100 degrees is barely enough to make it playable. You can clearly see the subpixels of the screen and the distance between them, it is called "screen door effect" and makes it seem like you are in some sort of 90's arcade virtual reality at times, but it is still pretty interesting as a feeling and a well designed game can still be fun.

When everything is running fine, it runs at 75 fps and there is ZERO lag, it's better than a monitor in this regard because oculus have used every crazy trick they could think of to reduce it. I think any decent headset will need to be as advanced as that in order to have a chance.
 
People who feel sick when using VR, isn't that due to the low fps ability of the device?
 
If you are not intending to develop games or applications for VR, I really cannot recommend buying a DK2, and the Vive devkit will likely have similar issues. This is NOT something you can just plug in and use.
For example, for you intended use on a 'headless' PC, you will at a minimum need a 'ghost' display adapter (a device that outputs an EDID to pretend to be a monitor) to act as the target display for desktop capture. And that's the easy part.

Let's say you want to emulate a typical 1920x1080 24" monitor at the normal 70cm from your head. The horizontal field of view taken up by that monitor is a hair under 40°. So, assuming square pixels on the display, and a rotationally symmetrical lens (i.e. not a cylindrical lens), we need 1920 pixels per 40°, or 48 pixels/degree. Call it 50 pixels/degree to make the numbers easier.
Now let's say we want to match the DK1 & DK2 and have a 90° horizontal field of view. That would require a horizontal resolution of 4500 pixels. But that's per eye, so we need a panel with a total horizontal resolution (assuming perfect coverage) of 9000 pixels.

So, we want a 9000x4500 panel for something like a normal desktop monitor at normal viewing distance, assuming a 90° hFoV HMD.

People who feel sick when using VR, isn't that due to the low fps ability of the device?
There are a whole load of factors that can cause simulator sickness:
Drops in framerate (missed/duplicated frames)
Frame latency
Tracking latency (lack of TimeWarp)
Poor tracking accuracy
Incorrect tracking gain (should always be 1:1)
Incorrect lens-compensation distortion
Incorrect IPD
Vestibular/Vection mismatch (the visual environment is telling you you are moving, but your vestibular system is not detecting acceleration)
And a whole load of other factors.
 
In all honesty, these are just mass-marketing competitions at this point. If you're savvy and own a higher-end Android device (especially OLED), there are numerous holsters you can buy like this and take advantage of a full catalog of Dive games/apps. These even have adjustable pupil/focal distances and a very wide FOV.

You can also stream PC games over wifi and USB, with full head-tracking capabilities. Trinus Gyre is one of the better solutions for this, and works with just about any game using the right settings. With an OLED-based phone (Rift DK2 uses exact same panel as Note 3) and wireless AC, 60fps and excellent quality is a viable option. Sensitivity for each axis is fully customizable, and if your phone has USB3 say goodbye to frame drops completely.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
In all honesty, these are just mass-marketing competitions at this point. If you're savvy and own a higher-end Android device (especially OLED), there are numerous holsters you can buy like this and take advantage of a full catalog of Dive games/apps. These even have adjustable pupil/focal distances and a very wide FOV.

You can also stream PC games over wifi and USB, with full head-tracking capabilities. Trinus Gyre is one of the better solutions for this, and works with just about any game using the right settings. With an OLED-based phone (Rift DK2 uses exact same panel as Note 3) and wireless AC, 60fps and excellent quality is a viable option. Sensitivity for each axis is fully customizable, and if your phone has USB3 say goodbye to frame drops completely.
This is going to give you an UTTERLY AWFUL experience for VR. It is not a viable option for anything other than the "here, put this on your face for 5 minutes" demo scenario Cardboard was intended for.

VR relies on very low tracking and display latencies, and accurate head tracking. An android phone has a pretty awful IMU (Gear VR uses an external IMU for this very reason), has several compositing steps that result in multiple full frames of latency (Gear VR bypasses all compositing to write directly to the front buffer), and streaming video form a PC to a phone will add a whole pile of latency on top of that. Even Gear VR, which to some extend solves the tracking and display latency issues and implements low-persistance display driving (which no other mobile device will), will be a highly unpleasant experience with streaming video, and has no position tracking.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Also DK2 owner here. Would not recommend currently for your scenario!
 
This is going to give you an UTTERLY AWFUL experience for VR. It is not a viable option for anything other than the "here, put this on your face for 5 minutes" demo scenario Cardboard was intended for.

VR relies on very low tracking and display latencies, and accurate head tracking. An android phone has a pretty awful IMU (Gear VR uses an external IMU for this very reason), has several compositing steps that result in multiple full frames of latency (Gear VR bypasses all compositing to write directly to the front buffer), and streaming video form a PC to a phone will add a whole pile of latency on top of that. Even Gear VR, which to some extend solves the tracking and display latency issues and implements low-persistance display driving (which no other mobile device will), will be a highly unpleasant experience with streaming video, and has no position tracking.

In practice, this latency is much lower than you make it out to be. The Gear VR IMU does add more precise tracking without external software, but not enough to warrant such a large investment. Using FreePIE and similar apps, it's actually quite simple to get positional tracking using the gyro and accelerometer. Direct drawing without triple buffering and low-persistence mode are advantages, and Google is actually working on a version of Android just to take advantage of features like these. I never claimed that this is a perfect or finished solution, but it's both more viable and accessible for far more users than any of the competition.
 
In practice, this latency is much lower than you make it out to be. The Gear VR IMU does add more precise tracking without external software, but not enough to warrant such a large investment.
I suggest you actually try comparing GearVR to another phone (or even the Note 4 for a 1:1 comparison) in one of the Cardboard derivative headsets. The difference is massive and immediately obvious.
Using FreePIE and similar apps, it's actually quite simple to get positional tracking using the gyro and accelerometer.
No, it is not possible to perform positional tracking using just a MEMS IMU. The double-integration error induced drift is far, far too large (on the order of 4-8 meters per second). If you want to do purely inertial positioning, the price of entry is using PIGA gyroscopes and SPIR accelerometers.
Direct drawing without triple buffering and low-persistence mode are advantages, and Google is actually working on a version of Android just to take advantage of features like these.
Because it is necessary for a good experience. If it were not, Google wouldn't be branching Android to do it.
I never claimed that this is a perfect or finished solution, but it's both more viable and accessible for far more users than any of the competition.
It may be accessible, but it's still bad. If you have access to a decent VR experience (DK2, Gear VR, rev.2 of OSVR, HTC Vive, rev 2 of Morpheus) then it may be worth trying vicarious Rube-Goldberg-machine thrills. If you do not have access to decent VR, then it will only result in frustration that 'this VR thing sucks and makes you ill'.

Google Cardboard derived headsets with local rendering are good enough for a 5-minute demo of "hey, VR is cool". Prolonged use is unpleasant, and the experience is entirely reliant of the 'wow' factor and short duration to hide all the flaws (totally wrong distortion shader leading to visual warp on head movement, poor tracking fidelity, tracking latency, lack of positional tracing, pixel persistence blurring, low update rates, inconsistant rendertimes leading to judder, etc).
 
It is possible but not advisable
The resolution is too low atm and you can only use it for 3D
If you just feed it your Windows desktop the image will be split and distorted
And while there's software that can project your 2d windows into 3d space and there are VR operating systems in the making ...right now it isn't really consumer-ready yet
 
It is possible but not advisable
The resolution is too low atm and you can only use it for 3D
If you just feed it your Windows desktop the image will be split and distorted
And while there's software that can project your 2d windows into 3d space and there are VR operating systems in the making ...right now it isn't really consumer-ready yet

Is SGI going to be resurrected to bring back Fusion?

http://en.m.wikipedia.org/wiki/Fsn
 
I suggest you actually try comparing GearVR to another phone (or even the Note 4 for a 1:1 comparison) in one of the Cardboard derivative headsets. The difference is massive and immediately obvious.
No, it is not possible to perform positional tracking using just a MEMS IMU. The double-integration error induced drift is far, far too large (on the order of 4-8 meters per second). If you want to do purely inertial positioning, the price of entry is using PIGA gyroscopes and SPIR accelerometers.
Because it is necessary for a good experience. If it were not, Google wouldn't be branching Android to do it.
It may be accessible, but it's still bad. If you have access to a decent VR experience (DK2, Gear VR, rev.2 of OSVR, HTC Vive, rev 2 of Morpheus) then it may be worth trying vicarious Rube-Goldberg-machine thrills. If you do not have access to decent VR, then it will only result in frustration that 'this VR thing sucks and makes you ill'.

Google Cardboard derived headsets with local rendering are good enough for a 5-minute demo of "hey, VR is cool". Prolonged use is unpleasant, and the experience is entirely reliant of the 'wow' factor and short duration to hide all the flaws (totally wrong distortion shader leading to visual warp on head movement, poor tracking fidelity, tracking latency, lack of positional tracing, pixel persistence blurring, low update rates, inconsistant rendertimes leading to judder, etc).

I'm not sure where you're getting the notion that I don't already own all of the aforementioned devices from. Again, it's not necessary to have any additional hardware for inertial tracking, but there's nothing stopping you from using external hardware such as TrackIR for added accuracy. Drift can be compensated for using software tracking solutions. Yet another advantage to such a modular and accessible platform over the walled sandbox competition.

Your rationale on Google's OS tailoring is also faulty. They are simply capitalizing on an opportunity to insert themselves into the market, by utilizing these features that give others a marketing (and yes, latency as well) advantage.

Here's an example of the former, and another using TrackIR for added accuracy. The latter is a 60fps example of a proper setup, and it's a very good illustration of actual latency here. There is no distortion, very little blur, and a consistent 60fps is easily achievable on both wired and wireless connections. Hardly what I'd consider a terrible experience, and as someone whose played through numerous games with a similar setup, I've yet to get sick. It's amusing to read your hyperbolic "rube-goldberg" rants, but if those videos draw any analogue to one you should consider seeing an optician.
 
Again, it's not necessary to have any additional hardware for inertial tracking, but there's nothing stopping you from using external hardware such as TrackIR for added accuracy. Drift can be compensated for using software tracking solutions.
IMU drift cannot be 'compensated for' in software without an additional non-intertial tracking system. A MEMS IMU is not suitable for position tracking. The additional hardware (be it a tacking camera like DK2 uses, or Valve's Lighthouse laser scanners, or Sixense's STEM, or a regular 3-point constellation with a webcam and Freetrack, etc) are not for 'improved accuracy', but a fundamental requirement for position tracking to be viable. Otherwise, you only get orientation tracking (and with yaw drift in the absence of magnetometer compensation, which is itself tricky due to the local geomagnetic field being nowhere near as uniform and horizontally oriented as most people assume).
On top of that, the IMUs built into phones are running some pretty awful filtering software, hobbling their performance to be far below what the chips themselves can potentially do.
Yet another advantage to such a modular and accessible platform over the walled sandbox competition.
Ironic given your recommendation for TrackIR.
Your rationale on Google's OS tailoring is also faulty. They are simply capitalizing on an opportunity to insert themselves into the market, by utilizing these features that give others a marketing (and yes, latency as well) advantage.
They already have Cardboard to 'insert themselves into the market' (and indeed have a whole section of the Play Store for Cardboard). The issue is that Cardboard is not a great experience without significant modification to the way Android works at a low level.
Here's an example of the former, and another using TrackIR for added accuracy. The latter is a 60fps example of a proper setup, and it's a very good illustration of actual latency here.
And the latency is sufficiently bad to be visible in a 30fps video. 20ms head-tracking latency is not a 'nice target to shoot for', it's the absolute minimum acceptable for comfortable use.
There is no distortion, very little blur, and a consistent 60fps is easily achievable on both wired and wireless connections. Hardly what I'd consider a terrible experience,
It's no good being able to stream a solid 60fps if takes 30ms to get from the framebuffer to the phone, and another 48ms to go through Android compositing (see: Carmack's Connect talk on Gear VR) before anything on-screen actually changes. That's well over the acceptable motion-photons latency. And that's ignoring the time taken to get the orientation from the phone (with the phone IMU's terrible filtering adding even more latency) to the PC in order to get a new frame rendered in the first place.
It's amusing to read your hyperbolic "rube-goldberg" rants, but if those videos draw any analogue to one you should consider seeing an optician.
It is a Rube-Goldberg machine. The motion-photons loop encompasses:
Tell phone to poll IMU
-> IMU reads out buffered orientation value (because its stupid firmware takes the END of an averaged string of values, not an updated value)
-> phone CPU cycles for a bit before getting the orientation data from the IMU (because you have to wait for the rest of the bus to finish responding ebcause the IMU does not have any priority)
-> phone reports the IMU to the PC over WiFi
-> PC renders frame using badly filtered orientation (makes forward predication tricky if the orientation is already a time-shifted composite)
-> frame is read out over PCI-E to main memory
-> CPU encodes frame to compress it
-> PC sends encoded frame to phone over WiFi
-> phone decodes frame to buffer
-> buffer is passed to Android compositor
-> Android compositor buffers s'more because it doesn't have a mechanism to directly access the front buffer, or even the back buffer
-> image is read out to display.

On the PC side, there's GPU companies going as far as exploiting DirectX's latching behaviour in order to shave milliseconds off of post-timewarp-to-display time, because latency is just that important.

The reason I'm going on such a rant about this is because latency is so key to a good VR experience. A horrible hacked orient-render-display path bouncing back and forth between two devices with poor IMUs and compressed frames is just bad.
 
i can't imagine having a headset hanging off of your face for long periods of time would be comfortable in a monitor replacement schenario. Anyone who actually owns a VR headset have any input on that?
 
i can't imagine having a headset hanging off of your face for long periods of time would be comfortable in a monitor replacement schenario. Anyone who actually owns a VR headset have any input on that?

I've only been wearing my DK2 for about 6 hours straight so far but I could probably use it all day
I think it's light enough already and the only things somewhat uncomfortable are the headbands/cables
 
in terms of head tracking, does the Vive/SteamVR headset have a leg up on the oculus dk2 you think?
 
i can't imagine having a headset hanging off of your face for long periods of time would be comfortable in a monitor replacement schenario. Anyone who actually owns a VR headset have any input on that?

Okay, so I can tell you, that DK2 is not very comfortable for long periods of time. Will improve with future devices.
 
Last edited:
Back
Top