Watching Movies on the Oculus Rift

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
It looks like the Oculus Rift is still a work in progress when it comes to watching movies on the device.

Some of these apps simulate the theater experience, complete with rows of seating and a giant screen on the wall in front of you—not to mention the use of the tilt sensors to emulate your head's movement. But for me, all I wanted was for the movie to be floating in the blackness in front of me—with no fancy tilt controls or anything else.
 
It's a dev kit. It isn't the consumer version. So, hurray for getting a dev-kit and not using it for development, I guess. I've got a dev-kit coming in soon, and I hope to do some actual development for it. But, I'd estimate that50% of devkits have gone to people who do crappy let's plays on youtube, without understanding that the early devkits are low resolution and are missing features like position-tracking. It's cool technology, and I can't fault anyone for wanting a cheap VR experience. But, don't make any judgements until the actual consumer hardware is out.
 
I would like to plug my DSLR into an Oculus Rift so that I can see what my camera sees in weird angles and positions.
 
2D movies on the Oculus Rift are NOT the point. We are looking for something a little more immersive that nothing else can provide.
 
2D movies on the Oculus Rift are NOT the point. We are looking for something a little more immersive that nothing else can provide.

This being said, it would be kind of cool to pull up netflix and watch a movie in a theater with friends who I can interact with IRL.
 
Oh look, another article talking about how the Rift's screen isn't very good.

Welcome to 5 months ago kotaku.com. Oculus has already showed off a 1080 version. The screen will be better in the consumer model.
 
920x1080 per eye is what your looking forward too? I was hoping for something better, being so close to your eyes.
 
Not surprising since the dev kit is lower resolution than the final product, I think it's unprofessional to not mention that important point for the readers, but at least it was easy for someone actively following the product to know this from the "600x400" comment.

The final version should be 1080p according to the company.

920x1080 per eye is what your looking forward too? I was hoping for something better, being so close to your eyes.

Probably a future version a few years down the line will do that, right now I think it's probably a combination of cost and actual hardware limitations - bandwidth demands etc - that makes it difficult to pull this off.

They did say that's the eventual goal, higher res and lower latency when the technology is available to do so in a practical fashion. 1080p should be enough to make the experience sufficiently pleasant for now, much better than the Dev kit used in the review, that's for sure.
 
I am glad some one finally got around to looking at this. I have gotten nowhere discussing the subject with OR zealots on various gaming forums. Few seem to grasp how 3D, peripheral vision and movies work.

The OR is designed to have a wide angle field of view, much wider than a typical view on a TV (unless you sit extremely close). To watch a 3D Bluray, each eye needs a minimum of 1920x1080 individual pixels. The screen would have to be 3840x1080 minimum. Since the OR expands so far to the sides and up and down, the 1920x1080 image may need to fit within a smaller section of that entire FoV to be comfortable for viewing. This would mean the display would have to be even larger.

I have seen no confirmation that the shipping version of the OR is even intended to have a 3840x1080 display. I do not believe the first revision of the OR will be a desirable movie watching device.
 
He was using the first generation dev kit with the 720p display rather than the second gen 1080p one, so I'm not surprised he had issues. However, it is never mentioned in the article whether this is a software limitation or with the VR headset alone. I'd like to know more about this...
 
I am glad some one finally got around to looking at this. I have gotten nowhere discussing the subject with OR zealots on various gaming forums. Few seem to grasp how 3D, peripheral vision and movies work.

The OR is designed to have a wide angle field of view, much wider than a typical view on a TV (unless you sit extremely close). To watch a 3D Bluray, each eye needs a minimum of 1920x1080 individual pixels. The screen would have to be 3840x1080 minimum. Since the OR expands so far to the sides and up and down, the 1920x1080 image may need to fit within a smaller section of that entire FoV to be comfortable for viewing. This would mean the display would have to be even larger.

I have seen no confirmation that the shipping version of the OR is even intended to have a 3840x1080 display. I do not believe the first revision of the OR will be a desirable movie watching device.

I'm curious to see what pc will be able to drive a 3840x1080 stereoscopic @30 fps minimum let alone consoles. It's not that I don't believe you about that high resolution requirement for 3d movies, but can you give me a source so I can read about it?
 
Yes, the Rift will not be a good way to watch movies until the screen rez gets a lot better. IMAX might work well earlier since it could take advantage of the wider FOV. I doubt it will ever be even close to being a killer app. I guess it could be nice on trips or for people that live in closet sized apartments?
 
I'm curious to see what pc will be able to drive a 3840x1080 stereoscopic @30 fps minimum let alone consoles. It's not that I don't believe you about that high resolution requirement for 3d movies, but can you give me a source so I can read about it?
He was talking about movies, not real-time rendering...

Most modern cards will have no trouble decoding h.264 at that resolution.
 
I'm curious to see what pc will be able to drive a 3840x1080 stereoscopic @30 fps minimum let alone consoles. It's not that I don't believe you about that high resolution requirement for 3d movies, but can you give me a source so I can read about it?

Source? It's how Bluray, broadcast and 3D work. I never said anything about @ 30fps or greater for each eye, because those are not part of the Bluray spec. Blurays display movies at 24fps. The 3D specification calls for 24 fps for each eye. Each eye gets its own 1920x1080 image. Two 1920x1080 screens mashed together (as the OR would use) results in 3840x1080.

Most devices that can display Bluray 3D content must be able to display at least 48 fps (24 fps for each eye). Most displays alternate the frames (one for left, then one for right), but the OR would display both simultaneously. Their display could get by with 24 fps.

For a source, you can check Wikipedia, the Bluray Disk Association, or even use Google to find some relevant articles.
 
Its sad they still stick to 30fps or less per eye, but it has to be technically feasible to sell products at a reasonable price.
On SBS or OU 1/2 res movies, I can put them through a decent interpolator (SVP) to bump the framerate up and they definitely look better.

I really want to see the Occulus rift :)
 
I would like to plug my DSLR into an Oculus Rift so that I can see what my camera sees in weird angles and positions.
Unless I'm not understanding what you're suggesting, that would make you incredibly sick.

The final version should be 1080p according to the company.
Near as I'm aware, they've made no official announcements.

To watch a 3D Bluray, each eye needs a minimum of 1920x1080 individual pixels. The screen would have to be 3840x1080 minimum.
That claim doesn't make sense to me. You're suggesting that 3D Blu-rays can't be downscaled?
 
Unless I'm not understanding what you're suggesting, that would make you incredibly sick.
Depends on the setup. With the appropriate harness, you could mount a camera on a pole so that it's always looking down from above-and-behind you.

Stream the live feed to the rift, and you have real-life 3rd person view :eek:

You could also head-mount a camera with a super-wide field of view and compress it onto the display inside the rift, giving you super-human FOV.

That claim doesn't make sense to me. You're suggesting that 3D Blu-rays can't be downscaled?
Of course they can, but I believe he wants to watch them at full-sharpness.
 
Source? It's how Bluray, broadcast and 3D work. I never said anything about @ 30fps or greater for each eye, because those are not part of the Bluray spec. Blurays display movies at 24fps. The 3D specification calls for 24 fps for each eye. Each eye gets its own 1920x1080 image. Two 1920x1080 screens mashed together (as the OR would use) results in 3840x1080.

Most devices that can display Bluray 3D content must be able to display at least 48 fps (24 fps for each eye). Most displays alternate the frames (one for left, then one for right), but the OR would display both simultaneously. Their display could get by with 24 fps.

For a source, you can check Wikipedia, the Bluray Disk Association, or even use Google to find some relevant articles.

The math you are using is not as clear as you are supposing - you aren't accounting for the OR's distortion. The OR distortion makes the center of your FoV have a higher pixel density (which is beneficial) and the edges have lower pixel density. If you were to use a 3840x1080 screen with the OR, the pixels would not match 1:1 to the movie. First, the movie would be distorted, a la http://img46.imageshack.us/img46/7971/screenshot1364631423.jpg . Now, as you can see, a good portion of this image is blacked out, and those sections of screen simply aren't used by the rift. Then, the image would be distorted again by the lenses in the rift before it hits your retina.

Furthermore, applications which simply feed the movie into both eyes have not been very successful so far, so you would likely be using one of the virtual theaters. This means there would be even more unused space on the screen, leaving even less pixel density for the video content you're viewing.

Basically, if you absolutely must see every single pixel of a movie, even in your peripheral vision, we may need to start getting to 4K levels of resolution.
 
The OR isn't intended as a movie player primarily. It can do it, but it's design criteria wasn't centered around it and indeed it's not really a good use for it due to the FOV and required resolution. The res would have to be greater than 3840x1080 to make for an enjoyable movie experience at full 1080p resolution per eye without feeling you're sitting way too close to the screen, and the problem with making the consumer version that rez is twofold. One, it would be very expensive. Two, very few would have computers with the horsepower to drive anything in 3d with any graphical quality at that resolution, which is it's primary purpose, not movie viewing. They are trying to take VR mainstream, not just make it for graphic quality whores with tri SLI.

The 720p screen in the dev kit is plenty to be immersive. Yes you can see the pixels and yes stuff does get blurry in the distance, but it's a dev kit. If they find a good 1080p screen with small ipg, it will be very good. I don't expect more than 1080p due to the price, the computer power that would be needed to drive it at full rez, and their goal of wide consumer adoption.
 
The 720p screen in the dev kit is plenty to be immersive. Yes you can see the pixels and yes stuff does get blurry in the distance, but it's a dev kit. If they find a good 1080p screen with small ipg, it will be very good. I don't expect more than 1080p due to the price, the computer power that would be needed to drive it at full rez, and their goal of wide consumer adoption.
What's the big deal with driving 2x 1080p? That's only 4,147,200 pixels. :confused:

My current Eyefinity setup is very nearly double that...
 
What's the big deal with driving 2x 1080p? That's only 4,147,200 pixels. :confused:

My current Eyefinity setup is very nearly double that...

It's hard to drive that many pixels at a solid 60FPS. Not average FPS, minimum FPS, since dropped frames are very noticeable in VR. Also you can only get one screen in the unit and the selection above 1080p is still rather poor. Response time, viewing angles and pixel spacing are all still important factors for the screen as well, plus price of course.
 
It's hard to drive that many pixels at a solid 60FPS. Not average FPS, minimum FPS, since dropped frames are very noticeable in VR. Also you can only get one screen in the unit and the selection above 1080p is still rather poor. Response time, viewing angles and pixel spacing are all still important factors for the screen as well, plus price of course.

Very true for games, but for movies, 24 fps is adequate.
 
It's hard to drive that many pixels at a solid 60FPS. Not average FPS, minimum FPS, since dropped frames are very noticeable in VR.
Well, again, many of us are pushing double (or more) that resolution. It's not all that unwieldy.

Also you can only get one screen in the unit and the selection above 1080p is still rather poor. Response time, viewing angles and pixel spacing are all still important factors for the screen as well, plus price of course.
We were talking about a hypothetical future version...

But yeah, the currently-planned version with a single 1080p screen is going to be a doddle to drive, even with decidedly mid-range cards.
 
Well, again, many of us are pushing double (or more) that resolution. It's not all that unwieldy.


We were talking about a hypothetical future version...

But yeah, the currently-planned version with a single 1080p screen is going to be a doddle to drive, even with decidedly mid-range cards.

That's the whole point, it has to be able to be run by most computers. They're not after a niche market where everybody has high end cards or SLI setups and can drive 4k resolution at 60fps minimum without issue. They're after wide spread adoption. The [H] crowd is not their intended market, and straight movie viewing is not their intended use, so saying you have the computer power to drive 3 1080p screens and it only needs to be 24fps so why don't they do it ignores the whole purpose and goals of the OR.
 
If only a technology could be invented to take a lower-than-native resolution input and somehow...'adapt' it to a higher resolution output.

What would such a technology be called? How about "whaling"? "Curtailing", maybe?
 
I'm so looking forward to this, I remember when I was a kid, my aunt took me to the local fair and I got to play duke nukem 3D. I don't remember the setup exactly but I had a helmet with a display inside that tracked my head movements. I can't believe that it has taken around 20 years to develop for home use.
 
Head-mounted displays have existed in the consumer space for a long time now, actually. There was considerable interest in them in the early- and mid-90's, when a large number of companies were producing consumer units. If you look at the MechWarrior 2 manual, for instance, you'll see details on how to use the "Virtual IO I-glasses" HMD with it.

As Carmack said, there was a lot of enthusiasm not backed up by much of anything. It was too early for that hardware to be any good. It's only just now barely good enough and affordable enough to make a reasonable consumer HMD.
 
That's the whole point, it has to be able to be run by most computers.
...and like I said, we're already pushing three to four times more pixels than 1080p on single-card high-end machines.

A fairly average box will have no trouble with the consumer version of the rift, which will be 1080p. I don't see what you're getting so worked up over...
 
...and like I said, we're already pushing three to four times more pixels than 1080p on single-card high-end machines.

A fairly average box will have no trouble with the consumer version of the rift, which will be 1080p. I don't see what you're getting so worked up over...

For VR and games you're not running three to four times as many pixels than a 1080p screen without your FPS regularly dropping below 60 FPS unless you've turned the settings down. Maintaining a solid 60 FPS matters for VR, not an average 60 FPS but a minimum of 60 FPS. A bunch of dropped frames will give you motion sickness with a Rift so an average FPS of 60 won't cut it.

I've been taking a close look at crysis3 since it's the same engine as Star Citizen. Even with new mid level cards (660 Ti, 760 or 7870) you can't run 1080p with max settings and get a min framerate of 60 FPS. http://hardocp.com/article/2013/07/09/gigabyte_gtx_760_oc_version_video_card_review/6
Their average FPS is only 50-60 FPS and all of them drop to around half that regularly.

Even bumping up to the 770 or 7970 level doesn't get you a solid 60 FPS.
http://hardocp.com/article/2013/06/..._directcu_ii_video_card_review/6#.Uh-V2Rukq-8

So, either have a high end card or get ready to turn down some settings in games when using the Rift.
 
If you really want to just watch movies get one of the Sony HMZ-T models, it's built specifically for that. The Rift really isn't.
 
I've been taking a close look at crysis3 since it's the same engine as Star Citizen. Even with new mid level cards (660 Ti, 760 or 7870) you can't run 1080p with max settings and get a min framerate of 60 FPS
Uh... so do what PC gamers have been doing for years.

Don't run max settings.

Problem solved...
 
Agreed. It's considerably more expensive, but it also has OLEDs and all the format conversion niceties that the Rift eschews for the sake of latency.
 
Also, you're using a Crysis game as a comparison. This is the franchise that [H] had to drop their standards from "60 average" to "30 average" because nothing could run it properly. :rolleyes:

Huge number of games give mid-range cards no trouble, even at Eyefinity / Surround resolutions.
 
Also, you're using a Crysis game as a comparison. This is the franchise that [H] had to drop their standards from "60 average" to "30 average" because nothing could run it properly. :rolleyes:

Huge number of games give mid-range cards no trouble, even at Eyefinity / Surround resolutions.

He used Crysis 3 since Star Citzen (one of the first popular games with Rift support) is built on the Cry3 engine..Since SC isn't out yet, we can use Crysis 3 to get a rough idea of how demanding SC will be? Following me?:p
 
I expect Star Citizen to be significantly more demanding than Crysis 3, particularly considering that the latter has the full might of Crytek's obviously adept graphics programmers behind it while the former does not.

That said...detail levels. These things can almost always be reduced to achieve a desired frame rate.
 
He used Crysis 3 since Star Citzen (one of the first popular games with Rift support) is built on the Cry3 engine..Since SC isn't out yet, we can use Crysis 3 to get a rough idea of how demanding SC will be? Following me?:p
No, because CryEngine is incredibly versatile. Just because one developer decides to run it balls-out doesn't mean another will.

Two games being based on the same engine is no indication they will perform the same.
 
Back
Top