Pimax 8k kickstarter

ssnyder28

2[H]4U
Joined
May 9, 2012
Messages
3,714
So Pimax is releasing their 8k headset in early 2018. I received an email when they started up the kickstarter for it with a goal of $200k. An hour later I received another e-mail they reached their goal of $200k in a little over and hour, looks like right now its already over $330k.

Hopefully with more competition on the VR front we'll get more innovation.

Kickstarter link below:

 
Yeah I've been keeping an eye on it, looks amazing.

Based on this hands-on from Tested, it's still very much a mixed bag:



There is potential, but it seems the lens system still needs work in that there' s a lot of distortion introduced in the way they are stretching the image on the periphery to get the wider FOV. While it sounds/looks great on paper, it seems things like IPD adjustment also doesn't work yet and individual pixels are still quite visible. Still, the fact that small text is now readable in Big Screen is a very big plus!

The guy also really should have used an interpreter - was quite painful watching him "attempt" to answer questions when it seemed he was using a working English vocabulary of only about 40 words max and only understood about 75% of what was being asked.

Can't wait for next Gen VR to arrive, but it appears that what they demoed is still pretty rough and needs more refinement as to the design of the lens system. Oh, and the controllers looked like very cheap copies of the Vive wands. Glad they are just prototypes!
 
Last edited:
I don't consider pre-production versions when looking at an item like this.

On the Pimax 4k you can hardly see the pixels and there is no SDE. The warping effect we'll have to see.
 
If this doesn't have eye tracking with foveated rendering, then kiss any semblance of playability goodbye...
 
So at 8K will it use 1 video card for each eye? At 4K it takes a 1080ti/Titan to run. It sounds nice but not sure the hardware is there unless you can run a dedicated card per eye.
 
So at 8K will it use 1 video card for each eye? At 4K it takes a 1080ti/Titan to run. It sounds nice but not sure the hardware is there unless you can run a dedicated card per eye.

They claim that their Brain Warp system reduces system requirements. That said they still recommend a 1080 ti
 
They claim that their Brain Warp system reduces system requirements. That said they still recommend a 1080 ti
Haha. So a single 1080ti can run 8K now with their technology.....

I would sell it to the video card makers and make a fortune! Imagine what it would do for regular gamers with lesser video cards.
 
They claim that their Brain Warp system reduces system requirements. That said they still recommend a 1080 ti

Well, they are using two separate 4K displays, so there is no single 8K signal/image being displayed, but rather a 4K image, one for each eye. From the interview, it sounds like they are using a scaler to up-convert the incoming signal to 4K as well for their lower end HMD offering. Their top tier offering the "X" sounds like it will accept two separate 4K display port inputs, one for each eye... but that will most likely require 2 GPU's to render effectively, and current Steam VR tech doesn't yet support multi-GPU left/right eye rendering. The good news is that none of this is appearing before next year, so perhaps Volta can arrive and be leveraged for this. A single 1080Ti would be very hard pressed (except for very simple graphics) to maintain 90FPS @ 4K * 2.
 
They claim that their Brain Warp system reduces system requirements. That said they still recommend a 1080 ti
Similar tech is already used and it introduces lag. I cant use it.
The higher the spec the system the less problems will be tolerated.
8K is not viable at this time unless it uses dual fastest cards AND they can make that work well.
Niche for quite some time unfortunately.

They also need roomscale, without that its got an arm missing.
 
Last edited:
Haha. So a single 1080ti can run 8K now with their technology.....

I would sell it to the video card makers and make a fortune! Imagine what it would do for regular gamers with lesser video cards.
It's technically not 8k, as it's 2x the pixels of 4k, whereas true 8k is 4x as many pixels as 4k.
 
It's technically not 8k, as it's 2x the pixels of 4k, whereas true 8k is 4x as many pixels as 4k.
And my statement still stands...... 4K, 8K, 189743786548732K. (like that last number?)
I would like to see what a 1080ti can do with it...
 
Brain Warp, multi-resolution rendering...there's going to be something that comes along that helps with the performance thing.

As for the Room Scale this is supposedly going to work with the HTC Vive's Lighthouse system and motion controllers. So while it can be used stand-alone I'm sure it could also be considered a "upgrade" to the Vive...if this is all true.
 
And my statement still stands...... 4K, 8K, 189743786548732K. (like that last number?)
I would like to see what a 1080ti can do with it...
As I said above, eye tracking with foveated rendering would cut the render load in half, essentially making it the equivalent of rendering a single 4k screen worth of pixels, still a big ask at 90hz.
 
As I said above, eye tracking with foveated rendering would cut the render load in half, essentially making it the equivalent of rendering a single 4k screen worth of pixels, still a big ask at 90hz.

That depends upon the game, doesn't it? I imagine older games will easily beat 90 fps. Fraps says I get 175 fps on my SLI Maxwell Titans (roughly the same as a 1080 Ti) at 4K in the original Far Cry.
 
Seems to have serious drawbacks right now but at least they are pushing the tech forward. We'll see what the reviewers have to say about the final product next year!
 
Similar tech is already used and it introduces lag. I cant use it.
The higher the spec the system the less problems will be tolerated.
8K is not viable at this time unless it uses dual fastest cards AND they can make that work well.
Niche for quite some time unfortunately.

They also need roomscale, without that its got an arm missing.
What do you mean they need room-scale? They use valve's tracking technology it has room-scale already. Also the image is up-scaled to 8k which is done today in many games and works just fine. It's not as intensive as rendering straight 4k. Using up-scaling a single 1080ti can run 8k.


As I said above, eye tracking with foveated rendering would cut the render load in half, essentially making it the equivalent of rendering a single 4k screen worth of pixels, still a big ask at 90hz.

It includes a port to connect an eye tracking third party device. Foveated rendering is not a technology to be implemented by headset manufacturer. That's a technology for the software developer to implement whether it be on the game or through the OpenVR SDK. The headset supports it through the use of third party hardware. NVIDIA has already built a framework for implementing on modern VR games. Hopefully the OpenXR group will implement foveated rendering technology on their API.
 
Last edited:
What do you mean they need room-scale? They use valve's tracking technology it has room-scale already. Also the image is up-scaled to 8k which is done today in many games and works just fine. It's not as intensive as rendering straight 4k. Using up-scaling a single 1080ti can run 8k.
The 4K doesnt have roomscale.
The 8K isnt out yet.
I trust that it will have roomscale but my point was to not be like the 4K in this respect.

Upsampling is what all displays do when you cant run at native res.
It doesnt look as good as running at the lower res on a display intended for it.
This will be a waste of a higher res VR display unless they have a separate low lag AA feature that does not run on the gfx card.
The one benefit higher res display with lower res input will give is less screen door.
 
If this doesn't have eye tracking with foveated rendering, then kiss any semblance of playability goodbye...
even without eye-tracking it could help a ton... we're used to not being able to move our VR eyes much anyways
 
That depends upon the game, doesn't it? I imagine older games will easily beat 90 fps. Fraps says I get 175 fps on my SLI Maxwell Titans (roughly the same as a 1080 Ti) at 4K in the original Far Cry.
If it will even be able to use SLI/CFX. If not, then it will be interesting.
 
Hmm, what I didn't like about the first Primax was that it was essential a beta product. It ran 1440p up-scaled to 4K, so wasn't really a 4K VR device.

Now that the 8K X will have actual dual 4K displays via DP 1.4, I ordered it. Not shipping until May though.
redface.gif
So really means late summer.

Their FAQ is kinda confusing for the non-X 8K. It says 4K up-scaled to 8K. But 8K in this sense isn't 7680×4320. It is 7680×2160.

So the actual INPUT of the "regular" 8K would be something like: 3840x1080. That's why they are saying something like a lowly GTX 980 can run it. It is essentially dual 1920x1080 signals up-scaled to dual 4K displays.

The only "real" way you could ever call it 8K is the dual native 4K signal (8K X).
 
Oh and that also makes the interesting fact that the 5K version will run dual 2560x1440 native OLED displays. So that means the 5K will actually take in a higher demand signal than the regular 8K version. And look better doing it!
 
Just read some more info, the "regular" 8K version will use the same input resolution as the 5K (5120x1440) and then up-scale it to 7680x2160.

IMO I would rank them from best to worst:

1. 8K X (7680x2160 LCD)
2. 5K (5120x1440 OLED)
3. 8K (5120x1440 LCD up-scaled)

Does anyone know if these displays will use any type of variable refresh technology?
 
The 8K X sounds the most appealing to me, but it's going to require some monster hardware to drive it. I'm thinking two 1080Ti's at a minimum to maintain 90FPS if up-scaling isn't employed and all of that resolution is actually being used/pumped into the HMD... and the problem with that expectation is that VR tech to drive dedicated left and right eye signal inputs (two Display Port cables) has not been realized/demonstrated yet. (Nvidia has only shown some SLI VR driver/API tech, but it only provides for a single output and requires three separate cards!)

While their target is May for their 8K X HMD, I'm thinking it will be more like fall next year to get all the HW/SW kinks worked out (and that's being optimistic). The lens system is also going to have to be different than the other two HMD models given that there's a lot more resolution to contend with and the panels will most likely also be different as to performance specs and possibly physical form factor - close perhaps, but I doubt a 100% match.

Volta will probably be needed to drive it decently as that is a F-ton of pixels to push that fast...

Going to wait and see what develops, enjoying my Vive and Rift rather than jump in this early on their kickstarter.
 
Last edited:
Just read some more info, the "regular" 8K version will use the same input resolution as the 5K (5120x1440) and then up-scale it to 7680x2160.

IMO I would rank them from best to worst:

1. 8K X (7680x2160 LCD)
2. 5K (5120x1440 OLED)
3. 8K (5120x1440 LCD up-scaled)

Does anyone know if these displays will use any type of variable refresh technology?
How come other VR companies are not doing this? Does up-scaling look any better than the format/resolution it came from?!
 
This will most likely be marketed alongside Volta. Word on the streets is Volta has +132% processing power as Pascal.
 
How come other VR companies are not doing this? Does up-scaling look any better than the format/resolution it came from?!

All upscaling helps with is the screen door effect.

The downside is you lose sharpness because it isn't native resolution.

Without a low persistence display and good head tracking this thing is worthless to me.
 
All upscaling helps with is the screen door effect.

The downside is you lose sharpness because it isn't native resolution.

Without a low persistence display and good head tracking this thing is worthless to me.
Hopefully the Rift gets a version 2 soon.(even the Vive)
My kids and wife always laugh how I could never live without my VR now.
 
Just read some more info, the "regular" 8K version will use the same input resolution as the 5K (5120x1440) and then up-scale it to 7680x2160.

IMO I would rank them from best to worst:

1. 8K X (7680x2160 LCD)
2. 5K (5120x1440 OLED)
3. 8K (5120x1440 LCD up-scaled)

Does anyone know if these displays will use any type of variable refresh technology?
Variable refresh is not used on modern VR HMDs since it's rather incompatible with ULMB/strobing techniques, and every VR HMD since the Oculus DK2 strobes - even Samsung flagship phones docked in a Gear VR (at 60 Hz, which gets flickery if you look too closely).

Low persistence is that important when you have a head-tracked image that close to your eyes. Even the OLED panels used in the Rift aren't perfect, as there's a sorta brown/reddish black smearing that you get on some scenes. Same thing happens on my Note 4 even outside of the Gear VR, and it's kinda like a lesser version of the really smeary dark transitions on my Eizo FG2421 (VA LCD).

Anyway, as much as I'd love a boost in resolution (for the reasons you'd expect; any boost in cockpit readability and visually identifying distant targets in VR is greatly needed right now, with the Rift being just on the edge of "good enough" for clarity), my GTX 980 isn't cut out for any of this and it's too expensive to replace my Rift so soon. I'll probably hold out for another year or two before splurging on more VR hardware, since my current issue outside of flight sims is less "not enough resolution" and more "too many juddery framedrops, especially in SteamVR games, and I haven't figured out why".
 
All upscaling helps with is the screen door effect.

The downside is you lose sharpness because it isn't native resolution.

Without a low persistence display and good head tracking this thing is worthless to me.
Another downside of running below native res is the large increase in aliasing.
The only really effective AA I have found is supersampling which defies the point of going lower res.
Other AA types either blur the image making it look even lower res or need way too much GPU time.
(although at higher res it might not matter so much, but anyway...)

This will either need a new VR specific AA with very low overhead that runs on the GPU or a separate high quality AA processor that is extremely fast and well integrated to keep lag down.

There is a plus side I've not seen mentioned yet.
NVidias VR tech includes a way to avoid drawing things twice that appear in both eyes.
I suspect this will give around 30% performance improvement which will help negate some overhead of using VR.
But its still going to have a tough time keeping quality settings up in VR at very high res.

I'm eager to see how they deal with this at 1/2 res.
 
This seems like a seriously wrongly represented product. First off it says that you need DP 1.4 but only the latest GPUs support that and they say a 980 is fine, but that only supports DP 1.2. Even with 1.4 there is not enough bandwidth for 4K @ 90 Hz x 2 so it seems they instead upscale to 4K from 1440p per eye to make it happen. So at best this only seems to reduce the screen door effect on the glasses.
 
All Displayport is backwards compatible. So a DP 1.2 980 will work fine for the regular 8K. The "real" dual 4K device, 8K X uses two Displayport 1.4 cables and uses two real 4K at 90 Hz input signals.
 
Eye tracking to allow higher resolutions to what you are looking at and lower on the sides of that would help performance tremendously, unless you render high in the middle and low on the sides which would not be as good, Nvidia Hardware supports this and I think RTGs as well. With the wide FOV I would think using your eyes more vice turning you head constantly would be used more, so without eye tracking that could limit usefulness. Volta should be out around the same time so more powerful hardware should be available. Vega 2 also but not sure what kind of improvement to expect from that.
 
I like the idea of the 8k X, but the fact there are no demos of it in action leave me a bit hesitant. Realistically you would want two 1080tis to drive it, and I would want to see people prove it works with existing steamvr, oculus apps etc.

The interview was terrible, and I was confused as to why they would demo the system on a laptop, kind of strange.
 
Decided to cancel my 8K X pledge. Two 4K displays aren't going to be able to be pushed properly at 90 Hz even with a Volta GPU. Having no variable refresh rate technology, I don't think this will lead to a very good experience. Plus I was not that excited about them using LCD displays. Decided to go with the Odyssey which is OLED and should help with the screen door effect some.
 
What about the regular 8K Pimax headset? Wouldn't higher resolution LCD panels still be better (all other things being equal) than lower res OLED at eliminating SDE?
 
The displays are 4k per eye right now. It is upsampling the signal and using brain warp. What makes me curious is if this is being done in hardware or software. Meaning, if the GPU power in the future can push native 4k/90Hz each eye, is it a software update or is the HMD hardware not designed to handle it?

EDIT: from the kickstarter page:

What is the resolution of Pimax 8K?
The resolution of Pimax 8K is always 8K. You can see totally over 16.6 million pixels at any time. For signal that less than 8K, Pimax can upscale the signal to 8K with a video processing module in the headset.

Looks like may be a future proof HMD. How far in the future you would have to wait may determine value now. I just got a Rift, otherwise i would have backed this up.
 
Last edited:
What about the regular 8K Pimax headset? Wouldn't higher resolution LCD panels still be better (all other things being equal) than lower res OLED at eliminating SDE?

Depends, but LCD, even low persistence, has ghosting that OLED doesn't, which is one of the reasons all of the high end sets use OLED despite there being larger pixel gaps. Early tests show not much ghosting on the Pimax, so that bodes well. I'll have to wait and see full reviews, but I could see getting this when it releases next year.
 
Last edited:
Depends, but LCD, even low persistence, has ghosting that OLED doesn't, which is one of the reasons all of the high end sets use OLED despite there being larger pixel gaps. Early tests show not much ghosting on the Pimax, so that bodes well. I'll have to wait and see full reviews, but I could see getting this when it releases next year.

Pimax uses a new LCD technology called CLPL that's designed to eliminate ghosting. I have yet to find an explanation of how it actually works. I'm half optimistic and half skeptical that it's marketing b.s.
 
Back
Top