SteamVR Performance Test

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
Those of you wondering whether or not your system has what it takes to be "VR Ready," can use the SteamVR Performance Test to find out. The test will also tell you, if your system doesn't pass, whether bound by CPU, GPU or both.
 
I'm ready....
VR Ready.jpg
 
Oh...daddy ain't ready. Guess I should still upgrade my video card at some point.
notreadyvr.JPG


(Ooo I like this forum software!)
 
Well, here is mine.

Details it doesn't show in the system specs section:
- 2x 980Ti's in SLI
- CPU is overclocked using turbo, to hit 4.8Ghz at load, but is still reported at base clock of 3.2Ghz

SteamVR Test.jpg


- So, how does this thing work? Will the headset work with any title, implemented on the driver level, or do titles need to be specifically written to support it?

- Has anyone tried the development version on? Does the dual 1080x1920 resolution look good?

- The rendered test on my screen looked lower res than 1080x1200 per eye. Can nayone confirm this? Why would they not test at the Vive's natural resolution?
 
I find it hilarious that those with 980 TI's and even 2x SLI 980 TI's bothered to even see if they were ready... I mean really? o_O I think you all are just showing off that you have those cards but in reality showed us you just totally wasted your time and 2g of space. Lol!
 
Yawn. 2 gig download. Not wasting an hour of my time.

:eek: Took me 2 minutes to download, peaking at 18MB/s

I find it hilarious that those with 980 TI's and even 2x SLI 980 TI's bothered to even see if they were ready... I mean really? o_O I think you all are just showing off that you have those cards but in reality showed us you just totally wasted your time and 2g of space. Lol!

Well, I ran it before I looked at anyone else's results.

There was a semi-recent front page news story (which I cant find now) suggesting that VR would require 7x higher performance from GPU's than current systems have. Couple that with the fact that in many titles my dual 980ti's in SLI struggle to keep up with my 4k monitor, and the test makes perfect sense.
 
My overclocked CPU made the cut, but my old R9 280 would only let me play on low. This confirms my earlier plan of getting a new card at some point this year, and then getting whichever headset 'wins' the race when the price has come down.
 
My overclocked CPU made the cut, but my old R9 280 would only let me play on low. This confirms my earlier plan of getting a new card at some point this year, and then getting whichever headset 'wins' the race when the price has come down.
While the 280 is below spec I believe, there's something screwy going on with this utility and AMD cards. Check out this thread
 
I can't help but wonder why the resolution for all the first gen VR devices is so low. Everything is starting to go 4k, and the benefits from 4k should be even greater on a VR-type device.

I would have expected to see a similar pixel count to a 4K screen on these things. The SteamVR HTC Vive reportedly has 1080x1200 per eye. I'd expect to see it higher, like maybe 1920x2160 per eye, to get a similar pixel density.

I also wonder if there is some opportunity when it comes to VR type devices to get better scaling for multiple GPU's, by instead of using SLI/CFX, instead using one GPU per eye with a separate video output for each side. (synced of course, or it would likely make you sick) It would avoid some of the overhead and lag associated with rendering every other frame.
 
Apparently the first version of the app did not use SLI. it has now been updated to take advantage of SLI. Will have to re-run it when I get home :p
 
I can't help but wonder why the resolution for all the first gen VR devices is so low. Everything is starting to go 4k, and the benefits from 4k should be even greater on a VR-type device.

I would have expected to see a similar pixel count to a 4K screen on these things. The SteamVR HTC Vive reportedly has 1080x1200 per eye. I'd expect to see it higher, like maybe 1920x2160 per eye, to get a similar pixel density.

I also wonder if there is some opportunity when it comes to VR type devices to get better scaling for multiple GPU's, by instead of using SLI/CFX, instead using one GPU per eye with a separate video output for each side. (synced of course, or it would likely make you sick) It would avoid some of the overhead and lag associated with rendering every other frame.

Apparently the first version of the app did not use SLI. it has now been updated to take advantage of SLI. Will have to re-run it when I get home :p


We are starting at lower resolutions because VR requires a much higher framerate and much lower latencies from input to output to be a cohesive experience. Both HMDs recommend a minimum of 90 fps for VR titles (reason why the tool notes frames below 90fps). John Carmack has a nice blog post regarding latency in VR. Do keep in mind though, its been stated a few times that refresh rate is showing to be more important than resolution in VR AND some visual tricks that work on 2D screens actually reduces immersion or flat out breaks immersion in a VR environment. There's a whole new paradigm for game design that's being discovered for VR.

At the moment, VR does not support multi-gpu setups because of the latency introduced. Nvidia and AMD are both working towards getting support there through various means but at the moment, it's not recommended.
 
At the moment, VR does not support multi-gpu setups because of the latency introduced. Nvidia and AMD are both working towards getting support there through various means but at the moment, it's not recommended.

Ahh, it would be really cool if they could run two concurrent GPU rendering instances on separate GPU's each rendering dedicated for one eye, with the framerate synced.

This ought to eliminate the latency associated with AFR SLI/CFX modes, at the same time as giving near perfect scaling, provided the CPU can keep up.
 
Ahh, it would be really cool if they could run two concurrent GPU rendering instances on separate GPU's each rendering dedicated for one eye, with the framerate synced.

This ought to eliminate the latency associated with AFR SLI/CFX modes, at the same time as giving near perfect scaling, provided the CPU can keep up.

Some of that DX12 magic could be useful here, with that SFR rendering mode.
 
Ahh, it would be really cool if they could run two concurrent GPU rendering instances on separate GPU's each rendering dedicated for one eye, with the framerate synced.

This ought to eliminate the latency associated with AFR SLI/CFX modes, at the same time as giving near perfect scaling, provided the CPU can keep up.
Ya, I think I read one of the options on the table would require duplicate frame buffers to be kept, one for each eye, meaning any game that takes 2GB of VRAM on a display would require 4GB in VR. There's also probably a problem with ensuring both frames stay synced...would hate for one eye to end up dozen ms in front of the other.


Something wrong there on your end I've got a 290 with a lower cpu and I scored better than you lol.
View attachment 471

Ya, there's another thread titled the same as this one, AMD owners are all over the map with this utility. The conspiracy theorist in me thinks Nvidia may have some under the table dealings with Valve/HTC.
 
I7-4770k at 4.5Ghz
AMD Fury X at stock clocks
16GB DDR3 at 1600Mhz
score = 9.5

4ftpy1.png
 
Okay.... I have a legit question. Is the thing it is rendering going to be exactly what would be rendered if I were playing a game? If so, it gave me Not Ready, but the benchmark ran fluid as can be.

I'm guessing it has to do with the frame rate?

I thought I would snuck in with an overclocked 7950 boost and a 4770k @ 4.0
 
What was your frames tested? If you look at all the screenshots - it appears that number varies most dramatically?

Is it something as simple as how many frames can render in a set period of time?
 
Back
Top