AMD & NVIDIA GPU VR Performance in Trials on Tatooine @ [H]

Damn Kyle; back at it again with the way too early reviews. This time with a worse choice than before lol. Good Job.

Seen some comments of people wanting to buy 1070s based on this.

My best advice is not to buy anything until everything next gen is out next year. Well apart from the titan X... But the cards next year, with the tasty HBM 2 is what I'm holding out for. Al wait till then to decide on VR.
 
What VR does AMD do good in? Everything else except for UE4 games. That's all it comes down to.

We've already seen it. Unity games (VR or not) run well on AMD & NV. UE4 games run total ass on AMD.

Unity does not use any NV or AMD proprietary features as default. UE4 uses GameWorks & PhysX.

I mean if [H] wanted to do proper tech journalism, why do they not ask the obvious question here, is why is NV working towards fragmenting the PC ecosystem with their closed propriety approach? It's quite clear if you play UE4 VR games, you need an NV GPU.

Should AMD do the same? They sponsor Crytek or Unity, inject in some DLLs that only they can modify and optimize, then the resulting game runs like shit on NV GPUs? How is that any good for the PC gaming as a whole when it leads to an exclusive effect?

I remember tech journalism of the good ole days when they would have jumped all over this as a bad case of optimization for one IHV over the other, and any who did it would have been called out. Certainly Anandtech back then, had the balls to call a spade a spade.

Today, it's the same old bullshit, "here guys, this is our game, NVIDIA sponsored us, and now it runs like total ass on AMD GPUs, enjoy!"... no tech journalist these days bats an eye when they see stuff like this. Instead they blame it on AMD's bad drivers. Yeah, such an easy excuse.

How do AMD's drivers get around being forced to run proprietary code that only NVIDIA decides how well it should run? AMD cannot modify to optimize these GameWorks or PhysX libraries. Only NV can.

Don't you guys see it? When an IHV sponsors a game's development, they have a hand in it and the final product has their say. Hitman runs worse on NV GPUs, is it a coincidence that AMD sponsored it? No it isn't. It shouldn't be at all, but we're so used to seeing AMD & especially NV sponsor a game and it ends up running total ass on the competitor hardware. It's like normality, and acceptable.

AMD doesn't have the market share or the money (i.e. the clout) to do what nVidia did. Perhaps developers like nVidia's programs because it makes their jobs easier? All this talk about low level, "bare metal" APIs and no one is mentioning the obvious - the reason high level APIs were developed and adopted in the first place - compatibility, efficiency, convenience/ease of use = less time and money for the developer.
 
Damn Kyle; back at it again with the way too early reviews. This time with a worse choice than before lol. Good Job.

Seen some comments of people wanting to buy 1070s based on this.

My best advice is not to buy anything until everything next gen is out next year. Well apart from the titan X... But the cards next year, with the tasty HBM 2 is what I'm holding out for. Al wait till then to decide on VR.

How exactly is it too early? The hardware exists, the games and demos are out, and people want to know what to expect. I know you didn't bring up the Radeons but if this is what AMD wants to tout as VR Ready isn't it their own damn fault that their cards are aren't meeting expectations in a widely used game engine?

I have a 1070 and it's terrific for gaming. I don't feel that I overspent, and it's meeting my expectations of performance based on what I read about it. I have no idea why you would steer clear of this card if it fits your budget and the timing is right.

To your final point, waiting for the next big thing will mean waiting perpetually. There's always something better on the horizon.

(ninja edit: minor clarity edit)
 
Last edited:
Milkman and dergreg - At some point we just have to put a cap on the amount of work we are doing. I did not see the 970 as being really needed seeing how it is overlapped by current products which you can easily see by reviews done here and there. It should be around 1060 in perf.
 
Thanks for the response. You cant blame a guy for wanting your take on it over someone else online?

No offense dergreg. ;) I still look forward to what you find.

None taken, they have a consistent test bed and methodology! I'd prefer their numbers to mine, too.
 
As you should. Marketing needs to be kept honest. The problem I see with AMD is their marketing department is too aggressive and needs to reflect their products a bit closer to reality. I also understand that thier job isnt easy especially if they have to reach certain goals with lack luster products. There job is on the line so they must feel the preasure to market these products as best they can regardless of the issues.
If they did that, no one would wait for AMD products when a better or equal nVidia product was already available. The hype is what keeps the fanboys in line.
 
In reality the game should ALWAYS be in reprojection. The only exception is this one apparently doesn't support DX12/Vulkan where an async queue could be used to keep the framerate flexible. Reprojection should be running asynchronously at 90Hz via compute and intrepreting a variable framerate or resolution to keep performance acceptable. Instead they're locked together and using multiples of the framerate.
 
I am not into VR yet, i am on my Intel Ivy Bridge Graphics at the moment and i can run my favorite games on low (Starcraft 2 and Heroes of The Storm) (probably i am going to get the Rx470 on black friday)
I just wanted to say, that i dont like the War between Nvidia and AMD right now, it is dirty, very dirty and unfortunatly, once a loved company now is hated by alot of people(yes, i am talking about AMD) including the reviwers comunity.
I meen, really, as a game developer, you want to sell, or you want your game to be bought or played by as many people as posible. So by default you should optimise your game for the entire people, and really now, on the graphics market there are only 2 types of videocards, so is it that to dificult? How come everytime(ok, not everytime, but in alot of cases) a new title comes out on UE4 it is so cripled on AMD that it is unplayable or the performance is so bad that even the reviwers dont know what to say.
The conclusion of the article, as it is now written, is that AMD is crap in VR. But if it would be me who has written the article, the conclusion would be: "hey, Lucasfilm ILMxLAB, is the final game going to be so bad optimised for my AMD card as it is now in the Demo"?
 
Maybe AMD has tried that and was kept out of the kingdom? I am not saying that's what happened but how do you know? Do tell if you do.

My bad, just saw this.

Unless they're choosing to do it behind closed doors, all pull requests, branches, and commits are viewable with a free developer account on Epic's github. Even if Epic wasn't integrating into mainline, the changes would be available there as pullable third-party integrations. That said, even a cursory glance through the UE4 third-party directory is showing some AMD hits, with some things directly integrated into the core engine. There's definitely someone doing work out there, though clearly not to the level of Nvidia, who have had engineers directly working on UEx optimizations for years.
 
You mean the whole engine - everything - is in that public repo?
 
I am not into VR yet, i am on my Intel Ivy Bridge Graphics at the moment and i can run my favorite games on low (Starcraft 2 and Heroes of The Storm) (probably i am going to get the Rx470 on black friday)
I just wanted to say, that i dont like the War between Nvidia and AMD right now, it is dirty, very dirty and unfortunatly, once a loved company now is hated by alot of people(yes, i am talking about AMD) including the reviwers comunity.
I meen, really, as a game developer, you want to sell, or you want your game to be bought or played by as many people as posible. So by default you should optimise your game for the entire people, and really now, on the graphics market there are only 2 types of videocards, so is it that to dificult? How come everytime(ok, not everytime, but in alot of cases) a new title comes out on UE4 it is so cripled on AMD that it is unplayable or the performance is so bad that even the reviwers dont know what to say.
The conclusion of the article, as it is now written, is that AMD is crap in VR. But if it would be me who has written the article, the conclusion would be: "hey, Lucasfilm ILMxLAB, is the final game going to be so bad optimised for my AMD card as it is now in the Demo"?

Don't forget that 3 VR tests have been made so far, and AMD always finishes at the bottom (*and especially from the 2 cards that are comparable between them, -GTX1060 & RX480-, the GTX1060 is far better at VR experience so far) , so that probably means something.
Surely, the results at the previous 2 VR tests aren't as bad as in Trials on Tatooine, but they are always at the bottom, far behind NVidia.
 
You mean the whole engine - everything - is in that public repo?

Everything except console-specific source for Xbone and PS4, which requires additional access through normal third party licensing. The core engine, as well as PC and mobile platforms for UE4 are effectively open source. The same goes for the new Unreal Tournament, which is elsewhere on Epic's github.
 
5% ;) for the "free license" which starts after the first $3000 grand made. So far this is the cheapest and best indie game engine out there (tools, support, features, and asset libraries). Unity support is not as good, but has more assets, but also costs more. Cry Engine 5 is very attractive but support is lacking since less people use it, and asset library is no where near what the other engines have.

25% was for UE3, which didn't come with the source :)
 
Everything except console-specific source for Xbone and PS4, which requires additional access through normal third party licensing. The core engine, as well as PC and mobile platforms for UE4 are effectively open source. The same goes for the new Unreal Tournament, which is elsewhere on Epic's github.

So, unless they reject AMD's commits following their lizard agreement with jen sun, it's almost entirely their fault.
 
Milkman: While my i3 system downloads the trials on tatooine I ran through it on mine and had slightly different(though comparable) results to the [H]. 6.88ms average render time. I'm showing 0 dropped frames, repro only 19 times, but I'm probably cutting my data at a different spot than they are. That vrframes.csv is clunky in Google Sheets. I'm running an MSI 1070 Gaming, 6700k. Everything at stock settings, including the Vive.

Once this thing is done installing I'll bloop up a quick graph for you on the i3/970/8GB RAM and we can see how they compare.
 
Funny how AMDs poor VR performance is AMDs fault, and Nvidias poor dx12 scaling is again AMDs fault.
 
So, unless they reject AMD's commits following their lizard agreement with jen sun, it's almost entirely their fault.

Even if they did reject them for an invalid reason, the records of that would be available in public. The changes would therefore both be available to anyone, as well as bring negative publicity to Epic. Point being I suppose is that it's in Epic's best interest for AMD to have dedicated engineers working on UE4 (not the least of which would greatly benefit the consoles anyway), and definitely not in their interest to reject changes or technology integrations that AMD wants to include.
 
So I tried high image settings and either a) something's seriously wrong or b) the i3 6100/GTX 970 is a complete miss for VR on the "high" in Trials on Tatooine - Repro: 10855 Dropped Frames: 1842 Avg Render: 14.78ms.

I got similar results after 2 runs, and there were some definite issues with the chaperone boundaries when they popped up(flickering mostly) but that went away when in Low mode. I didn't capture frame data for low/medium, but medium felt "fine". On high I had all kinds of issues with latency and I got seriously motion sick. I don't have time today to swap the card into my i7 system, but now I'm curious as to how much of an impact it may have.

I know there's no consistency re: motherboard/cpu, so there could be other factors. I don't think anybody's going to be surprised by an i3/970 combo being "not quite" enough to run it at the highest settings.
 
How was CPU usage on the i3? Its cores getting maxed out? The fact it handled medium 'fine' is a good thing though.
 
For those wondering about what amd is doing about ue4, well cryengine uses liquid VR and should boost performance for AMD GPUs. Also liquid VR is open source so someone go ahead and patch it into ue4.

Sadly not a single demo or title looks to currently use liquid VR, so kinda hard to benchmark.
 
Sadly not a single demo or title looks to currently use liquid VR, so kinda hard to benchmark.
I am very much keeping my eyes open to what is going on. The moment we can add a somewhat GPU-intensive VR title that is outside of UE4 and Unity, we will be on top of it. Monday's VR review is going to be interesting.
 
Hopefully will throw AMD a bone because they have been getting pummeled by bad performance reviews lately. Not [H] fault but only AMD's.
We are staying fair in our VR coverage, but expect no bones to be thrown...ever, to Red or Green.
 
AMD is in control of their own VR destiny and it seems that Raja has put that in the hands of Roy Taylor.
 
How was CPU usage on the i3? Its cores getting maxed out? The fact it handled medium 'fine' is a good thing though.

It looks like cpu was basically pegged on high.

And yeah, I'm thinking medium is good fit especially for such an inexpensive system. Now if the Vive was $300 cheaper we could see a solid fit for a broader audience.
 
This is second brutal AMD fail in a row in UE4. I wonder, what grief has Sweeney with AMD, that Epic ignores AMD so much?

But then, I will be devil's advocate here: minimum specs for Rift and Vive are GTX 970 AND Radeon 290. And they are here for a reason - so that users know they will have good experience with that hardware. With 80% reprojection on low settings on 480, this game fails VR spec.
 
This is second brutal AMD fail in a row in UE4. I wonder, what grief has Sweeney with AMD, that Epic ignores AMD so much?

But then, I will be devil's advocate here: minimum specs for Rift and Vive are GTX 970 AND Radeon 290. And they are here for a reason - so that users know they will have good experience with that hardware. With 80% reprojection on low settings on 480, this game fails VR spec.
I think AMD will need to clarify UE4 lack of VR performance, an engine everyone knew will be used in many games. Seems like something AMD should have been on top of as much as possible in a more serious way. If AMD actually started doing demo's again using various engines and working with the developers on those engines they would probably be in better shape in getting VR performing more consistently. Oh well, VR is in it's infancy.
 
Its obvious its not the cards can't handle vr. it is the support they have in UE4 games, and it shows. So obviously there needs to be a lot of work done, can we test a VR game that doesn't use UE4?

Why not test another game to show if AMD does as poor in it if its not a UE4 game?

Do you plan on testing another VR game that is not UE4? I mean not a demo. Because we all know AMD got its work cut out for them when it comes to UE4.

Shows the games are very well optimized for nvidia hardware and amd has to do the heavy lifting here when it comes to optimizing it.
 
Its obvious its not the cards can't handle vr. it is the support they have in UE4 games, and it shows. So obviously there needs to be a lot of work done, can we test a VR game that doesn't use UE4?

Why not test another game to show if AMD does as poor in it if its not a UE4 game?

Do you plan on testing another VR game that is not UE4? I mean not a demo. Because we all know AMD got its work cut out for them when it comes to UE4.

Shows the games are very well optimized for nvidia hardware and amd has to do the heavy lifting here when it comes to optimizing it.

You sure about that?

How about Unity engine? What The Gallery Ep. 1: Call of Starseed and VR All About - AMD & NVIDIA GPU VR Performance in Call of Starseed

And we looked at Raw Data too, but of course that does not count in AMD land since it is a real game that is selling very well, but it uses UE4. What is Raw Data and VR All About - AMD and NVIDIA GPU Vive VR Performance in Raw Data

Maybe tomorrow we do Robot Repair which SteamVR Performance Test is based on?
 
Bah Tatooine has a min space requirement of 2.5mx2.5m. I couldn't even start to play it. I got to the press the green button and had to quit cause I'd have to put the controller through my TV to press it. You know if only the game dev would let me use that thing they call "The Force" they keep talking about in the movies to push it maybe it wouldn't need that 2.5x2.5 sized play area lol.

Edit: Finally got to see the whole thing. It's really short, but it seems I got to hit the buttons it wanted me to press by putting my hands over my TV then behind it kind of pressing from behind it haha. Lightsaber FTW :)
 
Last edited:
Back
Top