Rise of the Tomb Raider DX11 vs. DX12 Review @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,532
Rise of the Tomb Raider DX11 vs. DX12 Review - Rise of the Tomb Raider has recently received a new patch which adds DX12 API support, in addition the patch adds NVIDIA VXAO Ambient Occlusion technology, however just under DX11. In this evaluation we will find out if DX12 is beneficial to the gameplay experience currently and how it impacts certain GPUs.
 
nice article. everything DX12 is gonna take time to mature, we will obviously see improvements as development continues but its a good start.
going by the 390/390x/970 numbers and how many options were lowered, they look more suited to 1080p for this game.
 
while I could be wrong, my take on Dx12 is it more on the dev side to optimize the game for AMD and Nvidia .
With dx12 the graphic driver mainly just opens pathway to the low level HW , my fear with Dx12 is many games , at least starting out, will just do a generic dx12 optimize and not spend time tweaking each (AMD-Nvida HW).

Over time though it probably get better as I am sure AMD and Nvidia will help point to were and better optimize for each of there HW .
Dev's just have to work with each one.

PS: thumbs up for doing review without fps, it still was/is very informative.
 
One comment on the format:

Work the resolution into all of the tables or present the table that has them first, as it is supposed to provide quick info on where you guys found the limit of what was acceptably playable at what settings.
 
while I could be wrong, my take on Dx12 is it more on the dev side to optimize the game for AMD and Nvidia .
With dx12 the graphic driver mainly just opens pathway to the low level HW , my fear with Dx12 is many games , at least starting out, will just do a generic dx12 optimize and not spend time tweaking each (AMD-Nvida HW).

Over time though it probably get better as I am sure AMD and Nvidia will help point to were and better optimize for each of there HW .
Dev's just have to work with each one.

PS: thumbs up for doing review without fps, it still was/is very informative.


Its like DX9, DX10, DX11 all over again. But in this case game devs will also need to optimize their games for DX12 not just drivers specially if they want to exploit all DX12 potential
 
The main issue is that a lot of the DX 12 implementations right now are "tack ons" - as in, the game wasn't designed with DX12 in mind, but they are adding support to do a couple things and get on the hype train.

We won't really know what DX 12 has under the hood until we start getting the run of games built with DX12 in mind - from what I heard, there are a lot of low level calls and optimizations that to get the most out of DX12, you need to be programming for it from the beginning.

Seems like DX 12 is a tradeoff - more complicated by now than DX11 to implement due to a lot of interfacing directly with the hardware, but because of that low level interaction, it gives the game creators a much larger degree of control, and can push games harder without framerate loss.

I don't think we will be seeing any of those true DX12 games until later in the year.
 
The encouraging sign from the dev in this case is that they've said pretty clearly that they aren't done optimizing for DX12, so it will be interesting to see how much more performance they can achieve.
 
The encouraging sign from the dev in this case is that they've said pretty clearly that they aren't done optimizing for DX12, so it will be interesting to see how much more performance they can achieve.
Well, let's see exactly what the devs did say instead of a paraphrasing your own words.

Jurjen Katsman, Studio Head at Nixxes Software

[Our developer blogs lift the curtain on the creation of Lara’s first tomb raiding expedition, and the technology we use toconstantly improve it. Following the release of Rise of the Tomb Raider for PC, the title will be one of the first in the industry to integrate DirectX 12 support, allowing fans with older PCs or newer rigs to run at higher framerates and higher graphical settings. Nixxes Studio Head Jurjen Katsman dives deep into the new technology below.]

Pushing the boundaries of technology on PC has always been a passion of the development team at Nixxes, and Crystal Dynamics and Square Enix have been great partners for us in doing so. One of the challenges with PC development is guaranteeing players on as many different PC configurations as possible can have a great experience. For us this means ensuring that users with older PCs can still get a great gameplay experience, but also that users with higher-end machines can get the most out of their hardware, including the highest quality visuals, frame-rate, and other technical enhancements.

One thing we are very excited about to help us further realize those goals is the new DirectX 12 graphics API that is available on Windows 10. In the patch released today on Steam – and coming soon to the Windows 10 Store – we will be adding DirectX 12 support to Rise of the Tomb Raider.

At Nixxes we have a long history of working with consoles as well, and one of the large differences between developing for consoles and developing for PCs is the level of access to the hardware available to us. We can leverage every single hardware feature and every bit of CPU power available in the most efficient way possible. With DirectX 12 we are taking a massive step forwards for bringing a lot of that flexibility to the PC as well. For Rise of the Tomb Raider the largest gain DirectX 12 will give us is the ability to spread our CPU rendering work over all CPU cores, without introducing additional overhead. This is especially important on 8-core CPUs like Intel i7’s or many AMD FX processors.

Let me explain how this helps the performance of your game. When using DirectX 11, in situations where the game is under heavy load – for example in the larger hubs of the game – the individual cores may not be able to feed a fast GPU like an NVIDIA GTX 980 or even NVIDIA GTX 970 quick enough. This means the game may not hit the desired frame-rate, requiring you to turn down settings that impact CPU performance. Even though the game can use all your CPU cores, the majority of the DirectX 11 related work is all happening on a single core. With DirectX 12 a lot of the work is spread over many cores, and the framerate of the game will run at can be much higher for the same settings. Check out the picture below for a visual example of how the CPU work is distributed:

tumblr_inline_o3vsjaoaAk1qij4lt_1280.png

As an example to illustrate the point, below is a screenshot of a scene in the game running on an Intel i7-2600 processor with 1333Mhz memory, paired with a GTX 970. Using DirectX 11 at High Settings we would only get 46 fps.

tumblr_inline_o3vsjnzHO71qij4lt_1280.jpg

Now look at the same location the new DirectX 12 implementation, we can lift it up to 60!

tumblr_inline_o3vsk0mwJl1qij4lt_1280.jpg

The above advantage we feel is the most important one for Rise of the Tomb Raider, but there are many more advantages that make us excited about DirectX 12. Another big feature, which we are also using on Xbox One, is asynchronous compute. This allows us to re-use GPU power that would otherwise go to waste, and do multiple tasks in parallel. And there is a never before seen level of control over NVIDIA SLI and AMD CrossFireX configurations, which means that as a developer we can take full control over those systems and ensure users get a great experience with them.

Being one of the first game titles out there using DirectX 12 there are still many more optimizations to make and DirectX 11 is available for the most predictable and proven experience. However, as seen above there are large gains to be found already, and we encourage you to check out DirectX 12 for yourself in our latest patch!
 
DX12 looks like a red herring for CPU optimisation with this game.
Using DX11, my 6700K @ 4.6GHz with HT on shows all cores heavily used with a clocked 980ti, averaging around 91fps @ 1080p with the benchmark.
(all settings maxed, MSAA, no motion blur... cant check if anything else is changed because their crap servers dont work!)
What can DX12 improve here?


I am getting pissed off with this game.
Every day before I can play it insists I go to their server, copy a code into a box, get the return code, paste that in the games box and I'm away.
Thats annoying enough.
Today, I enter the code and wait for the return code and their website does nothing.
ffs, they better stop this crap or I wont buy their next game.
 
DX12 looks like a red herring for CPU optimisation with this game.
Using DX11, my 6700K @ 4.6GHz with HT on shows all cores heavily used with a clocked 980ti, averaging around 91fps @ 1080p with the benchmark.
(all settings maxed, MSAA, no motion blur... cant check if anything else is changed because their crap servers dont work!)
What can DX12 improve here?


I am getting pissed off with this game.
Every day before I can play it insists I go to their server, copy a code into a box, get the return code, paste that in the games box and I'm away.
Thats annoying enough.
Today, I enter the code and wait for the return code and their website does nothing.
ffs, they better stop this crap or I wont buy their next game.

Yea the pirates don't have to deal with this issue. Just another reason DRM hurts the paying customers. At this point in time, they should just patch those checks out since the game has been out for awhile.
 
Tomb Raider just doesn't hit the CPU hard enough to get a good feel for DirectX 12's supposed benefits. Its basically the same thing: Test with a slower CPU or write an application that requires more CPU than is available. Until then, who knows? I do think testing on slower CPUs even in a GPU test will be interesting just as additional data points, to see if DirectX 12 is even doing what it claims or not.
 
Tomb Raider just doesn't hit the CPU hard enough to get a good feel for DirectX 12's supposed benefits. Its basically the same thing: Test with a slower CPU or write an application that requires more CPU than is available. Until then, who knows? I do think testing on slower CPUs even in a GPU test will be interesting just as additional data points, to see if DirectX 12 is even doing what it claims or not.
It does if you get high enough framerate.
With a 6600K @ 4.7GHz in DX11, the last 2 levels of the demo maxed all cores out for a few seconds,and made the first few seconds of the levels jerky.
With a 6700K @ 4.6GHz in DX11, the CPU maxes out on 4 cores and the other cores are at 98% ish for a split moment. It no longer jerks at the start of the levels.

This is averaging 91fps with a clocked 980ti.
 
Tomb Raider just doesn't hit the CPU hard enough to get a good feel for DirectX 12's supposed benefits. Its basically the same thing: Test with a slower CPU or write an application that requires more CPU than is available. Until then, who knows? I do think testing on slower CPUs even in a GPU test will be interesting just as additional data points, to see if DirectX 12 is even doing what it claims or not.

I cant play it.
Instead of having a nice time chilling and gaming for an hour, I've been trying to fix their stupid problem and swearing.
fucking idiots.
 
based on the info kyle just posted from the devs, to get a true feeling for the dx12 benefits you need a lower powered CPU/GPU combo, even lower RAM speeds. So people with high-end and enthusiast grade systems are going to see little benefit, other than visual improvements but people like myself with lower-mid range systems should see a good improvement. or did I interpret that wrong?
 
I disagree with the [H] stance that frame rates are finally unimportant, as they have been predicting for 13 years. I believe that frame rates are more important today than ever. It has been shown that frame rates above 90hz are critically important to VR immersion and preventing motion sickness. Those frame rate over time plots will be essential for actual developers as they optimize their games to ensure nothing causes rates to dip below 90hz. Consumers will need to know what hardware and game combinations will maintain 90+hz.

If there was ever a time that frame rates were important it is now. They are no longer a subjective, or snobby, issue. They are essential to the gaming experience.
 
based on the info kyle just posted from the devs, to get a true feeling for the dx12 benefits you need a lower powered CPU/GPU combo, even lower RAM speeds. So people with high-end and enthusiast grade systems are going to see little benefit, other than visual improvements but people like myself with lower-mid range systems should see a good improvement. or did I interpret that wrong?
This is exactly what I took from the dev's comments.
 
I disagree with the [H] stance that frame rates are finally unimportant, as they have been predicting for 13 years. I believe that frame rates are more important today than ever. It has been shown that frame rates above 90hz are critically important to VR immersion and preventing motion sickness. Those frame rate over time plots will be essential for actual developers as they optimize their games to ensure nothing causes rates to dip below 90hz. Consumers will need to know what hardware and game combinations will maintain 90+hz.

If there was ever a time that frame rates were important it is now. They are no longer a subjective, or snobby, issue. They are essential to the gaming experience.

with VR possibly this is the case. Idk, haven't used a single unit, ever. but on a monitor FPS is not everything. some games I can play for hours at 60FPS(T-Raider 2013, GTAV, Grid: AS) but others like the AC series, Dirt Rally, and MOST First Person Games running at 60FPS give me a headache and eyestrain. if its lower, like 45ish no headache/eyestrain. so it has to be the smoothness of those frames(frame time?) don't know why.

and that 90Hz is combined, its 45 per eye. is it not?
 
with VR possibly this is the case. Idk, haven't used a single unit, ever. but on a monitor FPS is not everything. some games I can play for hours at 60FPS(T-Raider 2013, GTAV, Grid: AS) but others like the AC series, Dirt Rally, and MOST First Person Games running at 60FPS give me a headache and eyestrain. if its lower, like 45ish no headache/eyestrain. don't know why. and that 90Hz is combined, its 45 per eye. is it not?

Raw monitor FPS is not an issue for most people, except in the cases of tearing. There is a percentage of people that get headaches at refresh rates below 60hz.

I think that frame consistency should always be the more heavily weighted factor on 2D monitors. Frame dips, stalls and tears are much more objectively bad than 45hz vs 60hz.
 
I disagree with the [H] stance that frame rates are finally unimportant, as they have been predicting for 13 years. I believe that frame rates are more important today than ever. It has been shown that frame rates above 90hz are critically important to VR immersion and preventing motion sickness. Those frame rate over time plots will be essential for actual developers as they optimize their games to ensure nothing causes rates to dip below 90hz. Consumers will need to know what hardware and game combinations will maintain 90+hz.

If there was ever a time that frame rates were important it is now. They are no longer a subjective, or snobby, issue. They are essential to the gaming experience.
Frame rates are very important to the gameplay experience. We do not believe that we have to have access those to be able to make a determination as to what cards perform better today on desktop monitors. As framerate pertains to VR, our statements here have nothing to do with that, and am I unsure as to why you think these would. We are very specific about what we are covering in our reviews and what those reviews cover.
 
with VR possibly this is the case. Idk, haven't used a single unit, ever. but on a monitor FPS is not everything. some games I can play for hours at 60FPS(T-Raider 2013, GTAV, Grid: AS) but others like the AC series, Dirt Rally, and MOST First Person Games running at 60FPS give me a headache and eyestrain. if its lower, like 45ish no headache/eyestrain. so it has to be the smoothness of those frames(frame time?) don't know why.

and that 90Hz is combined, its 45 per eye. is it not?

90 per eye.

Sorry I missed the '90Hz per eye" question.

Barring frame interpolation wizardry, the OR and Vive will need to receive a 90Hz or 120Hz signal at 2400x1080. Both eyes must simultaneously receive the same number of frames and that needs to be a 90Hz minimum. That's a 25% increase in pixels above the typical 1920x1080 that is the standard today and at frame rates that are nearly unheard of.

This is the reason that many VR games look like fancy Wii games. The hardware to run most modern games in VR does not exist and DX12 is unlikely to change that.
 
Sorry I missed the '90Hz per eye" question.

Barring frame interpolation wizardry, the OR and Vive will need to receive a 90Hz or 120Hz signal at 2400x1080. Both eyes must simultaneously receive the same number of frames and that needs to be a 90Hz minimum. That's a 25% increase in pixels above the typical 1920x1080 that is the standard today and at frame rates that are nearly unheard of.

This is the reason that many VR games look like fancy Wii games. The hardware to run most modern games in VR does not exist and DX12 is unlikely to change that.
Its not unrealisable by any stretch.
The total res is less than 1440p.

For playing current games not designed for VR, high end games quality settings will need to be reduced, such is life.
But lower gfx quality games and games designed for the platform will be designed to utilise current hardware effectively.
The immersion factor is so great, slightly lower graphics quality has little effect.

Anyway, enough, this is a tomb raider thread :p
 
While DX12 performance is not exactly optimized, i would say Tomb raider is more optimized than GoWUE. The hack job implementation they did for DX12 was so bad, they didnt even care to announce the release of the game. I just hope it gets fixed one of these days.
 
meaning you don't need to see FPS(OSD) as long as it plays smoothly? if 45FPS is just as smooth as 60FPS with every setting the same, who cares which brand it is. as long as there are no massive drops or stuttering the game experience would be the same. right?

That is the whole point of their testing method, to play it and see what is good enough.
Its not that 45fps is as good as 60fps, its about "what is good enough" to give good gameplay.
If you arent happy with anything but 60fps+ then you will have to find out for yourself, or extrapolate from their findings.
 
This is interesting. As a community it had been assumed that DX12's advantage over DX11 would be the ability to help lower end systems, BUT we didn't see them hurting higher end systems. Hopefully this won't be true in the future, it would suck to have to ensure you were running the "right" DX version again.
 
back on topic



meaning you don't need to see FPS(OSD) as long as it plays smoothly? if 45FPS is just as smooth as 60FPS with every setting the same, who cares which brand it is. as long as there are no massive drops or stuttering the game experience would be the same. right?

There has been instances where AMD had the major performance lead over Nvidia in certain titles, but with the massive dips (or stuttering) the AMD product was deemed a lesser product as the Nvidia card provided a smoother gameplay experience. This is what Kyle is referring to as "Over All Gameplay" and not just relying on #'s to determine the better product.

If the numbers don't look like shit but the gameplay is shit, its still shit.
 
that's all I was trying to clarify/point out. FPS is important but not as important as smoothness/playability.
 
back on topic

meaning you don't need to see FPS(OSD) as long as it plays smoothly? if 45FPS is just as smooth as 60FPS with every setting the same, who cares which brand it is. as long as there are no massive drops or stuttering the game experience would be the same. right?

To put it very simply, yes. We have seen instances in the past, had you just looked at framerate data, Card X would have been "faster" than Card Y. But when you play the game you find that Card Y gives a much better gameplay experience. This was one of the reasons that NVIDIA invented its FCAT system. NVIDIA wanted to be able to quantify HardOCP results.
 
This is interesting. As a community it had been assumed that DX12's advantage over DX11 would be the ability to help lower end systems, BUT we didn't see them hurting higher end systems. Hopefully this won't be true in the future, it would suck to have to ensure you were running the "right" DX version again.

If history has anything to do with it you will still have to pick the "right" DirectX version. Unless you can afford a build like Kyle's.
 
If history has anything to do with it, you will still have pick the "right" DirectX version. Unless you can afford a build like Kyle's.

Well, I'm not talking about the actual DX version stored on your PC, as it stands your OS should have the latest available no matter what. What I meant was the game patch DX version. Sorry if I was vague.
 
I just tried the 3D option which uses 1/2 res Side By Side.
Despite each eye getting 1/2 res, performance almost halved.
It looks very good, the 3D is of the highest quality. Shame it doesnt perform anywhere near as well.
(it hangs around 50fps on a clocked 980ti, but I like 60 :))
One setting changed, Ambient Occlusion now only gives on or off.

Dont know if I can be asked to install 3DVision and the driver bits.
 
I would definitely love to see how an old CPU like mine (i7-920 stock) paired with my video card (Geforce 970) would handle the game under DX12 vs DX11. I can play Far Cry 4 maxed out with all gameworks stuff turned on at 1920x1080 with this combination very smoothly, so I know that the CPU is STILL good for games at that resolution. I was hoping that with DX12 I could delay a CPU/MB upgrade even more!
 
Buckle up for Gameworks The Next Generation.

With DX12, you have to optimize for every card yourself as a game dev, or at least engine developers do. How many can reasonably do that? How many would happily turn to a library provided by a card IHV which purports to almost-kinda fix this issue and save themselves a ton of work?

I really don't think most devs are up to this task of optimizing a significant number of cards. It's a lot of work. Yes, closer to metal can be great, but when that metal varies - it's a problem and you look for higher level abstractions, not unlike DX11.
 
No desire to reinstall this game. I'm still bummed they made Lara have a fat ugly face compared to the last Tomb Raider.

WkmiOsd.jpg
 
I think this was the best review of a game that I've ever read from the [H]ardocp crew. The only thing I would think that could help the younger folks with ADHD is to include some type of visual denoting that this is top tier or one setting down from the top card in a category. Green you're great, yellow is one setting down, red you failed. Maybe a list of possible settings? Just trying to help the nonreaders to skim faster. Personally I like reading the whole thing a few times through.

Including the canned benchmark is awesome. Can't wait for the card that gets 100 fps in the canned but stutters like a SOB in the regular game. That will be the sh*tstorm of the year.
 
Back
Top