Rise of the Tomb Raider DX11 vs. DX12 Review @ [H]

That's because one person maximum playable settings aren't playable for someone else.

Someone can like 40-50 fps with maximum eye-candy I prefer 80-120 fps range even at cost of settings
Then I suggest you just stop typing right now and go somewhere else if you do not like the way we do GPU reviews. We have only been doing this for 13 years. If you don't like the way we are doing that, I get it. I am aware of your opinion. Please feel free to go post it elsewhere.
 
That's because one person maximum playable settings aren't playable for someone else.

Someone can like 40-50 fps with maximum eye-candy I prefer 80-120 fps range even at cost of settings

You are the people hes talking about. You only see the FPS #'s and draw conclusions from it, Brent's reviews focus on over all enjoyment of the game and taxes the game to the max before it becomes unenjoyable.

Its not a bout him finding a sweet spot for FPS numbers, hes finding a sweet spot to enjoy the game without worrying about a number.
 
Yeah! We can get some light to the "AMD is DX12 magic bullet!" myth!

The answer is - Its nothing different then Nvidia!

Though I will say, it sucks that it doesn't run better *yet*, and I'm sure when we get a Native DX12 game we will get a better view on the new benefits, it looks like grandfathering an engine isn't capable using DX12 properly.
 
The chart is perfect the way it is. I don't own the game but know enough about video features that I followed it easily. Every option side by side, easy. Also, they had previously linked to NVidias graphics feature article like they usually do if one is available. So if someone didn't know what a feature was or how it affects performance they could look there.

Edit: They are also finding the games sweet spot for the average user, not the NEED MAX FPS MORE FPS MORRRREEE FPS guys...
 
I think you can still use fraps to record fps for dx12 games. The on screen fps display doesn't work, but it's able to generate benchmark files.
 
Thanks for the review. Well I will stick with win 7 64 for awhile longer then. I have been waiting for reviews like this one. When they start coming out, especially a game I want, with real optimizations then I move to Windows 10.

For now perfectly happy with dx11 and windows 7
 
It's a test patch, they only started to implement DX12 and it looks like they have a lot of work cut out for them.

Ashes of the Singularity on Steam <- Releasing tomorrow with a full campaign along with multiplayer.

This is DX12 done right, performance improvement across the board vs DX11.
 
wasn't dx12 supposed to put most of those optimization techniques on the game dev part of the equation? so why are we automatically blaming the inferior performance on nvidia and amd?
 
"For Rise of the Tomb Raider the largest gain DirectX 12 will give us is the ability to spread our CPU rendering work over all CPU cores, without introducing additional overhead. This is especially important on 8-core CPUs like Intel i7’s or many AMD FX processors."


That claim from them was BS as HT on my 4770k actually hurts performance a little instead of helping it. In fact the more cpu intensive I make that game (running low res) the more that HT hurts. It is only few fps but the point is that DX12 is not doing what they claimed it was supposed to do.
 
"For Rise of the Tomb Raider the largest gain DirectX 12 will give us is the ability to spread our CPU rendering work over all CPU cores, without introducing additional overhead. This is especially important on 8-core CPUs like Intel i7’s or many AMD FX processors."


That claim from them was BS as HT on my 4770k actually hurts performance a little instead of helping it. In fact the more cpu intensive I make that game (running low res) the more that HT hurts. It is only few fps but the point is that DX12 is not doing what they claimed it was supposed to do.
Thats at odds with my findings but with a different CPU.

I ran a 6600K at 4.7GHz and got stuttering at the start of 2 levels on the bechmark due to being CPU limited.
All 4 cores were maxed at 2 specific points.

With a 6700K at 4.6GHz, it is completely smooth.
4 cores max out for a little less time now and the other 4 nearly max at that same point.

Despite being a slightly slower CPU, HT did help.


Something worth checking for yourself as it may help. It could make a difference on your CPU.
I tested MSI (Message Signalled Interrupts) on my 980ti and got the following results.
Windows: Line-Based vs. Message Signaled-Based Interrupts ... - Guru3D.com Forums
All settings maxed with VXAO, MSAA, no Motion Blur, No Film Grain.

84.19fps - No MSI on 6600K -very jerky at start of 2 levels
89.66fps - MSI on 6600K -a lot less jerky but still jerked a bit

90.87 fps - No MSI on 6700K - smooth
91.00 fps - MSI on 6700K - smooth

For this game, DX12 cannot help me at 60fps.
If it reduces CPU use, it should benefit playing Tomb Raider on a VR headset at 90fps.
 
Last edited:
Thats at odds with my findings but with a different CPU.

I ran a 6600K at 4.7GHz and got stuttering at the start of 2 levels on the bechmark due to being CPU limited.
All 4 cores were maxed at 2 specific points.

With a 6700K at 4.6GHz, it is completely smooth.
4 cores max out for a little less time now and the other 4 nearly max at that same point.

Despite being a slightly slower CPU, HT did help.


Something worth checking for yourself as it may help. It could make a difference on your CPU.
I tested MSI (Message Signalled Interrupts) on my 980ti and got the following results.
Windows: Line-Based vs. Message Signaled-Based Interrupts ... - Guru3D.com Forums
All settings maxed with VXAO, MSAA, no Motion Blur, No Film Grain.

84.19fps - No MSI on 6600K -very jerky at start of 2 levels
89.66fps - MSI on 6600K -a lot less jerky but still jerked a bit

90.87 fps - No MSI on 6700K - smooth
91.00 fps - MSI on 6700K - smooth

For this game, DX12 cannot help me at 60fps.
If it reduces CPU use, it should benefit playing Tomb Raider on a VR headset at 90fps.
The effectiveness of MSI is largely dependent on both your hardware and software configuration. Besides that, I think that if video cards worked better using MSI instead of LB then the video card drivers released by both AMD and NVIDIA would have it enabled by default. In my case, MSI made most games on my PC stutter with worse FPS versus LB.
 
Nice review. I don't know what all the complaining is about. The numbers speak for themselves. I kinda like the fact that [H] does all the time consuming work of tweaking the setting for best visual experience while being playable. Maybe I missed it but DX12 didn't improve IQ in the this game (fps aside). That said spreading the load across more cores would help lower clocked CPUs in theory assuming additional latency incurred would be negligible. Clearly it didn't. For the GTX970 there was drop in performance. My take away here is that developers are still learning performance impact of the API. Being able to having better core utilization doesn't mean you should.
 
GTX970 may have suffered when exceeding 3.5GB ram.
No other card has that little fast workspace.
ROTTR is pretty intense in VRam use.
At 1080p I see very close to 6GB on my 980ti. It may not need that much to remain fluid but with practically 1/2 the ram at close to twice the resolution, it looks feasible.
 
I think a lot of you guys are missing that even in DX11 mode, this game is spreading work across all cores.
 
Yep, it makes heavy use of up to 8 cores on DX11 with a 4 core + HT CPU.
It scales the number of cores based on workload as well.
 
I think you can still use fraps to record fps for dx12 games. The on screen fps display doesn't work, but it's able to generate benchmark files.

You are correct, capturing the FPS using Aida64 logging with Fraps with Rise Of The Tomb Raider DX12 works. Aida64 Extreme can record Fraps FPS. The logging feature was set for .5sec intervals. It can capture more than just FPS, at each interval mark it can record CPU speed/usage/each core, GPU speed, memory usage Vram, Dynamic Memory usage and the list goes on. I saved it to a CVS file (Excel) which I can convert to graphs etc. Aida64 can record CPU/GPU/Other temperatures at the exact same interval. It does not appear to affect the performance of the ROTTR benchmark in the game.

I understand Brent stance on real play ability of game play or experience - but without any kind of empirical data collected, confirmed and evaluated it would be hard to accept. In other word 80% real experience backed up by 20% of data - they compliment each other with emphasis on the experience I would hope will be maintained.
 
I am getting pissed off with this game.
Every day before I can play it insists I go to their server, copy a code into a box, get the return code, paste that in the games box and I'm away.
Thats annoying enough.
Today, I enter the code and wait for the return code and their website does nothing.
ffs, they better stop this crap or I wont buy their next game.

LOL, that's Ubisoft for you and why I have been boycotting them for the past few years. They are a dbag company and will never see another penny from me - ever.
 
  • Like
Reactions: Yakk
like this
Aren't they the publisher? I see it on their front store page for $90.00 CAD, which is ten bucks more than everywhere else so figured they published it.
 
Aren't they the publisher? I see it on their front store page for $90.00 CAD, which is ten bucks more than everywhere else so figured they published it.
You have sunk to a new low. Thanks for your opinions though, you can go back to sharing political memes on Facebook now.
 
Brent,
any idea what caused the performance increase for NVIDIA in DX11 with the March update for this game?
Back in February build 610.1_64 showed it seemed better optimised for AMD using DX11 consistently with each of its tiers outperforming NVIDIA.
With this latest build it has changed with NVIDIA now in front with noticable figure for each tier.

Is this to do with VXAO being implemented under 11.3 equivalent for conservative rasterization, something else, or a mixture of CR and other improvements?
Interested because it would be good to know if CR is improving optimisation peformance in a real world game for NVIDIA (and would also then mean probably Intel Skylake).
I am focusing on DX11 as it seemed very well optimised even for AMD here, and is an area one would expect NVIDIA to usually beat AMD when a game is optimised for both - February results showed the trend of some games now being better than NVIDIA or very well optimised (IMO comes back to console influence).

Thanks
 
Last edited:
I'm starting to think any game software developer that gets in to bed with NVidia are going to experience less then stellar DX 12 results. It seems to me any titles that don't have any interference from NVidia see marked improvements in DX12 over DX11 (assuming the hardware actually can do DX12 properly). I won't be buying this game (and before you call me an AMD fanboy I have multiple gaming rigs with a mixture of Intel, NVidia and AMD gear). I'm not impressed at all.
 
I'm starting to think any game software developer that gets in to bed with NVidia are going to experience less then stellar DX 12 results. It seems to me any titles that don't have any interference from NVidia see marked improvements in DX12 over DX11 (assuming the hardware actually can do DX12 properly). I won't be buying this game (and before you call me an AMD fanboy I have multiple gaming rigs with a mixture of Intel, NVidia and AMD gear). I'm not impressed at all.
Not really true, Hitman was an AMD partnership for DX12 and look at their performance for DX12 in that game here.
hardocp: DX11 v DX12 AMD - Hitman 2016 Performance Video Card Review
We are off-topic so better to take this to another thread otherwise a risk our responses will be deleted.
Cheers
 
Last edited:
The game plays great in DX11 so DX12 is a non-issue. It is a very fun great game that looks spectacular.
 
Quick question on vram usage. I'm using a 4790K at 4.5, 32 gigs of DDR3 @2400 and 2 OC MSI 980TI Goldens in SLI. I'm playing on an Acer predator X34 at 3440 X 1440. Now, I was playing with everything maxed out including hair and textures. Before I was playing on just 2 980s with textures just on high and hair on "on". I upgraded to the Ti's thinking I could run textures on "very high" and hair on "very high" but last night the game/drivers crashed and it gave me a message saying I ran out of video memory. Sure enough, when I checked the Vram usage on afterburner it was spiking past 6 gbs in the geothermal section standing near waterfalls.

Just sucks with two fucking 980 Tis I still can't run very high textures. :( I'm guessing it's the resolution that's still too high? I didn't expect it to run on 4K but at my resolution I thought I could crank everything to max. Oh well, it's still very pretty with only the "high" textures and everything else on max.
 
Quick question on vram usage. I'm using a 4790K at 4.5, 32 gigs of DDR3 @2400 and 2 OC MSI 980TI Goldens in SLI. I'm playing on an Acer predator X34 at 3440 X 1440. Now, I was playing with everything maxed out including hair and textures. Before I was playing on just 2 980s with textures just on high and hair on "on". I upgraded to the Ti's thinking I could run textures on "very high" and hair on "very high" but last night the game/drivers crashed and it gave me a message saying I ran out of video memory. Sure enough, when I checked the Vram usage on afterburner it was spiking past 6 gbs in the geothermal section standing near waterfalls.

Just sucks with two fucking 980 Tis I still can't run very high textures. :( I'm guessing it's the resolution that's still too high? I didn't expect it to run on 4K but at my resolution I thought I could crank everything to max. Oh well, it's still very pretty with only the "high" textures and everything else on max.

Two things at play, wider aspect ration has more objects/assets such as textures and shaders which have to be render -> more video ram needed. Plus the higher resolution over 1440p. My Nano chokes (as well as 290x/290 CFX combo) using very high textures. The good thing is the difference between high and very high is not much if at all at 1440p. I think you need a higher resolution like 4K on a very good monitor to start seeing the difference.
 
Been playong with dx12 with no problems until today.I have had two hard lock ups.Gone back to dx11 problem has stpped.Guess all the bugs not worked out.

Seems that the newest driver from Nvdia has cured the problen
 
Last edited:
Just got this game, but from what I am reading, it seems like DX11 is still the way to go at the moment. I don't have two different GPU's, so no benefit there either...
 
I play with a 970 and with textures on high and the game still uses near 4gb.Tried using very high,but there to many slowdowns
 
I play with a 970 and with textures on high and the game still uses near 4gb.Tried using very high,but there to many slowdowns
Looks like 6gb and up cards need apply only for Very High textures.
 
DX12 runs far better than 11 here, it's more smooth, fewer hiccups, VRAM usage is outrageous though.
 
Back
Top