Rise of the Tomb Raider Video Card Performance Review @ [H]

So far, the game is excellent, better by quite a bit than the first. And that was the first game I'd actually beaten in a long time! It's running great on a slightly OC'd 970m w/ 4720HQ, I'll try and post some frames when I get a chance. I can say, at 100hz w/ Gsync, I can run 1080p SMAA High Ani 8x High High On High High HBAO+ Very High High and all other settings On and it is smooth as butter! I am super impressed with the 970m and especially this game engine.
 
Finished this game last week, great all the way through. The enhancement to the specular reflections in the recent patch really did clean up the image in places, good to enable if the performance hit is acceptable.
 
stupid question but, anyone have any idea how crappy this might run on GTX 670 2GB SLI? with i5 3570k @ 4.2ghz and 16GB DDR3.. weee
 
Very fun game, CFX working well with a 290x/290 combo on 3440x1440p. Everything maxed out I get normally over 50fps up to 70 something. But; initial sections of game play I do see some slow downs which looks like possibly ram limitations other then that it is very smooth.

I do believe this game will be receiving a DX12 patch which will be interesting if the Async Shaders (is used on the XBox1) will be present and how it will affect game play. I also thought PureHair was an extension to TressFX (which is open source/public). This is what I like to see happen with gaming code which gets advanced and cudo's for AMD allowing this to happen. In Rise Of The Tomb Raider it is gorgeous and I like it on the highest level.
 
stupid question but, anyone have any idea how crappy this might run on GTX 670 2GB SLI? with i5 3570k @ 4.2ghz and 16GB DDR3.. weee

This is pre-patch so performance will have changed, but GameGPU has a GTX 690 (barely faster than stock 670 SLI) in their review.

Rise of the Tomb Raider тест GPU

I think SLI scaling has improved since then, but it didn't help that they are using the highest settings, so it doesn't tell you what you can really run it at.
 
I just started playing this yesterday and the review perfectly matched my experience, pretty awesome. Really liking this game and the PureHair looks fantastic.
 
Nice review sir.

Incase anyone is curious about 780ti sli performance at 1440, quite frankly I am impressed. 90% detail at its highest-rated with the exception of textures at very high and hair set to on. AA set to 4 . Still pretty easy to keep the vram inside my 3gig limit with a slight comprise. Average in game fps is around 70.
 
Plan on buying this eventually and glad to see it's pushing the envelope graphically and not just another crappy port. Nice to see my 290x is still capable but it does look like it's going to be pushed pretty hard. May be time to start thinking about an upgrade.
 
This is pre-patch so performance will have changed, but GameGPU has a GTX 690 (barely faster than stock 670 SLI) in their review.

Rise of the Tomb Raider тест GPU

I think SLI scaling has improved since then, but it didn't help that they are using the highest settings, so it doesn't tell you what you can really run it at.

thanks, good call on looking up 690 performance.. found a few youtube videos of people playing at 1440p with 45-60fps on a 690 with half-decent settings (pure hair on!), mostly need to turn down textures to medium.. but seems OK.. I only do 1080p so fingers crossed!
 
Agreed it's a great game to play (played it more than the last one) and finally nice that it works well with crossfire. Just don't do tri-fire like I did.

 
Really enjoyed the game, and while the SLI scaling is excellent in some areas, there are many areas that cause frame rates and GPU usage to drop severely no matter the system spec. Presumably bottlenecked by CPU, I suspect this is an issue that simply couldn't be fixed by Nixxes during porting due to some poor design in the XBone build.
 
AMD's 16.1.1 driver release notes claimed that they fix stuttering on Fury GPUs, and Computerbase.de found this to be true, for single GPU.

In multi-GPU it still stutters. This isn't VRAM related.

You can clearly see in the single GPU Fury X does not have stutters or major minimum FPS drop, in fact, according to your own data in this article as seen here:

1455189919EDyKUcGV8E_7_3.gif


Fury X is basically on par with the 980Ti, or better with the 35 fps minimum vs 29. There is zero VRAM bottlenecks happening at 1440p.

And again here at 4K:

1455189919EDyKUcGV8E_8_3.gif


VRAM bottleneck would result in constant stutter, sharp drops in minimum FPS.

This is what happens during a VRAM bottleneck, notice the constant sharp stutter spikes:



You guys have good benchmark practices but your analysis is sometimes very flawed.
 
When testing performance I don't want any third party utilities running in the background that could affect framerate, so I don' record VRAM usage when testing.

To do VRAM testing, it has to be a separate specific test for that which takes more time, basically double the work. The scope of this article is performance only.

That said, the follow-up will have VRAM information you are seeking. Remember, we may not be able to cover every single thing in the first review, that is why we usually do 2 or whatever is needed when it comes to big games like this to cover.

What's wrong with using an application like MSI Afterburner that works for either AMD or Nvidia cards? You could set the timeframe of the graph logging function to 15 minutes or whatever and get a peak vram usage figure in each test. I don't see how this would affect the results since it would run for every card - and should be taking an equivalent amount of resources regardless of AMD or Nvidia.
 
Yeah I'm not sure why the article reads like Nvidia has the edge, when the benchmark numbers seem to point to better minimum and better maximum on the AMD by 5FPS while the average is only 1FPS different. Looking at the graphs, it looks like AMD took this one - despite the fact this is marketed as an Nvidia game (optimized most heavily for Nvidia). As a recent Fury X purchaser - I can't say I'm disappointed. I game at 2560x1600 on a Dell 3014. I'll probably pick this game up on account of your review, and my traditional fondness of Tomb Raider games.
 
Great article. Love this game so far, but I stopped playing due to frame drops. I'm hoping a performance patch comes soon.
 
I played in on my XBO because I got if for free. I really enjoyed the game, lot of fun and overall. I think the gaming mechanics were better this time around, with the story being a bit better in the previous one. The PC version sure looks good...
 
When I go to a new area I get a slow down for about a second high 20fps to 30 fps and then the area runs around 60fps. If I go back to the previous area same thing happens, rinse and repeat. To me it looks like it is loading resources to memory since at 4gb per card it is pushing it. Since DX11 and mostly single threaded - the resources such as shaders, textures etc. are limiting draw calls - once DX12 patch hits I expect this to be less pronounced. 290x/290, FX9590, 3440x1440 max settings. I am reducing the settings to see if this affects this behavior. Otherwise this game is gorgeous and very fun, also very immersive with character.
 
Would it be possible to add 1080p high end tests where you'd need to maintain 60fps minimum? Something you may desire if you're using Steam to stream to a TV. It's a different approach, but the big screen can be nice if you have people over.
I've only had the game an hour or so but it seems at 1080p you can max everything and hug 60fps with a stock clocked GTX980ti.
Even better you can use 2331x1323 DSR and turn AA off.

Downside is the game only lets you use its own version of DSR (SSAA) in Windowed mode with borders, or at least thats all mine allows.
There is no option in the game to use the NVidia DSR modes unless you set your desktop to the resolution you want to use first.
Then you can select that mode and enable Exclusive Full Screen to get the best performance.

Using MSI Afterburner to report the FPS etc, it is a 60fps straight line with the odd bump when a new section loads.

GPU use touches 99% a few times but the framerate didnt drop from 60fps. It is generally 70 to 90%, a lot around 80%+.
Max clock speed was 1304MHz.
GFX memory use went a bit over 4.5GB
CPU use on a clocked 6600K is much less than 50% average. One core peaked at near 70% for a moment.
3 cores average around 30% each, the last core around 45%.
System ram use with page file on is less than 7GB total for the OS and game (Win7 64).

Gameplay is incredibly fluid and looks fantastic!


TLDR:
Works great at 1080p with everything maxed except AA, on a stock clock GTX980ti (EVGA SC+ ACX 2.0+)
Instead of AA use DSR. Change desktop res to DSR 2331x1323 res, launch the game, change to 2331x1323 and enable Exclusive Full Screen.

I just found that once the DSR mode is set in game you can launch it from a 1080p desktop.
fyi
 
Last edited:
Thanks Brent, great review. Luckily i didn't get the fury and bought the r9 390x after i compared performance on dx12 gaming. The r9 390x could utilize its 8gb ram beating r9 fury at certain aspects. Guess i made the right choice.
 
Back
Top