Rise of the Tomb Raider Video Card Performance Review @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,532
Rise of the Tomb Raider Video Card Performance Review - A new Tomb Raider game is out, Rise of the Tomb Raider. We take RoTR and find out how it performs and compares on no less than 14 of today's latest video card offerings from the AMD and NVIDIA GPU lineups from top-to-bottom using the latest drivers and game patch v1.0 build 610.1_64.
 
Superb review, gentlemen. Really comprehensive. I was wondering when you were going to hit this game up, and didn't disappoint.

That Crossfire and SLI scaling is terrific, near 2x! Unfortunately, neither work on my Windows Store version since dedicated fullscreen (in addition to any FPS monitoring overlay) is not allowed/supported. Good to see confirmation of the VRAM usage, but not surprising considering those excellent textures at very high settings.

As for the game itself, it's terrific. I'd highly it recommend to anyone.
 
Great review as always. We just started playing the game, good to see my FPS are close to what [H] is seeing.

I still find it hilarious that so many of us at [H] called out HBM as a poor decision by AMD at 4GB since the performance increase would be marginal, possible cost and supply issues, and a low VRAM capacity. Really makes you question their engineering department. I still think it's a fine card for a single card application - I have one in my parents SSF case and have not had any issues, but for crossfire where you want to turn up the VSR and features it doesn't make sense.

So far the game is fantastic.
 
Any chance we can add 3440x1440 resolution to the mix? Those 21:9 monitors seem to be a hit right now.
 
The game is indeed fantastic, finally a forward looking game graphically speaking. I hope there are more this year.
 
Any chance we can add 3440x1440 resolution to the mix? Those 21:9 monitors seem to be a hit right now.

If it helps, multiply 3.7/5 * the 1440p results and that'll give you the round about numbers. 1440p is 3.7MP and 3440x1440 is 5MP so it's not a huge jump.

3440x1440 is an awesome res because you can play most games on high or ultra with a single card.
 
Any chance we can add 3440x1440 resolution to the mix? Those 21:9 monitors seem to be a hit right now.

I don't have one. The performance would be less than 4K performance, and greater than 1440p performance. You could perform a calculation that shows the decrease in performance from 1440p to 3440x1440, and then the difference from 3440x1440 to 3440x2160 and estimate what performance would be at that resolution with the data. It would be somewhere between 1440p and 4K performance, probably closer to the 1440p numbers, we are only talking a horizontal resolution from 2560 to 3440 being 880.

Basically, if its playable at 4K, it will most assuredly be playable at 3440x1440, and take what is playable at 1440p and figure it will be a little slower and maybe have to turn down a quality notch.
 
Fury x beats the 980 ti in this case and even ties with the non x fury. Interesting!

Wonder if this would be the case with future games.
 
Great review as always. We just started playing the game, good to see my FPS are close to what [H] is seeing.

I still find it hilarious that so many of us at [H] called out HBM as a poor decision by AMD at 4GB since the performance increase would be marginal, possible cost and supply issues, and a low VRAM capacity. Really makes you question their engineering department. I still think it's a fine card for a single card application - I have one in my parents SSF case and have not had any issues, but for crossfire where you want to turn up the VSR and features it doesn't make sense.

So far the game is fantastic.

Yep, if I had time I'd go back and find some of the threads lampooning people for decrying the use of only 4GB. But it wasn't really a bad engineering decision as much as the high-density chips weren't there when AMD started putting out Fiji.

In any case, major props to Square Enix for doing a PC port right. Last year's TR redux was excellent, and this one is even better (finer graphics, better gameplay, more upgrades, more puzzles, bigger spaces etc, etc.) And yes, it really pushes cards to their limits.
 
Thanks for the heads up on "Exclusive Fullscreen".
I have been getting a constant 36 FPS in Surround with SLi on custom very high.

While the game runs very well, it would be nice to see a bit more FPS.
 
Great article. I was particularly impressed with the multi-GPU scaling. I wonder if that continues beyond two cards?

However, I must point out one glaring omission, one that I've raised before: nowhere do you give specific VRAM usage figures. I'm not just talking about VRAM usage at 4K, but at all resolutions. This box has two 12 GB Titan X cards, but the reserve box to my left has a 3 GB GTX 780 Ti and uses a 1200p screen. Will I see VRAM problems on that? You wonder about the scaling difference between the 980 Ti and the 390X. Could near 6 GB VRAM usage be part of that difference? I can't tell from the article. Imagine that I have a GSync monitor and need two new GPUs now; do I buy two 980 Tis or two Titan Xs for this game? Indeed, I was somewhat disappointed to see that you did not test the Titan X in SLI, but I guess you didn't have two with which to test.

VRAM usage numbers are critical, so please include them.
 
Great article. I was particularly impressed with the multi-GPU scaling. I wonder if that continues beyond two cards?

However, I must point out one glaring omission, one that I've raised before: nowhere do you give specific VRAM usage figures. I'm not just talking about VRAM usage at 4K, but at all resolutions. This box has two 12 GB Titan X cards, but the reserve box to my left has a 3 GB GTX 780 Ti and uses a 1200p screen. Will I see VRAM problems on that? You wonder about the scaling difference between the 980 Ti and the 390X. Could near 6 GB VRAM usage be part of that difference? I can't tell from the article. Imagine that I have a GSync monitor and need two new GPUs now; do I buy two 980 Tis or two Titan Xs for this game? Indeed, I was somewhat disappointed to see that you did not test the Titan X in SLI, but I guess you didn't have two with which to test.

VRAM usage numbers are critical, so please include them.

When testing performance I don't want any third party utilities running in the background that could affect framerate, so I don' record VRAM usage when testing.

To do VRAM testing, it has to be a separate specific test for that which takes more time, basically double the work. The scope of this article is performance only.

That said, the follow-up will have VRAM information you are seeking. Remember, we may not be able to cover every single thing in the first review, that is why we usually do 2 or whatever is needed when it comes to big games like this to cover.
 
When testing performance I don't want any third party utilities running in the background that could affect framerate, so I don' record VRAM usage when testing.

That's an excellent point. Perhaps one worth mentioning in the article given the other comments about VRAM in the article?

That said, the follow-up will have VRAM information you are seeking.

Excellent.

Remember, we may not be able to cover every single thing in the first review, that is why we usually do 2 or whatever is needed when it comes to big games like this to cover.

And that's why your reviews are so much better than the rest.
 
What's the rationale behind different graphics options and performance optimizations between different store versions?
 
Really good performance review, Brent. That multi-gpu scaling is crazy awesome. It's too bad more games don't see that kind of scaling. The game itself is great, possibly even better than the reboot. Nixxes really hit a home run on the PC version. Can't wait to see what they do with the PC version of Deus Ex: Mankind Divided.
 
Really good performance review, Brent. That multi-gpu scaling is crazy awesome. It's too bad more games don't see that kind of scaling. The game itself is great, possibly even better than the reboot. Nixxes really hit a home run on the PC version. Can't wait to see what they do with the PC version of Deus Ex: Human Revolution.

They have also been quick to patches, which is a good sign of support.

I too am very excited for the next Deus Ex game, one of my favorite game series.
 
They have also been quick to patches, which is a good sign of support.

I too am very excited for the next Deus Ex game, one of my favorite game series.

Very. The game runs really well at 3440x1440 after the patches too. I can probably tweak things a little more but I'm running with several settings at max with soft shadows, dof, and AO set to "On" and AA at "FXAA". Everything else is set to max. Makes for one heck of a stunning looking game.
 
Interesting to see the R9 390 outperform the GTX 970, pretty sure it was the opposite at launch. Oh well, makes me happier about my purchase!
 
Fury x beats the 980 ti in this case and even ties with the non x fury. Interesting!

Wonder if this would be the case with future games.
Where did you see that? At 2560x1440 the 980 Ti has a slightly higher average framerate than the Fury X, but they're virtually tied. The Fury X beats the 980 Ti at 4K, though. At both resolutions the regular Fury is using lower graphical settings than either the Fury X or 980 Ti.

What's the rationale behind different graphics options and performance optimizations between different store versions?
My guess would be that, much like the consoles, patches have to go through an "approval" process for games hosted on the Microsoft Store, while they can be pushed through as soon as they're ready on Steam. Why the Microsoft Store version doesn't support Exclusive Fullscreen or SLI/Crossfire is beyond me.
 
3440x2160 - 4K
3440x1440 - User Interest
2560x1440 - 1440p

Slowest
Less Slow
Fastest

It's not your day today :D You're missing a word there, like performance hit or similar, or less and greater need to switch places.
 
What's the rationale behind different graphics options and performance optimizations between different store versions?

Certain store/digital distribution applications are limiting options and support for the game. I know Windows Store at least forces users to use borderless windowed mode in the game which degrades performance on some cards and is not supported by multi-card configs (at least, not easily), and vsync is locked to On. In addition, Windows Store (not sure about others) hasn't received the most recent patch yet. This is the first time I remember a distribution service not releasing a patch in line with its competitors. Can't speak for other services, but I know these are not issues on the Steam version of the gam.
 
My guess would be that, much like the consoles, patches have to go through an "approval" process for games hosted on the Microsoft Store, while they can be pushed through as soon as they're ready on Steam. Why the Microsoft Store version doesn't support Exclusive Fullscreen or SLI/Crossfire is beyond me.

Certain store/digital distribution applications are limiting options and support for the game. I know Windows Store at least forces users to use borderless windowed mode in the game which degrades performance on some cards and is not supported by multi-card configs (at least, not easily), and vsync is locked to On. In addition, Windows Store (not sure about others) hasn't received the most recent patch yet. This is the first time I remember a distribution service not releasing a patch in line with its competitors. Can't speak for other services, but I know these are not issues on the Steam version of the gam.

That's just...retarded. Oh, Microsoft, you never cease to amaze.
 
Certain store/digital distribution applications are limiting options and support for the game. I know Windows Store at least forces users to use borderless windowed mode in the game which degrades performance on some cards and is not supported by multi-card configs (at least, not easily), and vsync is locked to On. In addition, Windows Store (not sure about others) hasn't received the most recent patch yet. This is the first time I remember a distribution service not releasing a patch in line with its competitors. Can't speak for other services, but I know these are not issues on the Steam version of the gam.
It's these kinds of things that kills my enthusiasm for cross-platform releases in the future like the forthcoming Quantum Break.
 
That's just...retarded. Oh, Microsoft, you never cease to amaze.

Yea... I suppose it's MSFT's plan to bring a "console experience" to PC users :rolleyes: I can deal with lowering the settings a bit to play in borderless windows (and not leverage my second GPU), but by forcing vsync they're preventing the use of G-sync or Freesync -- that's just dumb.
 
What has that got to do with different game options and patch notes for Steam vs other versions?

He did not understand your question. As to the answer, we have no idea.
 
I am running this game on my 980 with a g-sync monitor at 2550x1440. It has been silky smooth. Really enjoying it.

It would be interesting to see the HardOcp reviewers using their process for evaluating performance running a NVIDIA card with a G-sync monitor vs. an AMD Card with a FreeSync monitor vs. AMD & Nvidia Cards on a regular monitor. Can you get a better experience with higher settings on a x-Sync monitor (without stuttering) that you might experience on a non-Sync monitor?
 

Yes, Brent started that wrong/backwards.

We will not be adding another resolution to testing. Obviously the performance would land between 1440p and 4k.
 
Yea... I suppose it's MSFT's plan to bring a "console experience" to PC users :rolleyes: I can deal with lowering the settings a bit to play in borderless windows (and not leverage my second GPU), but by forcing vsync they're preventing the use of G-sync or Freesync -- that's just dumb.

It may be audience driven?
Maybe the demographic that uses the store doesn't know how to adjust PC seettings?
Maybe they don't use multi-card GPU setups?

Sort of smacks at pre-setting the game. Exactly as you said.....to give that console experience to the PC game....ie, you don't have to touch anything, just load the game and play.......meh.

I'll stick with Steam, thank you.
 
I am running this game on my 980 with a g-sync monitor at 2550x1440. It has been silky smooth. Really enjoying it.

It would be interesting to see the HardOcp reviewers using their process for evaluating performance running a NVIDIA card with a G-sync monitor vs. an AMD Card with a FreeSync monitor vs. AMD & Nvidia Cards on a regular monitor. Can you get a better experience with higher settings on a x-Sync monitor (without stuttering) that you might experience on a non-Sync monitor?

Since we test with vsync off, I'm going to say no, it wouldn't change the data.
 
Again, thanks for the tip on using that exclusive full screen .

Ive seen my performance go from 36 fps to now a constant 65 fps in 5760 x 1200 very high settings.

Yikes!!!
 
Again, thanks for the tip on using that exclusive full screen .

Ive seen my performance go from 36 fps to now a constant 65 fps in 5760 x 1200 very high settings.

Yikes!!!

I know right, and most people probably do not know about this behavior. There are people out there suffering lower performance than they should be because of one simple check box. I hope this info really gets out there for people, they can have a better experience easily.

I found out about this by trial and error myself, no documentation I had covered this.

It is a pretty big deal.
 
Would it be possible to add 1080p high end tests where you'd need to maintain 60fps minimum? Something you may desire if you're using Steam to stream to a TV. It's a different approach, but the big screen can be nice if you have people over.
 
Good review, testing multiple cards from each company at different resolutions mostly based on price. I have a question on the conclusion, however.

"For single-GPU our order of recommendation for this game is AMD Radeon R9 380X 4GB, AMD Radeon R9 390, AMD Radeon R9 390X, GeForce GTX 980, GeForce GTX 980 Ti."

Didn't the 390X slightly outperform the 980? In addition to being cheaper and having more VRAM, I find it odd that the 980 gets the recommendation over the 390X. Any more thoughts on this?
 
Back
Top