Rise of the Tomb Raider Graphics Features Performance @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,602
Rise of the Tomb Raider Graphics Features Performance - We will take the new game Rise of the Tomb Raider and deep-dive into specific graphics features performance. We will find out how each one affects performance and if AMD or NVIDIA GPUs are better at certain features. We will also evaluate VRAM utilization of features. We are using the latest patch v1.0 build 623.2.
 
Thanks for the article, Brent and Kyle.

The low VRAM usage on the 390X is really interesting. It's weird that it used a little dynamic memory instead of going over 4GB. I wonder if the 390X acts that way in every game or if it's unique to this one.

I'm sure these kind of articles are time consuming, but I'd be really interested in seeing more of them in future. It was a very interesting read.
 
i am really enjoying this game with my 2 X 390X cards, however my game crashed when i put textures at very high (while every thing is maxed) i assume this is a bug in the drivers.

great article.
 
Great article! Since more cards are moving towards HBM and utilizing system memory, any chance of a system memory shoot out featuring the likes of Corsair and others? DDR4 style, since that's where Intel and AMD are headed in the future.
 
Hand Tuning games pretty much becomes a moot point with HBM 2 that can have 8GB or more memory.
 
Hand Tuning games pretty much becomes a moot point with HBM 2 that can have 8GB or more memory.

Well AMD is still launching a Fury X2 as far as I know. That means that cards like the Nano, Fury, and Fury X will be a part of AMD's new lineup. Of course new Polaris cards are releasing with HBM2, but not everyone is going to buy a flagship card.
 
It'll be fun to tweak the settings as I go.
Also glad to see the hair effects requiring less from the GPUs. That TressFX was very demanding.

Lovely write-up!
 
Interesting situation regarding the VRam, was hoping you'd toss a GTX 970 in the mix to see if there is a stuttering issue as some have reported with Very High textures at 1440p. Nice article!
 
Great write up for sure! It's very interesting how well balanced and optimized each performance setting is considering just how great this game looks (and it doesn't just look great, its also a really good and fun game!).

I gotta say though....and I'm just nitpicking here...but I would have liked to see some screenshots to show how some of the setting levels affect the game scene.
 
I would love to see analysis of how AMD's dynamic vram usage affects performance based on PCIe and main system ram type.

For example, PCIe 8x has never suffered much of a performance hit compared to 16x, but would it make a difference when the GPU needs to access the system's main memory? Or, is PCIe 8x still going to be sufficient bandwidth even with the additional overhead of accessing main memory?

And what about ram speed? Will higher speed memory now have a greater effect on performance when the GPU's vram is full and the system's main memory starts being used?
 
Excellent article. Thank you for including the VRAM usage figures. I've read elsewhere how well SLI works with Tomb Raider, so I'm in for a treat when I get it.
 
The pretty screenshots are all on Nvidia's website; Rise of the Tomb Raider Graphics & Performance Guide

With the crashing mentioned above, do you get any errors or just a gray screen? I keep get DXGI_ERROR_blahblahdik if my 970m is oc'd. And it seems, although rare, that it will happen even if not oc'd. Every other game is just darn groovy, and I've seen this in the past with games. It's frustrating, and what about a game makes it work differently than another at certain clocks? Does it have to do with how the frames are presented and clock cycles or some crap? Anyone have a clue? Or just a driver issue - I've used different v's, so probably not - but who knows!
 
[H] pictures are, if they are snapped, and they weren't. Nvidia's guides are really good, the only thing they can improve on are actual frames and %. Which [H] does.
 
Where can I find the other performance reviews? Conclusion says this is the first time they've seen hbm perform like amd claimed. From my experience this happens in every game I've played
 
Great follow-up analysis here. Part I was the best article out there on this game, Part II makes it the best by leaps and bounds. Really interesting stuff on Fury's use of dynamic VRAM/RAM.

Looks like ROTR is going to prove to be this gen's Far Cry/Crysis/Metro 2033 i.e. a boundry-pushing title that only the next gen GPUs will run at highest settings.
 
Where can I find the other performance reviews? Conclusion says this is the first time they've seen hbm perform like amd claimed. From my experience this happens in every game I've played
Because it does happen in every game. AMD tune it on a per game basis.

4GB HBM is enough for 4K for AMD!

Scream it at the top of your lungs. It's the truth.
 
I would love to see analysis of how AMD's dynamic vram usage affects performance based on PCIe and main system ram type.

For example, PCIe 8x has never suffered much of a performance hit compared to 16x, but would it make a difference when the GPU needs to access the system's main memory? Or, is PCIe 8x still going to be sufficient bandwidth even with the additional overhead of accessing main memory?

And what about ram speed? Will higher speed memory now have a greater effect on performance when the GPU's vram is full and the system's main memory starts being used?

Good questions. I'd like to have those answered as well.
 
Because it does happen in every game. AMD tune it on a per game basis.

4GB HBM is enough for 4K for AMD!

Scream it at the top of your lungs. It's the truth.

So they notice it for the first time, because it's the first time they actually looked
 
So they notice it for the first time, because it's the first time they actually looked
You would be incorrect sir. Please take your lies and made up stories elsewhere.
 
Where can I find the other performance reviews? Conclusion says this is the first time they've seen hbm perform like amd claimed. From my experience this happens in every game I've played
Please share with me YOUR results.
 
AMD Radeon R9 Fury X CrossFire at 4K Review @ [H]

My results mirror the ones in the tomb raider review, but in every game. Only reason I say this is because the review strongly emphasizes that this is the first time hbm has performed this way in a game. So I'm wondering what other reviews analyzed hbm like this
As noted above, can I see your results in EVERY GAME that back up your statements?

And you confusing what we present publicly to what we actually look at during our real world gaming testing are two different things. I have a message into Brent to respond to you on this since he is the person with hands-on that is specific to your issue. If VRAM is something we have "just been missing" for the last year, he would be able to confirm that.
 
The one limiting factor may be the PCIe bandwidth when operating under CrossfireX which uses XDMA.

This might explain the bad scaling for Fiji with Crossfire under certain titles where the minimum FPS drops bellow 30.
 
Evidently, if you test a game prior to AMD performing driver optimizations for it, then you're going to have issues.
I guess that is a possibility. AMD has not been transparent on any changes in tuning on games though.
 
So this solution would give you all the benefits of HBM even if a game requires a larger than 4GB framebuffer.

The problem would be PCIe saturation, a possibility, when using two Graphics cards in a CrossfireX XDMA configuration.

Yeah, I get that. It is what AMD has said for a year. All our initial testing did not exhibit this at all. And interestingly enough, AMD has never come to us and said, hey "look at this" as by AMD's own admission, this has to be tuned for per game. AMD and its PR is sometimes its own enemy.

I do not think we are seeing PCIe saturation issue quite frankly. I am not saying that it is all of a sudden a problem, but every time we have investigated this over the years, we have found little if no impact in this area. When PCIe Gen3 came around, it made these issues even less of a concern. There is a LOT of bandwidth there. And we have talked to AMD and NVIDIA about these issues in the last year and whether we should be concerned about impact in these areas negatively impacting performance...and both basically said no.
 
One interesting test would be to add CrossFireX and SLI numbers to your review.
Considering it took us over two weeks to pull together this article as it is, we were not looking at doubling our workload. :)
 
Crazy tin-hat thought of the day--does the 390X really have 8 GB of RAM? Is this another 970-4 GB-3.5 GB slice of insanity?!?
 
What application are you using to monitor the VRAM usage?

Is it reporting the R9 390x usage properly? Because wouldn't the R9 390x show spillage into the system memory of it were gimped GTX 970 style and only had access to 4GB?
It was a joke.
 
what I don't understand correctly is the fact that in the initial Tomb Raider review you said couple of things which contrast badly with this review

first one:

Brent said:
The AMD Radeon R9 Fury CrossFire and AMD Radeon R9 390X CrossFire both allowed the same level of gameplay at "High" settings. Now, technically yes the AMD Radeon R9 Fury CrossFire is faster. However, due to its limitation of only having 4GB of VRAM versus 8GB on R9 390X that causes us trouble at higher settings. We would get bad stuttering and drop outs in performance. It was as if CrossFire would stop working for a brief time, then kick back in. Therefore, we had to stick with "High" settings, we just couldn't play at any higher level of settings. The Radeon R9 390X CrossFire combination is slower, but due to its 8GB of VRAM had no issues with stuttering or falling out of performance.


second:
Brent said:
This game is sensitive to VRAM capacity. This is a problem on the AMD Radeon R9 Fury and Fury X. While CrossFire scales great on the Radeon R9 Fury, its limited VRAM capacity ultimately keeps us from being able to utilize that performance at 4K. Therefore we end up having to use lower settings so the game doesn't stutter.

However, AMD Radeon R9 390X has 8GB of VRAM per GPU. This allowed a 99% scaling efficiency, and "High" level of quality settings at 4K for a more affordable price than two GeForce GTX 980 Ti's in SLI would cost. AMD Radeon R9 390X CrossFire costs about $400 per video card, so around $800 for a CrossFire setup. Therefore, AMD Radeon R9 390X CrossFire is by far the best "value" for 4K with around an $800 cost.



And then why is vRAM not an issue this time in this game 14 days after the first one?.
 
what I don't understand correctly is the fact that in the initial Tomb Raider review you said couple of things which contrast badly with this review

first one:




second:




And then why is vRAM not an issue this time in this game 14 days after the first one?.

That's what I'm getting at. I think that CrossfireX + this memory management optimization from AMD might be causing PCIe bus saturation.

It could also be:
Higher API overhead/CPU bottleneck
Or maybe these optimizations don't work with CrossfireX?
 
I get that but the review showed an odd result. Only 4GB used on the R9 390x and no Dynamic Ram which begs the question why?

Either the tool you're using is misreporting or ?

Trust me I know you were kidding.

Well, the fact is....just exactly like we wrote, we do not know why. Funny thought, if AMD is tuning game drivers to use less than 4GB of VRAM for HBM cards, are those same drivers impacting other cards? That was my initial thought, but just a thought, nothing more, nothing less.

Brent is aware of this thread, if he wants to take time to answer some of our specific questions, he can. I am not going to be a go-between on a lot of this.
 
I get that but the review showed an odd result. Only 4GB used on the R9 390x and no Dynamic Ram which begs the question why?

Either the tool you're using is misreporting or ?

Trust me I know you were kidding.

Programs used were the latest version of GPUz and HWInfo, both show Dedicated and Dynamic RAM for AMD GPUs and both were used to verify results.

Total Dynamic RAM used on 390X went up to 230MB max.

It isn't the card, I am working on another quick VRAM article and this card has exceeded over 4GB in my current testing, so there really is 8GB addressable on board.

As mentioned, I think it is a difference in the way the game or driver is managing RAM, it is possible it is spilling over from Fury/X tweaks. Perhaps because there is 8GB there you don't get stutter since there is plenty of headroom to fill to whatever, flush, and re-fill, with over spill, if that makes sense. Just a guess/theory.
 
Where can I find the other performance reviews? Conclusion says this is the first time they've seen hbm perform like amd claimed. From my experience this happens in every game I've played

There is no way that we have played every game available, but we do get our hands on many games, and test. There are times we test games, that don't make it to article status.

We do have many 4K and CrossFire reviews with Fury and Fury X you can check out on our site.

GPU / Video Cards Page 1
 
So they notice it for the first time, because it's the first time they actually looked

I think the fact that the game can use up to 4GB of Dynamic VRAM, and exceeds 4GB of Dedicated VRAM indicates that 4GB of Dedicated VRAM isn't enough. Rise of the Tomb Raider can utilize a lot of VRAM as witnessed by the NVIDIA GPUs. Therefore, yes, Fury and Fury X can use Dynamic RAM, but, ultimately wouldn't it be better to keep everything in local HBM VRAM? These issues will be resolved with HBM2, but this game shows us that games right now are utilizing way more than 4GB of VRAM.
 
After reading this yesterday I watched some YouTube of Mz. Croft.

The game looks amazing and likely wasnt even recorded at best settings.

Up till now I have pretty solidly planned to get a 980 ti when my bonus check hits and maybe a second on the next.

Do you guys see other games also going over the 6 gb mark? Will Nvidia suddenly start to support the use of system ram for caching textures?

I play currently at 5760*1080 which is more pixels then 2560*1440 (6220800 vs 3686400)

Having watched in the past I know I have existing games I play that get close to my 4gb without AA.

Is 2 gb more enough?

Anyone of the nice Titan X persons out there with 4k or surround and ESO/Witcher and the like up for maxing it all out and reporting Vram usage?

I know Kyle is not an MMO person. The newer MMO's use textures very differently, I think, then shooters. With all the armor, spells, clothing and zones with differing geography that can be moved though rather quickly I wonder how high things can stack up and how quickly)
 
"At its heart, PureHair is based off of AMD's TressFX, but is custom built in-house by Crystal Dynamics, it is not using TressFX leveraged by AMD."

What?
 
I like this game a lot, and it performs pretty well on my Fury at maximum settings on 1920x1080. There is occasional slowdown, and a lot of times gameplay goes below what 60fps normally feels like. Not a big deal, though.

HBAO+ is the big frame-rate killer here, for me. I turn it off, and I'm smooth as can be. Unfortunately, that is also the option that makes a difference in a lot of the visuals -- so I keep it on.
 
Back
Top