Shadow of the Tomb Raider Demo Is Hobbled by Real-Time Ray Tracing and Other Tech

Ok, let's use some common sense here. We've been over 60fps in 1080p for YEARS, and the 1080ti can do 4k60 in some titles. So you expect a new card with 800 extra cuda cores, better ram and clocks CAN'T do 4k30-70fps with ray tracing when it specifically has hardware ray tracing capabilities.

Who's not using common sense?
your level of ignorance is unprecedented. You are a God damn fool if you think they were getting that kind of performance with Ray tracing and Max settings at 4K. Anyone with an IQ above a fish stick knows that the developers and everyone else will be shouting from the rooftops if this was at 4K. It's at 1080P and that is a fact.
 
your level of ignorance is unprecedented. You are a God damn fool if you think they were getting that kind of performance with Ray tracing and Max settings at 4K. Anyone with an IQ above a fish stick knows that the developers and everyone else will be shouting from the rooftops if this was at 4K. It's at 1080P and that is a fact.

What exactly do you think the RT cores are doing exactly? If RT is turned on, meaning less cuda cores are being used to calculate all those shadow effects, what do you think those extra 800 cores were doing? Also noone said the game was running at max settings, noone, not the capturer, not the dev. AA could have been completely turned off and running at 4k for all we know at this time. Nobody said this was running DLSS or SMAA or anything. You are the god damn fool believing we are seeing a $1100 GPU having hiccups at 1080p.
 
What exactly do you think the RT cores are doing exactly? If RT is turned on, meaning less cuda cores are being used to calculate all those shadow effects, what do you think those extra 800 cores were doing? Also noone said the game was running at max settings, noone, not the capturer, not the dev. AA could have been completely turned off and running at 4k for all we know at this time. Nobody said this was running DLSS or SMAA or anything. You are the god damn fool believing we are seeing a $1100 GPU having hiccups at 1080p.
every single reviewer and Tech site is reporting this was at 1080P and even the developer did not dispute that when responding. You find me one fucking bit of proof that this is at 4K.
 
There's a few of these Vray tech demos out. The Battlefield V one was interesting and they have a secondary video where they break down where they're using the tech (no prebaked cube maps).

Time will tell if it's actually playable. I doubt it will be for a few generations if you want to maintain a decent resolution and a higher framerate.
 
I am surprised people turn down Ambient occlusion and shadows first. They are what makes a game look realistic. Model detail and distortions/post processing are usually less noticeable and help perf. Shadows and lighting are critical to me at least.
 
I am surprised people turn down Ambient occlusion and shadows first. They are what makes a game look realistic. Model detail and distortions/post processing are usually less noticeable and help perf. Shadows and lighting are critical to me at least.
And most people don't even realize that Shadows are actually a mostly CPU limited setting in many if not most modern games.
 
every single reviewer and Tech site is reporting this was at 1080P and even the developer did not dispute that when responding. You find me one fucking bit of proof that this is at 4K.

Ya, from what I know about tech sites/reviewers, they will trumpet info asap to get clicks without verifying. I'll eat humble pie if I'm wrong, but it makes 0 sense we'd go so far backwards with a card that even with more cuda and RT cores can't hit 60fps at 1080p. Only two places have actually seen, in person, the data. One is written in german pcgameshardware.de and only states they captured at 1080p60, they mention nowhere that they saw the resolution in game as 1080p. The other is PCGamesN, and he states in his article, he didn't see the resolution or gfx settings. He's also assuming 1080p but he says GFE was "capturing" in 1080p so he's assuming it's 1080p...that's a pretty bold assumption.

As for your proof, other than the fact that the monitor used was a 4k acer g-sync 60hz monitor, and that there's no proof it's 1080p either, since NOONE saw the gfx setting, tells me neither of us have the whole picture so then I choose to fall back on logic. And logic to me differs from you too, that you think we could have fallen that much off pace just due to ray tracing. Even Jensen mentioned all the demos in his presentation were at 4k. So if those demos, which granted weren't great in terms of showing off running around in game, weren't running at 2fps at 4k, it's pretty safe to assume this demo was at 4k, probably with AA off altogether.
 
Last edited:
Well soon the truth will come out as we will have reviews. It really has to make you wonder why Nvidia releases these cards with absolutely zero reviews. Bet your sweet rear-end though that Nvidia is going to limit the hell out of any kind of testing that will put the cards in bad light.
 
I am surprised people turn down Ambient occlusion and shadows first. They are what makes a game look realistic. Model detail and distortions/post processing are usually less noticeable and help perf. Shadows and lighting are critical to me at least.

It's all about the frame rate for some people. A lot of fps players will have $5000+ machines while turning every option off.

Really, I think the first quality uses will be for simple games. Maybe a remaster of Portal 1 & 2 where you can see what ray tracing is capable of, while not really taxing the system.
 
is it really 'unfinished code'?...the game went Gold a few weeks back...will be hilarious when 2080Ti owners have to disable ray-tracing in all games to get acceptable frame rates lol
 
  • Like
Reactions: DF-1
like this
Hunt: Showdown looks better without all that Ray crap.

And who the hell goes running around in game when their trying to survive to say,, oh hey look at the cool shadows?

I have to say, I would love to see a new Thief game with ray-tracing (and a decent story like the original two games).
 
is it really 'unfinished code'?...the game went Gold a few weeks back...will be hilarious when 2080Ti owners have to disable ray-tracing in all games to get acceptable frame rates lol

The Ray Tracing API portion of DirectX right now isn't complete. It's "Experimental" at the moment. Thus, anything could change. The game companies and hardware companies have been working side by side to create the SDK.
 
Isn't that game crazy dependent on system memory as well, I thought I saw a video comparing this game performance between 8gb/16/32 gb ram and it jumped up each time considerably.
 
and no where did it say 4K for shadow of the Tomb Raider, and in fact they made it clear that everything has been pointing to 1080p so far.

Wccftech is one of those terrible sites, that add info at the top of already discussed info, so it can be confusing to read since the bottom stuff contradicts the middle. Essentially the "new" info is only near the top. They, like other sites just assume once again that even though they are talking about 100fps at 4k ultra without RTX, yet somehow when techradar says "We have multiple reports on the performance of the new cards with RTX enabled and the game ran at around 50-60 FPS on average on a single GPU" that they OBVIOUSLY must mean a change to 1080p now...

Let's go to the source:

https://www.techradar.com/reviews/nvidia-geforce-rtx-2080-ti

They don't mention 1080p anywhere.
 
Jensen admitted during the launch that they were reaching too far ahead for RT. (Slightly spin doctored with enthusiasm, of course.) It reminds me of when SM3.0 came out and none of the "capable" cards could handle decent frame-rates for another two generations. It was common practice to turn graphics settings way down for playability, and ATI was competing with older cards by merely disabling features at the driver level.

I sense some incoming history repeating.
 
SM3.0 was more for developers that removed a big roadblock with shaders and what devs could accomplish with them.

There were games that went the next level, which kind of proved the point.
 
What exactly do you think the RT cores are doing exactly? If RT is turned on, meaning less cuda cores are being used to calculate all those shadow effects, what do you think those extra 800 cores were doing? Also noone said the game was running at max settings, noone, not the capturer, not the dev. AA could have been completely turned off and running at 4k for all we know at this time. Nobody said this was running DLSS or SMAA or anything. You are the god damn fool believing we are seeing a $1100 GPU having hiccups at 1080p.

https://www.digitaltrends.com/compu...responds-to-rtx-2080-ti-performance-concerns/

It was at 1080p.

Here is another one.

https://www.pcgamesn.com/nvidia-rtx-2080-ti-hands-on
 
Ok, let's use some common sense here. We've been over 60fps in 1080p for YEARS, and the 1080ti can do 4k60 in some titles. So you expect a new card with 800 extra cuda cores, better ram and clocks CAN'T do 4k30-70fps with ray tracing when it specifically has hardware ray tracing capabilities.

Who's not using common sense?

I think you have no idea of just how hard it is to do Ray Tracing in real time. It's incredibly complex.
 

Your 2nd post, the guy said he didn't get to see the resolution setting. The first one is using the german sites off-screen capture which they don't mention the resolution at all other than to say they captured at "Full HD". Again I'll eat humble pie if I'm wrong, but my post from TechRadar states they viewed many demos running at 4k, and when discussing the SotTR only mentioned the fps, not the resolution. Also this makes way more sense than nvidia releasing a card with less than 1080p60 capabilities(2070/2080) for 1080p;
 
So on the upside, stock dual cooler :)


Didn't expect much from ray tracing since tech demos have been so shady in the past.
 
I think you have no idea of just how hard it is to do Ray Tracing in real time. It's incredibly complex.

No, I know exactly how hard it is. Hence why it's taken 10 years for them to release a card with the capabilities implemented. We are also not talking about full on raytracing the whole scene. I'm sure when the whitepapers are released, it will show some cheating going on, just like when shader model 1.0 came out and some games were and still bake their shadows in(doom).
 
No, I know exactly how hard it is. Hence why it's taken 10 years for them to release a card with the capabilities implemented. We are also not talking about full on raytracing the whole scene. I'm sure when the whitepapers are released, it will show some cheating going on, just like when shader model 1.0 came out and some games were and still bake their shadows in(doom).

for this reason, its why i think it was at 1080p. Since when is the 1st implementation of anything new every well done? It generally won't be playable for at least another generation... How would this be any different?

also, if it was really 4k, they would make sure we all knew it was 4k. The very fact that we are questioning this is a caution sign to me.
 
Your 2nd post, the guy said he didn't get to see the resolution setting. The first one is using the german sites off-screen capture which they don't mention the resolution at all other than to say they captured at "Full HD". Again I'll eat humble pie if I'm wrong, but my post from TechRadar states they viewed many demos running at 4k, and when discussing the SotTR only mentioned the fps, not the resolution. Also this makes way more sense than nvidia releasing a card with less than 1080p60 capabilities(2070/2080) for 1080p;

Sorry man, you are really reaching there. The first site gives you a link that the reader can go to view the off screen capture. But, he clearly states that the game was running at 1920 by 1080. The second site says specifically that the GeForce experience was capturing at game resolution and that resolution was 1080p.

IF it was truly ray tracing at 4K and getting 50-57 fps like Tech Radar claims, I don't think the developer would have been asked to explain the poor performance.
 
No, I know exactly how hard it is. Hence why it's taken 10 years for them to release a card with the capabilities implemented. We are also not talking about full on raytracing the whole scene. I'm sure when the whitepapers are released, it will show some cheating going on, just like when shader model 1.0 came out and some games were and still bake their shadows in(doom).

If the demo they are using to sell the new Ray Tracing power of the new cards was run using some kind of cheats and not full on Ray Tracing, then, anybody who has pre-ordered a 2080ti for a $1000 has been duped.
 
Sorry man, you are really reaching there. The first site gives you a link that the reader can go to view the off screen capture. But, he clearly states that the game was running at 1920 by 1080. The second site says specifically that the GeForce experience was capturing at game resolution and that resolution was 1080p.

IF it was truly ray tracing at 4K and getting 50-57 fps like Tech Radar claims, I don't think the developer would have been asked to explain the poor performance.

First the fact that it was going down to 30fps many times during the run, and the expectation of this card was minimum 60fps is why someone was trying to ask about the performance. I don't believe I'm reaching. Who's reaching when the card with RTX off can apparently hit 100fps at 4k ultra just fine according to Techradar, and also due to core count ex, doesn't seem too off from expectations; versus instead of taking a 50-75% hit to turn on RTX at 4k, you suggest that a card that's better than the 1080ti on paper which can run RotTR at an average of 209fps at 1080p(https://www.techspot.com/review/1352-nvidia-geforce-gtx-1080-ti/page7.html), would be running this at 30-50.
 
If the demo they are using to sell the new Ray Tracing power of the new cards was run using some kind of cheats and not full on Ray Tracing, then, anybody who has spend $1000 on a 2080ti has been duped.

The point is we are still seeing a significant visual change in the scenes, so some cheating or not, it's doing it's job just fine. I mean look at character models vs their actual models in game. There is a massive lowering of poly's happening when transferred into a game engine, but with tessellation and other tricks we don't see the lowering as much. I anticipate the next few years will be implementing more ray tracing pathways for developers and removing the possible cheats in the system.
 
First the fact that it was going down to 30fps many times during the run, and the expectation of this card was minimum 60fps is why someone was trying to ask about the performance. I don't believe I'm reaching. Who's reaching when the card with RTX off can apparently hit 100fps at 4k ultra just fine according to Techradar, and also due to core count ex, doesn't seem too off from expectations; versus instead of taking a 50-75% hit to turn on RTX at 4k, you suggest that a card that's better than the 1080ti on paper which can run RotTR at an average of 209fps at 1080p(https://www.techspot.com/review/1352-nvidia-geforce-gtx-1080-ti/page7.html), would be running this at 30-50.

Ok, what are you talking about? I am discussing the ray tracing performance of the new cards because that's all the information we have. The gaming performance is still a complete unknown. Do you not find it funny that the Tech Radar guy has completely different information than the other sites? Does that not make you doubt the rest of his claims? He was claiming that Shadow of Tomb Raider was running at a steady 50-57 fps at 4K with full on ray tracing? When every other site and the developer state the game had pretty poor performance, and the developer hopes to improve things.

Isn't it odd that? And you are clinging to that one site to say the game was running at 4K when all the other sites state that the game was running at 1080p. Here I will quote the lines

From PcGamesn:-

"We weren’t able to see what settings the game was running at as the options screens were cut down in the build we were capturing, but GeForce Experience was capturing at the game resolution and the RTX footage we have is 1080p."

From Digital Trends:-

"The machine running the Tomb Raider demo included the just-revealed RTX 2080 TI while the game itself was only set to a 1,920 x 1,080 resolution. With FRAPS running in the background, ray tracing turned on and the game set to its highest detail settings, the framerate would fluctuate between 30 and 70 frames per second"

And lastly, with how he went on and on about giga this and mega that, that he would been mentioning real time Ray Tracing at 4K? I mean that's a pretty big deal.
 
I don't think it matters whether it was at 1080 or 4K because according to the developers, it was running on an early, unfinished segment of the game with early, non-optimized drivers. Basically the results are not going to be indicative or really anywhere close to what you'll see once the card and the game launch.
 
I don't think it matters whether it was at 1080 or 4K because according to the developers, it was running on an early, unfinished segment of the game with early, non-optimized drivers. Basically the results are not going to be indicative or really anywhere close to what you'll see once the card and the game launch.
the game went gold weeks ago...
 
I don't think it matters whether it was at 1080 or 4K because according to the developers, it was running on an early, unfinished segment of the game with early, non-optimized drivers. Basically the results are not going to be indicative or really anywhere close to what you'll see once the card and the game launch.

Going to be some performance optimisations to go from struggling to reach 60fps on 1080p to been playable at 1440p and higher resolutions that the 2080ti card is marketed towards. And they had to be pretty far along with it's optimisations to be using it as the demo for their latest greatest cards.
 
Ok, what are you talking about? I am discussing the ray tracing performance of the new cards because that's all the information we have. The gaming performance is still a complete unknown. Do you not find it funny that the Tech Radar guy has completely different information than the other sites? Does that not make you doubt the rest of his claims? He was claiming that Shadow of Tomb Raider was running at a steady 50-57 fps at 4K with full on ray tracing? When every other site and the developer state the game had pretty poor performance, and the developer hopes to improve things.

Isn't it odd that? And you are clinging to that one site to say the game was running at 4K when all the other sites state that the game was running at 1080p. Here I will quote the lines

From PcGamesn:-

"We weren’t able to see what settings the game was running at as the options screens were cut down in the build we were capturing, but GeForce Experience was capturing at the game resolution and the RTX footage we have is 1080p."

From Digital Trends:-

"The machine running the Tomb Raider demo included the just-revealed RTX 2080 TI while the game itself was only set to a 1,920 x 1,080 resolution. With FRAPS running in the background, ray tracing turned on and the game set to its highest detail settings, the framerate would fluctuate between 30 and 70 frames per second"

And lastly, with how he went on and on about giga this and mega that, that he would been mentioning real time Ray Tracing at 4K? I mean that's a pretty big deal.

Jensen said all the demos he was going to show on stage were run at 4k. There was a demo of SotTR showing during the stage presentation. IF you think that the guys that can't and didn't see the resolution settings, yet claim 1080p30-70fps(on a 4k monitor btw, and yes I see the confusion saying GFE was running at game resolution, but without actually seeing it, can't be confirmed) were correct, wouldn't you expect the demo Jensen showed, at 4k to be running about 5-10fps, which clearly it wasn't. OR do you think that maybe these two guys, who again didn't see the resolution, just seemingly got it wrong or misheard vs Jensen and TechRadar(and again, I'll give you that techradar stated it was running 50-57fps, maybe they were trying to take out the lower fps stutters as driver issues...who knows). We are in a world of 4k now when talking about high end gfx cards...cards that cost 800-1100 dollars. I'm not going to say it's impossible it's 1080p, but again it makes way more sense that it's running 4k considering the other information. If the video showed a constant 30fps, I'd be disappointed but I'd still say it was running 4k and say it's probable for 1st gen to only hit 4k30fps, however we have no idea how things are being produced on these cards. But to suggest it's only getting 30-60fps at 1080p, is going the other way in terms of realistic expectations.
 
Well soon the truth will come out as we will have reviews. It really has to make you wonder why Nvidia releases these cards with absolutely zero reviews. Bet your sweet rear-end though that Nvidia is going to limit the hell out of any kind of testing that will put the cards in bad light.

It's not released. It's a "paper launch". The card was revealed and is up for pre-order, but consumers cannot get it yet. The NDA will probably expire on launch day, which is pretty common for PC hardware.
 
the game went gold weeks ago...

Going to be some performance optimisations to go from struggling to reach 60fps on 1080p to been playable at 1440p and higher resolutions that the 2080ti card is marketed towards. And they had to be pretty far along with it's optimisations to be using it as the demo for their latest greatest cards.

From the developers own words:

“The Nvidia Ray Tracing technology currently being shown in Shadow of the Tomb Raider is an early work in progress version,” the developers state via Twitter. “As a result, different areas of the game have received different levels of polish while we work toward complete implementation of this new technology.”

Early, incomplete section or an early and incomplete implementation of a brand new technology. In other words, probably not indicative of 2080 Ti's performance once it's released.

Does it really make sense that Nvidia's $1000 flagship card is only capable of 30-40 fps at 1080 and slower than it's 2 year old predecessor?
 
They weren’t paper launched, it was a pre-release. If on the actual launch date we only have 10 cards go out then it was a paper launch.

I have to say this is odd of Nvidia to do this, we’ve seen early announcements before but have we seen pre-orders?
 
The Hairworks comparison makes me laugh, but Nvidia never crippled their GPUs by adding Hairworks Cores that didn't do anything else but Hairworks.
 
Back
Top