Shadow of the Tomb Raider Demo Is Hobbled by Real-Time Ray Tracing and Other Tech

cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,392
PCGamesHardware was able to view a demo of Shadow of the Tomb Raider running on a GeForce RTX 2080 Ti with real-time ray tracing and possibly max graphics enabled at 1080p. In addition to this, a frame rate counter (FRAPS) was enabled. The game was running in the low 30's for extended periods of time. Now of course this is a demo running unfinished code, possibly max settings and the real-time ray tracing will added as a post launch patch, but the game does release next month. Here is a link to the video with FRAPS enabled footage at the beginning. As DSOGaming noticed, the official trailer for Shadow of the Tomb Raider has uneven frame rates.

Our friends over at PCGamesHardware have shared an off-camera video, showing the game running with real-time ray tracing on an NVIDIA GeForce RTX 2080Ti. Thankfully, a FRAPS overlay was present and we can see that this new GPU can run the game with 30-70fps.
 
Beautiful game. I wonder how much of a hit will the real-time ray tracing cause in the finished product.
 
I thought I noticed slowdown during the stream when they turned RTX on in game, though it could have just been the shit streaming quality of nVidia.
 
Anyone expecting to flip the RayTracing switch and not see your performance plummet needs a very, very serious lesson in "the way shit works in the PC Gaming World". Want to know how long it took for cards to introduce AA/AF to actually being able to turn it on (without all your games turning into MYST)? :) Answer: A little while.
Hows that PhyX revolution going? Etc, etc.

Buy it for the extra 40 FPS in GTAV at 4K with Grass on ULTRA, not for any of its magical proposed features.
 
First gen of anything sucks.. check tesselation, different type of AA etc. Generally it takes 2-3 generation before it offer great performance. But hey gonna start somewhere... The last TR game is very difficult to run even on modest hardware with VH setting..
 
Shadows, the thing that literally does nothing for me in games. I just don't notice them when playing, and the first thing I turn off.

Is it HDR, or why do many of the surfaces look like they're overexposed and burned in?

I have to admit, I'm not seeing anything special in, either the linked video or the promo video on youtube.
 
Looks like it's time for Nvidia to use the CPU to do Ray-Tracing and leave the GPU to do traditional rendering.
 
Shadows, the thing that literally does nothing for me in games. I just don't notice them when playing, and the first thing I turn off.

Is it HDR, or why do many of the surfaces look like they're overexposed and burned in?

I have to admit, I'm not seeing anything special in, either the linked video or the promo video on youtube.

Hardware HDR doesn't do that, unless the dev tell it do it. The other game (last one) look amazing in HDR on Xbox One X...
 
Those are some pretty brutal graphical settings for the new flagship GPU in 2018 to not be able to hit 60 fps at 1080
Yeah that's what shocked me the most. At 4k or hell even 1440 ... but at 1080 it feels pretty underwhelming.

*edit - to not see 60 minimum on any game with a flagship TI product.
 
I don't believe this was running at 1080p(the capture was 1080p60). The monitor was running at 4k
 
Those are some pretty brutal graphical settings for the new flagship GPU in 2018 to not be able to hit 60 fps at 1080

Yeah that's what shocked me the most. At 4k or hell even 1440 ... but at 1080 it feels pretty underwhelming.

*edit - to not see 60 minimum on any game with a flagship TI product.
Shaping up to be the next Hairworks from the look of it.

Average gamers will not give a flying fuck about this if it makes games play worse than previous generation cards.
 
Nvidia has to sell us another decade of video cards, I wouldn't expct them to put a ton of RT performance on their first gen cards. No, they'll give us a little more each revision...since chasing resolution is now kind of silly.....they can't use that as a way to entice us. Enter new rendering methods, and we're back to square one.
 
This is why you need a paid of 2080 Ti's

Duh-uh.

(I'm seriously considering it)
 
Hunt: Showdown looks better without all that Ray crap.

And who the hell goes running around in game when their trying to survive to say,, oh hey look at the cool shadows?
 
Hunt: Showdown looks better without all that Ray crap.

And who the hell goes running around in game when their trying to survive to say,, oh hey look at the cool shadows?

I know right. I play the original Tomb Raider because all those fancy gfx are just useless...

I'm surprised you even mentioned Hunt, i mean all the gfx on that game are just useless too. Been complaining since the original splinter cell came out.

Getting back to reality, the point of ray tracing and the RT cores is to reduce the computational tasks from the raster and cuda cores. Technically turning on RT should actually improve fps with the correct drivers. Not saying we are going to get that day 1, but ray tracing has been the dream for decades in real time computer graphics. We are already at a point where 4k60 is not out of the question(i think 2080 and 2080ti will do it just fine) so we need the next thing to bring photo-realism out.
 
Not terribly surprising. Of course its going to cost extra performance. However, they are likely showing off unfinished tech in that demo. Since ray-tracing support is not going to be added until after launch I'd be surprised if this was more than something quickly slapped together for Gamescom. It will be interesting to see how things go once everything is final.
 
Nvidia has to sell us another decade of video cards, I wouldn't expct them to put a ton of RT performance on their first gen cards. No, they'll give us a little more each revision...since chasing resolution is now kind of silly.....they can't use that as a way to entice us. Enter new rendering methods, and we're back to square one.

I'm way more excited about the new AA TBH.
 
Literally the first game quality setting I lower is shadows.

Ok fair enough but the first thing that needs to be corrected in modern graphics is lighting and shadows. This or everything will keep looking fake in games as it does now. There is not other way around it and I am glad NVIDIA is trying to push it. The way they push it and whether it will succeed is another story.

I don't know whether they could bring it to the market @ 50% lower prices but it's not going mainstream, hence widely supported, until the prices become palatable for the mainstream.

What some of us are willing to spent, even what we conceive as midrange, is far off from what the majority considers as justifiable. Ask a normal consumer to buy their kids a $500 GPU just to play and see their reaction.
 
Didn't watch video, but max settings could include 4x SSAA.
LOL no. They are trying to show this Tech in the best light when it comes to Performance so do you really think they would be stupid enough to enable that demanding level of AA?
 
Didn't like what I saw even a bit. I am getting Doom 3 vibes except that's not nearly as revolutionary
 
  • Like
Reactions: Youn
like this
I don't believe this was running at 1080p(the capture was 1080p60). The monitor was running at 4k

I could have sworn Jensen stated the demo was running at 4k 60fps because of vsync and it was recorded at 1080p 60fps, several times he referred to the beautiful 4k monitors they were running the presentation on and that the game footage was ran by 1 2080ti for the game demos, so if that is 100% legit, you can def tell it runs faster than what this guy is reporting.

I watched it through Gordon Ung's stream and bingo session. Afterwards I watched quite a few videos and people were straight out of know where with their comments there was one really squeeky kid that was over the top wrong with his tech channel. People are just being overdramatic from the sticker shock.
 
1080p and only 40-50fps? On a $1200 graphics card? I can't pass any harder!
 
do a little more research as yes it was running at 1080p and even the developers responded
Actually no. The DEV never stated it was running at 1080p. The response from the dev just mentioned the frame rate not being indicative of final release, no words were mentioned on resolution. There are many articles mentioning it running at 1080p, yet they are all just reiterating the capture resolution.
 
Actually no. The DEV never stated it was running at 1080p. The response from the dev just mentioned the frame rate not being indicative of final release, no words were mentioned on resolution. There are many articles mentioning it running at 1080p, yet they are all just reiterating the capture resolution.

Whatever helps you sleep at night, bub.
 
Actually no. The DEV never stated it was running at 1080p. The response from the dev just mentioned the frame rate not being indicative of final release, no words were mentioned on resolution. There are many articles mentioning it running at 1080p, yet they are all just reiterating the capture resolution.
well let's just use some logic and basic Common Sense here. If the game was actually running at 4K then everybody would be talking about that including the Developers. Some of you live in a magic fairy land if you think this game was getting that kind of performance at 4K on max settings with Ray tracing enabled.
 
well let's just use some logic and basic Common Sense here. If the game was actually running at 4K then everybody would be talking about that including the Developers. Some of you live in a magic fairy land if you think this game was getting that kind of performance at 4K on max settings with Ray tracing enabled.
Ok, let's use some common sense here. We've been over 60fps in 1080p for YEARS, and the 1080ti can do 4k60 in some titles. So you expect a new card with 800 extra cuda cores, better ram and clocks CAN'T do 4k30-70fps with ray tracing when it specifically has hardware ray tracing capabilities.

Who's not using common sense?
 
Back
Top