Shadow of the Tomb Raider Demo Is Hobbled by Real-Time Ray Tracing and Other Tech

The bottom line though is, worst case scenario, it's not going to be slower than the old generation. Europe, as well as several other countries, have, built into their laws, the right for a customer to return a product. If the RTX was not at least generationally faster by a good 5%, it would be a massive problem for NVidia.

The reason why Ray Tracing was focused upon is that is the future, even if the future isn't Sept. 20. Who's going to spend money on a card that only has the typical 10% performance bump? Rather, who's going to watch a 2 hour talk about the latest graphic card which is just faster than the previous generation?
 
They weren’t paper launched, it was a pre-release. If on the actual launch date we only have 10 cards go out then it was a paper launch.

I have to say this is odd of Nvidia to do this, we’ve seen early announcements before but have we seen pre-orders?

I don't think its that odd. People are going to buy these launch day regardless of what reviews say, so they might as well take advantage of that. It's a smart move from the business end of things. It also lets them gauge early consumer demand for cards of this price. If pre-order numbers are crap they have time to adjust prices or come up with some kind of pre-order bonus/discount to entice people to hop on. If pre-order numbers are strong it shows them that people are fine with these kind of prices.
 
This reminds me of reading the Nvidia blog where they were implementing HDR on the PC version of Rise of the Tomb Raider...
 
Last edited:
I don't think its that odd. People are going to buy these launch day regardless of what reviews say, so they might as well take advantage of that. It's a smart move from the business end of things. It also lets them gauge early consumer demand for cards of this price. If pre-order numbers are crap they have time to adjust prices or come up with some kind of pre-order bonus/discount to entice people to hop on. If pre-order numbers are strong it shows them that people are fine with these kind of prices.
I’m not against pre-orders for hardware, I think it’s silly to preorder anything when there isn’t anything to tell you how they perform. Business side of things they make sense. I say it’s odd for Nvidia as they haven’t done this before. At least I do t think they have.

This reminds of reading the Nvidia blog where they were implementing HDR on the PC version of Rise of the Tomb Raider...
Look at HDR implementation into FarCry 1, it was a BIG seller for Nvidia, it also tanked performance. That didn’t stop them from using it as a major selling point, plus people actually wanted to use it.

We will see how RT works and is received by the public soon enough.
 
The bottom line though is, worst case scenario, it's not going to be slower than the old generation. Europe, as well as several other countries, have, built into their laws, the right for a customer to return a product. If the RTX was not at least generationally faster by a good 5%, it would be a massive problem for NVidia.

The reason why Ray Tracing was focused upon is that is the future, even if the future isn't Sept. 20. Who's going to spend money on a card that only has the typical 10% performance bump? Rather, who's going to watch a 2 hour talk about the latest graphic card which is just faster than the previous generation?

But by your words 5% could mean any aspect of the card. The 20 series is obviously going to be 5% faster at ray tracing, so consumers won't have a leg to stand on even if the card was 5% slower in everything else.
 
But by your words 5% could mean any aspect of the card. The 20 series is obviously going to be 5% faster at ray tracing, so consumers won't have a leg to stand on even if the card was 5% slower in everything else.

It's not going to be slower than the equivalent 1080 card on non raytracing. In terms of raytracing, it's going to be leagues faster than a 1080. But I'm not worried about using a feature that would normally run at 1 fps @ 1080p and now runs at 24 fps @ 1080p.

It's going to be a generational update with just some new features that are just there to get things started. The sooner we start implemented ray tracing, even if just minimal in aspects, the sooner we'll have cards which will be able to run it at a decent frame rate.

If you're worried so much, just don't preorder, and wait for a review.
 
I’m not against pre-orders for hardware, I think it’s silly to preorder anything when there isn’t anything to tell you how they perform. Business side of things they make sense. I say it’s odd for Nvidia as they haven’t done this before. At least I do t think they have.


Look at HDR implementation into FarCry 1, it was a BIG seller for Nvidia, it also tanked performance. That didn’t stop them from using it as a major selling point, plus people actually wanted to use it.

We will see how RT works and is received by the public soon enough.

Shit, HDR is still a problem in Farcry 5. Apparently the fucking morons don't want you using it on all tv's that are HDR capable. Yet Sony managed to get a massive amount of titles that work at 4k hdr with their shitty hardware. And all consumer uhd blu-ray players and a fucking $60 roku have no problem.
 
Didn't like what I saw even a bit. I am getting Doom 3 vibes except that's not nearly as revolutionary

Watch their Metro Exodus demo again. When they turn direct lights only, THAT is Doom 3. Turning on Raytracing is a revolutionary improvement over that.
 
144Hz monitor price drop.
Finally, we can all chase for that silky smooth 23.976

4840965.jpg
 
Last edited:
Just wait for later video cards till technology matures. In Canada it's easier due to fact in Canada we pay more for same cards. Gonna wait till more games support raytracing. Fact that its matured.
 
For what it's worth, Anandtech seem pretty confident that SotTR and BFV at least we're running at 1080P and struggling to keep up 60FPS with ray tracing going on.

If they say it, I'm fully believing them.
 
if it is stable at 1080p (60+ fps) at ultra settings (everything almost maxed) with Ray Tracing, Okay, maybe I "would" let that price slip, but dipping into 30fps at 1080p is not really good. Im pretty sure optimization at this point is not a good excuse, 1080p is like the 720p of the Full HD era now. A good demo would be running unoptimized code at 1080p at 60FPS, then bragging with optimization you could go even higher (especially with the advent of g sync & freesync).

On the positive side, It is really a testament that Ray tracing is getting here for general consumer/gaming use.
 
Still waiting for MMX to make the big leap in games... lol
the game looks really good, but how do you enjoy those graphics and stay alive at the same time? lol
I'm still having fun with my 1060 @1080p and my Rift... maybe upgrade when the 4080ti comes out... or i win the lottery.. which ever comes first... im putting my money on the 4080ti though :)
 
Jensen said all the demos he was going to show on stage were run at 4k. There was a demo of SotTR showing during the stage presentation. IF you think that the guys that can't and didn't see the resolution settings, yet claim 1080p30-70fps(on a 4k monitor btw, and yes I see the confusion saying GFE was running at game resolution, but without actually seeing it, can't be confirmed) were correct, wouldn't you expect the demo Jensen showed, at 4k to be running about 5-10fps, which clearly it wasn't. OR do you think that maybe these two guys, who again didn't see the resolution, just seemingly got it wrong or misheard vs Jensen and TechRadar(and again, I'll give you that techradar stated it was running 50-57fps, maybe they were trying to take out the lower fps stutters as driver issues...who knows). We are in a world of 4k now when talking about high end gfx cards...cards that cost 800-1100 dollars. I'm not going to say it's impossible it's 1080p, but again it makes way more sense that it's running 4k considering the other information. If the video showed a constant 30fps, I'd be disappointed but I'd still say it was running 4k and say it's probable for 1st gen to only hit 4k30fps, however we have no idea how things are being produced on these cards. But to suggest it's only getting 30-60fps at 1080p, is going the other way in terms of realistic expectations.

Anandtech has said the same thing, the games they demoed were running at 1080p. We know the monitor they used for the demo was 1080p. Come on man, you can't keep ignoring the amount of information that's saying the games were running at 1080 with Ray Tracing on. If it was 4K why didn't they use 4K monitors?

"For Battlefield V, the situation was similar with a 1080p 144Hz monitor, playing on the Rotterdam map over LAN"

So here are the facts: no BS.

We know it two of the games that journalists demoed weren't running smoothly. That's from several sites and the developers themselves.
We know the Demo's were run on a 1080p monitor.
We know the Geforce experience was recording at the game resolution, and while the site couldn't see what resolution the game was running at, he could see that the GFE was recording at 1080p.

Too much evidence pointing to 1080p. Come on, I know you want it to be 4K, but, it's just not the case.
 
I mentioned in another thread a few things about this: The new run of TR games have always been beasts that can bring down even the biggest gpu giants. I still use the 1st one for various new card/display benches. With every in game setting maxed for ROTR in 4k it will bring my heavily OC'd 1080TI to a crawl in 4k(but a playable 40-60 in 1440p)-p.s. vram also hits 11GB in 4k in the russian forest/camp areas with those settings. It's staggering to watch each AA setting chew up more vram with this game. That being said I would expect no less from the newest one.

On the other hand, I'm pretty sure most people have been saying for the last two years how they wanted a 4k/60fps card to play the games they already have and not another new feature that keeps 'em crawling. If RT is going to give this much of a performance hit then its no different in cost than AAx16 was 10+ years ago and then MSAAx4 still is at 4k.
 
Anandtech has said the same thing, the games they demoed were running at 1080p. We know the monitor they used for the demo was 1080p
''

Being totally facetious with this but maybe they were up-scaling! LOL
 
PCGamesHardware was able to view a demo of Shadow of the Tomb Raider running on a GeForce RTX 2080 Ti with real-time ray tracing and possibly max graphics enabled at 1080p. In addition to this, a frame rate counter (FRAPS) was enabled. The game was running in the low 30's for extended periods of time. Now of course this is a demo running unfinished code, possibly max settings and the real-time ray tracing will added as a post launch patch, but the game does release next month. Here is a link to the video with FRAPS enabled footage at the beginning. As DSOGaming noticed, the official trailer for Shadow of the Tomb Raider has uneven frame rates.

Our friends over at PCGamesHardware have shared an off-camera video, showing the game running with real-time ray tracing on an NVIDIA GeForce RTX 2080Ti. Thankfully, a FRAPS overlay was present and we can see that this new GPU can run the game with 30-70fps.

..……………….
https://www.techradar.com/reviews/nvidia-geforce-rtx-2080-ti
..………………..
People keep reporting different numbers, however, Eidos twitter account lit up afterwards explaining it was not optimized at this early of a stage and they were casting more rays than they were supposed to in the demo, unlike DICE, they don't have it all together yet, SotTR may release next month but RT support is coming through a patch update later. Either way it will be interesting, at this point I would only buy a 20 series for the Legacy gaming numbers, as RT is a gamble as to the performance hit one might take.
.................…….
Honestly IMO *IF* RT is that impactful I don't mind running games at 1080p w/ 120fps mark because of how game changing the tech actually is 1440p 60fps also would be fine because the toll for the features would still be near modern era resolutions(I know but keep in mind 1080p and the 1060 are still the most widely used), it would be disappointing to drop $1200 on something that can't do 4k RT reasonably. We have a month to let this stew till we get actual numbers, then we probably will have up to 6 month to wait for proper benchmarking tools Heaven/3dmark and games, but I wouldn't base my expectations on games where the feature was just tacked on to the game like SotTR as they in all likelihood will be the worst representation of the tech working.
 
Does it really make sense that Nvidia's $1000 flagship card is only capable of 30-40 fps at 1080 and slower than it's 2 year old predecessor?

I am sure in normal games the 2080Ti will be faster than the 1080Ti, I don't know how much faster, but, maybe 20 to 30%

In Ray Tracing games, I am sure it will be faster than the 1080Ti as well. And I would say in Real time Ray Tracing the 1080ti would only be able to manage single digit frame rates at 1080p. Yes, Real time ray tracing is that demanding.
 
I mentioned in another thread a few things about this: The new run of TR games have always been beasts that can bring down even the biggest gpu giants. I still use the 1st one for various new card/display benches. With every in game setting maxed for ROTR in 4k it will bring my heavily OC'd 1080TI to a crawl in 4k(but a playable 40-60 in 1440p)-p.s. vram also hits 11GB in 4k in the russian forest/camp areas with those settings. It's staggering to watch each AA setting chew up more vram with this game. That being said I would expect no less from the newest one.

On the other hand, I'm pretty sure most people have been saying for the last two years how they wanted a 4k/60fps card to play the games they already have and not another new feature that keeps 'em crawling. If RT is going to give this much of a performance hit then its no different in cost than AAx16 was 10+ years ago and then MSAAx4 still is at 4k.

I mean if you are using a Computer Monitor(35inches 4k) from a playability standpoint, running 4k with AA ramped up all the way is overkill, and is usually left for the special gamers out there so focused on running ULTRA settings in every game. X2 AA at most is all that is needed the 2080Ti may fix that stupidity in Legacy gaming in some games, At this point it depends on the quality of the game experience with RT enabled and is it worth the trade off. also SotTR see my other comment.
 
I don't trust Tech Radar's Ray Tracing numbers, They are so at odds with that other tech websites have reported and at odds with what the developer said. Unless of course Tech Radar were special and got a demo at 4k that ran at 50+ fps while the other peasant tech sites were only shown a 1080p demo that ran at 30+ fps. :p:)

I don't know, during the presentation it was running at 4k capped at 60htz refresh Jensen explained that, IDK to be fair, but I wouldn't use SotTR as an example for RT performance it is obvious they are having multiple problems implementing it, I wouldn't use any game that just had the feature tacked on towards half to the end of its development as a benchmarks of any kind, we have to really wait and see.
 
I don't know, during the presentation it was running at 4k capped at 60htz refresh Jensen explained that, IDK to be fair, but I wouldn't use SotTR as an example for RT performance it is obvious they are having multiple problems implementing it, I wouldn't use any game that just had the feature tacked on towards half to the end of its development as a benchmarks of any kind, we have to really wait and see.

I am not just basing it on SoTR. Battlefield V wasn't able to run at 60FPS on 1080p either according to the tech websites.
 
Watch their Metro Exodus demo again. When they turn direct lights only, THAT is Doom 3. Turning on Raytracing is a revolutionary improvement over that.
I know what the guy means, I also thought of Doom 3 watching this demo... things look a bit too shiny/plasticy, especially skin tones... maybe some added filtering or "camera muckification" might help

it's probably the same problem we saw going from SD to HD for film, or the difference between Lord of the Rings and The Hobbit... the content creators/artists need to put extra work into compensating for the added realism which is now exposing how fake things really are
 
I don't know, during the presentation it was running at 4k capped at 60htz refresh Jensen explained that, IDK to be fair, but I wouldn't use SotTR as an example for RT performance it is obvious they are having multiple problems implementing it, I wouldn't use any game that just had the feature tacked on towards half to the end of its development as a benchmarks of any kind, we have to really wait and see.

Go look at the demonstration again. Whenever they enable RT the frame rate plummets, you can see it in the animation of the character models. Those are all also fairly static scenes without a ton of models or effects, outside of lighting and shadows. If those moments, likely specifically designed to show off RT, had FPS drops imagine what happens with the actual game running.
 
That is *super* heavy...

We're basically entering holy grail territory though, so I'm not super surprised by the rough patch. This has been a really long time coming. That is still ludicrously brutal performance wise, though.

I guess it also remains to be seen whether we can dial some settings back and get 80% of the visuals at half the cost or something. For all we know, everything is ratcheted to 11.
 
I know what the guy means, I also thought of Doom 3 watching this demo... things look a bit too shiny/plasticy, especially skin tones... maybe some added filtering or "camera muckification" might help

it's probably the same problem we saw going from SD to HD for film, or the difference between Lord of the Rings and The Hobbit... the content creators/artists need to put extra work into compensating for the added realism which is now exposing how fake things really are

Oh, I see what you mean.
 
A graphics card company wastes silicon on some useless feature that may or may not be useful in the future, and requires developers to spend development time to utilize. Sounds familiar.
 
Anandtech has said the same thing, the games they demoed were running at 1080p. We know the monitor they used for the demo was 1080p. Come on man, you can't keep ignoring the amount of information that's saying the games were running at 1080 with Ray Tracing on. If it was 4K why didn't they use 4K monitors?

"For Battlefield V, the situation was similar with a 1080p 144Hz monitor, playing on the Rotterdam map over LAN"

So here are the facts: no BS.

We know it two of the games that journalists demoed weren't running smoothly. That's from several sites and the developers themselves.
We know the Demo's were run on a 1080p monitor.
We know the Geforce experience was recording at the game resolution, and while the site couldn't see what resolution the game was running at, he could see that the GFE was recording at 1080p.

Too much evidence pointing to 1080p. Come on, I know you want it to be 4K, but, it's just not the case.

I'm not ignoring, I'm asking for clarification. I've asked all of them so far. Couple Youtubers that were there that didn't mention 1080p or 4k, couple article authors(including Anand's, PCgamersN, etc). One place said the monitors were 4k displays, Anand says they were 1080p displays. One site says 4k50 with RT, couple others say 1080p30-50...big differences that just seem odd. Jensen showing 4k on screen and it not being a slideshow vs 30fps @1080p seems like a stretch, but I'm willing to find out. If the 2080ti is struggling that much, what should the 2070/2080 expect. Not much at all which doesn't bode well for sales. At least we know that some benchers have the 2080ti in hand so hopefully we get some leaks.
 
I'm not ignoring, I'm asking for clarification. I've asked all of them so far. Couple Youtubers that were there that didn't mention 1080p or 4k, couple article authors(including Anand's, PCgamersN, etc). One place said the monitors were 4k displays, Anand says they were 1080p displays. One site says 4k50 with RT, couple others say 1080p30-50...big differences that just seem odd. Jensen showing 4k on screen and it not being a slideshow vs 30fps @1080p seems like a stretch, but I'm willing to find out. If the 2080ti is struggling that much, what should the 2070/2080 expect. Not much at all which doesn't bode well for sales. At least we know that some benchers have the 2080ti in hand so hopefully we get some leaks.

Does the fact that they were using 1080p monitors not ring any alarm bells? It would be a very odd choice for them to use 1080p monitors to demo 4K ray tracing.

The one other site that says 4K 50 along with Tech Radar is Wccftech and you know they just are just copying and pasting from the Tech radar site. The one place that says 4k monitors is Tech Radar. But we actually know the monitor that was used and it's not 4K, it's an Asus 1080p one. I will get the exact model and post it.

The scene for the onstage demo was a fairly static scene though. Not much going on, whereas the Journalists were actually playing the game. I am not sure what drugs Tech Radar were on.

Yes, roll on the leaks. But other people have said that it's really strange that we have got no leaks yet of any kind.
 
I am surprised people turn down Ambient occlusion and shadows first. They are what makes a game look realistic. Model detail and distortions/post processing are usually less noticeable and help perf. Shadows and lighting are critical to me at least.

The max setting on shadows is always WAAAAAAAAYYYYY to sharp. Show me a real shadow where the outlines are an exact super-sharp outline of the object casting the shadow EXCEPT when the lighting is set up specifically for that effect. And even then, you are not going to get super-sharp edges on the shadow like what shows up ion a lot of games when the shadow setting is on max.

Real shadows with regular, daytime or artificial lighting are going to have pretty soft outlines.

Turning down the shadows a notch or two or three depending on the game makes the shadows look way more realistic.
 
I mean if you are using a Computer Monitor(35inches 4k) from a playability standpoint, running 4k with AA ramped up all the way is overkill, and is usually left for the special gamers out there so focused on running ULTRA settings in every game. X2 AA at most is all that is needed the 2080Ti may fix that stupidity in Legacy gaming in some games, At this point it depends on the quality of the game experience with RT enabled and is it worth the trade off. also SotTR see my other comment.

True but I also use a 55" HDR from about 10 feet away and a 4k/HDR projector ~200+ inches.

In either case I'm a sucker for a TI and I'll probably get one. Hardest part is the NV has gotten really bad about releasing mega variations of things recently and I can't help but wonder what will come out around spring.
 
Jensen said all the demos he was going to show on stage were run at 4k. There was a demo of SotTR showing during the stage presentation. IF you think that the guys that can't and didn't see the resolution settings, yet claim 1080p30-70fps(on a 4k monitor btw, and yes I see the confusion saying GFE was running at game resolution, but without actually seeing it, can't be confirmed) were correct, wouldn't you expect the demo Jensen showed, at 4k to be running about 5-10fps, which clearly it wasn't. OR do you think that maybe these two guys, who again didn't see the resolution, just seemingly got it wrong or misheard vs Jensen and TechRadar(and again, I'll give you that techradar stated it was running 50-57fps, maybe they were trying to take out the lower fps stutters as driver issues...who knows). We are in a world of 4k now when talking about high end gfx cards...cards that cost 800-1100 dollars. I'm not going to say it's impossible it's 1080p, but again it makes way more sense that it's running 4k considering the other information. If the video showed a constant 30fps, I'd be disappointed but I'd still say it was running 4k and say it's probable for 1st gen to only hit 4k30fps, however we have no idea how things are being produced on these cards. But to suggest it's only getting 30-60fps at 1080p, is going the other way in terms of realistic expectations.

AGAIN it was 1080p. Even Digital Foundry had hands on themselves and flat out said that they were running the Shadow of the Tomb Raider demo at 1080p with the 2080 ti and the framerate was dipping well below 60 fps regularly. They also were not very impressed with the look of the game with RT on.

go to the 18:05 mark

 
I am not just basing it on SoTR. Battlefield V wasn't able to run at 60FPS on 1080p either according to the tech websites.

According to posts from devs, they didn't have the hardware till 2 weeks ago, so they were shooting blind with their code till that point. I wouldn't take these demos as any kind of indication, 3dmark is releasing a new benchmark at the end of September, also until we get various games using the tech designed at start of development with it I wouldn't let it sway you on RT just yet, keep in mind you are not going to get 100+fps at 4k with RT, I am not saying that, their will be a toll to pay, but I seriously doubt its that bad, Adoredtv did a video and he clearly states they had no time to implement it for the demo and the game was running in an alpha/beta state, so as he said that is probably worst case scenario numbers for the tech, also finalized drivers aren't due out until September, which actually isn't that bad from a technical point. Legacy gaming performance we see what we are getting but you should still wait for the actual benchmarks.
 
I don't trust Tech Radar's Ray Tracing numbers, They are so at odds with that other tech websites have reported and at odds with what the developer said. Unless of course Tech Radar were special and got a demo at 4k that ran at 50+ fps while the other peasant tech sites were only shown a 1080p demo that ran at 30+ fps. :p:)

They aren't Ray Tracing numbers, they are Legacy gaming numbers they are reporting I seen a 50-60fps on RT demos, which is possible if they moved one of the computers as they were all clumped together spitting heat at one another so it is possible they were thermal throttling, the 30-40fps quote is only from 1 source everyone keeps repeating, that had fraps up and even then they couldn't verify 1080p they just assumed that was the capture rate, so in all possibility that could be skewed.

On RT I wouldn't use any of the garbage numbers from the event to form an opinion on the state of what we will see, the absolute best thing you can do is wait and see from finished games that were developed from start to finish with the tech, these demos were tacked on experiences because the hardware wasn't available to see or use until that point, multiple sources have quoted they only had the cards 2 weeks prior to the event...……………………...there is nothing like shooting code in the dark without something to test it on, If you have time I suggest watching AdoredTv's video, he puts it at its worst and at its best, I don't think I have ever heard him congratulate and damn NVidia at the same time like that, and he clearly goes through with a fine comb about basically how you are an idiot for forming an opinion this early on something that is clearly alpha beta state. I agree with him on that.
 
lol yes the nvidia hair fps hit was ridiculous for what it did in TW3 on my 980 ti at the time. It plays fine today at 4k and a 1080ti though.

TBF TressFX was the same thing and NVidia just offered their version of it because games like Lichdom went out of their way to lock NVidia users from accessing it at the request of AMD(there are sources from the devs on this on steam, I am not linking sources go find it yourself, and I'm not going to argue with fanboys over this)

It was wildly demanding of compute from the card, both AMD and NVidia won't compromise the lower end workstation products(which cost thousands versus hundreds) so gamers can have the experience at 100+frames with Tress/HairW, this is a limitation of the marketing and maintaining product lines for content creation and gamers, as both make the revenue, the 2080Ti people will cry over the price but the compute amount pretty much demands the price otherwise they would be cannibalizing the Quadro tier products. I have no doubt in my mind that amature smaller video content producing companies and individuals will be buying 2080Tis($2.4-4.8k) for that purpose rather than gaming to save bucks over the $20k package, same with the amature scientific communities that lacks funds and resources.
 
They aren't Ray Tracing numbers, they are Legacy gaming numbers they are reporting I seen a 50-60fps on RT demos, which is possible if they moved one of the computers as they were all clumped together spitting heat at one another so it is possible they were thermal throttling, the 30-40fps quote is only from 1 source everyone keeps repeating, that had fraps up and even then they couldn't verify 1080p they just assumed that was the capture rate, so in all possibility that could be skewed.

On RT I wouldn't use any of the garbage numbers from the event to form an opinion on the state of what we will see, the absolute best thing you can do is wait and see from finished games that were developed from start to finish with the tech, these demos were tacked on experiences because the hardware wasn't available to see or use until that point, multiple sources have quoted they only had the cards 2 weeks prior to the event...……………………...there is nothing like shooting code in the dark without something to test it on, If you have time I suggest watching AdoredTv's video, he puts it at its worst and at its best, I don't think I have ever heard him congratulate and damn NVidia at the same time like that, and he clearly goes through with a fine comb about basically how you are an idiot for forming an opinion this early on something that is clearly alpha beta state. I agree with him on that.

Keep reaching. Not sure why your bringing adored into the discussion. I never mentioned him and have only been discussing Ray Tracing in this thread.
 
Back
Top