Shadow of the Tomb Raider Demo Is Hobbled by Real-Time Ray Tracing and Other Tech

Keep reaching. Not sure why your bringing adored into the discussion. I never mentioned him and have only been discussing Ray Tracing in this thread.

I'm not in the slightest, the main part of Techradars report was about 4k gaming without RT, so what are you doubting exactly????? because as I have stated before if you are judging RT performance on the demos you are a moron knowing and seeing everything surrounding it, making and forming opinions based on conjecture without real hard numbers from an actual product, but unfinshed ones.
 
I'm not in the slightest, the main part of Techradars report was about 4k gaming without RT, so what are you doubting exactly????? because as I have stated before if you are judging RT performance on the demos you are a moron knowing and seeing everything surrounding it, making and forming opinions based on conjecture without real hard numbers from an actual product, but unfinshed ones.

If you don't understand my posts so far, then me explaining it even further would be no use. If you want to believe Tech Radar's numbers, that's fine. Whatever makes you happy.
 
Keep reaching. Not sure why your bringing adored into the discussion. I never mentioned him and have only been discussing Ray Tracing in this thread.

Because Adored is being one of the absolute few objective people atm, everyone else is going post nuclear with minimal knowledge, Gorden Ung hit the nail on the head as well, we have traditional performance increases and nvidias slides coicide with techradars report of that, the unknown is RT performance as I have stated before, any numbers on RT are complete garbage and useless. You need to wait and that is all there is to it, spending countless hours arguing with multiple people in one forum does not make your point right.
 
If you don't understand my posts so far, then me explaining it even further would be no use. If you want to believe Tech Radar's numbers, that's fine. Whatever makes you happy.

Oh I fully understand, and I am telling you straight out, judging RT based on the event is foolish and it's ok if you can't understand this, but, there was to much bs going on in the background from conflicting reports to Dice and Eidos, digital eclipse saying they didn't have the cards until 2 weeks before the event, to judging performance on a non finished product. People are going post nuclear for zero reason and it is utterly stupid.

On top of that I posted tech radars article based on non RT performance to show 4k normal gaming got a nice bump and it falls in line with NVidia slides released of 35 to 50% over previous gen. If you think for 1 second that RT isn't going to have a fps toll then your an idiot, don't expect 100+ frames 4k RT, it's not happening, I doubt realistically 4k will even be an option with RT. Not this early and not with RT founding hardware. But going the exact opposite negative Nancy bs and then reporting numbers that mean absolutely nothing make zero sense as well. There were too many variables to get anything conclusive.

Wait for the real numbers on RT. Even then we need actual benchmarks and games built from the ground up enabling RT, not tacked on experiences.

This has been my whole point the entire time.
 
Oh I fully understand, and I am telling you straight out, judging RT based on the event is foolish and it's ok if you can't understand this, but, there was to much bs going on in the background from conflicting reports to Dice and Eidos, digital eclipse saying they didn't have the cards until 2 weeks before the event, to judging performance on a non finished product. People are going post nuclear for zero reason and it is utterly stupid.

On top of that I posted tech radars article based on non RT performance to show 4k normal gaming got a nice bump and it falls in line with NVidia slides released of 35 to 50% over previous gen. If you think for 1 second that RT isn't going to have a fps toll then your an idiot, don't expect 100+ frames 4k RT, it's not happening, I doubt realistically 4k will even be an option with RT. Not this early and not with RT founding hardware. But going the exact opposite negative Nancy bs and then reporting numbers that mean absolutely nothing make zero sense as well. There were too many variables to get anything conclusive.

Wait for the real numbers on RT. Even then we need actual benchmarks and games built from the ground up enabling RT, not tacked on experiences.

This has been my whole point the entire time.

Ok, Tech Radar Claims they were getting between 50-57fps at 4K with Ray Tracing on, claims which were completely at odds with other websites. And your excuse for that was, that maybe they moved the machine when Tech Radar was using it so it got better airflow.

Don't talk to me about been an idiot, your whole argument is not making sense. You came into a discussion where I was telling another person that the games were running at 1080p and that 4K was unlikely because Ray Tracing is too demanding, that the Tech Radar numbers aren't right in Ray Tracing and if they aren't right in Ray tracing can we trust their numbers for normal gaming? Then you join, to tell me that I am wrong, that the developers didn't have enough time to work on these. And then tell me that the Tech Radar numbers are probably right, that they might have moved the computer. That it might not be 1080p, that all the results might be skewed. And now, with your latest post, you are still vouching for Tech Radar's numbers while at same time saying that 4K ray tracing is probably not even realistically possible. (Which is what I have been saying since I came into this thread BTW)

So which is it? Are Tech Radar's performance figures accurate or is 4K Ray Tracing not a realistic goal at this point in time? If you say 4K is unrealistic at this time, surely you have to wonder how Tech Radar played a game at 4K with full Ray Tracing on at 50-57fps? And if they are so far out with those figures, can you even trust the rest of what they said?

And maybe, you have to start believing the other Websites, who said the games were running at 1080p and were struggling at that. I totally understand that the are still working on them. But, when 4A Games, who are working on Metro Exodus, come out and say they are aiming for 60fps at 1080p for the game, then it's really looking like the Ray Tracing demos were only 1080p and that even with further optimisation, they still might only be hitting 60fps at 1080p on release.
 
oh, I am not a negative Nancy. I question all the information that's coming out good and bad. But, I am a lot more critical of any information that leaks that sounds much better than what's coming out from other sources.
 
I don't get why people keep comparing different games with the only metric being 4k @ 60FPS or whatever. You can run Pacman at 4K/60FPS on a 960 GTX. 4K/60FPS/Ray Tracing means NOTHING in a vacuum. Maybe they hit 4K/50FPS with ray tracing in Tomb Raider, and maybe completely different games with completely different level of utilizing ray tracing or more complex textures or whatever else could barely keep 60 FPS @ 1080p.

Rocket League runs @ 60 FPS with everything on on almost any modern graphics card. Doesn't mean that same card can run every game @ 60 FPS.

Until real benchmarks come out its all just conjecture.
 
Ok, Tech Radar Claims they were getting between 50-57fps at 4K with Ray Tracing on, claims which were completely at odds with other websites. And your excuse for that was, that maybe they moved the machine when Tech Radar was using it so it got better airflow.

Don't talk to me about been an idiot, your whole argument is not making sense. You came into a discussion where I was telling another person that the games were running at 1080p and that 4K was unlikely because Ray Tracing is too demanding, that the Tech Radar numbers aren't right in Ray Tracing and if they aren't right in Ray tracing can we trust their numbers for normal gaming? Then you join, to tell me that I am wrong, that the developers didn't have enough time to work on these. And then tell me that the Tech Radar numbers are probably right, that they might have moved the computer. That it might not be 1080p, that all the results might be skewed. And now, with your latest post, you are still vouching for Tech Radar's numbers while at same time saying that 4K ray tracing is probably not even realistically possible. (Which is what I have been saying since I came into this thread BTW)

So which is it? Are Tech Radar's performance figures accurate or is 4K Ray Tracing not a realistic goal at this point in time? If you say 4K is unrealistic at this time, surely you have to wonder how Tech Radar played a game at 4K with full Ray Tracing on at 50-57fps? And if they are so far out with those figures, can you even trust the rest of what they said?

And maybe, you have to start believing the other Websites, who said the games were running at 1080p and were struggling at that. I totally understand that the are still working on them. But, when 4A Games, who are working on Metro Exodus, come out and say they are aiming for 60fps at 1080p for the game, then it's really looking like the Ray Tracing demos were only 1080p and that even with further optimisation, they still might only be hitting 60fps at 1080p on release.

Yup, I definitely have to eat humble pie. But as you said you question everything and so do I and there was just too many places saying various things. I just don't understand why they would even bother with RT this gen if the 2070 may not even be able to handle 1080p30(if Metro's target is for the 2080ti). Also while I understood RT was going to be a hit to perf, I didn't expect it to be that much, given the dedicated hardware. I figured 100fps without it, and 30-50 with it wasn't too out of the question at 4k...but man was I wrong. AMD doesn't have to even worry at this point about implementing until after Navi. Also the price point now is more apparently egregious IF the 2070 can't even do 1080p30 with RT on, because then the 2080 probably can't do 1080p60 which imo is the bare mimimum for that card to make sense.
 
Yup, I definitely have to eat humble pie. But as you said you question everything and so do I and there was just too many places saying various things. I just don't understand why they would even bother with RT this gen if the 2070 may not even be able to handle 1080p30(if Metro's target is for the 2080ti). Also while I understood RT was going to be a hit to perf, I didn't expect it to be that much, given the dedicated hardware. I figured 100fps without it, and 30-50 with it wasn't too out of the question at 4k...but man was I wrong. AMD doesn't have to even worry at this point about implementing until after Navi. Also the price point now is more apparently egregious IF the 2070 can't even do 1080p30 with RT on, because then the 2080 probably can't do 1080p60 which imo is the bare mimimum for that card to make sense.
1) because their pro series has it
2) get the ball rolling on development
3) good advertising

I will always go back to HDR when it was first implemented. It sucked out all performance, ruined AA and was very limited in support but it was a popular add-on feature people wanted to use. RT could be that thing that sucks out performance but is good enough to make people say fuck-it and use it. We also don’t know final performance numbers on games that are building it into their engine versus patches. I know ME creators said they are aiming for 60fps at 1080p with max all else (maybe I made that max all else up but still...).
 
A graphics card company wastes silicon on some useless feature that may or may not be useful in the future, and requires developers to spend development time to utilize. Sounds familiar.

There is no "may or may not" here. It will be useful in the future its just that future probably isn't for a while. However, it has to start somewhere.
 
Yup, I definitely have to eat humble pie. But as you said you question everything and so do I and there was just too many places saying various things. I just don't understand why they would even bother with RT this gen if the 2070 may not even be able to handle 1080p30(if Metro's target is for the 2080ti). Also while I understood RT was going to be a hit to perf, I didn't expect it to be that much, given the dedicated hardware. I figured 100fps without it, and 30-50 with it wasn't too out of the question at 4k...but man was I wrong. AMD doesn't have to even worry at this point about implementing until after Navi. Also the price point now is more apparently egregious IF the 2070 can't even do 1080p30 with RT on, because then the 2080 probably can't do 1080p60 which imo is the bare mimimum for that card to make sense.

Hey man, it's no biggie, and full respect to you. We were just having a discussion based on what we both knew or thought we knew at the time. It could have easily been me eating the humble pie. I really enjoyed the discussion and it was great to have an argument with somebody while not descending into name calling and other crap. They had to start somewhere, with AMD introducing Radeon Ray 2.0 earlier in the year into Vulkan and now Nvidia releasing the first hardware based Ray tracing for consumer GPUs, signs are that it will move along pretty fast. With all new tech the early adopters are always hit hardest. I am still hopeful that the performance will be ok when they finally get here. That with the patch Microsoft are releasing for Dx in October will bump up the performance a lot. And I know this goes against everything that I have been saying, but, maybe 30fps at 4K might be achievable with enough time and patches.
 
Back
Top