AMD RDNA 2 gets ray tracing

Yes, performance much better than Turing since Turing has pretty much sucked with RT gaming from a performance prospective, lack of real innovative games or games where one would say -> I want that!.
Do you have any proof to support this "much better performance" claim? Can be simple leak as I would not expect anything more substantial than this in this point of time anyway.
Not only we do not have any leaks about that performance but also we didn't even really get leaks about what kind of hardware resources AMD will put to their chips.
At this point all we can really say is that high-end RNDA2 models will support DXR and nothing else.
Best case scenario RDNA2 simply does DXR much better than Turing and can process more ray intersection and that is that. Worst case scenario is that RDNA2 will only partially accelerate DXR and rest will be muddying waters as to which nature I better not go in to.

Nvidia does not make games so what is this lack of innovative games argument even about?

Minimum for RT should be like 1440p 100fps+ with max or near max settings and RT, 4K 60fps+. Preferrably 100fps+ at 4K but even normal games can't even achieve that for the most part with even a 2080Ti except maybe Doom Eternal and older games.
Nvidia provided hardware ray tracing with the level of performance they were able to make available at that time and that is that.
We are all benefiting from the fact they did as we would not have few RT games now but zero RT games.

Let's wait and see if AMD can deliver the kind of RT performance that is satisfactory to you.
 
Do you have any proof to support this "much better performance" claim? Can be simple leak as I would not expect anything more substantial than this in this point of time anyway.
Not only we do not have any leaks about that performance but also we didn't even really get leaks about what kind of hardware resources AMD will put to their chips.
At this point all we can really say is that high-end RNDA2 models will support DXR and nothing else.
Best case scenario RDNA2 simply does DXR much better than Turing and can process more ray intersection and that is that. Worst case scenario is that RDNA2 will only partially accelerate DXR and rest will be muddying waters as to which nature I better not go in to.

Nvidia does not make games so what is this lack of innovative games argument even about?


Nvidia provided hardware ray tracing with the level of performance they were able to make available at that time and that is that.
We are all benefiting from the fact they did as we would not have few RT games now but zero RT games.

Let's wait and see if AMD can deliver the kind of RT performance that is satisfactory to you.
Going from AMD's raytracing demo...I would not hold my breath...
 
Going from AMD's raytracing demo...I would not hold my breath...
Did you check the unreal demo for PS5? The lighting makes a huge improvement in visuals to me, more than the everything is shiny demo's that we keep getting. I'm hoping to see more like that from "raytracing".

Edit:. I'm not saying this makes AMD better/worse than NVidia, just pointing out the quality isn't all about reflected surfaces.
 
  • Like
Reactions: noko
like this
Did you check the unreal demo for PS5? The lighting makes a huge improvement in visuals to me, more than the everything is shiny demo's that we keep getting. I'm hoping to see more like that from "raytracing".

Edit:. I'm not saying this makes AMD better/worse than NVidia, just pointing out the quality isn't all about reflected surfaces.

If you watch the video, and pay attention, you will find that they do "hacks" to simulate raytracing...this is not what you think it is...a "competitor" to RT.
Infact you can see the lag in their bounces quite cleary...unless you only play on a console and are not used to better.

AMD
NVIDIA
Intel.

They are all going DXR...good luck staying in the past.
 
If you watch the video, and pay attention, you will find that they do "hacks" to simulate raytracing...this is not what you think it is...a "competitor" to RT.
Infact you can see the lag in their bounces quite cleary...unless you only play on a console and are not used to better.

AMD
NVIDIA
Intel.

They are all going DXR...good luck staying in the past.
Nvidia raytracing is a hack, tack on top of rasterization, limited number of samples for ray trace IQ. AMD will probably be similar in regards to this as well. Can it improve IQ and not destroy performance is the real question
Do you have any proof to support this "much better performance" claim? Can be simple leak as I would not expect anything more substantial than this in this point of time anyway.
Not only we do not have any leaks about that performance but also we didn't even really get leaks about what kind of hardware resources AMD will put to their chips.
At this point all we can really say is that high-end RNDA2 models will support DXR and nothing else.
Best case scenario RDNA2 simply does DXR much better than Turing and can process more ray intersection and that is that. Worst case scenario is that RDNA2 will only partially accelerate DXR and rest will be muddying waters as to which nature I better not go in to.

Nvidia does not make games so what is this lack of innovative games argument even about?


Nvidia provided hardware ray tracing with the level of performance they were able to make available at that time and that is that.
We are all benefiting from the fact they did as we would not have few RT games now but zero RT games.

Let's wait and see if AMD can deliver the kind of RT performance that is satisfactory to you.
No claim, no idea if better or worst, for AMD RT to be useful, I would say it will need to be better than Turing since Turing performance sucked when RTX was used. I was responding to your own statement previously. The only saving grace is DLSS 2 which is very good. Will next gen RT console games use DLSS when the games are ported to the PC? Then again UE5 demo, not using RT looked better than any current RT game out there, Crytek will be using non hardware specific RT and we should get a taste of it with the remake of Crysis. In the scheme of things, RTX on Turing has been not only disappointing but almost useless with games. As for AMD, the Demo they produced was also very discouraging.
 
If you watch the video, and pay attention, you will find that they do "hacks" to simulate raytracing...this is not what you think it is...a "competitor" to RT.
Infact you can see the lag in their bounces quite cleary...unless you only play on a console and are not used to better.

AMD
NVIDIA
Intel.

They are all going DXR...good luck staying in the past.
Really not sure what you're going on about. My point was you can (and games do) use raytracing to do lighting as well as shiny surfaces. I am more interested in better lighting of the scene more so than having a really shiny ground with puddles all over. This is done USING DXR, so no clue what you're talking about they are going to DXR and good luck staying in the past? I'm literally talking about using DXR (or equivalent Vulkan API) to improve the quality of a scene. This IS their upcoming game enine that includes hardware DXR to do the lighting (as well as other stuff like geometry scaling). Welcome to the present? This applies equally to all 3 of those you listed, yay. This isn't a purely console thing, just the demo happened to be for a console. Similar things are being done with PC games to cast rays to determine if a surface is in a shadow or not and bounce rays to apply some ambient lighting (via monte carlo or similar). Thanks for letting me know to stay in the past while looking at current and future implementations. Maybe you need to keep up?
 
Nvidia raytracing is a hack, tack on top of rasterization, limited number of samples for ray trace IQ. AMD will probably be similar in regards to this as well. Can it improve IQ and not destroy performance is the real question

No claim, no idea if better or worst, for AMD RT to be useful, I would say it will need to be better than Turing since Turing performance sucked when RTX was used. I was responding to your own statement previously. The only saving grace is DLSS 2 which is very good. Will next gen RT console games use DLSS when the games are ported to the PC? Then again UE5 demo, not using RT looked better than any current RT game out there, Crytek will be using non hardware specific RT and we should get a taste of it with the remake of Crysis. In the scheme of things, RTX on Turing has been not only disappointing but almost useless with games. As for AMD, the Demo they produced was also very discouraging.
I hope it's better than turing, as it will be competing with Ampere, which will have improved RT performance (not sure how much yet, but claims are pretty hefty). I was under the impression (maybe incorrectly) that the UE5 demo was using ray's to perform the lighting, not to do reflections. I agree, the AMD demo was not the best showing, I much prefer accurate lighting in scenes to make them realistic rather than mirrors all over that aren't natural.
 
  • Like
Reactions: noko
like this
I hope it's better than turing, as it will be competing with Ampere, which will have improved RT performance (not sure how much yet, but claims are pretty hefty). I was under the impression (maybe incorrectly) that the UE5 demo was using ray's to perform the lighting, not to do reflections. I agree, the AMD demo was not the best showing, I much prefer accurate lighting in scenes to make them realistic rather than mirrors all over that aren't natural.
Appears to be a totally different approach and not using hardware raytracing in this demo, it may though in the future is what I gather. RTX for example finds the light intersections after the geometry setup, Lumens work very much differently and does not do normal triangle rendering. Lumens and Nanite (Geometry uses primitive shaders) may have to work together. The below link maybe useful:

"Lumen uses ray tracing to solve indirect lighting, but not triangle ray tracing," explains Daniel Wright, technical director of graphics at Epic. "Lumen traces rays against a scene representation consisting of signed distance fields, voxels and height fields. As a result, it requires no special ray tracing hardware."
https://www.eurogamer.net/articles/...eal-engine-5-playstation-5-tech-demo-analysis

Now of note which I have no real clue or not: If AMD raytracing is following their patent, the intersections of the rays for AMD is done inside the texture block and not separate like in RTX after geometry setup which to me means AMD methods will support non traditional geometry setups such as like how Nanite operates. Which may mean that RT games may support AMD hardware method but not Nvidia (which is not ideal unless Nvidia changes RTX). Not 100% sure on any of this.
 
Last edited:
Nvidia raytracing is a hack, tack on top of rasterization, limited number of samples for ray trace IQ. AMD will probably be similar in regards to this as well. Can it improve IQ and not destroy performance is the real question

No claim, no idea if better or worst, for AMD RT to be useful, I would say it will need to be better than Turing since Turing performance sucked when RTX was used. I was responding to your own statement previously. The only saving grace is DLSS 2 which is very good. Will next gen RT console games use DLSS when the games are ported to the PC? Then again UE5 demo, not using RT looked better than any current RT game out there, Crytek will be using non hardware specific RT and we should get a taste of it with the remake of Crysis. In the scheme of things, RTX on Turing has been not only disappointing but almost useless with games. As for AMD, the Demo they produced was also very discouraging.
I've been preaching this problem for a while without a ray tracing benchmark allot of the GTX enhancements are actually garbage. The biggest offender is Minecraft. That demo video made me want to hurl. Yep, can't have reflections without Nvidia! 🙄
 
  • Like
Reactions: noko
like this
Nvidia raytracing is a hack, tack on top of rasterization, limited number of samples for ray trace IQ. AMD will probably be similar in regards to this as well. Can it improve IQ and not destroy performance is the real question
DXR (or as you call it "Nvidia raytracing") is fully programmable and can be used without any rasterization at all. You can do full path tracing game with it and actually it was already done with Quake 2 RTX and probably Minecraft RTX also.
Hybrid ray tracing approach was touted by Nvidia as a way to have sensible frame rates in modern games with some effects enabled.

No claim, no idea if better or worst, for AMD RT to be useful, I would say it will need to be better than Turing since Turing performance sucked when RTX was used. I was responding to your own statement previously. The only saving grace is DLSS 2 which is very good. Will next gen RT console games use DLSS when the games are ported to the PC? Then again UE5 demo, not using RT looked better than any current RT game out there, Crytek will be using non hardware specific RT and we should get a taste of it with the remake of Crysis. In the scheme of things, RTX on Turing has been not only disappointing but almost useless with games. As for AMD, the Demo they produced was also very discouraging.
UE5 demo most likely used Voxel-based global illumination http://on-demand.gputechconf.com/gt...2-rt-voxel-based-global-illumination-gpus.pdf
Unreal Engine had this feature for a while now. Ray tracing is not the only way to do GI and it certainly isn't the most efficient. It is however potentially most accurate and have the easiest implementation.
Of course something entirely else might have been used in UE5 demo. Only real clue we have is answer of someone from Epic that RT was not used for GI.

RDNA2 does not really need to be faster.
If their DXR implementation is slower than Turing then so what? What can you do?
 
I've been preaching this problem for a while without a ray tracing benchmark allot of the GTX enhancements are actually garbage. The biggest offender is Minecraft. That demo video made me want to hurl. Yep, can't have reflections without Nvidia! 🙄

I never would have thought you'd feel this way.
 
DXR (or as you call it "Nvidia raytracing") is fully programmable and can be used without any rasterization at all. You can do full path tracing game with it and actually it was already done with Quake 2 RTX and probably Minecraft RTX also.
Hybrid ray tracing approach was touted by Nvidia as a way to have sensible frame rates in modern games with some effects enabled.


UE5 demo most likely used Voxel-based global illumination http://on-demand.gputechconf.com/gt...2-rt-voxel-based-global-illumination-gpus.pdf
Unreal Engine had this feature for a while now. Ray tracing is not the only way to do GI and it certainly isn't the most efficient. It is however potentially most accurate and have the easiest implementation.
Of course something entirely else might have been used in UE5 demo. Only real clue we have is answer of someone from Epic that RT was not used for GI.

RDNA2 does not really need to be faster.
If their DXR implementation is slower than Turing then so what? What can you do?
Not even sure what you are driving at, GCN arch., RNDA arch will fo full raytracing, programmable, I use it in Pro Render. Played Quake II RTX with 2x 10i80Ti's, it supports mGPU. VRay supports using the RT cores giving about a 40% boost over not using it. No where close to being real time when actually doing a typical quality ray trace scene with VRay.

Even with Turing hardware for RT, the lack of games approaching two years and the quality and performance achieved has been disappointing saying it kindly. The UE5 Demo was a head turner, real ground breaking. RT may never work well with games other than to give a more accurate model for something. Be it reflections, GI, shadows etc.
 
I never would have thought you'd feel this way.
Let me make it easy. If you can't quantify how much ray tracing is in a game how do you know ray tracing is making the game better than other available technologies?
 
  • Like
Reactions: noko
like this
Well you can play a game and enable/disable RT and see the difference for yourself.

In Control, for example, there is a big difference in quality for RT and it is worth turning some settings down.

I had to play Control with RT High, other settings Medium, to get good perf at 1080p but it was worth it.

Metro has a more subtle global lighting effect but it is noticeable. Some games like BF5 weren't the best but the devs are learning.
 
Well you can play a game and enable/disable RT and see the difference for yourself.

In Control, for example, there is a big difference in quality for RT and it is worth turning some settings down.

I had to play Control with RT High, other settings Medium, to get good perf at 1080p but it was worth it.

Metro has a more subtle global lighting effect but it is noticeable. Some games like BF5 weren't the best but the devs are learning.
That does not in any way dictate that it can only be done via "Nvidia RT". The loss of detail does not mean that RT was or wasn't involved. That's your guess.
 
  • Like
Reactions: noko
like this
I see what you are getting at, but there is no way to know for certain what a game would look like if they used different technology.

But you can say for certain that, on a particular game, the addition of ray tracing is an improvement versus the base level.

As I said with Control or Metro you can enable/disable DXR and A/B test yourself.

Though with Unreal Engine 5 and CryEngine, they are doing RT effects without RT hardware (just with programmable shaders) so time will tell if that is a better approach.
 
I see what you are getting at, but there is no way to know for certain what a game would look like if they used different technology.
Yup! Although the problem could be solved if a benchmark was created that tested raytracing on video cards and just included that.

But you can say for certain that, on a particular game, the addition of ray tracing is an improvement versus the base level.

As I said with Control or Metro you can enable/disable DXR and A/B test yourself.

Though with Unreal Engine 5 and CryEngine, they are doing RT effects without RT hardware (just with programmable shaders) so time will tell if that is a better approach.
That's the thing you can't even say that. Yet, all of the reviewers have been gleefully asking for a feature that they can't benchmark. I don't care who caused the problem but it's a major problem and reviewers should be the first ones noticing it. If they can create tools or find tools for frame times then they can find the same for ray-tracing. You shouldn't promote something you can't quantify but that's what's been happening and that's *ucked no matter who is doing it.
 
Last edited:
That does not in any way dictate that it can only be done via "Nvidia RT". The loss of detail does not mean that RT was or wasn't involved. That's your guess.
If RT was not involved and only initialization code was preventing games to run on non-DXR cards then you could enable it on any GTX GPU which can pass these checks and would have pretty decent/comparable performance.

What is it with "Nvidia RT" anyway? Nvidia pushed this tech and only NV cards even have it be it in hardware or software but when RDNA2 comes out you will be able to use it there too and do benchmarks to compare RT performance so I do not get your whole point.

Global illumination, reflections, etc. can be achieved using different methods but each have advantages and disadvantages ad it is not as straightforward as to assume that these different approaches are faster/better. Ray tracing approach is most straightforward and potentially can give most accurate results but is also very computationally expensive and even with hardware RT some simplifications need to be made to make it work at decent frame rates on current hardware. But same is true for any method and each have its own quirks and implementation effort and you need to reduce precision to make it viable for real time applications.
 
If RT was not involved and only initialization code was preventing games to run on non-DXR cards then you could enable it on any GTX GPU which can pass these checks and would have pretty decent/comparable performance.

What is it with "Nvidia RT" anyway? Nvidia pushed this tech and only NV cards even have it be it in hardware or software but when RDNA2 comes out you will be able to use it there too and do benchmarks to compare RT performance so I do not get your whole point.
The benchmarks should exist now because the cards exist now. It's not that hard to understand that buyer should understand just how much RT they are getting.

Global illumination, reflections, etc. can be achieved using different methods but each have advantages and disadvantages ad it is not as straightforward as to assume that these different approaches are faster/better. Ray tracing approach is most straightforward and potentially can give most accurate results but is also very computationally expensive and even with hardware RT some simplifications need to be made to make it work at decent frame rates on current hardware. But same is true for any method and each have its own quirks and implementation effort and you need to reduce precision to make it viable for real time applications.
Not making any assumption at all actually. Merely stating that to gleefully exclaim that an effect is ray traced and couldn't be done any other way while at the same time not being able to quantify it sounds ridiculous.
 
The benchmarks should exist now because the cards exist now. It's not that hard to understand that buyer should understand just how much RT they are getting.
There are benchmarks to quantify how good RT performance is. You can also use various games which have support for DXR and compare cards against each other.
Of course you cannot compare against AMD cards but that is only AMDs fault for not providing this feature.

Not making any assumption at all actually. Merely stating that to gleefully exclaim that an effect is ray traced and couldn't be done any other way while at the same time not being able to quantify it sounds ridiculous.
Take any single feature out of GPU and you can tell exactly the same thing about it. There are many ways to accomplish similar visual result. Some easier on hardware, some for developers.
And if we are talking about RTX cards feature of accelerating ray-intersection using DXR API you can always see how these cards compare to cards which need to do the same ray-intersection calculations on shaders eg. https://wccftech.com/geforce-gtx-ray-tracing-1080p-performance-snapshot/

I am not really sure what is your issue. If you want to compare implementation of algorithms for Global Illumination and reflections and such directly then I guess you are out of luck because no one made such benchmark software. You can make it yourself eg. with Unreal Engine which is pretty much free at this point and should already have support for various technologies built-in.
 
There are benchmarks to quantify how good RT performance is. You can also use various games which have support for DXR and compare cards against each other.
Of course you cannot compare against AMD cards but that is only AMDs fault for not providing this feature.
Really which? In what review? How is that ANDs fault? It's nVidias cards we are trying to benchmark here not AMDs. FPS does not in anyway tell us how many rays are being calculated.
There are benchmarks to quantify
Take any single feature out of GPU and you can tell exactly the same thing about it. There are many ways to accomplish similar visual result. Some easier on hardware, some for developers.
Which benchmarks? Can you link a review please? Thanks.

And if we are talking about RTX cards feature of accelerating ray-intersection using DXR API you can always see how these cards compare to cards which need to do the same ray-intersection calculations on shaders eg. https://wccftech.com/geforce-gtx-ray-tracing-1080p-performance-snapshot/

I am not really sure what is your issue. If you want to compare implementation of algorithms for Global Illumination and reflections and such directly then I guess you are out of luck because no one made such benchmark software. You can make it yourself eg. with Unreal Engine which is pretty much free at this point and should already have support for various technologies built-in.
Not my job cause I'm not a reviewer the questions I'm asking now should have been asked and answered long ago. How we could get this far with anyone asking such a simple question is beyond me.

So no one made a benchmark and because of this it's acceptable to take the face value of an effect as being awesome without questioning is validity, effectiveness, or implementation? Doesn't sound too productive.
 
Well 3DMark does have a ray tracing benchmark.
https://benchmarks.ul.com/news/3dmark-port-royal-ray-tracing-benchmark-now-available

But I think the real test of validity is looking at a game or demo running and deciding for yourself if it looks better than previous games.

For example, this Star Wars demo would not have been possible in previously generations, and that is clear without any other data just by watching it.



I don't think knowing the rays per second or anything would matter, just like polygons per second is no longer used as it doesn't mean anything.
 
Last edited:
Well 3DMark does have a ray tracing benchmark.
https://benchmarks.ul.com/news/3dmark-port-royal-ray-tracing-benchmark-now-available

But I think the real test of validity is looking at a game or demo running and deciding for yourself if it looks better than previous games.

For example, this Star Wars demo would not have been possible in previously generations, and that is clear without any other data just by watching it.



I don't think knowing the rays per second or anything would matter, just like polygons per second is no longer used as it doesn't mean anything.

We should count money this way.
 
kac77
What you are complaining is lack of synthetic benchmarks.
I agree that it would be nice to have good benchmark of that kind, especially for comparison of various aspects of DXR (and other features) with RDNA2 and Ampere to see how they compare and what exactly improved.
Overall however synthetic benchmarks are kinda useless because in the end what matters is how hardware performs in programs/games.

When AMD finally releases cards with DXR it will be possible to compare performance with ot without synthetic benchmarks.
Of course some people will be like "These games favor Nvidia because they were developed for RTX cards, bla bla bla" 🤣
 
kac77
What you are complaining is lack of synthetic benchmarks.
I agree that it would be nice to have good benchmark of that kind, especially for comparison of various aspects of DXR (and other features) with RDNA2 and Ampere to see how they compare and what exactly improved.
Overall however synthetic benchmarks are kinda useless because in the end what matters is how hardware performs in programs/games.

When AMD finally releases cards with DXR it will be possible to compare performance with ot without synthetic benchmarks.
Of course some people will be like "These games favor Nvidia because they were developed for RTX cards, bla bla bla" 🤣
and some folks will say "These games favor AMD because they were developed using console RT methods, bla bla bla" One big circle jerk. The question is will the new implementation actually be useful and worthwhile, Turing was clearly not in general with a few exceptions.

Cyberpunk 2077 will be getting an update to support XBox Series X console new features, the original console version will work from the start but what the update will do to take advantage of the new console features is to be seen. Could be using AMD RT method for raytracing. Would think that would also carry over to RNDA2 GPU's that support RT.

https://www.gamesradar.com/cyberpunk-2077-next-gen-xbox-series-x/
 
Last edited:
What you are complaining is lack of synthetic benchmarks. I agree that it would be nice to have good benchmark of that kind, especially for comparison of various aspects of DXR (and other features) with RDNA2 and Ampere to see how they compare and what exactly improved.

It would be nice to have benchmarks comparing different implementations of the same feature. For example there are lots of ways to render shadows, ambient occlusion, god rays, light volumes etc but there's never been a standard benchmark comparing them.

I don't know why anyone would expect a benchmark comparing different implementations of GI or any other effect. It's just not done as it's a ton of work and there is no "standard" way of implementing these effects so the benchmark would be meaningless.

The best we can hope for is a synthetic rays per second test like the pixel and texturing tests back in the day.

What we do know is that RT is the most physically accurate implementation of GI. Everything else is even more of a hack. The question is whether those hacks can look good enough with acceptable performance. The only way to answer that is to compare actual implementations in shipping games.
 
Not my job cause I'm not a reviewer the questions I'm asking now should have been asked and answered long ago. How we could get this far with anyone asking such a simple question is beyond me.

You can answer the question yourself. Just compare raytraced effects to the non RT versions in actual games. No need for a synthetic benchmark.

Some people know RT is awesome because of math and physics. Other people know it's awesome because it enables a level of image quality that hasn't been achieved before. If that's not enough evidence then you'll just have to wait until somebody tries to achieve those things without RT and you can compare for yourself.
 
Let me make it easy. If you can't quantify how much ray tracing is in a game how do you know ray tracing is making the game better than other available technologies?
Crytek Neon Noir is a great example of doing RT without dedicated hardware (which in reality the Nvidia RT cores are just doing one aspect of ray tracing while most of it is being done in the shaders). Neon more does both triangle object raytracing and voxel raytracing, only the triangle raytracing could get hardware support with RTX, while AMD upcoming RNDA2 maybe able to do both if done with the texture units. Here Turing does do better at the time of this video, since then it appears RNDA has had some driver optimizations. Anyways we should see this in action when Crysis comes back out with their modern engine and it is looking really good. How much RT and what options will be available to turn on and off and how much dedicated hardware support is to be seen. Crysis maybe a great game to compare RT when it comes out, for one how agnostic it appears to be and possible future hardware support for Triangle and voxel RT support. Crytek already showed RT shadows, near field ambient occlusions etc. so this could be a more feature packed RT title. A title that may make RT a must have for Enthusiasts and graphic whores.

 
kac77
What you are complaining is lack of synthetic benchmarks.
I agree that it would be nice to have good benchmark of that kind, especially for comparison of various aspects of DXR (and other features) with RDNA2 and Ampere to see how they compare and what exactly improved.
Overall however synthetic benchmarks are kinda useless because in the end what matters is how hardware performs in programs/games.

When AMD finally releases cards with DXR it will be possible to compare performance with ot without synthetic benchmarks.
Of course some people will be like "These games favor Nvidia because they were developed for RTX cards, bla bla bla" 🤣

You can answer the question yourself. Just compare raytraced effects to the non RT versions in actual games. No need for a synthetic benchmark.

Some people know RT is awesome because of math and physics. Other people know it's awesome because it enables a level of image quality that hasn't been achieved before. If that's not enough evidence then you'll just have to wait until somebody tries to achieve those things without RT and you can compare for yourself.
Nope. I can tell you already especially with the Minecraft upgrade that many of those effects can be done via rasterization. That's why I posed the problem. Using your eye in no way guarantees anything. It's not technical in any way. Why do you think Nvidia picks older games?

The whole point of increasing rasterization performance had been to trick the eye for reflections and lighting angles. What are we determining FPS and frame times by the human eye now instead of benchmarks?
 
Yeah, someone could make a benchmark that looked at specific metrics, like triangle ray intersection speed, etc. That would be like you are saying.

But in terms of games, these are base components of much more complex algorithms, and games today are not fully ray-traced. It's still a hybrid model with many tricks and optimizations.

So the number of intersections per seconds could be useful to compare two GPUs (all other hardware and software the same) but it wouldn't say anything about how good games will look and/or be a good indication of real-world game performance.
 
Nope. I can tell you already especially with the Minecraft upgrade that many of those effects can be done via rasterization. That's why I posed the problem. Using your eye in no way guarantees anything. It's not technical in any way.

If the effects can be done with rasterization then why weren’t they? We’ve had hardware accelerated rasterization for over 20 years. What’s your technical reasoning?

The whole point of increasing rasterization performance had been to trick the eye for reflections and lighting angles. What are we determining FPS and frame times by the human eye now instead of benchmarks?

Not sure what that means. Nobody’s determining FPS and frame times by eye.
 
If the effects can be done with rasterization then why weren’t they? We’ve had hardware accelerated rasterization for over 20 years. What’s your technical reasoning?
Um because Nvidia halted the upgrade in Minecraft and paid for the RT additions. You didn't know? Also did a stint in video game design. Basic reflections are totally doable. But in the upgrade basic reflections are reserved for the RT path. It's disingenuous easily. Why no one mentions it is beyond me but basic reflections with only 1 view port has been doable since the PlayStation 1, N64, and Sega Saturn.

Not sure what that means. Nobody’s determining FPS and frame times by eye.
Exactly that's the point.
 
Last edited:
Um because Nvidia halted the upgrade. You didn't know? Also did a stint in video game design. Basic reflections are totally doable. But in the upgrade basic reflections are reserved for the RT path. It's disingenuous easily. Why no one mentions it is beyond me but basic reflections with only 1 view port head been doable since the PlayStation 1 and Sega Saturn.

Yeah this isn’t just about Minecraft. Why are you so convinced that raytraced effects are better or even possible with rasterization and that raytracing in games is some nvidia scam? If you’re right then shouldn’t all the big engines have implemented those effects years ago using rasterization?

Repeating claims over and over with no technical evidence is just conspiracy theory nonsense.
 
Yeah this isn’t just about Minecraft. Why are you so convinced that raytraced effects are better or even possible with rasterization and that raytracing in games is some nvidia scam? If you’re right then shouldn’t all the big engines have implemented those effects years ago using rasterization?

Repeating claims over and over with no technical evidence is just conspiracy theory nonsense.
Didn't say it was. You're testing words hoping for an out. Feel free to say that reflections are only doable via ray-tracing.

The problem is that since no one questioned the claims being made it's all of a sudden difficult to defend them. That's not my problem. In fact I've been asking why no one has even asked instead of taking anyone as face value. I really really don't care who makes a claim but to not even test the validity of it is a new bar for sure.

Right now there is absolutely no test to validate ray tracing in a AAA game. This is a problem. To claim otherwise is absurd. It's not fanboyism its just basic inquisitiveness.
 
Didn't say it was. You're testing words hoping for an out. Feel free to say that reflections are only doable via ray-tracing.

You’re arguing against claims that no one has made.

Right now there is absolutely no test to validate ray tracing in a AAA game. This is a problem. To claim otherwise is absurd. It's not fanboyism its just basic inquisitiveness.

That’s an interesting proposal. What sort of test would validate raytracing? How about SSR vs RT reflections in Control? That’s a pretty valid comparison.
 
You’re arguing against claims that no one has made.
Of course they have. "Wow look at <insert title here> with ray tracing!" Is said quite often without even knowing how much of the asset is in fact ray traced, or what, when, or if it really couldn't be done in another way to achieve the same result. This is quite ubiquitous. Heck until Crytek did their demo people actually believed that RT could only be done with the latest cards when actually any DX12 card could. There was better performance with Nvidia's latest cards but the point is that it still could be done using non-RTX cards.

That’s an interesting proposal. What sort of test would validate raytracing? How about SSR vs RT reflections in Control? That’s a pretty valid comparison.
I would say the only way would be to do a synthetic benchmark and just stick to that when it comes to testing it.
 
Last edited:
Yeah this isn’t just about Minecraft. Why are you so convinced that raytraced effects are better or even possible with rasterization and that raytracing in games is some nvidia scam? If you’re right then shouldn’t all the big engines have implemented those effects years ago using rasterization?

Repeating claims over and over with no technical evidence is just conspiracy theory nonsense.
Didn't say it was a scam. Those are your words. What I've been saying is that without quantifying the effect there's no way to test/validate its use. If you want a technical description of how reflections are done sans RT and how they can come quite close to ray tracing. Here's a white paper on that topic. In the Minecraft update there's a sign which with RT it has a reflection without RT there is none. This is one of the areas I'm talking about. That's a basic reflection that easily could still be done otherwise. Now Nvidia paid to put the effect in there so obviously it's a showcase of their technology but the alternate path could have still had a reflection there and that's just a fact unless we are to believe that reflections weren't possible before.
 
Of course they have. "Wow look at <insert title here> with ray tracing!" Is said quite often without even knowing how much of the asset is in fact ray traced, or what, when, or if it really couldn't be done in another way to achieve the same result. This is quite ubiquitous. Heck until Crytek did their demo people actually believed that RT could only be done with the latest cards when actually any DX12 card could. There was better performance with Nvidia's latest cards but the point is that it still could be done using non-RTX cards.

Of course any card can do raytracing, even the 8800GTX from 2006. It's a simple algorithm that has been around since the 1970's. The only thing that has changed is performance. Crytek hasn't performed any magic here. They achieved acceptable performance using shaders by limiting the amount of ray tracing work actually done. Their special sauce is liberal use of voxel cone tracing which requires a lot less horsepower than raytracing.

From Crytek: "One of the key factors which helps us to run efficiently on non-RTX hardware is the ability to flexibly and dynamically switch from expensive mesh tracing to low-cost voxel tracing, without any loss in quality. Furthermore, whenever possible we still use all the established techniques like environment probes or SSAO. These two factors help to minimize how much true mesh ray tracing we need and means we can achieve good performance on mainstream GPUs. "

https://www.cryengine.com/news/view...-ray-traced-reflections-in-cryengine-and-more

I would say the only way would be to do a synthetic benchmark and just stick to that when it comes to testing it.

Your position on this doesn't make any sense. There is an actual shipping game that provides exactly what you're asking for but you choose to ignore it. I can only guess at the reasons for that.
 
Didn't say it was a scam. Those are your words. What I've been saying is that without quantifying the effect there's no way to test/validate its use. If you want a technical description of how reflections are done sans RT and how they can come quite close to ray tracing. Here's a white paper on that topic. In the Minecraft update there's a sign which with RT it has a reflection without RT there is none. This is one of the areas I'm talking about. That's a basic reflection that easily could still be done otherwise. Now Nvidia paid to put the effect in there so obviously it's a showcase of their technology but the alternate path could have still had a reflection there and that's just a fact unless we are to believe that reflections weren't possible before.

Minecraft is almost 10 years old and developers could have added reflections at any time. The lack of non-raytraced reflections in Minecraft has nothing to do with Nvidia or RTX. Again, it's baffling why you're associating those two things.
 
Back
Top