AMD RDNA 2 gets ray tracing

Of course any card can do raytracing, even the 8800GTX from 2006. It's a simple algorithm that has been around since the 1970's. The only thing that has changed is performance. Crytek hasn't performed any magic here. They achieved acceptable performance using shaders by limiting the amount of ray tracing work actually done. Their special sauce is liberal use of voxel cone tracing which requires a lot less horsepower than raytracing.

From Crytek: "One of the key factors which helps us to run efficiently on non-RTX hardware is the ability to flexibly and dynamically switch from expensive mesh tracing to low-cost voxel tracing, without any loss in quality. Furthermore, whenever possible we still use all the established techniques like environment probes or SSAO. These two factors help to minimize how much true mesh ray tracing we need and means we can achieve good performance on mainstream GPUs. "

https://www.cryengine.com/news/view...-ray-traced-reflections-in-cryengine-and-more



Your position on this doesn't make any sense. There is an actual shipping game that provides exactly what you're asking for but you choose to ignore it. I can only guess at the reasons for that.
Not looking for a game what I'm looking for is the acknowledgement that without a shift in thinking with exactly how to benchmark the games that are being benchmarked that actually details how much of a scene is Rt'd and at what performance level it's going to cause a while lot of needless back and forth when developing a tool to calculate it would fix the problem.

This isn't exclusively a Nvidia thing this is a anyone who benchmarks a game thing.

Also not every video card can do ray tracing. I think I have a S3 Virge or diamond stealth 24x video card you can test that theory on.
 
Last edited:
Not looking for a game what I'm looking for is the acknowledgement that without a shift in thinking with exactly how to benchmark the games that are being benchmarked that actually details how much of a scene is Rt'd and at what performance level it's going to cause a while lot of needless back and forth when developing a tool to calculate it would fix the problem.

The reason for the back and forth on raytracing is mostly people hating because their favorite IHV doesn’t support it yet or because the price of entry is too high. The fix for both of those problems will arrive later this year with RDNA2 and Ampere.

Once AMD supports ray tracing nobody will care how much of a scene is raytraced. Just like nobody is asking for micro benchmarks of dynamic lighting vs pre-baked lightmaps today. People will just care which hardware is faster in the games they play.

It will finally be an even playing field and there will be no need to go out of the way to diminish the impact of RT.
 
Looks like a combination of methods will be used since pure RT for real time is not even remotely possible for modern complex scenery games. Then the rather gross, I feel, misrepresentations or hints dealing with real time RT, has not help matters.

For example, in BFV, after geometry setup, RT cores find intersections for reflections, nothing is rendered directly from ray traced lighting, the triangles that are intersected from the reflected surface are rasterized using shaders, hardware lights etc. Not one pixel has any ray trace lighting, just what objects will be seen in reflective surfaces are calculated for rasterization. It is like one drop of orange juice is dropped into a gallon of water and it is then called orange juice. I just don't like how Nvidia perverted what ray tracing is. It would have been more clear, honest to have come up with a new marketing name then to try to steal what it use to mean.

All of this could be done before just with much poorer performance with older hardware. Then again Intel did this with just CPU's back in the late 2000's. I don't see Nvidia doing anything really significant with RTX RT, they added some dedicated fixed function units for intersection finding. Now their AI and Tensor Cores is a whole different matter and is very innovative.
 
The reason for the back and forth on raytracing is mostly people hating because their favorite IHV doesn’t support it yet or because the price of entry is too high. The fix for both of those problems will arrive later this year with RDNA2 and Ampere.

Once AMD supports ray tracing nobody will care how much of a scene is raytraced. Just like nobody is asking for micro benchmarks of dynamic lighting vs pre-baked lightmaps today. People will just care which hardware is faster in the games they play.

It will finally be an even playing field and there will be no need to go out of the way to diminish the impact of RT.
I doubt very seriously it's going to end when that happens quite the contrary. In fact you've already started to see people say "well that's not real ray-tracing". Based on what tool? What benchmark is around to make that statement? Either way I'm waiting on Intel and couldn't be happier that they are entering the GPU market.
 
In fact you've already started to see people say "well that's not real ray-tracing".

The “not real raytracing” people somehow think that every effect in a game has to be 100% raytraced in order for it to count. I hope you understand how ridiculous that is. You can raytrace shadows and reflections while rasterizing primary lighting. Just like you can render some effects deferred and others forward. There is no law against combining techniques in a single frame.

Either way, the not real raytracing camp is sorta irrelevant to the rest of us. They can choose to disable RT until it’s real enough for them.
 
The “not real raytracing” people somehow think that every effect in a game has to be 100% raytraced in order for it to count. I hope you understand how ridiculous that is. You can raytrace shadows and reflections while rasterizing primary lighting. Just like you can render some effects deferred and others forward. There is no law against combining techniques in a single frame.

Either way, the not real raytracing camp is sorta irrelevant to the rest of us. They can choose to disable RT until it’s real enough for them.
Most folks do already :p probably more to do with performance and degrading the gaming experience if used by setting other settings lower. I could care less what they call it, it is the final package or experience that counts. It is a hybrid approach with very little ray tracing, when 1 to 2 rays per pixel are being used that is really hard to pretend it is raytracing - it is a different approach helping rasterization to do some things better. A low quality ray trace image is around 200 rays per pixel in a typical ray traced scene, 1000+ for high. Now who thinks every effect in a game has to be 100% raytraced? Even non RTX games normally have a lot of ray tracing via lightmaps and texture baking - main reason games look so good dealing with lighting - the problem is it is more static than dynamic. The non real time raytrace parts of games, for example BFV far exceed the RTX parts. There is already a lot of ray tracing work in games and has been for over two decades, many have no clue in what they are looking at. I am more for the overall user experience and not some pretend show which when the curtain is pulled back is an empty dark room.
 
  • Like
Reactions: kac77
like this
Whether it's 10 million rays or 10 billion rays it's still raytracing. No need to pretend.
If it makes you happy happy calling it raytracing in such a very limited matter, with devastating performance loss, have at it. Nvidia AI potential is way more useful and interesting. UE 5 GI and geometry handling is light years ahead of what RTX brought, no need for any lightmaps (no need to raytrace static lightmaps), no need for normal maps due to no need to reduce geometry or having LOD of objects, the lists goes on and on -> Will the so called realtime raytracing even be relevant? The hardware design of Turing for this is utterly irrelevant from the RT side, using it for reflections for such magnitude increase in complexity of geometry/triangles seems to be doubtful but will leave that open. So far Nvidia has proven more how useless their hardware design RT is. Not only do you still have to do everything else to your game from normal maps, raytraced static lightmaps but also limited to scene complexity. Only saving grace and it is a big one is their AI, DLSS being the first one that has significant impact. As for AMD? Not sure what route they expect to take, their demo was utter crap in the scheme of things and didn't show much of anything useful. It looks like the developers, Epic and Crytek are the ones really moving the ball forward for game advancement. My bet would be with Nvidia AI in advancing how scenes should be lit, much potential there to create great dynamic lighted scenes with very limited information such as from the limited number of rays cast and that would also not be raytracing but a step up from it, intelligent rendering maybe coined. Too bad AMD doesn't come up with a terminology that sounds original and unique as in AMD Intellegence.
 
Last edited:
  • Like
Reactions: kac77
like this
AdoredTV did a video today where he estimates if AMD does release a 500mm+ "big Navi" that it should be around 21TF based on what is known about it.
 
  • Like
Reactions: noko
like this
If it makes you happy happy calling it raytracing in such a very limited matter

Raytracing is a well defined technique and clearly industry experts disagree with you. What makes you or me happy doesn't matter.

UE 5 GI and geometry handling is light years ahead of what RTX brought, no need for any lightmaps (no need to raytrace static lightmaps), no need for normal maps due to no need to reduce geometry or having LOD of objects, the lists goes on and on -> Will the so called realtime raytracing even be relevant?

UE5 is doing a few really interesting things, mostly with geometry detail and compute based rasterization. Will be interesting to see how it scales on hardware that doesn't have the benefit of PS5's lightning fast storage for streaming high fidelity assets.

UE5 lighting is less revolutionary. They're using a combination of voxel, SDF and screen space tracing. Basically lower resolution alternatives to triangle raytracing and all done before. So if you're not impressed by raytracing it's hard to see how you're impressed by UE5's lighting engine.
 
It's ray tracing. I know people want to say it's "fake", well games are all "fake". It's just a trick to make graphics better, just like any other technique used over the years.

The UE5 stuff looked great, but we don't know how well that will be in all cases. For example, I didn't see any reflections in the demo, maybe because crystal clear reflections would look better with triangle ray tracing.
 
If it makes you happy happy calling it raytracing in such a very limited matter, with devastating performance loss, have at it. Nvidia AI potential is way more useful and interesting. UE 5 GI and geometry handling is light years ahead of what RTX brought, no need for any lightmaps (no need to raytrace static lightmaps), no need for normal maps due to no need to reduce geometry or having LOD of objects, the lists goes on and on -> Will the so called realtime raytracing even be relevant? The hardware design of Turing for this is utterly irrelevant from the RT side, using it for reflections for such magnitude increase in complexity of geometry/triangles seems to be doubtful but will leave that open. So far Nvidia has proven more how useless their hardware design RT is. Not only do you still have to do everything else to your game from normal maps, raytraced static lightmaps but also limited to scene complexity. Only saving grace and it is a big one is their AI, DLSS being the first one that has significant impact. As for AMD? Not sure what route they expect to take, their demo was utter crap in the scheme of things and didn't show much of anything useful. It looks like the developers, Epic and Crytek are the ones really moving the ball forward for game advancement. My bet would be with Nvidia AI in advancing how scenes should be lit, much potential there to create great dynamic lighted scenes with very limited information such as from the limited number of rays cast and that would also not be raytracing but a step up from it, intelligent rendering maybe coined. Too bad AMD doesn't come up with a terminology that sounds original and unique as in AMD Intellegence.
We literally got 1st GPU with ray-tracing where it is just a small addition, merely few percent of die was allocated to it, and all the ray-tracing that is added to games is an afterthought. What did you expect would happen?

RT will be the future of real-time computer graphics like it is for rendering. Eventually everything will be path-traced, simply because it is simpler and yields more accurate results.
You cannot expect that first few GPUs and games with this feature to be jar dropping. It would be nice and some times it happen with new features but usually it does not. Most features get relevant long after first cards supporting them become obsolete.

BTW. DLSS was considered crap by pretty much everyone before DLSS 2.0
 
Whether the current implementation of RT is fake or not it sure looks good. Ever since I ended up with my 2080Ti I've had a lot of fun playing Control with all the RT effects turned on with DLSS 2.0. Performance isn't the highest at 4K but it's very playable when using a controller. Since RT will only get better with more powerful GPUs, I'm very much looking forward to the future of it.
 
Whether the current implementation of RT is fake or not it sure looks good. Ever since I ended up with my 2080Ti I've had a lot of fun playing Control with all the RT effects turned on with DLSS 2.0. Performance isn't the highest at 4K but it's very playable when using a controller. Since RT will only get better with more powerful GPUs, I'm very much looking forward to the future of it.
Personally I don't buy games because they support a fancy new feature of my card. I'm glad you found one you like as they are far and few between.
The highly touted extra features for my library have been a complete bust.
 
Well Control is also a really awesome game, and the RT is actually noticeable unlike some of the earlier games.

For me, it made spending $1,200 on the Ti worth it.
 
Well I feel like PC gaming is where you can spend the extra money and play the future of gaming before everyone else.

I also used to import consoles from Japan. Still remember freshman year of college people were blown away that I had a Dreamcast before it even came out.
 
Well I feel like PC gaming is where you can spend the extra money and play the future of gaming before everyone else.

I also used to import consoles from Japan. Still remember freshman year of college people were blown away that I had a Dreamcast before it even came out.
I used to run own a store with imported super famicoms. Cut out the cartridge slot so it could use super nintendo games.

But ya, if you have the funds to play out your dreams on a $1200 card for a game, good on you. For many like myself $1150 CAD that dream never came true, nor looks like it will. But again no one put a gun to my head and it was still the only purchase decision that made sense for me, for what I wanted to do.
 
Last edited:
Reminds me of quantum break.
Playing it non raytraced and it is a good game. In both cases it is still rasterized :D

I am sure I would like the added effects using ray tracing techniques to enhance the rasterization of the pixels. In the end it is the final quality how ever calculated that counts. Even Nvidia calls it a hybrid which is the most accurate description.
 
Playing it non raytraced and it is a good game. In both cases it is still rasterized :D

I am sure I would like the added effects using ray tracing techniques to enhance the rasterization of the pixels. In the end it is the final quality how ever calculated that counts. Even Nvidia calls it a hybrid which is the most accurate description.

Hybrid = Rasterization + Raytracing....no matter how you skin it.
 
  • Like
Reactions: noko
like this
I agree, it is a Hybrid method. Which has some very good potential to push graphics forward.

No, raytracing IS the future...these are first steps.
Trying to downplay raytracing (and it's importance) is bunch of FUD...
 
No, raytracing IS the future...these are first steps.
Trying to downplay raytracing (and it's importance) is bunch of FUD...
What is a bunch of FUD is you saying or hinting that I am down playing RT. Like to see more effective viable results using GPUs for real time rendering, believe that will come hopefully with Crysis and Cyberpunk. As for Raytracing as the Future, it is pretty much has been used for decades now, since it is a good model for lighting for the most part making it just work so to speak when done right. Been used in games as well (not real time) for a long while.
 
Perhaps over time we will see GPU slowly moving from prioritizing rasterization and pure compute on consumer cards to prioritizing RT and AI.

Even now Nvidia could make RTX real monsters at RT but given zero software support it would make zero sense.
For now most probable scenario is that Nvidia will try to optimize its RT cores as much as possible and try to spend as little additional transistors on them compared to Turing as possible just like they did with Turing.
 
What is a bunch of FUD is you saying or hinting that I am down playing RT. Like to see more effective viable results using GPUs for real time rendering, believe that will come hopefully with Crysis and Cyberpunk. As for Raytracing as the Future, it is pretty much has been used for decades now, since it is a good model for lighting for the most part making it just work so to speak when done right. Been used in games as well (not real time) for a long while.

DXR is a standard by Microsoft.
This is where EVERYONE'S going.
Intel will support DXR.
AMD will support DXR.
NVIDIA supports DXR.

Raytracing is going, which put an expiration date on rasterazation.
It is already begun...it is not a question about "if"...it is a question about "when".

This should be mandatory to spoon-fed these facts to the people that cannot separate Microsoft's extension of the DX12 API wit DXR and then NVIDIA's branding of their cards with RT-cores supporting Microsoft's DXR (All though some of NVIDIA's Turing and Pascal cards supports shader-based DXR).

Raytracing IS the future.
Even Vulkan is basically doing a "copy" of DXR and will have feature parity with DX12 DXR.

The sole reason people are whining about Raytracing is because their favorite peevee company has not yet entered the party.

If you told people 15 years years ago that some people would respond to real time raytracing in a FUD/dismissive way....they would assume the future had gone "Idiocracy"...
 
DXR is a standard by Microsoft.
This is where EVERYONE'S going.
Intel will support DXR.
AMD will support DXR.
NVIDIA supports DXR.

Raytracing is going, which put an expiration date on rasterazation.
It is already begun...it is not a question about "if"...it is a question about "when".

This should be mandatory to spoon-fed these facts to the people that cannot separate Microsoft's extension of the DX12 API wit DXR and then NVIDIA's branding of their cards with RT-cores supporting Microsoft's DXR (All though some of NVIDIA's Turing and Pascal cards supports shader-based DXR).

Raytracing IS the future.
Even Vulkan is basically doing a "copy" of DXR and will have feature parity with DX12 DXR.

The sole reason people are whining about Raytracing is because their favorite peevee company has not yet entered the party.

If you told people 15 years years ago that some people would respond to real time raytracing in a FUD/dismissive way....they would assume the future had gone "Idiocracy"...
Seriously I am getting sick and tired of the people crying about how pointless ray tracing is. It is the future. They have to start some where. Game development takes years. Developers can't just do a few key strokes to add ray tracing into their engines. It hasn't even been 2 years since RTX came out. Games will feature it more now that it is incorporated into engines like unreal 5 and much more streamlined. At least Nvidia is pushing tech forward and not falling into stagnation like Intel has.
 
Seriously I am getting sick and tired of the people crying about how pointless ray tracing is. It is the future. They have to start some where. Game development takes years. Developers can't just do a few key strokes to add ray tracing into their engines. It hasn't even been 2 years since RTX came out. Games will feature it more now that it is incorporated into engines like unreal 5 and much more streamlined. At least Nvidia is pushing tech forward and not falling into stagnation like Intel has.
It's been the future for a long time... This is a stepping stone. How impressive of a stepping stone is up to interpretation. I don't know why there is so much hate both ways. It's cool tech but not everyone cares. Just because it came out and people don't want to drop massive sums of $ on it doesn't mean they hate ray tracing. It just means they don't feel the entry price or current state is worth their hard (or not hard) earned cash. Why do you take this so personal? Having written a raytracing engine many moons ago, I know what goes into it and am glad to see it being integrated into games in this limited fashion as it can bring increased visuals. I think it can do more for lighting than reflections at this point (as most things in real life are not super shiny, but light does affect them). That doesn't mean I'm going to blow my family budget on a 2080ti. If there is limited games and someone uses that as a reason to not spend $, it's a valid reason, not everyone wants to pay for new tech. If someone else loves the latest and greatest and can afford it and wants to buy it, that's valid too. Neither has to be right/wrong, it just where they are at.
 
The Unreal Engine 4 Elemental demo showed off voxel based GI back in 2012. In 2020, 8 years later there are zero UE4 games using SVOGI.

The first RT demo came out late 2018. In the first year we got 4 blockbuster titles using raytracing for various effects with more in the pipeline. We have standards based support for hardware accelerated RT in DirectX and Vulkan. Yet we constantly hear the peanut gallery whining about insufficient adoption.

The only logical explanation is brand hate since the adoption rate has been fantastic relative to competing tech. There is absolutely no technical reason to hate on RT.
 
It's been the future for a long time... This is a stepping stone. How impressive of a stepping stone is up to interpretation. I don't know why there is so much hate both ways. It's cool tech but not everyone cares. Just because it came out and people don't want to drop massive sums of $ on it doesn't mean they hate ray tracing. It just means they don't feel the entry price or current state is worth their hard (or not hard) earned cash. Why do you take this so personal? Having written a raytracing engine many moons ago, I know what goes into it and am glad to see it being integrated into games in this limited fashion as it can bring increased visuals. I think it can do more for lighting than reflections at this point (as most things in real life are not super shiny, but light does affect them). That doesn't mean I'm going to blow my family budget on a 2080ti. If there is limited games and someone uses that as a reason to not spend $, it's a valid reason, not everyone wants to pay for new tech. If someone else loves the latest and greatest and can afford it and wants to buy it, that's valid too. Neither has to be right/wrong, it just where they are at.
Sure it is expensive right now and probably not worth it to most people but it has to enter the market at one point. If Nvidia didn't start it with the 2xxx series we would have no ray traying games atm nor any in development. It will get cheaper but it just erks me how people complain about it on this site. A site about pushing the latest and greatest computer tech. I honestly don't think the 2xxx cards would of came in at a much lower price point without ray tracing.
 
Last edited:
The Unreal Engine 4 Elemental demo showed off voxel based GI back in 2012. In 2020, 8 years later there are zero UE4 games using SVOGI.

The first RT demo came out late 2018. In the first year we got 4 blockbuster titles using raytracing for various effects with more in the pipeline. We have standards based support for hardware accelerated RT in DirectX and Vulkan. Yet we constantly hear the peanut gallery whining about insufficient adoption.

The only logical explanation is brand hate since the adoption rate has been fantastic relative to competing tech. There is absolutely no technical reason to hate on RT.
This. If AMD was first to market with ray tracing and only 1 game they would of been praised to high heaven for pushing the industry forward.
 
This. If AMD was first to market with ray tracing and only 1 game they would of been praised to high heaven for pushing the industry forward.
Some would, just as some are praising Nvidia to high heaven now... See how that works. Like I said both sides are guilty of feeling like theirs is the only viewpoint. In reality some will think it's worth it, others won't. I don't feel like either is right or wrong. Me personally, I'm not spending a ton of money on features I won't use or if i do use will be very minimal. Not because I think it's stupid, because I have other things I feel are more important. Not everyone saying it's not worth it right now is saying it because they think it's stupid... Some really just don't think the value proposition and usefulness are there yet. Early adopters always pay a premium to play with new things. It's always been this way and the first version of something is rarely as good as follow up releases from lessons learned. As an enthusiast I still get to choose for myself if I believe a feature is worth paying for. If you disagree that's fine, I don't tell anyone not to buy it if they think it's worth it. For the ones who just say things to get under your skin or are complete fan boys (AMD, Nvidia, Intel, doesn't matter)... They exist and they aren't going away, don't let them get you to discouraged.
 
Sure it is expensive right now and probably not worth it to most people but it has to enter the market at one point. If Nvidia didn't start it with the 2xxx series we would have no ray traying games atm nor any in development. It will get cheaper but it just erks me how people complain about it on this site. A site about pushing the latest and greatest computer tech. I honestly don't think the 2xxx cards would of came in at a much lower price point without ray tracing.
I agree it had to start somewhere, but if someone is asking for a suggestion I'm not going to say, get rtx because it are awesome. I explain it's limited use right now but pretty neat. If they have extra $ it's probably worth getting it as it will be getting more popular, but by then other hardware that will perform better will most likely be out too. If they are into new tech and don't mind then they buy it, if they don't care for the few games and plan to upgrade once new hardware comes out, they save a few $ and get something that works well also. I don't know what the 2xxx cards could have cost without rtx or tensor cores, probably a bit lower though. I am glad to see hardware and software continue to evolve and am glad they came out with a first iteration to start the ball rolling so to speak. This doesn't mean it's (currently) for everyone, enthusiast or not. I know some people (the ones you speak of) do exist and are purely just trying to put it down because it's not their favorite vendor. But not everyone that says it's not ready for the masses and/or not worth it for most aren't saying it's stupid, just that it's early tech and not widely in use yet.
 
Perhaps over time we will see GPU slowly moving from prioritizing rasterization and pure compute on consumer cards to prioritizing RT and AI.
Yeah, I could see GPUs going totally programmable in the future. It makes more sense as it has been moving in that direction for a while.
 
Sure that’s a personal choice and nobody is telling anyone how to spend their money. What’s annoying is the intellectual dishonesty on the underlying technology.
Yeah, it can be frustrating, just know not everyone is like that ;).
 
Sure that’s a personal choice and nobody is telling anyone how to spend their money. What’s annoying is the intellectual dishonesty on the underlying technology.
Yes indeed. Agree 100%
 
Back
Top