AMD RDNA 2 gets ray tracing

If developers are programming against the DXR API, then I don't see how writing for AMD is going to be any easier or different at all. It's the same API.

Now, maybe there are paths that give better performance on one versus the other, so maybe some care is involved, but it can't be totally different.
 
Looks promising, Quantum Error, Exclusive for PS5/PS4, 4K, 60fps, 100% Ray Trace Reflections and 3d Audio (this should also be using raytracing) on PS5. Game looks or feels like a combination of Doom 3, Fear and Deep Sapce.
https://gamingbolt.com/quantum-error-will-use-real-time-ray-tracing-on-ps5

Interview with the developer talking about Quantum Error, basically Questions/Answers read:


Trailer:
 
AMD discussion of AMD use of DXR 1.1 in Demo:



Appears RNDA2 will support DXR 1.0 as well, hopefully older games will play as well.
 
Appears RNDA2 will support DXR 1.0 as well, hopefully older games will play as well.
DXR 1.1 compatibility requires DXR 1.0 compatibility.

When it comes to games I only expect Quake 2 RTX and Minecraft RTX to not work on AMD cards and all normal retail games should work just fine.
 
DXR 1.1 compatibility requires DXR 1.0 compatibility.

When it comes to games I only expect Quake 2 RTX and Minecraft RTX to not work on AMD cards and all normal retail games should work just fine.
I hope that is the case and looks like it should. Just that the new improve efficiencies from 1.1 and AMD design will not be used effectively unless the developers do some updates to the DXR games already out.
 
I hope that is the case and looks like it should. Just that the new improve efficiencies from 1.1 and AMD design will not be used effectively unless the developers do some updates to the DXR games already out.

Why do you insist on posting out of your "knowlegde-zone"?
Your "arguments" are technically ignorant and you skate from false statements to uneducated guesses like a trainwreck?

If you have NO clue about DXR 1.0 -> DXR 1.1 don't base your posts on that.

Your posts went into lala-land territory with "serial raytracing" and you have just gotten worse from there...
 
Why do you insist on posting out of your "knowlegde-zone"?
Your "arguments" are technically ignorant and you skate from false statements to uneducated guesses like a trainwreck?

If you have NO clue about DXR 1.0 -> DXR 1.1 don't base your posts on that.

Your posts went into lala-land territory with "serial raytracing" and you have just gotten worse from there...
Your post are just pure BS, enlighten us if you know so much. Show why my posts are left field. Except you can't because I see not much clue in anything you post, a typical troll. I have links, research, professionals telling it like it is and you and XOR have the typical nothingness yet lashing out in vain. If you can even rear the patent, Mark Cerny excellent layman's level presentation, and developers having a 60fps 4k game coming using 100% ray trace reflections and 3d audio. I suppose in your mind you are smarter then they as well. Except the one showing their ignorance is you.
 
Your post are just pure BS, enlighten us if you know so much. Show why my posts are left field. Except you can't because I see not much clue in anything you post, a typical troll. I have links, research, professionals telling it like it is and you and XOR have the typical nothingness yet lashing out in vain. If you can even rear the patent, Mark Cerny excellent layman's level presentation, and developers having a 60fps 4k game coming using 100% ray trace reflections and 3d audio. I suppose in your mind you are smarter then they as well. Except the one showing their ignorance is you.

Still waiting for your documentation of "serial raytracing"...don't throw rocks when you are inside a glass-castle.

And you have also not responded to the fact that any DXR 1.0 GPU supports DRX 1.1 as it is a driver update, not new hardware in the GPU.

How long until you document your claims?
 
Your post are just pure BS, enlighten us if you know so much. Show why my posts are left field. Except you can't because I see not much clue in anything you post, a typical troll. I have links, research, professionals telling it like it is and you and XOR have the typical nothingness yet lashing out in vain. If you can even rear the patent, Mark Cerny excellent layman's level presentation, and developers having a 60fps 4k game coming using 100% ray trace reflections and 3d audio. I suppose in your mind you are smarter then they as well. Except the one showing their ignorance is you.
You start being aggressive and it is not a good sign... 🤨
It really looks like you are making wild claims from no real performance data and some documents you cannot read correctly. Most obvious sign you cannot read them correctly is that you think you understand them and state what you think as facts. If you were AMD engineer then you could state facts.

We are all waiting for RDNA2 performance results as it will affect many people including me, even if not on PC then on consoles for sure.
 
Thing is, we are all basically speculating.

Whether on rumors or on patents that may not even be used in real hardware.

Companies patent all sorts of stuff just in case that never materializes.
 
Not all is speculation, how Mark Cerny described, as be it short, RNDA2 raytracing does sound awful alot what the patent says. As for Factum not even understanding posted frame rendering showing RT cores being virtually serial during the intersection stage I cannot help him. Then again that was also limited by DXR 1.0, DXR 1.1 allows shaders (pixel, compute, . . .) to call for the raytracing as needed and Turing should not be so serial except; unless the BVH is local in memory with the shader (program). I would say it can get expensive to read write to the RT cores of Turing if you don't have a shared memory scheme with where the shaders excute. Anyone interested can look at various Microsoft Videos dealing with DXR 1.1, there kinda dry but the information is there. The one that exposes shader control of raytracing is called Inline Raytracing.

As for what kinda performance we will get with Ampere, what improvements etc. any ones guess. We do have some significant information and some performance numbers from actual developers using the hardware which we know is only 36 CU's while AMD maybe delivering to gaming PC a video card up to 80 CU's. If anyone actually looked at the interview, well listen to it, at least with the PS5 there is a new upscaler, so will AMD have a viable alternative to DLSS? Not hamstrung to only games having to be programmed for it as in like DLSS? I rather have a very good upscaler that I can use for any game since that would benefit me much more than a few titles supporting it but I could care less in playing them.

Having ray tracing features that are broadly usable, easy to program for, does not severely hamper performance but gives clear benefits that people would want to have is good for everyone, Turning never really gave any of that as far as I am concerned. In the end it will be best to see when the cards become available to decide if worth it or not. Also what game developers say about the hardware can also be a good indicator.
 
Not all is speculation, how Mark Cerny described, as be it short, RNDA2 raytracing does sound awful alot what the patent says.

Cerny actually said very little. All he said was that AMD has added hardware accelerated intersection and that raytracing can run in parallel with other work. Just like on Turing. Is there something specific he said that I'm missing?

As for Factum not even understanding posted frame rendering showing RT cores being virtually serial during the intersection stage I cannot help him. Then again that was also limited by DXR 1.0, DXR 1.1 allows shaders (pixel, compute, . . .) to call for the raytracing as needed and Turing should not be so serial except;

You should worry less about others and more about correcting your own understanding. DXR 1.0 does not prevent raytracing work from running in parallel with shading. I honestly don't know why you keep claiming guesswork as fact when you should instead just read the readily available documentation on these things. At this point I'm starting to believe you might just be trolling.

https://devblogs.nvidia.com/introduction-nvidia-rtx-directx-ray-tracing/

" All ray tracing related GPU work is dispatched via command lists and queues that the application schedules. Ray tracing therefore integrates tightly with other work such as rasterization or compute, and can be enqueued efficiently by a multithreaded application. "

"The application retains the responsibility of explicitly synchronizing GPU work and resources where necessary, as it does with rasterization and compute. This allows developers to optimize for the maximum amount of overlap between ray tracing, rasterization, compute work, and memory transfers."


unless the BVH is local in memory with the shader (program). I would say it can get expensive to read write to the RT cores of Turing if you don't have a shared memory scheme with where the shaders excute.

Shaders run off of a register file or local data store on both AMD and Nvidia architectures. In AMD's patent the RT cores and texture units access data via an L1 cache and have to ship results back to the shader core, just like on Turing. What gave you the impression that AMD's RT cores and shader cores share the same memory pool?
 
I also quoted this line from that same page:
Ray tracing shaders are dispatched as grids of work items, similar to compute shaders. This lets the implementation utilize the massive parallel processing throughput of GPUs and perform low-level scheduling of work items as appropriate for the given hardware.

However, I was told that page was for writing fallback code for non-RTX Nvidia cards. That doesn't make sense to me, since what would the point of a fallback be if you have to code specifically for it and not to the API spec.
 
I also quoted this line from that same page:


However, I was told that page was for writing fallback code for non-RTX Nvidia cards. That doesn't make sense to me, since what would the point of a fallback be if you have to code specifically for it and not to the API spec.

I read somewhere that Microsoft didn't support the software fall back layer in DXR, it was up to the GPU manufacturer to provide support.

DXR 1.1 has the software fall back layer for DX12 GPUs and it's vendor agnostic.
 
Turning is pretty much not important at this time. Look at how games processed RT code and pretty much when the RT cores are in use the SMs are tied up almost exclusively to that. It may and probably can multitask with the rest of the SM but is not very good doing it. RT cores have one function and that is traverse and find intersection points, the rest of the RT calculations and shader programs are done inside the SMs.

PS5 reveal was very interesting, looks like their will be more Ray Traced games (hybrid) on launch then the last two years with Turning, some look outstandingly good. If the PS5 with 36 CU's can do what has been shown, just the beginning of the next generation of console, developers still not fully proficient, the PC GPUs are looking to be extremely promising. Ampere also I hope will bring some very nice surprises and improvements over Turning in general.
 
Turning is pretty much not important at this time. Look at how games processed RT code and pretty much when the RT cores are in use the SMs are tied up almost exclusively to that. It may and probably can multitask with the rest of the SM but is not very good doing it. RT cores have one function and that is traverse and find intersection points, the rest of the RT calculations and shader programs are done inside the SMs.
And basis for this "fact" of yours is what?

PS5 reveal was very interesting, looks like their will be more Ray Traced games (hybrid) on launch then the last two years with Turning, some look outstandingly good. If the PS5 with 36 CU's can do what has been shown, just the beginning of the next generation of console, developers still not fully proficient, the PC GPUs are looking to be extremely promising. Ampere also I hope will bring some very nice surprises and improvements over Turning in general.
Which titles of the PS5 reveal event were confirmed one way or another to use ray tracing?
Or like everything you write you made wild assumption that everything uses ray tracing and state this as a fact?
 
Which titles of the PS5 reveal event were confirmed one way or another to use ray tracing?

Ratchet and Clank was confirmed to use ray traced reflection and Digital Foundry did a pixel count to find it runs at native 4K. However, it's only 30Fps. The reflections in GT7 looked ray traced as well. I forgot if there were others but we still don't know what frame rate a lot of the games in the PS5 reveal will run at so they could have ray traced effects but only run at 30Fps.
 
Ratchet and Clank was confirmed to use ray traced reflection and Digital Foundry did a pixel count to find it runs at native 4K. However, it's only 30Fps. The reflections in GT7 looked ray traced as well. I forgot if there were others but we still don't know what frame rate a lot of the games in the PS5 reveal will run at so they could have ray traced effects but only run at 30Fps.

Just for reference, Prior to the stream, they announced that the stream would only be at 30fps, to make it possible for all these studios to actually deliver the content needed for the stream under lockdown restrictions without being able to massively crunch out these demos that you would normally be able to do (studios put a massive amount of effort into these gameplay reveals, sometimes months worth, so it wouldn't be likely that they would've been able to reach the same level of quality many would have preffered sans caronavirus).

There were many times that games hitched or dropped frames during the reveals, which I would just chalk up to not being able to really work on them as much as they would have preferred. Considering that many of them won't be out till 2021, I wouldn't be too concerned.
 
Just for reference, Prior to the stream, they announced that the stream would only be at 30fps, to make it possible for all these studios to actually deliver the content needed for the stream under lockdown restrictions without being able to massively crunch out these demos that you would normally be able to do (studios put a massive amount of effort into these gameplay reveals, sometimes months worth, so it wouldn't be likely that they would've been able to reach the same level of quality many would have preffered sans caronavirus).

There were many times that games hitched or dropped frames during the reveals, which I would just chalk up to not being able to really work on them as much as they would have preferred. Considering that many of them won't be out till 2021, I wouldn't be too concerned.

True, but a few of the same trailers have released on YouTube at 60fps for games that are confirmed to run at it. I'm sure a lot of games that were revealed will end up running at 60fps but I'm skeptical that the ones at 4K using ray traced effects will all hit 60fps.
 
Yeah, I watched it live and the stream quality was subpar.

I mean, you could see it but it didn't look any better than a PS4 game.

Watching the 4K60 trailers on YouTube was a totally different world.
 
Ratchet and Clank was confirmed to use ray traced reflection and Digital Foundry did a pixel count to find it runs at native 4K. However, it's only 30Fps. The reflections in GT7 looked ray traced as well. I forgot if there were others but we still don't know what frame rate a lot of the games in the PS5 reveal will run at so they could have ray traced effects but only run at 30Fps.
Good rundown of what they think is RT, FPS and resolution -> They don't seem absolutely sure but seems to have valid solid basis for their statements.

 
And basis for this "fact" of yours is what?


Which titles of the PS5 reveal event were confirmed one way or another to use ray tracing?
Or like everything you write you made wild assumption that everything uses ray tracing and state this as a fact?
How about you post some facts and data, links? Really your not worth the effort.
 
That video is a full length feature film. Is there a cliff notes?
Basically they commented on the whole Sony Reveal non stop. I am sure others will shorthand it. A lot of the titles are still works in progress. Sony looks to be well focus and closer to a successful launch while Microsoft does not look that way. Ratchet and Clank is a great demonstration of what can be done with a very fast SSD with compression and unique controller allowing for a number of priorities.
 
How about you post some facts and data, links? Really your not worth the effort.
I have no facts to post and I didn't post any.

It is you who post all sorts of crazy things like they were confirmed truth. At this point in time we have zero confirmed information or benchmark results so everything is at most wild speculation.
 
True, but a few of the same trailers have released on YouTube at 60fps for games that are confirmed to run at it. I'm sure a lot of games that were revealed will end up running at 60fps but I'm skeptical that the ones at 4K using ray traced effects will all hit 60fps.
There will probably be some frame drops here and there. Which will be fine as long as your TV supports VRR.

I expect ray tracing effects to be optimized rather well on console games. They have to be if game is to even have them. This is in stark contrast to current RT implementations where these effects seems to be added as an afterthought and without much if any real optimization.
 
There will probably be some frame drops here and there. Which will be fine as long as your TV supports VRR.

I expect ray tracing effects to be optimized rather well on console games. They have to be if game is to even have them. This is in stark contrast to current RT implementations where these effects seems to be added as an afterthought and without much if any real optimization.
I'm expecting the same as well but at the same time hitting 4K and 60fps with ray tracing may be tough for a lot of games but I'm sure we'll get some. GT7 looks to have ray tracing and run at 4K/60 fps so there is one already.
 
AMD RT patent application hot off the press. This one is interesting. It describes a method for grouping rays during traversal to avoid performance killing divergence in the CU. Grouping is done by material i.e. process all rays together that bounced off glass.

http://appft.uspto.gov/netacgi/nph-...681.PGNR.&OS=DN/20200193681&RS=DN/20200193681

A scheduler launches waves by grouping together multiple data items associated with the same material. The rays processed by that wave are processed with a continuation ray, rather than the full original ray. A continuation ray starts from the previous point of intersection and extends in the direction of the original ray. These steps help counter divergence that would occur if a single shader program that inlined the intersection and any hit shaders were executed.
 
AMD RT patent application hot off the press. This one is interesting. It describes a method for grouping rays during traversal to avoid performance killing divergence in the CU. Grouping is done by material i.e. process all rays together that bounced off glass.

http://appft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&u=/netahtml/PTO/search-adv.html&r=1&p=1&f=G&l=50&d=PG01&S1=20200193681.PGNR.&OS=DN/20200193681&RS=DN/20200193681
Thanks,

Full document with images here:
https://pdfaiw.uspto.gov/.aiw?PageN...PGNR.%26OS=DN/20200193681%26RS=DN/20200193681

May have to print up, grab a cup of coffee and plow through this one, looks like a software patent but have not gone through it yet.
 
even if Big Navi is revealed to be an amazing card can we really trust AMD drivers?
This is LITERALLY the only reason I won’t consider AMD cards at the moment. The number of AMD driver issues I’ve seen posted across several different hardware forums completely turns me off from their GPUs.
 
Don't want to derail the thread. AMD driver issues are real but somewhat overblown.

I mean, it's true there were some issues, but it was not everyone (I personally didn't have any issues as did others on this board).

Usually the complainers are way louder than the satisfied customers. You don't necessary go to post on the forum to say everything works normal as expected.
 
A review and talk dealing with the raytrace titles on PS5. Rather amazing for a 36CU part.



Nice analysis as usual from DF.

Really exciting to see devs putting RT hardware to use in the first wave of next generation console games. We won’t be seeing 4K raytraced effects anytime soon. It seems Gran Turismo is getting away with checkerboard 1080p reflections.

At minimum it shows even with these early builds that AMD has usable RT performance on even console class hardware.
 
Some bizarre ideas, 3d stack memory, Nvidia pricing and so on with Big Navi, worth a look if you don't have something more important to do: No idea if true or not or if some of this will pan out.

 
Some bizarre ideas, 3d stack memory, Nvidia pricing and so on with Big Navi, worth a look if you don't have something more important to do: No idea if true or not or if some of this will pan out.




Well, if we go only by what AMD have already told us, which is up to 50% better performance per watt, that is a very big performance hint.

So, it really seems like we can guess how well RDNA2 / Big Navi will perform by just guessing how high they are willing to go in TDP.

Typically stock GPU's tend to top out at about 250W, if that's where they go, expect something up to 60% faster than a current 5700 XT, which means it would be trading blows with a 2080 ti.

AMD have not been shy about upping the TDP a whole lot more in the past with fancy AIO coolers though. If they are willing to go up to 350W again like they did with one of the liquid cooled Vega 64 (Fronteir edition or something?) we could be talking 125% faster than the 5700 XT which would make it the fastest consumer GPU on the market, at least until Ampere hits.

It's going to be interesting to see where this one lands.

The performance of the raytracing is a big unknown.
 
Last edited:
Back
Top