AMD RDNA 2 gets ray tracing

GotNoRice

[H]F Junkie
Joined
Jul 11, 2001
Messages
9,403
Is this hardware or software ray-tracing? Keep in mind that even Nvidia has already enabled Ray-tracing on their older cards like the GTX 1080, but it's done in software. Only the RTX series has actual dedicated ray-tracing hardware on it. From what I've read, the ray-tracing in DX12 Ultimate WILL be able to make use of the dedicated ray-tracing hardware in Nvidia RTX cards.

Either way, it's great that the new AMD cards will support ray-tracing, but will it support it like the Nvidia RTX cards do (in hardware), or will it "support" it like the Nvidia GTX cards do (in software)?
 

fightingfi

Look at Me! I need the attention.
Joined
Oct 9, 2008
Messages
2,887
ya
GotNoRice
i was wondering the exact same things and will there be any difference in speed software vs hardware etc.........
 

Factum

[H]ard|Gawd
Joined
Dec 24, 2014
Messages
1,986
MUCH better link....straight from Microsoft:

https://devblogs.microsoft.com/directx/announcing-directx-12-ultimate/

DirectX Raytracing 1.1
DirectX Raytracing (DXR) brings a new level of graphics realism to video games, previously only achievable in the movie industry. The effects achievable by DXR feel more real, because in a sense they are more real: DXR traces paths of light with true-to-life physics calculations, which is a far more accurate simulation than the heuristics based calculations used previously.

We’ve already seen an unprecedented level of visual quality from titles that use DXR 1.0 since we unveiled it, and built DXR 1.1 in response to developer feedback, giving them even more tools with which to utilize DXR.

DXR 1.1 is an incremental addition over the top of DXR 1.0, adding three major new capabilities:

  • GPU Work Creation now allows Raytracing. This enables shaders on the GPU to invoke raytracing without an intervening round-trip back to the CPU. This ability is useful for adaptive raytracing scenarios like shader-based culling / sorting / classification / refinement. Basically, scenarios that prepare raytracing work on the GPU and then immediately spawn it.
  • Streaming engines can more efficiently load new raytracing shaders as needed when the player moves around the world and new objects become visible.
  • Inline raytracing is an alternative form of raytracing that gives developers the option to drive more of the raytracing process, as opposed to handling work scheduling entirely to the system (dynamic-shading). It is available in any shader stage, including compute shaders, pixel shaders etc. Both the dynamic-shading and inline forms of raytracing use the same opaque acceleration structures.
When to use inline raytracing
Inline raytracing can be useful for many reasons:

  • Perhaps the developer knows their scenario is simple enough that the overhead of dynamic shader scheduling is not worthwhile. For example, a well constrained way of calculating shadows.
  • It could be convenient/efficient to query an acceleration structure from a shader that doesn’t support dynamic-shader-based rays. Like a compute shader or pixel shader.
  • It might be helpful to combine dynamic-shader-based raytracing with the inline form. Some raytracing shader stages, like intersection shaders and any hit shaders, don’t even support tracing rays via dynamic-shader-based raytracing. But the inline form is available everywhere.
  • Another combination is to switch to the inline form for simple recursive rays. This enables the app to declare there is no recursion for the underlying raytracing pipeline, given inline raytracing is handling recursive rays. The simpler dynamic scheduling burden on the system can yield better efficiency.
Scenarios with many complex shaders will run better with dynamic-shader-based raytracing, as opposed to using massive inline raytracing uber-shaders. Meanwhile, scenarios that have a minimal shading complexity and/or very few shaders will run better with inline raytracing.

If the above all seems quite complicated, well, it is! The high-level takeaway is that both the new inline raytracing and the original dynamic-shader-based raytracing are valuable for different purposes. As of DXR 1.1, developers not only have the choice of either approach, but can even combine them both within a single renderer. Hybrid approaches are aided by the fact that both flavors of DXR raytracing share the same acceleration structure format, and are driven by the same underlying traversal state machine.

Best of all, gamers with DX12 Ultimate hardware can be assured that no matter what kind of Raytracing solution the developer chooses to use, they will have a great experience.
Now say after me:

D X R

DirectX RayTracing.

Raytracing is DXR...
 

fightingfi

Look at Me! I need the attention.
Joined
Oct 9, 2008
Messages
2,887
so the new dx 12 ultimate will it be able to fully support what new card are out now 5700 2070 2060 2080's or are we going to have to upgrade to get the shiny dx 12 sticker to tell us were now ready to game on?
 

MavericK

Zero Cool
Joined
Sep 2, 2004
Messages
29,338
so the new dx 12 ultimate will it be able to fully support what new card are out now 5700 2070 2060 2080's or are we going to have to upgrade to get the shiny dx 12 sticker to tell us were now ready to game on?
Read and learn - down at the bottom clicking "Supported GPUs"

https://www.nvidia.com/en-us/geforce/technologies/directx-12-ultimate/

Of course that's just on the nVidia end. I don't believe any of the current AMD cards would support it but someone can correct if wrong.
 

cybereality

Supreme [H]ardness
Joined
Mar 22, 2008
Messages
5,181
From what I understand, it requires RDNA2 architecture, so no current cards would work (and they would probably be too low anyhow with a software based approach).
 

jologskyblues

Weaksauce
Joined
Mar 20, 2017
Messages
117
It might be just my observation, but I have the feeling that Nvidia may have indirectly helped AMD achieve their real-time hybrid raytracing implementation through Microsoft with NV and MS' collaboration on DXR 1.0 and MS' decision to get that framework into their next XBox console since MS defined the requirements for AMD to follow in developing the custom APU for the new console. Maybe Nvidia was willing to let this happen just to make sure their R&D and transistor/die space investments in Turing (RTX, Mesh Shaders, VRS) gets wide spread adoption so these technologies stay relevant as baseline features for the next gen consoles and GPUs from AMD and Intel to have similar implementations via common API specs/requirements as they were introduced in Turing instead of having those features fade into irrelevance like some of Nvidia's other technological investments (PhysX, Simultaneous Multi-Projection, etc.). With DX12 Ultimate and Vulkan Ray Tracing, Nvidia has pretty much had a major hand in defining how hybrid real-time ray tracing is done in the industry. According to the Digital Foundry videos featuring the XBox Series X, AMD's RDNA2 will also have their own "RT cores" to handle BVH in hardware. I do wonder how de-noising will be handled by each vendor going forward. Turing can possibly use their tensor cores while RDNA2 might be able to do them using shaders.
 

MangoSeed

Gawd
Joined
Oct 15, 2014
Messages
708
It might be just my observation, but I have the feeling that Nvidia may have indirectly helped AMD achieve their real-time hybrid raytracing implementation through Microsoft with NV and MS' collaboration on DXR 1.0 and MS' decision to get that framework into their next XBox console since MS defined the requirements for AMD to follow in developing the custom APU for the new console.
There’s no intellectual capital in DXR. Its concepts have been common knowledge in the graphics industry for decades. The secret sauce is in the hardware implementation and nvidia certainly didn’t share that with AMD.

And of course it’s in everyone’s best interest to define a common api for raytracing (DXR/Vulkan RT). Just like for any other graphics feature that you want developers to use.

I do wonder how de-noising will be handled by each vendor going forward. Turing can possibly use their tensor cores while RDNA2 might be able to do them using shaders.
No games use tensors for denoising on Turing. RDNA will denoise just fine.
 

jologskyblues

Weaksauce
Joined
Mar 20, 2017
Messages
117
There’s no intellectual capital in DXR. Its concepts have been common knowledge in the graphics industry for decades. The secret sauce is in the hardware implementation and nvidia certainly didn’t share that with AMD.
I made no such assertion. I just said the framework for the hardware implementation of DXR as well as VRS and Mesh Shading is very much based on Microsoft and Nvidia's work on them which naturally is based on the hardware capabilities that Nvidia has done with Turing. I never implied that Nvidia directly shared any proprietary technology with AMD. The DXR 1.0 API spec was developed by MS and NV after which MS used the resulting requirements in separately developing the custom GPU in the Xbox One Series X with AMD.


And of course it’s in everyone’s best interest to define a common api for raytracing (DXR/Vulkan RT). Just like for any other graphics feature that you want developers to use.
I totally agree, but my point is, it's a win-win for AMD, MS and Nvidia. AMD gets hybrid RTRT sooner than later, Nvidia gets broader developer support for their technologies especially RTX, thus their cards stay relevant in the midst of the AMD-powered next-gen consoles, and MS achieves a more unified platform in the Xbox and Windows gaming PCs.

No games use tensors for denoising on Turing. RDNA will denoise just fine.
I'm aware of that which is why I specifically said "going forward" and "possibly".
 
Last edited:

MangoSeed

Gawd
Joined
Oct 15, 2014
Messages
708
I made no such assertion. I just said the framework for the hardware implementation of DXR as well as VRS and Mesh Shading is very much based on Microsoft and Nvidia's work on them which naturally is based on the hardware capabilities that Nvidia has done with Turing. I never implied that Nvidia directly shared any proprietary technology with AMD. The DXR 1.0 API spec was developed by MS and NV after which MS used the resulting requirements in separately developing the custom GPU in the Xbox One Series X with AMD.
Yeah I get what you’re saying. I just don’t think that DXR itself was particularly helpful in designing hardware.

I’m sure certain hardware optimizations were possible given the constraints imposed by the DXR api but the core problems to be solved - ray/box intersection and ray/triangle intersection have been around long before DXR.
 

kac77

2[H]4U
Joined
Dec 13, 2008
Messages
2,341
I made no such assertion. I just said the framework for the hardware implementation of DXR as well as VRS and Mesh Shading is very much based on Microsoft and Nvidia's work on them which naturally is based on the hardware capabilities that Nvidia has done with Turing. I never implied that Nvidia directly shared any proprietary technology with AMD. The DXR 1.0 API spec was developed by MS and NV after which MS used the resulting requirements in separately developing the custom GPU in the Xbox One Series X with AMD.
This isn't correct not even close. All DX12 cards support DXR. The difference lies in the implementation. Nvidia contributed no more to the API than AMD did.

That's why I've said multiple times that's it's really disingenuous to say Nvidia developed Ray tracing in games it's not true at all. Imagination has done more to put tray tracing in games than even Nvidia.

What you can say though it's that Nvidia is the only one that ships hardware today that had DXR enabled. That's true. But more than that and you're getting into areas where it's just not accurate.
 

jologskyblues

Weaksauce
Joined
Mar 20, 2017
Messages
117
This isn't correct not even close. All DX12 cards support DXR. The difference lies in the implementation. Nvidia contributed no more to the API than AMD did.

What you can say though it's that Nvidia is the only one that ships hardware today that had DXR enabled. That's true. But more than that and you're getting into areas where it's just not accurate.
If you're just talking about Direct X 12 as it was released back in 2015, you are correct, but I was referring specifically to DX12 Ultimate, where the four new key features specifically require back-end hardware support that is only available at the moment on RTX Turing GPUs available since 2018. It's blatantly obvious at this point that Nvidia contributed more and for much longer to DX12 Ultimate seeing that it's their hardware that's being fully supported first and it's in their best interest to do so. Coincidence? I think not. Please don't ignore that MS and NV were the main collaborators in developing the DXR 1.0 API in Windows that leverages RTX hardware and has actually been used in games that support it. Not saying that AMD didn't contribute anything to DX12 Ultimate but at best, AMD's contributions to new API stems more from their work with MS on the XBox One Series X which looks to me is still based on what RTX already has (RT Cores, Mesh shading, VRS, Sampler Feedback). Remember, AMD and Intel eventually had to announce their future support of raytracing in their roadmaps after RTX came out. Not before.

That's why I've said multiple times that's it's really disingenuous to say Nvidia developed Ray tracing in games it's not true at all. Imagination has done more to put tray tracing in games than even Nvidia.

What you can say though it's that Nvidia is the only one that ships hardware today that had DXR enabled. That's true. But more than that and you're getting into areas where it's just not accurate.
I didn't say what you think I said because I was clearly talking about the DX12 Ultimate API which is based on RTX Turing hardware capabilities, not "who invented ray tracing in games". It's Nvidia's own Hardware-Accelerated Real-Time Hybrid Ray Tracing for PC games along with Mesh Shading and Variable Rate Shading that were adopted in Direct X 12 Ultimate as the basic underlying hardware feature set design and those were evidently established on the RTX Turing feature set back in 2018. Surely it wasn't just a coincidence to have Turing to support four out of four DX12 Ultimate features on the outset? AMD and Intel will have to release similar hardware implementations that support these newly announced DX12 features in due time.
 
Last edited:

cybereality

Supreme [H]ardness
Joined
Mar 22, 2008
Messages
5,181
I mean ray tracing in software has existed for decades, for example used for movie graphics.

Nvidia didn't invent ray tracing, what they did was make it viable in real-time for the general public, more so than any other company in recent times.

DXR followed from Nvidia proving RTX worked. MS would never have made a theoretical API for hardware that didn't exist. Nvidia has everything to do with DirectX and Vulkan now adopting ray tracing.
 

jologskyblues

Weaksauce
Joined
Mar 20, 2017
Messages
117
Just to clear things up. I wasn't talking about who invented raytracing and whatnot. I was talking about the actual hardware by which the new features introduced in Direct X 12 Ultimate API were initially based upon.
 

dave343

[H]ard|Gawd
Joined
Oct 17, 2000
Messages
1,679
Got it... So anyone who has, or will be, buying a Navi based card, in which the architecture is less than 9 months old, can't make use of the new DX 12 Ultimate Features... Nice (y)
 
Top