NVIDIA Interview – Discussing Ray Tracing Support And ‘Proprietary’ Extensions

chameleoneel

Supreme [H]ardness
Joined
Aug 15, 2005
Messages
7,580
https://wccftech.com/nvidia-intervi...ary-extensions/amp/?__twitter_impression=true

Short, straightforward interview from wccftech

NVidia says only 3 games with ray tracing, use custom extensions. Those are Vulkan games. And that's because the standards for Vulkan were not finished by the time those games needed ray tracing.
All other games (asa far as Nvidia is aware) follow DXR standards and should work on any other GPUs which also follow the DXR standards.

I'm paraphrasing there. It's a good little interview. Check it out!
 
It's good to hear that they're embracing the ray tracing built into standard APIs but I couldn't help notice that there was no mention of the gameworks RT module that they had previously announced, hopefully they've scrapped it.
 
I would have liked to ask why the AMD RX 6800 seems to do better with Ray tracing when DLSS is off? As DLSS is a software solution and Tensor cores the hardware side of Ray tracing something there is not adding up. Wccftech reported last week that a RX 6800 (non XT) pulled slightly ahead in Ray tracing in some games with DLSS off. Shocking to me if true especially when you consider where the 6800 is in the pecking order this bodes well for the rest of the stack. Even if it was a controlled environment made to shine AMD in the best light it is still ominous in it's implications. The Super Resolution feature of FidelityFX (DLSS alt) is currently under development but, won't be ready at the time of the 6000 series release. I for one am wishing it great success. Its expected to gain wide industry support (according to KitGuru) much the same way as Freesync so who knows where we'll be by the end of 2021 but I'd say AMD fans are going to be quite happy.
 
I would have liked to ask why the AMD RX 6800 seems to do better with Ray tracing when DLSS is off? As DLSS is a software solution and Tensor cores the hardware side of Ray tracing something there is not adding up. Wccftech reported last week that a RX 6800 (non XT) pulled slightly ahead in Ray tracing in some games with DLSS off. Shocking to me if true especially when you consider where the 6800 is in the pecking order this bodes well for the rest of the stack. Even if it was a controlled environment made to shine AMD in the best light it is still ominous in it's implications. The Super Resolution feature of FidelityFX (DLSS alt) is currently under development but, won't be ready at the time of the 6000 series release. I for one am wishing it great success. Its expected to gain wide industry support (according to KitGuru) much the same way as Freesync so who knows where we'll be by the end of 2021 but I'd say AMD fans are going to be quite happy.
Wasn't that when it was put against a 3070? If DLSS is off... Wouldn't that make it an apples to apples comparison?
If the marketing data is to be believed. I am not surprised that it would pull ahead with out DLSS running on the Nvidia card.
 
It will be interesting to see how many games hit in the next 6 months for the next gen consoles and PC built for MS standard API. As I have said for a long time.... AMD has been working on RT for years now for this generation console. Nvidia realized they could get a product out first as they where building in AI bits that could be purposed to do the same job. I think they honestly thought if they got something out working fast enough they might be able to embed their way of doing deeply into the game developer market and perhaps lock things into NV. Its almost like they absorbed 3DFX at some point. I half kid... NV seems to have that priority API to lock in sales thinking deeply ingrained.

Of course we still need to wait and see for sure how AMD stacks up with RT content. Based on what we know of their Ray units being part of each CU... I tend to think it will scale very well. NV has to cut a lot of the Tensor/RT bits off on their lower end skus. I mean a 2060 only has 30 RT cores... based on what we have heard so far about AMDs solution it seems like they have a simple ray generation unit in each CU which is directly attached which should have magnitudes better internal latency penalties and potentially 1,000s of cores that could be used by developers even on their lowest end and console parts. Going to be an interesting comparison as the techs seem so very different.
 
I think it’s more going to come down to what content is and isn’t ray traced. At this stage in the techs deployment I wound expect vast performance differences between titles and differences in what is or isn’t ray traced. I think it’s going to be another gen or 2 before we have a “standard” implementation set of it means when something is ray traced.
 
It will be interesting to see how many games hit in the next 6 months for the next gen consoles and PC built for MS standard API. As I have said for a long time.... AMD has been working on RT for years now for this generation console. Nvidia realized they could get a product out first as they where building in AI bits that could be purposed to do the same job. I think they honestly thought if they got something out working fast enough they might be able to embed their way of doing deeply into the game developer market and perhaps lock things into NV.

That's a load of nonsense. All the games out so far, apart from 3, are using the MS standard API. Nvidia has been working with Microsoft for a long time on Ray Tracing. They did a demo of Metro Exodus back in March 2018 with Microsoft. They worked with Microsoft to get DXR 1 out by November 2018.

Those 3 games are using Vulkan and there is no standard out yet that's why you need vendor specific APIs to do Ray Tracing in Vulkan games at the moment. There is nothing stopping AMD from writing their own API that works with their cards in those 3 games.
 
It's good to hear that they're embracing the ray tracing built into standard APIs but I couldn't help notice that there was no mention of the gameworks RT module that they had previously announced, hopefully they've scrapped it.
No, it is a thing that has been worked into a lot of professional tools, but it also uses the Microsoft DXR libraries, it is just a collection of tools NVidia has created to assist developers.

" Introducing GameWorks for Ray Tracing
To allow game developers to take advantage of these new capabilities, NVIDIA also announced the NVIDIA GameWorks™ SDK will add a ray-tracing denoiser module. This suite of tools and resources for developers will dramatically increase realism and shorten product cycles in titles developed using the new Microsoft DXR API and NVIDIA RTX. With these capabilities, developers can create realistic, high-quality reflections that capture the scene around it and achieve physically accurate lighting and shadows. Making these capabilities available on an industry-standard platform like Microsoft DXR means every PC game developer will have access to levels of realism never before possible."
 
Wasn't that when it was put against a 3070? If DLSS is off... Wouldn't that make it an apples to apples comparison?
If the marketing data is to be believed. I am not surprised that it would pull ahead with out DLSS running on the Nvidia card.
Yes and my point exactly. It matters little that AMD or Nvidia collaborated with Microsoft and use DX12 api or developed tools for developers. We all know what happens with proprietary BS like SLI or Gsync and now tensor cores. Nvidia tools are just another hoop to go through and what we end up with is games optimized for them and not others. While Nvidia separates itself from the competition and expects the industry to follow it's lead under the guiz of innovation AMD collaborates, conforms, reinvents and, works well with everybody. When a competitor 1 ups you on your supposed innovation and only software makes the difference your forced to compare apples to apples and end up with a whole lot of egg on your face. That said, when Nvidia resolve there node issues they will likely jump to the top of the heap again but for now/this generation things seem to be leaning AMD's way which is great for bringing prices down (back to resonable) and will spur real innovation. Bottom line don't be fooled by Nvidia hype under the giz of innovation or you'll spend 1300 on a 2080ti only to be obsolete and useless for RT way before it's time. All we did was providing Nvidia much need cash for R&D.
 
No, it is a thing that has been worked into a lot of professional tools, but it also uses the Microsoft DXR libraries, it is just a collection of tools NVidia has created to assist developers.

" Introducing GameWorks for Ray Tracing
To allow game developers to take advantage of these new capabilities, NVIDIA also announced the NVIDIA GameWorks™ SDK will add a ray-tracing denoiser module. This suite of tools and resources for developers will dramatically increase realism and shorten product cycles in titles developed using the new Microsoft DXR API and NVIDIA RTX. With these capabilities, developers can create realistic, high-quality reflections that capture the scene around it and achieve physically accurate lighting and shadows. Making these capabilities available on an industry-standard platform like Microsoft DXR means every PC game developer will have access to levels of realism never before possible."
We all know that Nvidia has used the gameworks tools to make it hard for AMD to optimize their drivers even when it's built on standard APIs and runs on both, not to mention obfuscating the fact that they were doing things like tessellating at the sub-pixel level because it hurt current AMD cards much more than theirs. In this case it would likely be similar to the latter with them pushing reflections and toning down AO and lighting. To be fair we also know that developers used these tools because it simplified their work and saved development time.

I'd rather neither company try to play these games and it sounds like AMD is at least flirting with doing the same thing with their fidelityfx thing which could be bad for users if it escalates into a tit for tat situation with certain games favoring one and some favoring the other. Of course I could be wrong and it might end up being a situation where both end up with tools that make it easier for devs to get the most out of both but that sounds too good to be true.
 
We all know that Nvidia has used the gameworks tools to make it hard for AMD to optimize their drivers even when it's built on standard APIs and runs on both, not to mention obfuscating the fact that they were doing things like tessellating at the sub-pixel level because it hurt current AMD cards much more than theirs. In this case it would likely be similar to the latter with them pushing reflections and toning down AO and lighting. To be fair we also know that developers used these tools because it simplified their work and saved development time.

I'd rather neither company try to play these games and it sounds like AMD is at least flirting with doing the same thing with their fidelityfx thing which could be bad for users if it escalates into a tit for tat situation with certain games favoring one and some favoring the other. Of course I could be wrong and it might end up being a situation where both end up with tools that make it easier for devs to get the most out of both but that sounds too good to be true.
That’s the problem with “optimizing” (I really hate that term) during the process you are fine tuning the elements to run best in a specific environment. And that means when taken from that environment it will not work as well, it’s not necessarily malicious but an unfortunate part of what it means to optimize something. But yes it would be more friendly if you stop said processes before they get to a point of no longer being hardware agnostic. But the tools now tend to not optimize so much for a set of hardware but instead a chosen game engine and a performance profile. The optimization process is than more accurately handled at the driver level it’s why NVidia is so proud of its day 0 game patches for large launches. They can fine tune the profiles internally from there to work with changes or specifics the game engine or texture lighting process created.
This though is one of the things that the DXR libraries really hope to speed up though, the Unreal team talked about it during their PS5 demo but the ability to import the movie quality textures and objects and let the tools strip them down to the basic forms then let the engines ray tracing effects calculate the needed data. It drastically cuts down on the “optimization” that need to be done and the amount of information stored in objects and textures which speeds up development times and ultimately leads to a more consistent performance across platforms.
 
Back
Top