Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
A bit of a scholarly fluff piece but not an inaccurate one.
This is exactly how it works. It's really the same as any other task. Both computation and even manual labor go faster through two approaches:My understanding from the video is that NVidia has taken ray-tracing from being something that took months to render a frame into something that can be done in milliseconds (aka real-time) partly by reducing the number of light projection samples needed to obtain an image.
0) Decide what work is sufficientThis is exactly how it works. It's really the same as any other task. Both computation and even manual labor go faster through two approaches:
1) Do less work
2) Do work faster
The AI-based denoiser allows less work to be done. The RT cores allow for the remaining work to be done faster.
Thanks, that was very interesting.
The most interesting part for me was around 8:45. My understanding from the video is that NVidia has taken ray-tracing from being something that took months to render a frame into something that can be done in milliseconds (aka real-time) partly by reducing the number of light projection samples needed to obtain an image. The low number of light rays basically gives you a rough dotted image but then Nividia uses AI to fill in the gaps which then results in an impressively clear image (the narrators didn't specifically say that it was an AI algorithm but I assume that it would be given the impressive results). It's kind of like DLSS except for light rays instead of video frames.
It's crazy, though. Between AI-agmented ray tracing and DLSS I wonder pretty soon if video cards are going to be primarily used for executing neural nets instead of rendering graphics during gameplay. I mean, it's not totally surprising given how useful GPU's have been for AI applications for years now but until this point it they have always been two separate worlds: frame rendering for gamers and cuda core neural net execution for AI developers. It makes sense that over time these two worlds would gradually merge.
Uh, step 1 covered that already.0) Decide what work is sufficient
This step is neither part of the denoiser nor can be done with RT cores. They could probably use the tensor cores in some way, but it's very likely just CPU code in the driver.
Well...... Better ai has been possible for decades, but people don't find it as fun to play against because they don't like losing.I want on GPU real time AI based in game NPC AI
Don't @ me with facts and aKcHuAlLiEs - I want what I want
Yes, that's basically what denoising is. They just shoot one ray per pixel which isn't nearly enough, you need to shoot a thousand or more to really get an image with no noise in it. So you get the super grainy image you see where a lot of pixels don't even get a ray returning to them. But then you throw a denoising filter on that and holy shit, it looks pretty good. Not 100% the same as an output that has thousands of passes, but way closer than you ever thought possible. They are doing a bunch of other things to speed it up as well, but the denoiser is really the magic part.Thanks, that was very interesting.
The most interesting part for me was around 8:45. My understanding from the video is that NVidia has taken ray-tracing from being something that took months to render a frame into something that can be done in milliseconds (aka real-time) partly by reducing the number of light projection samples needed to obtain an image. The low number of light rays basically gives you a rough dotted image but then Nividia uses AI to fill in the gaps which then results in an impressively clear image (the narrators didn't specifically say that it was an AI algorithm but I assume that it would be given the impressive results). It's kind of like DLSS except for light rays instead of video frames.
That's done at the user interface level in rendering packages or pre-configured by the dev in real time setups like games. In rendering, AI denoising has been a game-changer by allowing an 80% reduction in passes for many scenes. Combined with dedicated RT hardware, this has changed many artists' workflows.0) Decide what work is sufficient
This step is neither part of the denoiser nor can be done with RT cores. They could probably use the tensor cores in some way, but it's very likely just CPU code in the driver.
This is why Ray Tracing will gradually take over traditional raster only methods. The time it saves in workflow and man hours is insane for HQ assets. It saves studios too much money to not use it, if they can save millions on development by making us spend an extra $200 on hardware it’s an easy decision for them.That's done at the user interface level in rendering packages or pre-configured by the dev in real time setups like games. In rendering, AI denoising has been a game-changer by allowing an 80% reduction in passes for many scenes. Combined with dedicated RT hardware, this has changed many artists' workflows.