NVIDIA Previews and Releases Path Tracing

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,868
“NVIDIA has also announced a new set of ray tracing features that will be available in the NVIDIA Caustics branch of Unreal Engine 5, bringing path tracing to that same game engine. NVIDIA Caustics, as it is called, is a set of features in Unreal Engine 5 that includes ray tracing effects, including depth of field in translucent objects, something that is apparently pretty hard to do with traditional rasterization workflows.

NVIDIA released two videos showing a bit more information about the DLSS Frame Generation, Path Tracing, and Nsight developer tools in Cyberpunk 2077 and its Ray Tracing: Overdrive mode.

More information from NVIDIA should be available as the GTC 2023 kicks off later today.”

1679401593791.png




Source: https://www.techpowerup.com/306188/nvidia-previews-and-releases-path-tracing-sdk-at-gdc-2023
 
Need to sell the $2500 5090 ...
The good news, it's only a 6 slot card and not the rumored 7. However, does require a rugged chassis with extra thick steel and/or titanium. Can be powered via fairly typical gas or diesel generators. There's even a power wall+ solution to handle very short term requirements. However, it is recommended that the board be kept away from hard wood floors or other combustible materials. Early prototype originally called the "4090 Ti":

 
"According to NVIDIA, the thing that makes Path Tracing possible now, and more accessible to developers, is the combination of previously available NVIDIA technologies, as well as some new ones, including the new performance multiplier in DLSS 3, called the DLSS Frame Generation."

So, even Nvidia's cards can't really do it.
 
Just try to imagine how good PC games will look at the end of this console gen with this + when devs know how to extract everything from Unreal5 + there are multiple, multiple gens of cards that can do ray tracing scattered about 🤤

Looking forward to the "wow" Division 1 last gave me once again (though RE engine games can look damn good too now)
 
now if they didn't need gimmicks to make the cards "fast" enough to actually to RT.
 
now if they didn't need gimmicks to make the cards "fast" enough to actually to RT.
What about the denoising requirements for ray tracing? Seems there’s a loss of fidelity somewhere somehow in the chain
 
now if they didn't need gimmicks to make the cards "fast" enough to actually to RT.
I'm guessing you consider things like Anisotropic Filtering, Anti-Alising, Tessalation, etc. gimmicks then? Because, you know, all of those are gimmicks to get a desired picture displayed on a screen.
 
Look more stuff people can turn on, look at for a minute and then turn back off so they can actually play the game at enjoyable frame rates.
 
That sounds like a personal choice. Are you grumpy for being given more choices?
It is not like there a Nvidia line of powerful card without dedicated hardware to buy form (and they would come to a cost has well), Internet is mostly about complaining and at least here in a convoluted way one could create some causal link between the high price of their gaming AMD card and that focus.

What about the denoising requirements for ray tracing? Seems there’s a loss of fidelity somewhere somehow in the chain
Like it was shown in a recent thread about Dreamworks rendering tech, today this is even use in non realtime rendering by some because how good it got.

It is a loss of fidelity over an all pixel actually calculated, but at the same time-power budget much higher fidelity than an other solution.
 
It is not like there a Nvidia line of powerful card without dedicated hardware to buy form (and they would come to a cost has well), Internet is mostly about complaining and at least here in a convoluted way one could create some causal link between the high price of their gaming AMD card and that focus.


Like it was shown in a recent thread about Dreamworks rendering tech, today this is even use in non realtime rendering by some because how good it got.

It is a loss of fidelity over an all pixel actually calculated, but at the same time-power budget much higher fidelity than an other solution.
what are they doing, using dithering or what

https://link.springer.com/chapter/10.1007/978-3-031-23473-6_17

Light field rendering and displays are emerging technologies that produce more immersive visual 3D experiences than the conventional stereoscopic 3D technologies, as well as provide a more comfortable virtual or augmented reality (VR/AR) experience by mitigating the vergence–accommodation conflict. Path tracing photorealistic synthetic light fields in real time is extremely challenging, since it involves rendering a large amount of viewpoints for each frame. However, these viewpoints are often spatially very close to each other, especially in light field AR glasses or other near-eye light field displays. In this paper, we propose a practical real-time light field path tracing pipeline and demonstrate it by rendering
 
I'm guessing you consider things like Anisotropic Filtering, Anti-Alising, Tessalation, etc. gimmicks then? Because, you know, all of those are gimmicks to get a desired picture displayed on a screen.
Oh no! Color grading, HDR, physics based particle effects, all gimmicks too! THE CHOICES AHHHHHHHH! :p
 
Oh no! Color grading, HDR, physics based particle effects, all gimmicks too! THE CHOICES AHHHHHHHH! :p
What about the dithering gimmick from 6-bit to apparent 8-bot displays done in nVidia hardware support
 
what are they doing, using dithering or what
Will take a look, not sure how much I will understand (documentation is not out yet), but I imagine they do some intelligent light importance sampling and other trick to kill rays has fast as possible and that denoising got extremely good, making that you need way less rays per pixels for impressive result.

This seem an other tricks used:
https://www.mitsuba-renderer.org/~wenzel/papers/decomposition.pdf

Playing around with it a little bit, there still a good difference between a 4000 samples more than a minute renders and one that take 2500 less times real time (20 fps or so on a 3070).

I'm still not entirely clear on how this differs from existing RT, but OK.
This seem to be close to animated movie type of solution, making it easier for the next quake-portal RTX pathtraced type of game, it is a bit like moving the camera around in blender with Optix on with how it look like, but aiming at 20 fps on a 3070 instead of a frame every 2-3 second, which I imagine will be 60 fps lock with a dlss 3 on a 4060ti or above.
 
Last edited:
This is great but I'm really hoping to see any UE5 next gen game other than Fortnite, before adoption of the new-for-2023 APIs begins in earnest...
 
Back
Top