“Back Stage” Ray Tracing Tech Demo - Created By Luminous Productions

https://www.dusterwald.com/2016/07/path-tracing-vs-ray-tracing/

Basically Ray tracing is still not really perfectly physically accurate.

(Ray Tracing) "It gives us reflections and refractions virtually for free and it gives very nice hard shadows (unfortunately in the real world shadows are rarely if ever perfectly sharp). So just like rasterization engines have to cheat to achieve reflections and refractions (pay close attention to reflective surfaces in games, they either reflect only a static scene, or are very blurry or reflect only objects that are on screen), a ray tracer has to cheat to get soft shadows, caustics, and global illumination to name a few effects required to achieve photo realism."

"Now a path tracer is like a ray tracer on steroids. Instead of sending out one ray it sends out tens, hundreds or even thousands of rays for each pixel to be rendered. When it hits a surface it doesn’t trace a path to every light source, instead it bounces the ray off the surface and keeps bouncing it until it hits a light source or exhausts some bounce limit. It then calculates the amount of light transferred all the way to the pixel, including any colour information gathered from surfaces along the way. It then averages out the values calculated from all the paths that were traced into the scene to get the final pixel colour value. "

So path tracing is even more brute forced than Ray tracing... :p dang

Fun stuff, even for offline rendering this artist recommends to turn off global illumination on Ray tracing and to fake it with rasterization tricks, this tells me that the wholly deformable ray traced universe that gamers want will not come to be any time soon.
 
Last edited:
This blog post helps a bit too.

https://news.developer.nvidia.com/t...active-path-tracing-scenes-from-a-short-film/

I thought that path tracing meant choosing ray paths completely at random and using lots of rays then figuring out which rays happen to hit a light source. Nvidia seems to be using a slightly different definition that involves selectively choosing which lights to shoot rays at.

So not completely random but it doesn't waste as many rays as brute force path tracing.
 
yea, path tracing is basically the method we'll wanna go for, but it's expensive. 18 years ago when I first tried it (splutterfish Brazil renderer) it was insanely slow, but for arch viz working primarily in lighting, it was obviously the future as we get more speed/memory. Nowadays, something like OTOY gets pretty close to realtime, at least with a heavy dose of denoising.

Combined with some of that AI/machine-learning stuff, who knows... I don't think it's "a few hundred years" away, I mean folks are tricked everyday thinking video games are real footage, a lot of here are just so tuned into what is/isn't real because we're actively looking for it
 
Nowadays, something like OTOY gets pretty close to realtime, at least with a heavy dose of denoising.

In my experience, Octane is nowhere close to being suitable for realtime gaming, even on four GPUs. That said, it is massively better than "good enough" for an artist building a scene who needs quick real time or near-ish real time interface feedback while painting the scene. It's beyond anything we had imagined as recently as 6-7 years ago.

This is clearly where 3D rendering, including gaming, is going - and very very quickly. A new computational paradigm is being developed along with specialized hardware to run it. Everybody here should be pretty excited for this even if it's going to be another generation or two before we have $150 GPUs that work with it for gaming. Pathtracing truly is the next step in the evolution of gaming, and everything that happens between then and now is simply polishing the old tech to keep us happy until the future is now.
 
looks one step above hl2. im not impressed at all, looks just as fake as something from the last 3 generations-
 
looks one step above hl2. im not impressed at all, looks just as fake as something from the last 3 generations-
OK, if there is not much difference you can play games with 2003 graphics.
 
  • Like
Reactions: T4rd
like this
Aside from reflections on the lipstick(who cares) most of this could be simulated in current gen engines.
 
Aside from reflections on the lipstick(who cares) most of this could be simulated in current gen engines.

I mean... a I the only one here who caught that the entire demo the only things not rendered via ray tracing are the mirror itself and the things stuck to it?
 
https://www.dusterwald.com/2016/07/path-tracing-vs-ray-tracing/

Basically Ray tracing is still not really perfectly physically accurate.

This is one of the reasons why some of us curmudgeonly types (like me) have been shitting on all the ray tracing hype. Classical eye-based ray tracing isn't a complete solution to the rendering equation. It has some neat benefits that rasterization doesn't, but has plenty of drawbacks as well and is computationally expensive. It isn't a magic solution that gets you photorealism without any additional "cheats". For that matter, neither is path tracing. It comes close... but it doesn't handle light scattering properly. Volumetric path tracing is an extension that does and thus and properly simulate light going through smoke and so on. Basically we are VEEEERY far away from being able to have a rendering engine that handles everything, in real time, with no cheating and gives us photorealistic results.
 
I was under the assumption that most of the RTX games are path traced and thats why they need the denoiser? Quake2 RTX seems to be path traced. Minecraft seems to be path traced. Am i wrong?
Yeah don't think we're ever going to see full path tracing in games. It's too wasteful. Raytracing with multiple rays per pixel + multiple bounces + denoising is probably the best we can expect for a few hundred years :)
Multiple rays per pixel+multiple bounces... didn't you just describe path tracing? I thought plain vanilla ray tracing was a single ray per pixel, do some bouncing depending on surface, and then trace to each light source.
 
It looks quite a bit better.

Of course, if it was an AMD demo, you'd be singing its praises...

again, it looks better and thats expected following moores law and cramming more face rigging and pixels on the screen. but it looks just as fake. its comparing a sock puppet to a muppet. fake fake fake fake fake.

haha and i only have nvidia gear you hack
 
I was under the assumption that most of the RTX games are path traced and thats why they need the denoiser? Quake2 RTX seems to be path traced. Minecraft seems to be path traced. Am i wrong?

Multiple rays per pixel+multiple bounces... didn't you just describe path tracing? I thought plain vanilla ray tracing was a single ray per pixel, do some bouncing depending on surface, and then trace to each light source.

Yeah essentially. The method described by nvidia isn’t classic stochastic path tracing though. It also traces the final bounce to a light source and doesn’t just scatter rays randomly into the scene and hope to hit a light.
 
Which is the case for every incremental improvement in 3D graphics for the past 30 years. You have a better suggestion?

bad argument: "i dont like this singer" -> "well, can you sing better?"

if i could make it better id be working for nvidia and not arguing with a MangoSeed on a dead forum. i dont know what they need to do, it just looks like shit. they cram more and more polygons and lighting into each gen and it still looks fake fake fake. im never fooled into thinking i am watching video of a person as soon as they start moving. ive said this before, i dont think there is anyone like i dunno a photographer or an artist taking a step back and looking at the big picture and giving feedback to whatever engineers make these demos. its just more and more lifeless tech. we are missing something that is required to make the jump to life like.

No idea, what that is, and i dont need to know to have an opinion about it, because i am the consumer of their product and i can say it looks like shit whenever i want. ive done ya better and even provided specific reasons. like superhero movies, some people get validation and identify with these companies so naturally they get offended when you dont join in the circle jerk for their product.
 
If you just focus on the lighting it looks pretty good. The animations and some other things throw it off, but that’s not the point.

Like the cups, her shirt when she sits down, ect.
 
bad argument: "i dont like this singer" -> "well, can you sing better?"

if i could make it better id be working for nvidia and not arguing with a MangoSeed on a dead forum. i dont know what they need to do, it just looks like shit. they cram more and more polygons and lighting into each gen and it still looks fake fake fake. im never fooled into thinking i am watching video of a person as soon as they start moving. ive said this before, i dont think there is anyone like i dunno a photographer or an artist taking a step back and looking at the big picture and giving feedback to whatever engineers make these demos. its just more and more lifeless tech. we are missing something that is required to make the jump to life like.

No idea, what that is, and i dont need to know to have an opinion about it, because i am the consumer of their product and i can say it looks like shit whenever i want. ive done ya better and even provided specific reasons. like superhero movies, some people get validation and identify with these companies so naturally they get offended when you dont join in the circle jerk for their product.

Calm down there cowboy. I didn’t say you need to be a super genius to have an opinion.

I meant what is the point of whining about incremental improvements in technology. Seems your only other option is to freeze yourself and come back in a few hundred years.

And why do people keep talking as if raytracing belongs to one company? It’s just an algorithm and it that has been around forever. It’s not some proprietary thing owned by one “side”.
 
Visual gains don't justify the performance penalty at this time.
 
bad argument: "i dont like this singer" -> "well, can you sing better?"

Bad argument.

No idea, what that is, and i dont need to know to have an opinion about it, because i am the consumer of their product and i can say it looks like shit whenever i want. ive done ya better and even provided specific reasons. like superhero movies, some people get validation and identify with these companies so naturally they get offended when you dont join in the circle jerk for their product.

You seem to go out of your way to point out that you are "not in a circle jerk for <whatever you are bitching about at the moment>".

Sorry, we don't know what's wrong with you or how to help you get the "validation" you appear to crave. Perhaps you should talk to someone.

btw, not the slightest bit offended by your opinions... opinions are like assholes, everybody has one. In mine it does look better. And we know yours.

Please come back and tell us we are "getting validated", "shills", or whatever else you come up with next...

I will agree that some people fit into your description, I would suggest you not assume that about everyone whose opinion doesn't align with yours. My 2¢, free.
 
Bad argument.



You seem to go out of your way to point out that you are "not in a circle jerk for <whatever you are bitching about at the moment>".

Sorry, we don't know what's wrong with you or how to help you get the "validation" you appear to crave. Perhaps you should talk to someone.

btw, not the slightest bit offended by your opinions... opinions are like assholes, everybody has one. In mine it does look better. And we know yours.

Please come back and tell us we are "getting validated", "shills", or whatever else you come up with next...

I will agree that some people fit into your description, I would suggest you not assume that about everyone whose opinion doesn't align with yours. My 2¢, free.

no u
 
People talking all this technical detail on if current raytracing is good enough or not.

Who cares.

Just look at Control and look at how good the reflections are, in particular. It adds enough to the graphics that it's worth it if you can run it at a decent frame rate. Who gives a shit if it's 100% or not.

This would be like complaining back in the 3DFX days that true 3D engines like Quake 2 weren't a 'real' full raster engine because in order to run things fast enough things were only being done with 16-bit precision. Who cares if it's running well enough and is still offering enough fidelity. You don't need CGI movie level raytracing/raster with perfect precision, nor are games CAD software where perfect precision is required.
 
It adds enough to the graphics that it's worth it if you can run it at a decent frame rate.

It adds enough to the graphics that it's worth it if you can afford the hardware .... IMHO most gamers can't and that's why they are playing on console or mobile games. Another factor is games ... for the past 2 years the PC game releases have been anything but awe inspiring so why lay out lots of cash for ray tracing hardware?
 
It adds enough to the graphics that it's worth it if you can afford the hardware .... IMHO most gamers can't and that's why they are playing on console or mobile games. Another factor is games ... for the past 2 years the PC game releases have been anything but awe inspiring so why lay out lots of cash for ray tracing hardware?

Good point, but the same could have been said when the first GPU's were out. You could play Quake 2, Monster Truck Madness, etc all under software renderer if you didn't really care and/or didn't have the money for a GPU.
 
Good point, but the same could have been said when the first GPU's were out. You could play Quake 2, Monster Truck Madness, etc all under software renderer if you didn't really care and/or didn't have the money for a GPU.

The bug difference is that hardware accelerated quake 2 was faster and with higher resolution than without.

Right now it doesn't really accelerate, and reflections even if imperfect have existed for years without raytracing, and good enough at it, heck doom 3 in the bathroom had a pretty neat mirror, and it made sense to be *a mirror there and not just everywhere as it's over used with rt on at the moment.
 
The bug difference is that hardware accelerated quake 2 was faster and with higher resolution than without.

Right now it doesn't really accelerate, and reflections even if imperfect have existed for years without raytracing, and good enough at it, heck doom 3 in the bathroom had a pretty neat mirror, and it made sense to be *a mirror there and not just everywhere as it's over used with rt on at the moment.

Screen space and environment map based reflections are actually quite terrible. SSR by definition can’t reflect things that are off screen. The idea that this is ok is mind boggling.

Environment maps are low resolution, don’t support self reflection and incorrectly assume all reflected objects are the same distance away.

Just because we’re used to terrible things doesn’t make them good enough.
 
Older reflections were tricks. They would do things like have a floor (or mirror) really be a semi-transparent plane, and then actually duplicate everything in the scene that needed to be reflected and place them (flipped) behind the plane.

This actually works well in a bathroom mirror but has many limitations, such as the surface being a flat plane and can only be used in a limited sense or else performance would tank.
 
Last edited:
For the massive performance hit yeah they were good enough. Specially when you account how little is fully chromed in reality, but whatever enjoy perfectly reflective puddles everywhere and streets that look mirror polished, all for the low low cost of a couple notches of resolution or half your performance in current generation.
The comparison to hardware accelerated quake 2 from back in the day vs software is still wrong.
 
For the massive performance hit yeah they were good enough. Specially when you account how little is fully chromed in reality, but whatever enjoy perfectly reflective puddles everywhere and streets that look mirror polished, all for the low low cost of a couple notches of resolution or half your performance in current generation.
The comparison to hardware accelerated quake 2 from back in the day vs software is still wrong.

What are you complaining about, you don't think the demo looks good, or you think the hardware is too expensive? And the comparison to voodoo cards isn't too far off. Better graphics, more expensive.
 
Back
Top