NVIDIA RTX: Cinematic Real-Time Ray Tracing

DooKey

[H]F Junkie
Joined
Apr 25, 2001
Messages
13,500
There's been a lot of buzz this week since Nvidia announced their new RTX line at SIGGRAPH. We've covered it pretty extensively the last couple of days.Yesterday they released another cool video that shows cinematic real-time ray tracing and I think you'll agree that it looks pretty dang awesome. I want my games to look like this from now on. Hopefully we'll see developers jump on this bandwagon right away.

Watch the video here.
 
That looked amazing.
Also I liked how it went from serious to fun.
 
The video doesn't tell me a whole lot though. How do I know that is real time, and not pre-rendered? Show me what it would look like without real-time ray-tracing, so I can see what the difference is. The only thing that is "standing out" to me is that there are a lot of shiny reflections, which seem overdone. Maybe there are other things that make it look "better" without the technology, but without a "before and after" how would I know?
 
Damn you Nvidia....

Cgn7C5nUoAAysOI.jpg-large.jpg
 
The video doesn't tell me a whole lot though. How do I know that is real time, and not pre-rendered? Show me what it would look like without real-time ray-tracing, so I can see what the difference is. The only thing that is "standing out" to me is that there are a lot of shiny reflections, which seem overdone. Maybe there are other things that make it look "better" without the technology, but without a "before and after" how would I know?

Agreed. Show me what the difference is
 
Well that was fucking stupid.

Also while I'm sure ray tracing was obviously used I didn't see anything THAT impressive myself.

Try looking at the subtleties in the lighting that is in real time. There isn't anything else like it. If you don't see what is actually happening on the surfaces, then I'm sorry for you. Continue playing Doom. The rendering time for something like this in the past would be impossible in real time. It would take hours to complete each frame.

The video doesn't tell me a whole lot though. How do I know that is real time, and not pre-rendered? Show me what it would look like without real-time ray-tracing, so I can see what the difference is. The only thing that is "standing out" to me is that there are a lot of shiny reflections, which seem overdone. Maybe there are other things that make it look "better" without the technology, but without a "before and after" how would I know?

How do you know? Well I guess you don't. But do you really think nVidia would showcase this saying it's real time when it's not, when companies will be spending half million dollars for rendering farms? This obviously isn't a product you understand or comprehend.
 
I don't think it really looks any better than most games. However, I wouldn't expect the first gen of this stuff, to eclipse current game engines. and it remains to be seen how this will work in a larger "map" than a small room. But, there isn't any aliasing, which is nice. and the quality of the diffuse lighting, glow from light sources, and depth of field effects are all really high quality.
 
About time the actual gameplay could like as good as the pre rendered cut-scenes.
 
...

How do you know? Well I guess you don't. But do you really think nVidia would showcase this saying it's real time when it's not, when companies will be spending half million dollars for rendering farms? This obviously isn't a product you understand or comprehend.

Well they've done it before. And they would have gotten away with it if it weren't for those pesky woodscrews.
 
The video doesn't tell me a whole lot though. How do I know that is real time, and not pre-rendered? Show me what it would look like without real-time ray-tracing, so I can see what the difference is. The only thing that is "standing out" to me is that there are a lot of shiny reflections, which seem overdone. Maybe there are other things that make it look "better" without the technology, but without a "before and after" how would I know?
It's much different to have a demo than to have a game. Lots of demos were ahead of their time but when all you're rendering is a person in a confined room, anything can look really good. Look at this Ruby double cross demo that's from 2004 meant to show off the Radeon X800. This pretty much looks like what games are like today, despite 14 years of time has passed.



And again Nvidia Dawn demo from 2003. Again, a single 3D model in a room without any game mechanics beyond a scripted cut scene. If Nvidia wants to impress me, take Fallout 4 and mod it to have Ray Tracing and then lets talk.

 
Do you all really believe that this technology is going to be affordable to mid budget gamers anytime soon? Because that’s exactly what’s needed to kickstart adoption amongst game designers...

Also, do you think an affordable gamers card from nVidia is going to be able to do ray tracing at anything over 1080p 30fps? Hell, it will probably even struggle with that.

Don’t get me wrong, this is amazing tech that I have been waiting for ever since buying my first 3DFX card so many years ago. But I’m also a realist, that know nVidia is not all about the gamers, and that this tech is not going to be mainstream for many years to come.
 
Do you all really believe that this technology is going to be affordable to mid budget gamers anytime soon? Because that’s exactly what’s needed to kickstart adoption amongst game designers...

Also, do you think an affordable gamers card from nVidia is going to be able to do ray tracing at anything over 1080p 30fps? Hell, it will probably even struggle with that.

Don’t get me wrong, this is amazing tech that I have been waiting for ever since buying my first 3DFX card so many years ago. But I’m also a realist, that know nVidia is not all about the gamers, and that this tech is not going to be mainstream for many years to come.

I expect it to be mainstream quickly.

The differentiator will be just how much ray-tracing is being used, given that this will start out as a 'second path' for lighting.

The big kicker would be AMD putting it in the next console SoCs. Obviously those are still deep in development, so we'll see varying levels used to varying effect on desktop games, but that will be the 'critical mass' moment.
 
It's much different to have a demo than to have a game. Lots of demos were ahead of their time but when all you're rendering is a person in a confined room, anything can look really good. Look at this Ruby double cross demo that's from 2004 meant to show off the Radeon X800. This pretty much looks like what games are like today, despite 14 years of time has passed.



And again Nvidia Dawn demo from 2003. Again, a single 3D model in a room without any game mechanics beyond a scripted cut scene. If Nvidia wants to impress me, take Fallout 4 and mod it to have Ray Tracing and then lets talk.


It'd probably be easier to create a whole new engine from scratch. Gamebryo reminds me of those old beat up cars you see with a different color parts.
 
If you don't see what is actually happening on the surfaces, then I'm sorry for you. Continue playing Doom. The rendering time for something like this in the past would be impossible in real time. It would take hours to complete each frame.

Oh sod off with your condescending BULLSHIT! What's with people like you and just needing to be insulting over such stupid shit? Yes, I saw subtleties but it's not some mind blowing shit. Neither is it something that would have taken "hours each frame"...did GPU power just jump a head a decade over one generation?
 
It'd probably be easier to create a whole new engine from scratch. Gamebryo reminds me of those old beat up cars you see with a different color parts.

Several engines, specially Frostbite, Unreal, and Unity already have support for both Vulkan and DX12's Ray Tracing APIs, so support is there. NVIDIA is simply adding it's own framework on top that allows performance boosts on its cards. Without the specialized HW/SW there simply isn't a way to currently do Ray Tracing at a playable FPS.

What likely happens long term is NVIDIA's solution get's rolled up into the next API revision, at which point AMD will catch up. No different then when NVIDIA and ATI each made separate Pixel Shader implementations; they eventually got combined and rolled into DX9.
 
Oh sod off with your condescending BULLSHIT! What's with people like you and just needing to be insulting over such stupid shit? Yes, I saw subtleties but it's not some mind blowing shit. Neither is it something that would have taken "hours each frame"...did GPU power just jump a head a decade over one generation?

With respect to ray-tracing, it just did. Going from shader processors to dedicated hardware has been quoted as a 25x jump in ray-tracing performance.

And with respect to relative quality- just getting ray-tracing to the 'level' that current raster methods can reach is an achievement. Not because that's a jump in graphical fidelity, but because it shows that developers can start piling on the detail in ways that the raster engines simply cannot.
 
Oh sod off with your condescending BULLSHIT! What's with people like you and just needing to be insulting over such stupid shit? Yes, I saw subtleties but it's not some mind blowing shit. Neither is it something that would have taken "hours each frame"...did GPU power just jump a head a decade over one generation?

Reflection/Refraction of light is near impossible to do realistically without using a Ray Tracing engine; that's why those effects are being focused on in this demo. Remember how "god rays" kill performance even in modern games? You get those for free in a Ray Tracing engine.

I won't speak for performance, but adding specialized HW to handle Ray Tracing is the right way to go if you care about performance.
 
Several engines, specially Frostbite, Unreal, and Unity already have support for both Vulkan and DX12's Ray Tracing APIs, so support is there. NVIDIA is simply adding it's own framework on top that allows performance boosts on its cards. Without the specialized HW/SW there simply isn't a way to currently do Ray Tracing at a playable FPS.

What likely happens long term is NVIDIA's solution get's rolled up into the next API revision, at which point AMD will catch up. No different then when NVIDIA and ATI each made separate Pixel Shader implementations; they eventually got combined and rolled into DX9.
If it's anything like the existing multigpu support that's also built into some of those engines, then I'm not too optimistic about it.
 
That’s pretty insane, this single card is doing the workload of 4 previous ones.

This is a fairly exciting release.
 
Oh sod off with your condescending BULLSHIT! What's with people like you and just needing to be insulting over such stupid shit? Yes, I saw subtleties but it's not some mind blowing shit. Neither is it something that would have taken "hours each frame"...did GPU power just jump a head a decade over one generation?
Some people look up at the stars and just see several dots of light, walking away bored and unimpressed. Others look at those same dots of light and see the beauty that lies beneath, the subtleties the make each star unique, and the stories that they tell.

You claim you noticed the subtleties, but yet you seem to be missing the details that others are appreciating. Forest and the trees, so to speak.

Meanwhile, you call fwiler1 condescending yet your mastery of sophisticated word choices doesn't strengthen your original comment nor your reply, it only weakens it.
 
It'd probably be easier to create a whole new engine from scratch. Gamebryo reminds me of those old beat up cars you see with a different color parts.
Speaking of cars, here's a car being Ray-Traced on three PS3's. I'm sure it doesn't impress anyone here cause it needs that many PS3's, until you realize that a modern PC can emulate the PS3 easily. Which I have to remind people that emulating anything requires at least 10X more processing power than the hardware you emulate. The IBM demo is just a 3D model in a confined space. Doesn't that sound familiar?



Do you all really believe that this technology is going to be affordable to mid budget gamers anytime soon? Because that’s exactly what’s needed to kickstart adoption amongst game designers...

Also, do you think an affordable gamers card from nVidia is going to be able to do ray tracing at anything over 1080p 30fps? Hell, it will probably even struggle with that.

Don’t get me wrong, this is amazing tech that I have been waiting for ever since buying my first 3DFX card so many years ago. But I’m also a realist, that know nVidia is not all about the gamers, and that this tech is not going to be mainstream for many years to come.
I don't expect Nvidia's version of Ray-Tracing to be affordable, but I do expect AMD's to be mainstream. I really really doubt Nvidia's Ray-Tracing is 100% on the GPU. I'm willing to be it uses the CPU with some post processing on the GPU to limit the amount of rays casted.

Several engines, specially Frostbite, Unreal, and Unity already have support for both Vulkan and DX12's Ray Tracing APIs, so support is there. NVIDIA is simply adding it's own framework on top that allows performance boosts on its cards. Without the specialized HW/SW there simply isn't a way to currently do Ray Tracing at a playable FPS.
Oh really? Why can I download this demo made in 2012 that uses DX11 that gives me real time ray tracing? Nvidia is adding the same shit we see from Creative's EAX and Nvidia's PhysX. Runs on top of existing API's but proprietary enough to deny competitors.

 
Not sure what anyone thinks is really changing any time soon. I have no doubts about this being real time... but what where they rendering it on.

How many shiny new RTX 8000s where in this system ? 40 thousand worth ? Perhaps only 20 thousand..... or was it rendered on one 8000, which would only put the hardware cost at 10k. lmao

These demos are wonderful and cool and all... however the hardware to power this at gaming frame rates hasn't been shown to be widely on offer for realistic end user pricing. This is just more hey might be cool in 7 or 8 years territory... which is where real time ray tracing has been for 20 years.

Their 2080 is coming sure... but is it going to have the horse power to render that demo at 60fps at even 1080p. I doubt it. Would be happy to be wrong.
 
Oh really? Why can I download this demo made in 2012 that uses DX11 that gives me real time ray tracing? Nvidia is adding the same shit we see from Creative's EAX and Nvidia's PhysX. Runs on top of existing API's but proprietary enough to deny competitors.

Nah bro, apparently people like you and I are just idiots and don't really "get it".
 
Ray tracing in real time would be great, but this demo is lackluster as hell. Maybe because the building blocks are crap. This reminds me of those nineties demos, where every surface was overly shiny just because we could do it. In this static environment you could've faked all this with static maps, no need for real time ray tracing.
Probably that's why it doesn't look impressive at all, because we have seen much better scenes in games already rendered in real time. It doesn't matter if the lighting was faked, if the end result is convincing.

Long story short: It's not enough to just do something with ray tracing and call it a day, use it wisely, and sparingly where it would really stand out if the lighting was faked. This scene can be done to equal graphical fidelity without real time ray tracing, that's why this is a meh.
 
  • Like
Reactions: noko
like this
It looks impressive, yes.

When I see it running in a real game at 1440p 60 FPS, then I'll be onboard.
 
This looks amazing, simply beyond what we have today, WOW

You either have old hardware or haven't played newer games.

Try Vermintide 2
Try Star Wars Battlefront 2
Try Hunt Showdown
Heck, even Doom...

Yeah - I'm not sure I see anything on this demo that looks beyond what I've already seen my current system capable of doing...
 
Need to start somewhere. The hardware needs to come first. Then the games later.
What I'm curious about does this help the animated movies. Will this make render times faster and look better for animated shows on TV?
 
The video doesn't tell me a whole lot though. How do I know that is real time, and not pre-rendered?
If we trust Jensen, he stated that it was being rendered in real time during the Siggraph presentation. I believe he also stated that it was being rendered in real time using eight Turing Quadros.
(Not sure if he actually needed all 8, or if it was just a matter of... "if we have 'em why not use 'em?")


Show me what it would look like without real-time ray-tracing, so I can see what the difference is.
One of the fun things about nVidia's keynote is that he actually showed a ton of examples of scenes with ray tracing on and ray tracing off. He didn't show them for this particular scene but he showed similar examples from an arch-viz apartment demo with robots to the simplest Cornell box examples. They actually did a great job showing the differences.


The only thing that is "standing out" to me is that there are a lot of shiny reflections, which seem overdone. Maybe there are other things that make it look "better" without the technology, but without a "before and after" how would I know?
Well, actually, you're right. Ray Tracing provides basically 3 features over what rasterization can do: Reflections, refractions, and shadows. That's pretty much it. So, yes, shiny reflections, blurry reflections, soft dynamic shadows, and refractive transparent objects are pretty much all we're ever going to get from this. But reflections and shadows are SO HUGE when you're trying to render photorealistic scenes for film. Any increase in accuracy in how the light reflects and bounces from surfaces and how shadows are cast makes a massive difference in how real we perceive the image to be.


Speaking of cars, here's a car being Ray-Traced on three PS3's. I'm sure it doesn't impress anyone here cause it needs that many PS3's, until you realize that a modern PC can emulate the PS3 easily. Which I have to remind people that emulating anything requires at least 10X more processing power than the hardware you emulate. The IBM demo is just a 3D model in a confined space. Doesn't that sound familiar?
The problem with real time ray traced scenes until now is that they had to be confined to tiny simple demos like the one of that car because the frame buffer was simply not large enough for using it in production. The big change here is the 96 GB frame buffer. That is the first frame buffer that can realistically be used with an actual arch-viz or VFX production scene.

Not sure if you saw the Spiderman scene on the keynote, but the cityscape that they used was the total opposite of a confined space.


You either have old hardware or haven't played newer games.

Try Vermintide 2
Try Star Wars Battlefront 2
Try Hunt Showdown
Heck, even Doom...

Yeah - I'm not sure I see anything on this demo that looks beyond what I've already seen my current system capable of doing...
That is true... and... not true. Everything we saw in the video can be done in DirectX11 but (and this is kind of a huge but) it has to be baked first, using offline ray tracing. So, if you change the lighting, everything would look wrong. This is terrible for both arch viz and film because most of what you do is experiment with and match lighting schemes. Any change or movement of the lights and things start to look... and I don't mean this derogatorily... like a game.


Need to start somewhere. The hardware needs to come first. Then the games later.
What I'm curious about does this help the animated movies. Will this make render times faster and look better for animated shows on TV?
Yes it does. Obviously I can't go into it, but the Octane 4 beta already uses nvidia's AI lights, denoising and AA and it's SO much freaking faster than Octane 3. I'm getting similar feedback from the folks in Arnold's and VRay's betas.
 
Still wont buy from em, but pretty good lookin demo. Will wait for AMD (or maaaybe Intel in a few years) to bring its flavor. #NeverGreen, too many shitty moves. A shiny demo doesnt hide the greed...
 
Still wont buy from em, but pretty good lookin demo. Will wait for AMD (or maaaybe Intel in a few years) to bring its flavor. #NeverGreen, too many shitty moves. A shiny demo doesnt hide the greed...
So edgy. Shooting your own gaming in the foot to spite a company that's never heard of you. A+++++ post would laugh at again.
 
The most beautiful UE4 cheesy-dancing Fortnite ad ever. Take my money....in 5 to 7 years.
 
Looks decent until you notice the temporal filtering falling apart when the robot arms start moving rapidly and occluding other objects in screen space. (And this is with a static heavily scripted camera. With user control of the camera it will look so much worse) Amazes me they can't come up to a solution to this yet.

I'd rather have better image quality (Read:Not awful low quality TAA that costs barely more than FXAA per frame) than better shading.
 
Back
Top