Microsoft Introduces DirectX Raytracing

rgMekanic

[H]ard|News
Joined
May 13, 2013
Messages
6,943
Microsoft has announced at GDC this morning, support for hardware accelerated raytracing coming to DirectX 12, known as DXR. Microsoft stated "Raytracing is an upcoming Windows 10 feature and a new paradigm for both DirectX and PIX on Windows and consequently we plan to evolve PIX on Windows significantly in this area based on input from developers."

PCPerspective is on scene reporting about NVIDIA RTX Technology, which NVIDIA is calling a combination of hardware and software to improve the performance of ray tracing algorithms on NVIDIA hardware, working hand and hand with DXR. RTX will only run on Volta GPUs, however, PCPer is stating that NVIDIA was light on the details as to what RTX actually is. Alongside the announcement NVIDIA also announced "GameWorks Ray Tracing."

On the AMD side, a PR rep sent this over earlier today:
"AMD is collaborating with Microsoft to help define, refine and support the future of DirectX12 and ray tracing. AMD remains at the forefront of new programming model and application programming interface (API) innovation based on a forward-looking, system-level foundation for graphics programming. We’re looking forward to discussing with game developers their ideas and feedback related to PC-based ray tracing techniques for image quality, effects opportunities, and performance."

Raytracing has long been the holy grail for PC graphics, and with the DirectX integration, the time may finally be upon us. That is if we can get graphics cards with enough horsepower to run it.
 
So is the graininess just post processing? Or is it a "feature" of ray tracing? I don't recall that on the Doom 3 Arena RT demo from Intel a few years back.
 
So is the graininess just post processing? Or is it a "feature" of ray tracing? I don't recall that on the Doom 3 Arena RT demo from Intel a few years back.

Doom 3 Arena RT demo??? WHAT?
 
I would guess post process. Film grain hides imperfections in the material an your brain fills it in making it look "better" with it on.
 
I would guess post process. Film grain hides imperfections in the material an your brain fills it in making it look "better" with it on.

if it's so, I ain't on the bandwagon...
I've never seen a pretty raytracing demo where they exceed the visual quality without.
 
That graininess is an artefact of RT, as each photon is simulated in a quantum manner, either it hits an object or it doesn't, no half and half partial hits. So you get a huge amount of noise unless you use tons of photons to help diffuse all the grainy individual photon hits.

Edit: This is why I've always said RayTracing will never NEVER replace raster graphics.If you showed these graphics to the world 10 years ago and said that it was realtime, they would have shat their pants, but nowadays, we can accomplish BETTER and more realistic visuals without all the grainy surface reflection using raster graphics. and in ten years when PCs are powerful enough to render EVEN BETTER raytracing, the raster visuals will be that much more ahead again.

Raytracing will be useful as a PART of a deferred pipeline (like for calculating AO or GI which many engines actually already do) but you will always be behind visually when you try to use RT to do the entire pipeline.
 
Last edited:
I would guess post process. Film grain hides imperfections in the material an your brain fills it in making it look "better" with it on.


That's not a film grain, that's a raytracing artefact from lack of samples.
 
That graininess is an artefact of RT, as each photon is simulated in a quantum manner, either it hits an object or it doesn't, no half and half partial hits. So you get a huge amount of noise unless you use tons of photons to help diffuse all the grainy individual photon hits.

Edit: This is why I've always said RayTracing will never NEVER replace raster graphics.If you showed these graphics to the world 10 years ago and said that it was realtime, they would have shat their pants, but nowadays, we can accomplish BETTER and more realistic visuals without all the grainy surface reflection using raster graphics. and in ten years when PCs are powerful enough to render EVEN BETTER raytracing, the raster visuals will be that much more ahead again.

Raytracing will be useful as a PART of a deferred pipeline (like for calculating AO or GI which many engines actually already do) but you will always be behind visually when you try to use RT to do the entire pipeline.


Rasterization has a finite ceiling. Raytracing, in theory, can perfectly recreate our physical world.

This is less about raytracing being awesome *now*, and more about the framework/foundation to allow raytracing to be awesome in 8-10 years.
 
This is fake raytracing or something?

If its fake, then its just another marketing stunt to get people to switch to win 10
 
Oh F off, already. Microsoft is like a drowning cat at this point, trying to latch onto any gimmick to make 10 seem relevant.

Just like DX12 itself, if "DXR" is Windows 10 only, its already DOA. Developers will just ignore it. And likewise if NVIDIA's RTX requires Windows 10, it's equally DOA. I don't think Nvidia is that short sighted though.

improve the performance of ray tracing algorithms on NVIDIA hardware, working hand and hand with DXR

Gotta have DXR to have RTX, gotta have DirectX 12 to have DXR.
 
Nope, not there yet ... They need to bump those samples up a bit, but if they do, that film becomes a slideshow at best.
 
Realtime raytracing seems like the proverbial carrot on a stick. It doesn't matter how fast we run, pixel counts and expectations of image fidelity rise just as fast.
 
If you watch past the opening part, after it stops being dark in the room the system they are using starts looking better.

Yes, you can see the sampling is just a little short of producing a grain free image but holy hell that was CLOSE. And the quality of the reflections off of... everything. It's eye candy of the highest order. My brain was really liking how that whole scene looked in motion.

Maybe we are almost there.
 
Rasterization has a finite ceiling. Raytracing, in theory, can perfectly recreate our physical world.

This is less about raytracing being awesome *now*, and more about the framework/foundation to allow raytracing to be awesome in 8-10 years.

Raster graphics do not have a finite ceiling, in fact, no graphics pipeline has any inherent ceiling. As I said earlier, no matter how much time you give RayTracing, Raster pipelines receive the same amount of time, meaning the IQ of raster will always be better than RayTracing.

The best way to utilise RT in graphics, (like I said) is in hybrid approaches that integrate it into a deferred pipeline, in the same way Unreal creates pseudo RT through volumetric textures using Distance Field maps or 2.5D PBR reflections using Screen Space Reflections

A Hybrid approach will always generate better IQ given a specific render time than simple RT alone.
 
Might finally set up a Win 10 VM if this all comes together at a reasonable price, in my lifetime. Of course it would be connected to the "cloud" as little as possible (well maybe satellite internet will be cheap, fast and unlimited by then...ha ha).
 
I’m really confused by this. On one hand we have Nvidia working with Microsoft to help develop a ray tracing api. Seems good, everyone wins. But then there’s xtr. A gameworks type middleware that only works on Volta and newer GPUs.

There’s a lot of information lacking here. Unless some of this was explained in the video (didn’t watch that yet).
 
I’m really confused by this. On one hand we have Nvidia working with Microsoft to help develop a ray tracing api. Seems good, everyone wins. But then there’s xtr. A gameworks type middleware that only works on Volta and newer GPUs.

There’s a lot of information lacking here. Unless some of this was explained in the video (didn’t watch that yet).

Microsoft DirectX RT = API path for raytracing using compute shaders

Nvidia XTR = Hardware level optimisation for specific Nvidia software that utilises Microsoft's API path.
 
Sure Raytracing for DirectX might seem impressive but I'm still waiting for the Enjoyment addon for DirectX that makes the games better not just marginally prettier. Having better visuals doesn't make a game more fun, enjoyable or better especially when they're on ever tighter deadlines and most AAA game devs seem to sacrifice playability before making the game cut-scenes nicer.
 
I guess I don't understand what raytracing is.

Looked like a standard CGI demo to me.
 
So is the graininess just post processing? Or is it a "feature" of ray tracing? I don't recall that on the Doom 3 Arena RT demo from Intel a few years back.
That's because this uses path tracing. Path tracing is infamous in the 3D rendering world for the how grainy it is compared to other ray tracing systems (and also for the amount of fireflies it produces).

On the other hand, the advantage of path tracing is that, if you increase the samples enough to remove the noise (increasing render time) it's beautiful and allows both the texturing process and the final render to be very physically accurate.

I wouldn't have chosen path tracing for anything real time, but with AI noise removal, it improves leaps and bounds. Check the promotional materials for Octane version 4, which shoes how much better the interactive preview works with AI noise removal (). Or Nvidias own work on this (http://www.gpurendering.com/technology/nvidiaAiForImageNoiseReduction.html). However, although, it's really good for speeding up offline rendering and previews, I think scanline "Faking ray trace" techniques have gotten so good that real realtime ray tracing will actually look like a step back for a couple of generations.
 
  • Like
Reactions: blkt
like this
I guess I don't understand what raytracing is.

Looked like a standard CGI demo to me.

..or substandard, i think i've seen current games (in real time) look better than that on a decent system...
 
  • Like
Reactions: DF-1
like this
I guess I don't understand what raytracing is.

Looked like a standard CGI demo to me.

It's a different way of rendering computer graphics. Wikipedia has a decent article on it, if you are interested in the tech details but it is another way of rendering a scene. It has some advantages over the rasterization GPUs do now, namely that it scales very well to high numbers of polygons, and disadvantages, namely resolution.
 
That's because this uses path tracing. Path tracing is infamous in the 3D rendering world for the how grainy it is compared to other ray tracing systems (and also for the amount of fireflies it produces).

On the other hand, the advantage of path tracing is that, if you increase the samples enough to remove the noise (increasing render time) it's beautiful and allows both the texturing process and the final render to be very physically accurate.

so why would they use this vid as their "tech demo" if it looks like ass??

and if that's the case they should, in reality, be calling it DXP and be straight up w/ people. instead of showing us a grainy ass youtube video and telling us this is why we should dump win7...
 
I experimented with raytracing back in the 90's first with the DOS versions of 3D Studio Max and Lightwave, and later Povray. I renddred some pretty cool still frames, but I never understood the details of the technology.


How does raytracing differ from what Direct3D, OpenGL, Vulcan, etc. do today?
 
Last edited:
That graininess is an artefact of RT, as each photon is simulated in a quantum manner, either it hits an object or it doesn't, no half and half partial hits. So you get a huge amount of noise unless you use tons of photons to help diffuse all the grainy individual photon hits.

Edit: This is why I've always said RayTracing will never NEVER replace raster graphics.If you showed these graphics to the world 10 years ago and said that it was realtime, they would have shat their pants, but nowadays, we can accomplish BETTER and more realistic visuals without all the grainy surface reflection using raster graphics. and in ten years when PCs are powerful enough to render EVEN BETTER raytracing, the raster visuals will be that much more ahead again.

Raytracing will be useful as a PART of a deferred pipeline (like for calculating AO or GI which many engines actually already do) but you will always be behind visually when you try to use RT to do the entire pipeline.

I used to believe that myself until I started to heavily research quantum ray-tracing. Needless to say, the results blew me away and ultimately showed that when quantum computers become feasible and practical for companies, it's going to accelerate the technological capabilities of cgi beyond rasterizing. Here is a QC simulator using superposition to approximates samples to dramatically decrease the noise in an image.

On a different note, anyone who does computer modeling and rendering for a living will appreciate how impressive these results are because rendering single frames on highly photo-realistic images can takes hours, let alone instantaneously producing these types of results. Kudos.
 
Oh boys, we has more GameWorks. Can't wait for my games to get slower for no reason.
 
I used to believe that myself until I started to heavily research quantum ray-tracing. Needless to say, the results blew me away and ultimately showed that when quantum computers become feasible and practical for companies, it's going to accelerate the technological capabilities of cgi beyond rasterizing. Here is a QC simulator using superposition to approximates samples to dramatically decrease the noise in an image.

On a different note, anyone who does computer modeling and rendering for a living will appreciate how impressive these results are because rendering single frames on highly photo-realistic images can takes hours, let alone instantaneously producing these types of results. Kudos.

It is impressive, no doubt about it. But hybrid raster graphics will always do the same or better image quality only faster.

Once quantum computers get cheap enough to be used for offline rendering by art studios, my grandchildren will enjoy the movies they make. But I don't see supercooled quantum chips being a household item in this century.
 
  • Like
Reactions: blkt
like this
Amazing. I remember running a ray tracing program on my Commodore Amiga 1000 in 1986, and it would take 2-3 hours to render a single 640x400 image. Now, we may FINALLY get to see real time, >30FPS ray traced graphics. Well, it only took about 32 years. :)
 
"RTX will only run on Volta GPUs"

Hah, I knew it! NVIDIA had to have a selling point that would make Pascal look old hat in order to sell Volta hardware! Whether it works is another matter... all I know is that my GTX 980's getting a bit long in the tooth for VR (it's fine outside of that) and I've been waiting for the Volta equivalent of the 1080 Ti for a while now.

As for raytracing in general, I kinda wish there was a real-time raytracer that represented that distinct '90s CGI quality you'd see in games like MegaRace, System Shock, Crusader and so forth in their pre-rendered intros, banding, lighting and everything, as the foundation for a throwback game aesthetic - sorta like playing out those pre-rendered scenes in real-time decades later. I mean, real-time raytracing was already demonstrated years ago back on HD Radeon 4000-class GPUs, from what I recall. It's been possible, just never really developed on a standard API for modern GPGPUs.

Amazing. I remember running a ray tracing program on my Commodore Amiga 1000 in 1986, and it would take 2-3 hours to render a single 640x400 image. Now, we may FINALLY get to see real time, >30FPS ray traced graphics. Well, it only took about 32 years. :)
Please tell me you've still got that Amiga 1000 kicking around somewhere... weak as it sounds now, it's money now like every other bit of Amiga hardware nowadays, especially since it's the first of the Amiga line. Heck, I'm sure someone will find a way to cram a Vampire accelerator board into it soon enough.
 
why isn't Vulkan being adopted more?

Because generic APIs have the universal problem of being unintuitive because they have to cover every possible use case. From that perspective DX has better performance, debug tools, and ease of use.

Both APIs screwed up though; they went too low level and no one wants to deal with all the headaches that involves.
 
Very cool - someday we'll really get there. Stuff I always like with this kinda tech is light bounces around - so in a game, instead of "baking" in 'light' into the textures where the actual lights aren't doing their job, one light using ray tracing can light a scene for you (and depending on how many times you let a ray bounce, the more realistic the lighting becomes - but for every bounce the more expensive computationally it becomes). It's supposed to help with different types of "materials" used too... so depending on how opaque/transparent something is it can allow for some light to past through an object or not (like light shining on yellow piece of "plastic" and the light on the other side is now that same-ish color etc.)

Though last time I studied the subject is going on some 15 years ago so the tech has probably changed on how they're trying to accomplish it - like how they used photon mapping in Final Fantasy: The Spirits Within - its still such an expensive way to produce because you have to chose how many photons to use in a scene and all that other stuff which can dramatically change a scene and the time it takes to render a single frame. Keep in mind that this was back in the late 90ies as it took four years to make Spirits Within, but for the nearly 150k frames, each frame took 90 minutes to render (maya/renderman) on a massive server farm. Could you imagine doing a 3 minute scene and then saying - "nope, doesn't look right" and then starting over? Takes a long time to re-render ~4k frames lol. It may not have been the best of animated movies but it was, at the time, a tech marvel that ultimately resulted in Square Studios being shut down :( oops. ANYWAYS - because of stuff like this we've found much easier ways to "fake" it and get really good results. so yeah, been waiting for true ray-tracing for some time now lol because that stuff looks amazing.
 
  • Like
Reactions: blkt
like this
Back
Top