Nvidia enables DXR on Pascal GPUs

euskalzabe

[H]ard|Gawd
Joined
May 9, 2009
Messages
1,478
So... there you go.

This is what you do when sales are not going well. What they've been able to do from day one, the software/driver based implementation of raytracing, is now enabled on most Pascal GPUs.

Sure, RTX will accelerate things, 3x according to Nvidia (and this sounds like a fair estimate). But they now realize that they need many more GPUs allowed to process DXR for studios to actually use it.

Also, this is further confirmation that AMD's Navi will do the same, at a minimum. Same for next-gen consoles.

I do hope this means DXR will be here to stay, now that it can be ran by the majority of GPU users. Now it is worth it for studios to invest the time/money in implementing these effects. By next year, there'll be an actual decent reason to buy DXR accelerating hardware.
 
It's good that people will be able to try out raytracing for themselves even if it runs like crap on an older card. I just wonder if we will see any solutions where it will be reasonable to use it, especially on consoles that typically have far worse hardware. Exciting times for sure.
 
Sweet, I was looking for another graphics option to disable in order for my aged 1070 to maintain minimum framerates.
 
This is a, well.. interesting development.

If you look at the BFV image, there is only maybe a 10 - 15% performance boost from using RT cores to not on the 2080.

But, yeah, RT is struggling as is with RTX, so running what like a 1060 would make no sense.
 
It's a good move though, and gives me hope AMD may be able to cook up something similar.
 
  • Like
Reactions: XoR_
like this
Very good development!
There are many people who would like to play with this technology but lacked hardware or are plainly locked out of it because they use laptops.
Now they can experiment with it

Also NV should make this right from the start as a method to show RT and tensor cores do actually speed up things
 
Really?
This is how people flaunt their ignornace?

DXR = DirectX Raytraycing.

HARDWARE AGNOSTIC.

All it requires is a DX12 GPU.

No effing news!

Links:
https://en.wikipedia.org/wiki/DirectX_Raytracing
https://devblogs.microsoft.com/directx/directx-raytracing-and-the-windows-10-october-2018-update/
https://devblogs.microsoft.com/directx/announcing-microsoft-directx-raytracing/

Now give us performance numbers....and the key to why NVIDIA put TR-cores in their RTX cards is revealed...

ANY DX12 GPU will be able to do raytraycing...but their will be LARGE diffrences in performance. a GTX 1080 Ti is about 10% of the RT performance of a RTX 2080Ti...


Must be a slow news day...(ah...yes, the source is a rumor-site...)
 
Interesting the gains are shit on BFV. Reinforces the past observations that the the engine is terrible for DX12 and RT. Metro has much better utilization (increases).
 
Interesting the gains are shit on BFV. Reinforces the past observations that the the engine is terrible for DX12 and RT. Metro has much better utilization (increases).
BFV DXR implementation was done for Titan V and is using low amount of ray intersections. They run complex shaders for each ray so quickly hog up shader performance with them.
RT cores save shader performance but since running shader calculation for each ray takes so much processing power intersection calculation themselves do not take so much time in comparison.

The way to use RTX card to their full potential would be to use much simpler shaders and use a lot of rays. Ray tracing itself can produce even better effects than emulating these effects with shaders and using ray tracing to calculate new points in space to calculate normal shaders.

With GTX cards getting RTX and hopefully AMD cards soon we might rather see more implementations like BFV uses... which is not that bad actually because RT accelerated GPU market saturation is pretty pathetic right now and will be for quite some time. In the mean time we can expect some projects like Quake 2 RTX to fill the gap for fully path traced games which will be the benchmark of how GPU graphics will look in the future when everyone will have RT-like cores in their GPU's

Imho it is great move from NV to add RTX capability to older cards.
Now AMD should just do the same. Is AMD up to the task though? AMD? AMD... AMD!!!!!111
 
Back
Top