NVIDIA Extends DirectX Raytracing (DXR) Support to Many GeForce GTX GPUs

Not that this isn't cool, but I'm left wondering what the utility is. Maybe for that high-quality screenshot function Nvidia has?

Otherwise, I don't see this as being useful for actually playing games.


[and the possibility of using a dedicated RT GPU seems extremely remote- that'd be harder to get right than SLI]
 
Well the rumors appear to be true. It'll be interesting to test this out later. If they'd go a step further and allow us to use a dedicated card for RT/DXR I'd be fully impressed.

https://www.techpowerup.com/254528/...ytracing-dxr-support-to-many-geforce-gtx-gpus
Offloading ray tracing to another card won't be some magical panacea that people are thinking. For ray tracing to work the entire framebuffer for each frame needs to be known, meaning that it needs to be synchronized across a bus and you would essentially be taking at least twice the amount of time to render a mixed rasterized-ray traced image across two separate cards. Programming for it would be a nightmare. No game would ever support it.
Not that this isn't cool, but I'm left wondering what the utility is. Maybe for that high-quality screenshot function Nvidia has?

Otherwise, I don't see this as being useful for actually playing games.


[and the possibility of using a dedicated RT GPU seems extremely remote- that'd be harder to get right than SLI]
Ja, I'm happy that 1060 owners can now play ray traced games at 5 FPS.
 
This wasn't a "rumor," it was announced about a month ago by Nvidia. They just prioritized RTX on the cards that support it.

This is a definite improvement over the transition to programmable shaders in DX8, where you couldn't even see the demos and early games. Now you can try every demo and game option.

Capping it at GTX 1060 also means you will not be greeted with 1fps on your 1050 (never fun to watch), so its definitely good advertising.
 
Last edited:
I still think it's lame to have it supported on cards like the 1060 but not the 980ti given that both are DX12.1.
 
Just did a little testing with Metro using the rig in my profile. That 1080TI is clocked at 2012-2025Mhz/5858Mhz. In 4k everything maxed, an amusing slideshow. Dropped RT to 'high' and got ~20fps. Dropped to 1080p and at 'high' it wasn't horrible but varied greatly from the 40's to 60's. Yeah, can't wait to see all the sites publishing 1660/1060 slideshow results. Well at least this might shut up some of those people who claim the RT/Tensor cores are useless.
 
This wasn't a "rumor," it was announced about a month ago by Nvidia. They just prioritized RTX on the cards that support it.

This is a definite improvement over the transition to programmable shaders in DX8, where you couldn't even see the demos and early games. Now you can try every demo and game option.

Capping it at GTX 1060 also means you will not be greeted with 1fps on your 1050 (never fun to watch), so its definitely good advertising.

Well since it was announced for april, many people thought it to be an april fools joke.


I'm going to try the SW reflection demo on my GTX1070Ti and see how it goes.
 
Offloading ray tracing to another card won't be some magical panacea that people are thinking. For ray tracing to work the entire framebuffer for each frame needs to be known, meaning that it needs to be synchronized across a bus and you would essentially be taking at least twice the amount of time to render a mixed rasterized-ray traced image across two separate cards. Programming for it would be a nightmare. No game would ever support it.

Ja, I'm happy that 1060 owners can now play ray traced games at 5 FPS.

I read in a forum somewhere that you could split the scene in 2, so the cards could do half and half
 
Translation: "Now that we've met our initial sales goals for RTX, we will be releasing it for the rest of the hold outs to try". "Thank you for being an RTX Beta Tester!".
 
I tried on UE4 and it's slow as fuck on a GTX 1080. It naturally scales really hard with your resolution.

Ray traced shadows work alright. They look great and performance is vaguely passable. They're probably quite fast on an RTX card (and are basically the cheapest ray tracing effect).

Anything else just completely rocks your shit. Reflections basically will absolutely tank if the majority of your screen is reflective, like looking down at a plane of water. Ambient occlusion is super heavy as well. Global illumination will just destroy you instantly, enjoy 1fps.
 
Tried out Quake 2 with the RT mod on my 1080Ti and the only way to get relatively smooth performance was to play at 720p. At 4K it struggled to get double digit frame rates before crashing after taking a few steps in the first level. With 1080p I was lucky when it went over 20fps and it was barely playable enough to where I could check out the game. 720p got me to playable levels with the frame rate staying in the 40s most of the time and getting as high as in the 50s in smaller areas. Even though it brought my 1080Ti to its knees and then some, it was still cool seeing all the ray traced effects (before it crashed a few levels in). Only thing is that the dynamic lights and shadows looked very noisy up close. My guess is that this is due to the lack of tensor cores on the 1080Ti to denoise the effects.
 
I still think it's lame to have it supported on cards like the 1060 but not the 980ti given that both are DX12.1.
It sucks for all 980Ti owners but to give NV some slack implementing such feature need at the very least a lot of testing = money.
I do however believe it could be done...

This is a definite improvement over the transition to programmable shaders in DX8, where you couldn't even see the demos and early games. Now you can try every demo and game option.
It seems NV listened for once... or used their own marketing brain muscles which seems to have some serious SLI input lag ;)

Tried out Quake 2 with the RT mod on my 1080Ti and the only way to get relatively smooth performance was to play at 720p. At 4K it struggled to get double digit frame rates before crashing after taking a few steps in the first level. With 1080p I was lucky when it went over 20fps and it was barely playable enough to where I could check out the game. 720p got me to playable levels with the frame rate staying in the 40s most of the time and getting as high as in the 50s in smaller areas. Even though it brought my 1080Ti to its knees and then some, it was still cool seeing all the ray traced effects (before it crashed a few levels in). Only thing is that the dynamic lights and shadows looked very noisy up close. My guess is that this is due to the lack of tensor cores on the 1080Ti to denoise the effects.
So Vulkan is also supported. Nice :)
 
And remember...this is not "equal" RT...pascal is running a lesser version compared to Turing CPU's with Tensor and RT cores:
GDC_Update_FINAL-page-019_678x452.jpg
 
And just this tidbit as most people don't seem to understand the technical side of DXR:
 
Translation: "Now that we've met our initial sales goals for RTX, we will be releasing it for the rest of the hold outs to try". "Thank you for being an RTX Beta Tester!".

...and for paying handsomely for the privilege....
 
Translation: "Now that we've met our initial sales goals for RTX, we will be releasing it for the rest of the hold outs to try". "Thank you for being an RTX Beta Tester!".

You must be one of those people that cannot tell a X from a O right?

Not all RT is created equal...but I guess those pesky facts are something you either don't care about...or are ignorant about?

So which is it...ignorance or dishonesty? :)
 
Now nvidia can release a chart that shows the 20 series card will increase fps by 500% over the 10 series cards.
 
Was that Ray Tracing Ultra or High? All the benches nvidia showed were with ultra which is kinda skewing the numbers even worse. The game looks great with RT on High.
It was on Ultra as I was just trying to look at a worst-case scenario at 1080p.
 
Not that this isn't cool, but I'm left wondering what the utility is. Maybe for that high-quality screenshot function Nvidia has?

It's strictly a marketing gimmick, to make you want a RTX enabled card. It probably means RTX is ~not~ hitting their sales goals, and they are trying to illustrate why you should go out and buy one. If RTX was doing fine, I can't see there would have been a reason at all to do this.

Factum has a good point, Pascal -> Turing is pretty darn lackluster, and this gives them some material to just point and say "No, it really is better, see!"
 
It was on Ultra as I was just trying to look at a worst-case scenario at 1080p.

Appreciate you posting that. Can definitely see RT doing something there. Maybe in motion it looks different, but just based on the SS, I kinda prefer it without RT. It's not so dark and that lets some of the texture detail pop better. But with a moving image that could be different and I'm just seeing the SS wrongly.
 
This is reverse psychology at its finest.

Consumer complains to company with regards to holding out on said "software add-on" feature.

Company delivers said feature as "free software upgrade".

Consumers are, yet again, disappointed as said feature does not deliver expected experience.

After much deliberation, consumers are then enlightened as to why said feature was never introduced in the first place. Thus, validating what company was saying all along.

Enlightened consumer now sees the path to righteousness and promptly empties their wallet for a "proper experience".

In either case, the seeds are planted. Now it's just a waiting game.

Well played, Nvidia.
 
It's strictly a marketing gimmick, to make you want a RTX enabled card. It probably means RTX is ~not~ hitting their sales goals, and they are trying to illustrate why you should go out and buy one. If RTX was doing fine, I can't see there would have been a reason at all to do this.

Factum has a good point, Pascal -> Turing is pretty darn lackluster, and this gives them some material to just point and say "No, it really is better, see!"

Well, it is really better. As for enabling it on architectures that lack the hardware, this could be for a variety of reasons including sales. I could see lower-end games making use of it in limited ways, in particular as interest in implementing ray tracing grows. It also allows those without the hardware to test the software, like was done with the software renderers of old. This also unifies their product stack a bit from a driver perspective.
 
Well played, Nvidia.

They got ray traced hardware with full multi-API, multi-OS driver and application support done in one hardware generation.

That's well played. I don't really mind that they are making a buck or two off the process (or not).
 
You must be one of those people that cannot tell a X from a O right?

Not all RT is created equal...but I guess those pesky facts are something you either don't care about...or are ignorant about?

So which is it...ignorance or dishonesty? :)


No complaints here I got my RTX card for free :)
 
It's a subtle difference, but while playing it does feel more realistic and accurate in the lighting.
Yeah looking again at the darker parts it has more contrast and shadowing and bit more light definition and accuracy from sources. But it feels a bit too much overdone because it's damn bright sunshine outside, in a room full of windows - yet it's dark as a coalmine on the bottom part. This is what always pissed me off about new graphics tech like this, e.g. 3d. When they often push it to the limit for the sake of it (and $ for including it), it's never as much of an experience as when it's done in a well complimentary and natural way, more matured, where I can see it add true value and pleasure to a gaming experience.
 
Last edited:
And remember...this is not "equal" RT...pascal is running a lesser version compared to Turing CPU's with Tensor and RT cores:
View attachment 154266
My understanding is that since performance on non-RTX card is low you are supposed to use "low" settings and low resolution to get playable framerates and not that DXR implementation is any different than RTX one
 
My understanding is that since performance on non-RTX card is low you are supposed to use "low" settings and low resolution to get playable framerates and not that DXR implementation is any different than RTX one

Yeah I don't see why it would look any different, and so far in my fucking around, it doesn't. You can't really cheat your implementation.

Just runs like shit.
 
And remember...this is not "equal" RT...pascal is running a lesser version compared to Turing CPU's with Tensor and RT cores:
View attachment 154266

I'm not sure this is really true. I just tried this with my 1080 Ti in Shadow of the Tomb Raider, and it appears to offer the same ray traced shadow options that an RTX card would have. It ran at about 30 FPS at 1080P, but it really did look amazing. Especially given how terrible everyone says RTX performance supposedly is, I was impressed it was even sorta-kinda playable. I expected single digits.

You're obviously missing out on the hardware acceleration that the 20 series cards have with Pascal, but it sure looks to me like it's mathematically the same on Pascal vs Turing, and the main difference is that Turing just has hardware to do the math faster.

Credit where it's due, it really did make me want a 2080 Ti. Most of my time with Shadow of the Tomb Raider so far has been on my 1050 Ti equipped laptop, but I thought it was really impressive how much more convincing the game looked with the ray traced shadows.

Edit: That was the "ultra" ray traced shadow preset. I didn't try the lower versions, as I gather they expect you to use with Pascal cards.
 
I'm not sure this is really true. I just tried this with my 1080 Ti in Shadow of the Tomb Raider, and it appears to offer the same ray traced shadow options that an RTX card would have. It ran at about 30 FPS at 1080P, but it really did look amazing. Especially given how terrible everyone says RTX performance supposedly is, I was impressed it was even sorta-kinda playable. I expected single digits.

You're obviously missing out on the hardware acceleration that the 20 series cards have with Pascal, but it sure looks to me like it's mathematically the same on Pascal vs Turing, and the main difference is that Turing just has hardware to do the math faster.

Credit where it's due, it really did make me want a 2080 Ti. Most of my time with Shadow of the Tomb Raider so far has been on my 1050 Ti equipped laptop, but I thought it was really impressive how much more convincing the game looked with the ray traced shadows.

Edit: That was the "ultra" ray traced shadow preset. I didn't try the lower versions, as I gather they expect you to use with Pascal cards.

It cost me a lot to upgrade all my stuff back around the holidays. Probably around $6k and still got about $1200 to pay off, but I upgraded to a 7.1 Dolby Atmos audio system, 65" 4k/HDR, and 2080TI for the rig in my signature. Shadow of the Tomb Raider looks and sounds absolutely amazing on it. I play with RT on Med and just about everything maxed with v-sync on. I keep lens flares, chromatic aberration, DOF off. With these settings I get solid 60fps. IF I use my OC settings I can use high and hold mid 40's to high 50's but considering how many things(vegetation, birds, bugs, people-and all their shadows) are often moving on the screen it's a lot nicer to my eyes to see the 60+ for this game all the time. Ultra brings it down into the 30's and I had a hard time telling the differences between it and high. This seems to be common for other games as well. So far this is the 1st game to truly check all the boxes for modern gaming tech. I honestly feel like their implementation of RT/DLSS improved both visuals and game performance. To me the added blur of DLSS is only just barely noticeable and well worth the gains. So far this is the 1st I can say that for. I've got both Metro and BFV and even though both show improvements neither feel as optimized as this game does. Not to mention that Metro's HDR is still busted in 4k with RT/DLSS. I know each game has used RT in different ways such as shadows, global illumination, etc., but Tomb raider really knocks it out of the park. I imagine if someone were to pair a 2080TI to a G-Sync 4k/HDR monitor they might get really close to 60fps regularly with RT on high.
 
Back
Top